Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Definite matrix
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Properties == === Induced partial ordering === For arbitrary square matrices <math>M,</math> <math>N</math> we write <math>M \ge N</math> if <math>M - N \ge 0</math> i.e., <math>M - N</math> is positive semi-definite. This defines a [[partially ordered set|partial ordering]] on the set of all square matrices. One can similarly define a strict partial ordering <math>M > N.</math> The ordering is called the [[Loewner order]]. === Inverse of positive definite matrix === Every positive definite matrix is [[invertible matrix|invertible]] and its inverse is also positive definite.<ref>{{harvtxt|Horn|Johnson|2013}}, p. 438, Theorem 7.2.1</ref> If <math>M \geq N > 0</math> then <math>N^{-1} \geq M^{-1} > 0.</math><ref>{{harvtxt|Horn|Johnson|2013}}, p. 495, Corollary 7.7.4(a)</ref> Moreover, by the [[min-max theorem]], the {{mvar|k}}th largest eigenvalue of <math>M</math> is greater than or equal to the {{mvar|k}}th largest eigenvalue of <math>N.</math> === Scaling === If <math>M</math> is positive definite and <math>r > 0</math> is a real number, then <math>r M</math> is positive definite.<ref name="HJobs713">{{harvtxt|Horn|Johnson|2013}}, p. 430, Observation 7.1.3</ref> === Addition === * If <math>M</math> and <math>N</math> are positive-definite, then the sum <math>M + N</math> is also positive-definite.<ref name="HJobs713"/> * If <math>M</math> and <math>N</math> are positive-semidefinite, then the sum <math>M + N</math> is also positive-semidefinite. * If <math>M</math> is positive-definite and <math>N</math> is positive-semidefinite, then the sum <math>M + N</math> is also positive-definite. === Multiplication === * If <math>M</math> and <math>N</math> are positive definite, then the products <math>M N M</math> and <math>NMN</math> are also positive definite. If <math>M N = N M,</math> then <math>M N</math> is also positive definite. * If <math>M</math> is positive semidefinite, then <math>A^* M A</math> is positive semidefinite for any (possibly rectangular) matrix <math>A .</math> If <math>M</math> is positive definite and <math>A</math> has full column rank, then <math>A^* M A</math> is positive definite.<ref>{{harvtxt|Horn|Johnson|2013}}, p. 431, Observation 7.1.8</ref> === Trace === The diagonal entries <math>m_{ii}</math> of a positive-semidefinite matrix are real and non-negative. As a consequence the [[trace (linear algebra)|trace]], <math>\operatorname{tr}(M) \ge 0.</math> Furthermore,<ref>{{harvtxt|Horn|Johnson|2013}}, p. 430</ref> since every principal sub-matrix (in particular, 2-by-2) is positive semidefinite, <math display="block">\left|m_{ij}\right| \leq \sqrt{m_{ii}m_{jj}} \quad \forall i, j</math> and thus, when <math>n \ge 1,</math> <math display="block"> \max_{i,j} \left|m_{ij}\right| \leq \max_i m_{ii}</math> An <math>n \times n</math> Hermitian matrix <math>M</math> is positive definite if it satisfies the following trace inequalities:<ref>{{cite journal | title=Bounds for Eigenvalues using Traces | last1=Wolkowicz | first1=Henry | last2 = Styan | first2 = George P.H. | journal=Linear Algebra and Its Applications | issue=29 | publisher=Elsevier | year=1980 | volume=29 | pages=471–506 | doi=10.1016/0024-3795(80)90258-X }}</ref> <math display="block">\operatorname{tr}(M) > 0 \quad \mathrm{and} \quad \frac{(\operatorname{tr}(M))^2}{\operatorname{tr}(M^2)} > n-1 .</math> Another important result is that for any <math>M</math> and <math>N</math> positive-semidefinite matrices, <math>\operatorname{tr}(MN) \ge 0 .</math> This follows by writing <math>\operatorname{tr}(MN) = \operatorname{tr}(M^\frac{1}{2}N M^\frac{1}{2}).</math> The matrix <math>M^\frac{1}{2}N M^\frac{1}{2}</math> is positive-semidefinite and thus has non-negative eigenvalues, whose sum, the trace, is therefore also non-negative. === Hadamard product === If <math>M, N \geq 0,</math> although <math>M N</math> is not necessary positive semidefinite, the [[Hadamard product (matrices)|Hadamard product]] is, <math>M \circ N \geq 0</math> (this result is often called the [[Schur product theorem]]).<ref>{{harvtxt|Horn|Johnson|2013}}, p. 479, Theorem 7.5.3</ref> Regarding the Hadamard product of two positive semidefinite matrices <math>M = (m_{ij}) \geq 0,</math> <math>N \geq 0,</math> there are two notable inequalities: * Oppenheim's inequality: <math>\det(M \circ N) \geq \det (N) \prod\nolimits_i m_{ii}.</math><ref>{{harvtxt|Horn|Johnson|2013}}, p. 509, Theorem 7.8.16</ref> * <math>\det(M \circ N) \geq \det(M) \det(N).</math><ref name=styan1973>{{cite journal |last=Styan |first=G.P. |year=1973 |title=Hadamard products and multivariate statistical analysis |journal=[[Linear Algebra and Its Applications]] |volume=6 |pages=217–240 |doi=10.1016/0024-3795(73)90023-2 }}, Corollary 3.6, p. 227</ref> === Kronecker product === If <math>M, N \geq 0,</math> although <math>M N</math> is not necessary positive semidefinite, the [[Kronecker product]] <math>M \otimes N \geq 0.</math> === Frobenius product === If <math>M, N \geq 0,</math> although <math>M N</math> is not necessary positive semidefinite, the [[Frobenius inner product]] <math>M : N \geq 0</math> (Lancaster–Tismenetsky, ''The Theory of Matrices'', p. 218). === Convexity === The set of positive semidefinite symmetric matrices is [[convex set|convex]]. That is, if <math>M</math> and <math>N</math> are positive semidefinite, then for any <math>\alpha</math> between {{math|0}} and {{math|1}}, <math>\alpha M + \left(1 - \alpha\right) N</math> is also positive semidefinite. For any vector <math>\mathbf{x}</math>: <math display="block">\mathbf{x}^\mathsf{T} \left(\alpha M + \left(1 - \alpha\right)N\right)\mathbf{x} = \alpha \mathbf{x}^\mathsf{T} M\mathbf{x} + (1 - \alpha) \mathbf{x}^\mathsf{T} N\mathbf{x} \geq 0.</math> This property guarantees that [[semidefinite programming]] problems converge to a globally optimal solution. === Relation with cosine === The positive-definiteness of a matrix <math>A</math> expresses that the angle <math>\theta</math> between any vector <math>\mathbf{x}</math> and its image <math>A \mathbf{x}</math> is always <math>-\pi / 2 < \theta < +\pi / 2:</math> <math display="block">\cos\theta = \frac{ \mathbf{x}^\mathsf{T} A\mathbf{x} }{\lVert \mathbf{x} \rVert \lVert A\mathbf{x} \rVert} = \frac{\langle \mathbf{x}, A\mathbf{x} \rangle}{\lVert \mathbf{x} \rVert \lVert A\mathbf{x} \rVert} , \theta = \theta(\mathbf{x}, A \mathbf{x}) \equiv \widehat{\left(\mathbf{x},A\mathbf{x}\right)} \equiv</math> the angle between <math>\mathbf{x}</math> and <math>A\mathbf{x}.</math> === Further properties === # If <math>M</math> is a symmetric [[Toeplitz matrix]], i.e. the entries <math>m_{ij}</math> are given as a function of their absolute index differences: <math>m_{ij} = h(|i-j|),</math> and the ''strict'' inequality <math display="inline">\sum_{j \neq 0} \left|h(j)\right| < h(0)</math> holds, then <math>M</math> is ''strictly'' positive definite. # Let <math>M > 0</math> and <math>N</math> Hermitian. If <math>MN + NM \ge 0</math> (resp., <math>MN + NM > 0</math>) then <math>N \ge 0</math> (resp., <math>N > 0</math>).<ref> {{Cite book | title=Positive Definite Matrices | last=Bhatia | first=Rajendra | publisher=Princeton University Press | year=2007 | isbn=978-0-691-12918-1 | location=Princeton, New Jersey | pages=8 }}</ref> # If <math>M > 0</math> is real, then there is a <math>\delta > 0</math> such that <math>M > \delta I,</math> where <math>I</math> is the [[identity matrix]]. # If <math>M_k</math> denotes the leading <math>k \times k</math> minor, <math>\det\left(M_k\right)/\det\left(M_{k-1}\right)</math> is the {{mvar|k}}th pivot during [[LU decomposition]]. # A matrix is negative definite if its {{mvar|k}}th order leading [[principal minor]] is negative when <math>k</math> is odd, and positive when <math>k</math> is even. # If <math>M</math> is a real positive definite matrix, then there exists a positive real number <math>m</math> such that for every vector <math>\mathbf{v},</math> <math>\mathbf{v}^\mathsf{T} M\mathbf{v} \geq m\|\mathbf{v}\|_2^{2}.</math> # A Hermitian matrix is positive semidefinite if and only if all of its principal minors are nonnegative. It is however not enough to consider the leading principal minors only, as is checked on the diagonal matrix with entries {{math|0}} and {{math|−1 .}} === Block matrices and submatrices === A positive <math>2n \times 2n</math> matrix may also be defined by [[block matrix|blocks]]: <math display="block">M = \begin{bmatrix} A & B \\ C & D \end{bmatrix}</math> where each block is <math>n \times n,</math> By applying the positivity condition, it immediately follows that <math>A</math> and <math>D</math> are hermitian, and <math>C = B^*.</math> We have that <math>\mathbf{z}^* M\mathbf{z} \ge 0</math> for all complex <math>\mathbf{z},</math> and in particular for <math>\mathbf{z} = [\mathbf{v}, 0]^\mathsf{T} .</math> Then <math display="block">\begin{bmatrix} \mathbf{v}^* & 0 \end{bmatrix} \begin{bmatrix} A & B \\ B^* & D \end{bmatrix} \begin{bmatrix} \mathbf{v} \\ 0 \end{bmatrix} = \mathbf{v}^* A\mathbf{v} \ge 0.</math> A similar argument can be applied to <math>D,</math> and thus we conclude that both <math>A</math> and <math>D</math> must be positive definite. The argument can be extended to show that any [[Matrix_(mathematics)#Submatrix|principal submatrix]] of <math>M</math> is itself positive definite. Converse results can be proved with stronger conditions on the blocks, for instance, using the [[Schur complement#Conditions for positive definiteness and semi-definiteness|Schur complement]]. === Local extrema === A general [[quadratic form]] <math>f(\mathbf{x})</math> on <math>n</math> real variables <math>x_1, \ldots, x_n</math> can always be written as <math>\mathbf{x}^\mathsf{T} M \mathbf{x}</math> where <math>\mathbf{x}</math> is the column vector with those variables, and <math>M</math> is a symmetric real matrix. Therefore, the matrix being positive definite means that <math>f</math> has a unique minimum (zero) when <math>\mathbf{x}</math> is zero, and is strictly positive for any other <math>\mathbf{x}.</math> More generally, a twice-differentiable real function <math>f</math> on <math>n</math> real variables has local minimum at arguments <math>x_1, \ldots, x_n</math> if its [[gradient]] is zero and its [[Hessian matrix|Hessian]] (the matrix of all second derivatives) is positive semi-definite at that point. Similar statements can be made for negative definite and semi-definite matrices. === Covariance === In [[statistics]], the [[covariance matrix]] of a [[multivariate probability distribution]] is always positive semi-definite; and it is positive definite unless one variable is an exact linear function of the others. Conversely, every positive semi-definite matrix is the covariance matrix of some multivariate distribution.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)