Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Definite matrix
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Decomposition == {{See also|Gram matrix}} Let <math>M</math> be an <math>n \times n</math> [[Hermitian matrix]]. <math>M</math> is positive semidefinite if and only if it can be decomposed as a product <math display="block">M = B^* B</math> of a matrix <math>B</math> with its [[conjugate transpose]]. When <math>M</math> is real, <math>B</math> can be real as well and the decomposition can be written as <math display="block">M = B^\mathsf{T} B.</math> <math>M</math> is positive definite if and only if such a decomposition exists with <math>B</math> [[Invertible matrix|invertible]]. More generally, <math>M</math> is positive semidefinite with rank <math>k</math> if and only if a decomposition exists with a <math>k \times n</math> matrix <math>B</math> of full row rank (i.e. of rank <math>k</math>). Moreover, for any decomposition <math>M = B^* B,</math> <math>\operatorname{rank}(M) = \operatorname{rank}(B).</math><ref>{{harvtxt|Horn|Johnson|2013}}, p. 440, Theorem 7.2.7</ref> {{math proof | proof = If <math>M = B^* B,</math> then <math>x^* M x = (x^* B^*) (B x) = \|B x \|^2 \geq 0,</math> so <math>M</math> is positive semidefinite. If moreover <math>B</math> is invertible then the inequality is strict for <math>x \neq 0,</math> so <math>M</math> is positive definite. If <math>B</math> is <math>k \times n</math> of rank <math>k,</math> then <math>\operatorname{rank}(M) = \operatorname{rank}(B^*) = k.</math> In the other direction, suppose <math>M</math> is positive semidefinite. Since <math>M</math> is Hermitian, it has an [[Eigendecomposition of a matrix#Decomposition for special matrices|eigendecomposition]] <math>M = Q^{-1} D Q</math> where <math>Q</math> is [[unitary matrix|unitary]] and <math>D</math> is a diagonal matrix whose entries are the eigenvalues of <math>M</math> Since <math>M</math> is positive semidefinite, the eigenvalues are non-negative real numbers, so one can define <math>D^{\frac{1}{2}}</math> as the diagonal matrix whose entries are non-negative square roots of eigenvalues. Then <math>M = Q^{-1} D Q = Q^* D Q = Q^* D^{\frac{1}{2}} D^{\frac{1}{2}} Q = Q^* D^{\frac{1}{2}*} D^{\frac{1}{2}} Q = B^* B</math> for <math>B = D^{\frac{1}{2}} Q.</math> If moreover <math>M</math> is positive definite, then the eigenvalues are (strictly) positive, so <math>D^{\frac{1}{2}}</math> is invertible, and hence <math>B = D^{\frac{1}{2}} Q</math> is invertible as well. If <math>M</math> has rank <math>k,</math> then it has exactly <math>k</math> positive eigenvalues and the others are zero, hence in <math>B = D^{\frac{1}{2}} Q</math> all but <math>k</math> rows are all zeroed. Cutting the zero rows gives a <math>k \times n</math> matrix <math>B'</math> such that <math>B'^* B' = B^* B = M.</math> }} The columns <math>b_1, \dots, b_n</math> of <math>B</math> can be seen as vectors in the [[complex vector space|complex]] or [[real vector space]] <math>\mathbb{R}^k,</math> respectively. Then the entries of <math>M</math> are [[inner product]]s (that is [[dot product]]s, in the real case) of these vectors <math display="block">M_{ij} = \langle b_i, b_j\rangle.</math> In other words, a Hermitian matrix <math>M</math> is positive semidefinite if and only if it is the [[Gram matrix]] of some vectors <math>b_1, \dots, b_n.</math> It is positive definite if and only if it is the Gram matrix of some [[linearly independent]] vectors. In general, the rank of the Gram matrix of vectors <math>b_1, \dots, b_n</math> equals the dimension of the space [[Linear span|spanned]] by these vectors.<ref>{{harvtxt|Horn|Johnson|2013}}, p. 441, Theorem 7.2.10</ref> === Uniqueness up to unitary transformations === The decomposition is not unique: if <math>M = B^* B</math> for some <math>k \times n</math> matrix <math>B</math> and if <math>Q</math> is any [[unitary matrix|unitary]] <math>k \times k</math> matrix (meaning <math>Q^* Q = Q Q^* = I</math>), then <math>M = B^* B = B^* Q^* Q B = A^* A</math> for <math>A = Q B.</math> However, this is the only way in which two decompositions can differ: The decomposition is unique up to [[unitary transformation]]s. More formally, if <math>A</math> is a <math>k \times n</math> matrix and <math>B</math> is a <math>\ell \times n</math> matrix such that <math>A^* A = B^* B,</math> then there is a <math>\ell \times k</math> matrix <math>Q</math> with orthonormal columns (meaning <math>Q^* Q = I_{k \times k}</math>) such that <math>B = Q A.</math><ref>{{harvtxt|Horn|Johnson|2013}}, p. 452, Theorem 7.3.11</ref> When <math>\ell = k</math> this means <math>Q</math> is [[unitary matrix|unitary]]. This statement has an intuitive geometric interpretation in the real case: let the columns of <math>A</math> and <math>B</math> be the vectors <math>a_1,\dots,a_n</math> and <math>b_1, \dots, b_n</math> in <math>\mathbb{R}^k.</math> A real unitary matrix is an [[orthogonal matrix]], which describes a [[rigid transformation]] (an isometry of Euclidean space <math>\mathbb{R}^k</math>) preserving the 0 point (i.e. [[Rotation matrix|rotations]] and [[Reflection matrix|reflections]], without translations). Therefore, the dot products <math>a_i \cdot a_j</math> and <math>b_i \cdot b_j</math> are equal if and only if some rigid transformation of <math>\mathbb{R}^k</math> transforms the vectors <math>a_1,\dots,a_n</math> to <math>b_1,\dots,b_n</math> (and 0 to 0). === Square root === {{main|Square root of a matrix}} A Hermitian matrix <math>M</math> is positive semidefinite if and only if there is a positive semidefinite matrix <math>B</math> (in particular <math>B</math> is Hermitian, so <math>B^* = B</math>) satisfying <math>M = B B.</math> This matrix <math>B</math> is unique,<ref>{{harvtxt|Horn|Johnson|2013}}, p. 439, Theorem 7.2.6 with <math>k = 2</math></ref> is called the ''non-negative [[square root of a matrix|square root]]'' of <math>M,</math> and is denoted with <math>B = M^\frac{1}{2}.</math> When <math>M</math> is positive definite, so is <math>M^\frac{1}{2},</math> hence it is also called the ''positive square root'' of <math>M .</math> The non-negative square root should not be confused with other decompositions <math>M = B^* B.</math> Some authors use the name ''square root'' and <math>M^\frac{1}{2}</math> for any such decomposition, or specifically for the [[Cholesky decomposition]], or any decomposition of the form <math>M = B B;</math> others only use it for the non-negative square root. If <math>M \succ N \succ 0</math> then <math>M^\frac{1}{2} \succ N^\frac{1}{2} \succ 0.</math> === Cholesky decomposition === A Hermitian positive semidefinite matrix <math>M</math> can be written as <math>M = L L^*,</math> where <math>L</math> is lower triangular with non-negative diagonal (equivalently <math>M = B^*B</math> where <math>B = L^*</math> is upper triangular); this is the [[Cholesky decomposition]]. If <math>M</math> is positive definite, then the diagonal of <math>L</math> is positive and the Cholesky decomposition is unique. Conversely if <math>L</math> is lower triangular with nonnegative diagonal then <math>L L^*</math> is positive semidefinite. The Cholesky decomposition is especially useful for efficient numerical calculations. A closely related decomposition is the [[Cholesky decomposition#LDL decomposition|LDL decomposition]], <math>M = L D L^*,</math> where <math>D</math> is diagonal and <math>L</math> is [[Triangular matrix#Unitriangular matrix|lower unitriangular]]. === Williamson theorem === Any <math>2n\times 2n </math> positive definite Hermitian real matrix <math>M </math> can be diagonalized via symplectic (real) matrices. More precisely, [[Williamson theorem|Williamson's theorem]] ensures the existence of symplectic <math>S\in\mathbf{Sp}(2n,\mathbb{R}) </math> and diagonal real positive <math>D\in\mathbb{R}^{n\times n} </math> such that <math>SMS^T=D\oplus D </math>.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)