Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Matrix decomposition
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Decompositions based on eigenvalues and related concepts == === Eigendecomposition === {{main|Eigendecomposition (matrix)}} *Also called ''[[Spectral decomposition (Matrix)|spectral decomposition]]''. *Applicable to: [[square matrix]] ''A'' with linearly independent eigenvectors (not necessarily distinct eigenvalues). *Decomposition: <math>A=VDV^{-1}</math>, where ''D'' is a [[diagonal matrix]] formed from the [[eigenvalue]]s of ''A'', and the columns of ''V'' are the corresponding [[eigenvector]]s of ''A''. *Existence: An ''n''-by-''n'' matrix ''A'' always has ''n'' (complex) eigenvalues, which can be ordered (in more than one way) to form an ''n''-by-''n'' diagonal matrix ''D'' and a corresponding matrix of nonzero columns ''V'' that satisfies the [[eigenvalue equation]] <math>AV=VD</math>. <math>V</math> is invertible if and only if the ''n'' eigenvectors are [[Linear independence|linearly independent]] (that is, each eigenvalue has [[geometric multiplicity]] equal to its [[algebraic multiplicity]]). A sufficient (but not necessary) condition for this to happen is that all the eigenvalues are different (in this case geometric and algebraic multiplicity are equal to 1) *Comment: One can always normalize the eigenvectors to have length one (see the definition of the eigenvalue equation) *Comment: Every [[normal matrix]] ''A'' (that is, matrix for which <math>AA^*=A^*A</math>, where <math>A^*</math> is a [[conjugate transpose]]) can be eigendecomposed. For a [[normal matrix]] ''A'' (and only for a normal matrix), the eigenvectors can also be made orthonormal (<math>VV^*=I</math>) and the eigendecomposition reads as <math>A=VDV^*</math>. In particular all [[Unitary matrix|unitary]], [[Hermitian matrix|Hermitian]], or [[Skew-Hermitian matrix|skew-Hermitian]] (in the real-valued case, all [[Orthogonal matrix|orthogonal]], [[Symmetric matrix|symmetric]], or [[Skew-symmetric matrix|skew-symmetric]], respectively) matrices are normal and therefore possess this property. *Comment: For any real [[symmetric matrix]] ''A'', the eigendecomposition always exists and can be written as <math>A=VDV^\mathsf{T}</math>, where both ''D'' and ''V'' are real-valued. *Comment: The eigendecomposition is useful for understanding the solution of a system of linear ordinary differential equations or linear difference equations. For example, the difference equation <math>x_{t+1}=Ax_t</math> starting from the initial condition <math>x_0=c</math> is solved by <math>x_t = A^tc</math>, which is equivalent to <math>x_t = VD^tV^{-1}c</math>, where ''V'' and ''D'' are the matrices formed from the eigenvectors and eigenvalues of ''A''. Since ''D'' is diagonal, raising it to power <math>D^t</math>, just involves raising each element on the diagonal to the power ''t''. This is much easier to do and understand than raising ''A'' to power ''t'', since ''A'' is usually not diagonal. === Jordan decomposition === The [[Jordan normal form]] and the [[Jordan–Chevalley decomposition]] *Applicable to: [[square matrix]] ''A'' *Comment: the Jordan normal form generalizes the eigendecomposition to cases where there are repeated eigenvalues and cannot be diagonalized, the Jordan–Chevalley decomposition does this without choosing a basis. === Schur decomposition === {{main|Schur decomposition}} *Applicable to: [[square matrix]] ''A'' *Decomposition (complex version): <math>A=UTU^*</math>, where ''U'' is a [[unitary matrix]], <math>U^*</math> is the [[conjugate transpose]] of ''U'', and ''T'' is an [[upper triangular]] matrix called the complex [[Schur form]] which has the [[eigenvalue]]s of ''A'' along its diagonal. *Comment: if ''A'' is a [[normal matrix]], then ''T'' is diagonal and the Schur decomposition coincides with the spectral decomposition. === Real Schur decomposition === *Applicable to: [[square matrix]] ''A'' *Decomposition: This is a version of Schur decomposition where <math>V</math> and <math>S</math> only contain real numbers. One can always write <math>A=VSV^\mathsf{T}</math> where ''V'' is a real [[orthogonal matrix]], <math>V^\mathsf{T}</math> is the [[matrix transpose|transpose]] of ''V'', and ''S'' is a [[block matrix|block upper triangular]] matrix called the real [[Schur form]]. The blocks on the diagonal of ''S'' are of size 1×1 (in which case they represent real eigenvalues) or 2×2 (in which case they are derived from [[complex conjugate]] eigenvalue pairs). === QZ decomposition === {{main|QZ decomposition}} *Also called: ''generalized Schur decomposition'' *Applicable to: [[square matrix|square matrices]] ''A'' and ''B'' *Comment: there are two versions of this decomposition: complex and real. *Decomposition (complex version): <math>A=QSZ^*</math> and <math>B=QTZ^*</math> where ''Q'' and ''Z'' are [[unitary matrix|unitary matrices]], the * superscript represents [[conjugate transpose]], and ''S'' and ''T'' are [[upper triangular]] matrices. *Comment: in the complex QZ decomposition, the ratios of the diagonal elements of ''S'' to the corresponding diagonal elements of ''T'', <math>\lambda_i = S_{ii}/T_{ii}</math>, are the generalized [[eigenvalue]]s that solve the [[Eigendecomposition of a matrix#Additional topics|generalized eigenvalue problem]] <math>A \mathbf{v} = \lambda B \mathbf{v}</math> (where <math>\lambda</math> is an unknown scalar and '''v''' is an unknown nonzero vector). *Decomposition (real version): <math>A=QSZ^\mathsf{T}</math> and <math>B=QTZ^\mathsf{T}</math> where ''A'', ''B'', ''Q'', ''Z'', ''S'', and ''T'' are matrices containing real numbers only. In this case ''Q'' and ''Z'' are [[orthogonal matrix|orthogonal matrices]], the ''T'' superscript represents [[matrix transpose|transposition]], and ''S'' and ''T'' are [[block matrix|block upper triangular]] matrices. The blocks on the diagonal of ''S'' and ''T'' are of size 1×1 or 2×2. === Takagi's factorization === *Applicable to: square, complex, symmetric matrix ''A''. *Decomposition: <math>A=VDV^\mathsf{T}</math>, where ''D'' is a real nonnegative [[diagonal matrix]], and ''V'' is [[unitary matrix|unitary]]. <math>V^\mathsf{T}</math> denotes the [[matrix transpose]] of ''V''. *Comment: The diagonal elements of ''D'' are the nonnegative square roots of the eigenvalues of <math>AA^*=VD^2V^{-1}</math>. *Comment: ''V'' may be complex even if ''A'' is real. *Comment: This is not a special case of the eigendecomposition (see above), which uses <math>V^{-1}</math> instead of <math>V^\mathsf{T}</math>. Moreover, if ''A'' is not real, it is not Hermitian and the form using <math>V^*</math> also does not apply. === Singular value decomposition === {{main|Singular value decomposition}} *Applicable to: ''m''-by-''n'' matrix ''A''. *Decomposition: <math>A=UDV^*</math>, where ''D'' is a nonnegative [[diagonal matrix]], and ''U'' and ''V'' satisfy <math>U^*U = I, V^*V = I</math>. Here <math>V^*</math> is the [[conjugate transpose]] of ''V'' (or simply the [[matrix transpose|transpose]], if ''V'' contains real numbers only), and ''I'' denotes the identity matrix (of some dimension). *Comment: The diagonal elements of ''D'' are called the [[singular value]]s of ''A''. *Comment: Like the eigendecomposition above, the singular value decomposition involves finding basis directions along which matrix multiplication is equivalent to scalar multiplication, but it has greater generality since the matrix under consideration need not be square. *Uniqueness: the singular values of <math>A</math> are always uniquely determined. <math>U</math> and <math>V</math> need not to be unique in general. === Scale-invariant decompositions === Refers to variants of existing matrix decompositions, such as the SVD, that are invariant with respect to diagonal scaling. *Applicable to: ''m''-by-''n'' matrix ''A''. *Unit-Scale-Invariant Singular-Value Decomposition: <math>A=DUSV^*E</math>, where ''S'' is a unique nonnegative [[diagonal matrix]] of scale-invariant singular values, ''U'' and ''V'' are [[unitary matrix|unitary matrices]], <math>V^*</math> is the [[conjugate transpose]] of ''V'', and positive diagonal matrices ''D'' and ''E''. *Comment: Is analogous to the SVD except that the diagonal elements of ''S'' are invariant with respect to left and/or right multiplication of ''A'' by arbitrary nonsingular diagonal matrices, as opposed to the standard SVD for which the singular values are invariant with respect to left and/or right multiplication of ''A'' by arbitrary unitary matrices. *Comment: Is an alternative to the standard SVD when invariance is required with respect to diagonal rather than unitary transformations of ''A''. *Uniqueness: The scale-invariant singular values of <math>A</math> (given by the diagonal elements of ''S'') are always uniquely determined. Diagonal matrices ''D'' and ''E'', and unitary ''U'' and ''V'', are not necessarily unique in general. *Comment: ''U'' and ''V'' matrices are not the same as those from the SVD. Analogous scale-invariant decompositions can be derived from other matrix decompositions; for example, to obtain scale-invariant eigenvalues.<ref>{{citation|last=Uhlmann |first=J.K. |title=A Generalized Matrix Inverse that is Consistent with Respect to Diagonal Transformations |journal=SIAM Journal on Matrix Analysis and Applications |year=2018 |volume=239 |issue=2 |pages=781–800 |doi=10.1137/17M113890X }}</ref><ref>{{citation|last=Uhlmann |first=J.K. |title=A Rank-Preserving Generalized Matrix Inverse for Consistency with Respect to Similarity |journal=IEEE Control Systems Letters |issn=2475-1456 |year=2018 |volume=3 |pages=91–95 |doi=10.1109/LCSYS.2018.2854240 |arxiv=1804.07334 |s2cid=5031440 }}</ref> ===Hessenberg decomposition=== *Applicable to: [[square matrix]] A. *Decomposition: <math>A=PHP^*</math> where <math>H</math> is the [[Hessenberg matrix]] and <math>P</math> is a [[unitary matrix]]. *Comment: often the first step in the Schur decomposition. ===Complete orthogonal decomposition=== {{main|Complete orthogonal decomposition}} *Also known as: ''UTV decomposition'', ''ULV decomposition'', ''URV decomposition''. *Applicable to: ''m''-by-''n'' matrix ''A''. *Decomposition: <math>A=UTV^*</math>, where ''T'' is a [[triangular matrix]], and ''U'' and ''V'' are [[unitary matrix|unitary matrices]]. *Comment: Similar to the singular value decomposition and to the Schur decomposition.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)