Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Spectral theorem
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Finite-dimensional case == <!-- This section is linked from [[Singular value decomposition]] --> === Hermitian maps and Hermitian matrices === We begin by considering a [[Hermitian matrix]] on <math>\mathbb{C}^n</math> (but the following discussion will be adaptable to the more restrictive case of [[symmetric matrix|symmetric matrices]] on {{nowrap|<math>\mathbb{R}^n</math>).}} We consider a [[Hermitian operator|Hermitian map]] {{math|''A''}} on a finite-dimensional [[complex number|complex]] [[inner product space]] {{math|''V''}} endowed with a [[Definite bilinear form|positive definite]] [[sesquilinear form|sesquilinear]] [[inner product]] <math>\langle\cdot,\cdot\rangle</math>. The Hermitian condition on <math>A</math> means that for all {{math|''x'', ''y'' ∈ ''V''}}, <math display="block"> \langle A x, y \rangle = \langle x, A y \rangle.</math> An equivalent condition is that {{math|1=''A''<sup>*</sup> = ''A''}}, where {{math|''A''<sup>*</sup>}} is the [[Hermitian conjugate]] of {{math|''A''}}. In the case that {{math|''A''}} is identified with a Hermitian matrix, the matrix of {{math|''A''<sup>*</sup>}} is equal to its [[conjugate transpose]]. (If {{math|''A''}} is a [[real matrix]], then this is equivalent to {{math|1=''A''<sup>T</sup> = ''A''}}, that is, {{math|''A''}} is a [[symmetric matrix]].) This condition implies that all eigenvalues of a Hermitian map are real: To see this, it is enough to apply it to the case when {{math|1=''x'' = ''y''}} is an eigenvector. (Recall that an [[eigenvector]] of a linear map {{math|''A''}} is a non-zero vector {{math|''v''}} such that {{math|1=''Av'' = ''λv''}} for some scalar {{math|''λ''}}. The value {{math|''λ''}} is the corresponding [[eigenvalue]]. Moreover, the [[eigenvalues]] are roots of the [[characteristic polynomial]].) {{math theorem | math_statement = If {{math|''A''}} is Hermitian on {{math|''V''}}, then there exists an [[orthonormal basis]] of {{math|''V''}} consisting of eigenvectors of {{math|''A''}}. Each eigenvalue of {{math|''A''}} is real.}} We provide a sketch of a proof for the case where the underlying field of scalars is the [[complex number]]s. By the [[fundamental theorem of algebra]], applied to the [[characteristic polynomial]] of {{math|''A''}}, there is at least one complex eigenvalue {{math|''λ''<sub>1</sub>}} and corresponding eigenvector {{math|''v''<sub>1</sub>}}, which must by definition be non-zero. Then since <math display="block">\lambda_1 \langle v_1, v_1 \rangle = \langle A (v_1), v_1 \rangle = \langle v_1, A(v_1) \rangle = \bar\lambda_1 \langle v_1, v_1 \rangle,</math> we find that {{math|''λ''<sub>1</sub>}} is real. Now consider the space <math>\mathcal{K}^{n-1} = \text{span}(v_1)^\perp</math>, the [[orthogonal complement]] of {{math|''v''<sub>1</sub>}}. By Hermiticity, <math>\mathcal{K}^{n-1}</math> is an [[invariant subspace]] of {{math|''A''}}. To see that, consider any <math>k \in \mathcal{K}^{n-1}</math> so that <math> \langle k, v_1 \rangle = 0 </math> by definition of <math>\mathcal{K}^{n-1}</math>. To satisfy invariance, we need to check if <math>A(k) \in \mathcal{K}^{n-1}</math>. This is true because, <math>\langle A(k), v_1 \rangle = \langle k, A(v_1) \rangle = \langle k, \lambda_1 v_1 \rangle = 0</math>. Applying the same argument to <math>\mathcal{K}^{n-1}</math> shows that {{math|''A''}} has at least one real eigenvalue <math>\lambda_2</math> and corresponding eigenvector <math>v_2 \in \mathcal{K}^{n-1} \perp v_1</math>. This can be used to build another invariant subspace <math>\mathcal{K}^{n-2} = \text{span}(\{v_1, v_2\})^\perp</math>. Finite induction then finishes the proof. The matrix representation of {{math|''A''}} in a basis of eigenvectors is diagonal, and by the construction the proof gives a basis of mutually orthogonal eigenvectors; by choosing them to be unit vectors one obtains an orthonormal basis of eigenvectors. {{math|''A''}} can be written as a linear combination of pairwise orthogonal projections, called its '''spectral decomposition'''. Let <math display="block">V_{\lambda} = \{v \in V: A v = \lambda v\}</math> be the eigenspace corresponding to an eigenvalue <math>\lambda</math>. Note that the definition does not depend on any choice of specific eigenvectors. In general, {{math|''V''}} is the orthogonal direct sum of the spaces <math>V_{\lambda}</math> where the <math>\lambda</math> ranges over the [[Spectrum of a matrix|spectrum]] of <math>A</math>. When the matrix being decomposed is Hermitian, the spectral decomposition is a special case of the [[Schur decomposition]] (see the proof in case of [[#Normal matrices|normal matrices]] below). === Spectral decomposition and the singular value decomposition === The spectral decomposition is a special case of the [[singular value decomposition]], which states that any matrix <math>A \in \mathbb{C}^{m \times n}</math> can be expressed as <math>A = U\Sigma V^{*}</math>, where <math>U \in \mathbb{C}^{m \times m}</math> and <math>V \in \mathbb{C}^{n \times n}</math> are [[unitary matrices]] and <math>\Sigma \in \mathbb{R}^{m \times n}</math> is a diagonal matrix. The diagonal entries of <math>\Sigma</math> are uniquely determined by <math>A</math> and are known as the [[singular values]] of <math>A</math>. If <math>A</math> is Hermitian, then <math>A^* = A</math> and <math>V \Sigma U^* = U \Sigma V^*</math> which implies <math>U = V</math>. === Normal matrices === {{main|Normal matrix}} The spectral theorem extends to a more general class of matrices. Let {{math|''A''}} be an operator on a finite-dimensional inner product space. {{math|''A''}} is said to be [[normal matrix|normal]] if {{math|1=''A''<sup>*</sup>''A'' = ''AA''<sup>*</sup>}}. One can show that {{math|''A''}} is normal if and only if it is unitarily diagonalizable using the [[Schur decomposition]]. That is, any matrix can be written as {{math|1=''A'' = ''UTU''<sup>*</sup>}}, where {{math|''U''}} is unitary and {{math|''T''}} is [[upper triangular]]. If {{math|''A''}} is normal, then one sees that {{math|1=''TT''<sup>*</sup> = ''T''<sup>*</sup>''T''}}. Therefore, {{math|''T''}} must be diagonal since a normal upper triangular matrix is diagonal (see [[normal matrix#Consequences|normal matrix]]). The converse is obvious. In other words, {{math|''A''}} is normal if and only if there exists a [[unitary matrix]] {{math|''U''}} such that <math display="block">A = U D U^*,</math> where {{math|''D''}} is a [[diagonal matrix]]. Then, the entries of the diagonal of {{math|''D''}} are the [[eigenvalue]]s of {{math|''A''}}. The column vectors of {{math|''U''}} are the eigenvectors of {{math|''A''}} and they are orthonormal. Unlike the Hermitian case, the entries of {{math|''D''}} need not be real.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)