Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Hermitian matrix
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Properties== ===Main diagonal values are real=== The entries on the [[main diagonal]] (top left to bottom right) of any Hermitian matrix are [[real number|real]]. {{math proof|1= By definition of the Hermitian matrix <math display=block>H_{ij} = \overline{H}_{ji} </math> so for {{math|1=''i'' = ''j''}} the above follows, as a number can equal its complex conjugate only if the imaginary parts are zero. }} Only the [[main diagonal]] entries are necessarily real; Hermitian matrices can have arbitrary complex-valued entries in their [[off-diagonal element]]s, as long as diagonally-opposite entries are complex conjugates. ===Symmetric=== A matrix that has only real entries is [[symmetric matrix|symmetric]] [[if and only if]] it is a Hermitian matrix. A real and symmetric matrix is simply a special case of a Hermitian matrix. {{math proof|1= <math>H_{ij} = \overline{H}_{ji}</math> by definition. Thus <math>H_{ij} = H_{ji}</math> (matrix symmetry) if and only if <math>H_{ij} = \overline{H}_{ij}</math> (<math>H_{ij}</math> is real). }} So, if a real anti-symmetric matrix is multiplied by a real multiple of the imaginary unit <math>i,</math> then it becomes Hermitian. ===Normal=== Every Hermitian matrix is a [[normal matrix]]. That is to say, <math>AA^\mathsf{H} = A^\mathsf{H}A.</math> {{math proof|1=<math>A = A^\mathsf{H},</math> so <math>AA^\mathsf{H} = AA = A^\mathsf{H}A.</math>}} ===Diagonalizable=== The finite-dimensional [[spectral theorem]] says that any Hermitian matrix can be [[diagonalizable matrix|diagonalized]] by a [[unitary matrix]], and that the resulting diagonal matrix has only real entries. This implies that all [[eigenvectors|eigenvalue]]s of a Hermitian matrix {{mvar|A}} with dimension {{mvar|n}} are real, and that {{mvar|A}} has {{mvar|n}} linearly independent [[eigenvector]]s. Moreover, a Hermitian matrix has [[orthogonal]] eigenvectors for distinct eigenvalues. Even if there are degenerate eigenvalues, it is always possible to find an [[orthogonal basis]] of {{math|'''C'''<sup>''n''</sup>}} consisting of {{mvar|n}} eigenvectors of {{mvar|A}}. ===Sum of Hermitian matrices=== The sum of any two Hermitian matrices is Hermitian. {{math proof|1= <math display="block">(A + B)_{ij} = A_{ij} + B_{ij} = \overline{A}_{ji} + \overline{B}_{ji} = \overline{(A + B)}_{ji},</math> as claimed.}} ===Inverse is Hermitian=== The [[inverse matrix|inverse]] of an invertible Hermitian matrix is Hermitian as well. {{math proof|1= If <math>A^{-1}A = I,</math> then <math>I= I^\mathsf{H} = \left(A^{-1}A\right)^\mathsf{H} = A^\mathsf{H}\left(A^{-1}\right)^\mathsf{H} = A \left(A^{-1}\right)^\mathsf{H},</math> so <math>A^{-1}=\left(A^{-1}\right)^\mathsf{H}</math> as claimed.}} ===Associative product of Hermitian matrices=== The [[matrix multiplication|product]] of two Hermitian matrices {{mvar|A}} and {{mvar|B}} is Hermitian if and only if {{math|1=''AB'' = ''BA''}}. {{math proof|1= <math display="block">(AB)^\mathsf{H} = \overline{(AB)^\mathsf{T}} = \overline{B^\mathsf{T} A^\mathsf{T}} = \overline{B^\mathsf{T}} \ \overline{A^\mathsf{T}} = B^\mathsf{H} A^\mathsf{H} = BA.</math> Thus <math>(AB)^\mathsf{H} = AB</math> [[if and only if]] <math>AB = BA.</math> Thus {{math|''A''<sup>''n''</sup>}} is Hermitian if {{mvar|A}} is Hermitian and {{mvar|n}} is an integer. }} ===''ABA'' Hermitian=== If ''A'' and ''B'' are Hermitian, then ''ABA'' is also Hermitian. {{math proof|1= <math display="block">(ABA)^\mathsf{H} = (A(BA))^\mathsf{H} = (BA)^\mathsf{H}A^\mathsf{H} = A^\mathsf{H}B^\mathsf{H}A^\mathsf{H} = ABA </math>}} ==={{math|v<sup>H</sup>''A''v}} is real for complex {{math|v}}=== For an arbitrary complex valued vector {{Math|'''v'''}} the product <math> \mathbf{v}^\mathsf{H} A \mathbf{v} </math> is real because of <math> \mathbf{v}^\mathsf{H} A \mathbf{v} = \left(\mathbf{v}^\mathsf{H} A \mathbf{v}\right)^\mathsf{H} .</math> This is especially important in quantum physics where Hermitian matrices are operators that measure properties of a system, e.g. total [[Spin (physics)|spin]], which have to be real. ===Complex Hermitian forms vector space over {{math|β}}=== The Hermitian complex {{mvar|n}}-by-{{mvar|n}} matrices do not form a [[vector space]] over the [[complex number]]s, {{math|'''β'''}}, since the identity matrix {{math|''I''<sub>''n''</sub>}} is Hermitian, but {{math|''i''β''I''<sub>''n''</sub>}} is not. However the complex Hermitian matrices ''do'' form a vector space over the [[real numbers]] {{math|'''β'''}}. In the {{math|2''n''<sup>2</sup>}}-[[dimension of a vector space|dimensional]] vector space of complex {{math|''n''βΓβ''n''}} matrices over {{math|'''β'''}}, the complex Hermitian matrices form a subspace of dimension {{math|''n''<sup>2</sup>}}. If {{math|''E''<sub>''jk''</sub>}} denotes the {{mvar|n}}-by-{{mvar|n}} matrix with a {{math|1}} in the {{math|''j'',''k''}} position and zeros elsewhere, a basis (orthonormal with respect to the Frobenius inner product) can be described as follows: <math display=block>E_{jj} \text{ for } 1 \leq j \leq n \quad (n \text{ matrices}) </math> together with the set of matrices of the form <math display=block>\frac{1}{\sqrt{2}}\left(E_{jk} + E_{kj}\right) \text{ for } 1 \leq j < k \leq n \quad \left( \frac{n^2-n} 2 \text{ matrices} \right) </math> and the matrices <math display=block>\frac{i}{\sqrt{2}}\left(E_{jk} - E_{kj}\right) \text{ for } 1 \leq j < k \leq n \quad \left( \frac{n^2-n} 2 \text{ matrices} \right) </math> where <math>i</math> denotes the [[imaginary unit]], <math>i = \sqrt{-1}~.</math> An example is that the four [[Pauli matrices]] form a complete basis for the vector space of all complex 2-by-2 Hermitian matrices over {{math|'''β'''}}. ===Eigendecomposition=== If {{mvar|n}} orthonormal eigenvectors <math>\mathbf{u}_1, \dots, \mathbf{u}_n</math> of a Hermitian matrix are chosen and written as the columns of the matrix {{mvar|U}}, then one [[Eigendecomposition of a matrix|eigendecomposition]] of {{mvar|A}} is <math> A = U \Lambda U^\mathsf{H}</math> where <math>U U^\mathsf{H} = I = U^\mathsf{H} U</math> and therefore <math display=block>A = \sum_j \lambda_j \mathbf{u}_j \mathbf{u}_j ^\mathsf{H},</math> where <math>\lambda_j</math> are the eigenvalues on the diagonal of the diagonal matrix <math>\Lambda.</math> === Singular values === The singular values of <math>A</math> are the absolute values of its eigenvalues: Since <math>A</math> has an eigendecomposition <math>A=U\Lambda U^H</math>, where <math>U</math> is a [[unitary matrix]] (its columns are orthonormal vectors; [[Hermitian matrix#Eigendecomposition|see above]]), a [[singular value decomposition]] of <math>A</math> is <math>A=U|\Lambda|\text{sgn}(\Lambda)U^H</math>, where <math>|\Lambda|</math> and <math>\text{sgn}(\Lambda)</math> are diagonal matrices containing the absolute values <math>|\lambda|</math> and signs <math>\text{sgn}(\lambda)</math> of <math>A</math>'s eigenvalues, respectively. <math>\sgn(\Lambda)U^H</math> is unitary, since the columns of <math>U^H</math> are only getting multiplied by <math>\pm 1</math>. <math>|\Lambda|</math> contains the singular values of <math>A</math>, namely, the absolute values of its eigenvalues.<ref>{{Cite book |last1=Trefethan |first1=Lloyd N. |url=http://worldcat.org/oclc/1348374386 |title=Numerical linear algebra |last2=Bau, III |first2=David |publisher=[[SIAM]] |year=1997 |isbn=0-89871-361-7 |location=Philadelphia, PA, USA |pages=34 |oclc=1348374386}}</ref> ===Real determinant=== The determinant of a Hermitian matrix is real: {{math proof|1= <math> \det(A) = \det\left(A^\mathsf{T}\right)\quad \Rightarrow \quad \det\left(A^\mathsf{H}\right) = \overline{\det(A)} </math> Therefore if <math>A = A^\mathsf{H}\quad \Rightarrow \quad \det(A) = \overline{\det(A)} .</math> }} (Alternatively, the determinant is the product of the matrix's eigenvalues, and as mentioned before, the eigenvalues of a Hermitian matrix are real.)
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)