Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Normal matrix
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
{{Short description|Matrix that commutes with its conjugate transpose}} In mathematics, a [[complex number|complex]] [[square matrix]] {{mvar|A}} is '''normal''' if it [[commute (mathematics)|commute]]s with its [[conjugate transpose]] {{math|''A''{{sup|*}}}}: :<math>A \text{ normal} \iff A^*A = AA^* .</math> The concept of normal matrices can be extended to [[normal operator]]s on [[dimension (vector space)|infinite-dimensional]] [[normed space]]s and to normal elements in [[C*-algebra]]s. As in the matrix case, normality means commutativity is preserved, to the extent possible, in the noncommutative setting. This makes normal operators, and normal elements of C*-algebras, more amenable to analysis. The [[spectral theorem]] states that a matrix is normal if and only if it is [[similar matrix|unitarily similar]] to a [[diagonal matrix]], and therefore any matrix {{mvar|A}} satisfying the equation {{math|1=''A''<sup>*</sup>''A'' = ''AA''<sup>*</sup>}} is [[Diagonalizable matrix | diagonalizable]]. (The converse does not hold because diagonalizable matrices may have non-orthogonal eigenspaces.) Thus <math>A = U D U^*</math> and <math>A^* = U D^* U^*</math>where <math>D</math> is a diagonal matrix whose diagonal values are in general complex. The left and right singular vectors in the [[singular value decomposition]] of a normal matrix <math>A = U D V^*</math> differ only in complex phase from each other and from the corresponding eigenvectors, since the phase must be factored out of the eigenvalues to form singular values. ==Special cases== Among complex matrices, all [[unitary matrix|unitary]], [[Hermitian matrix|Hermitian]], and [[skew-Hermitian matrix|skew-Hermitian]] matrices are normal, with all eigenvalues being unit modulus, real, and imaginary, respectively. Likewise, among real matrices, all [[orthogonal matrix|orthogonal]], [[symmetric matrix|symmetric]], and [[skew-symmetric matrix|skew-symmetric]] matrices are normal, with all eigenvalues being complex conjugate pairs on the unit circle, real, and imaginary, respectively. However, it is ''not'' the case that all normal matrices are either unitary or (skew-)Hermitian, as their eigenvalues can be any complex number, in general. For example, <math display="block">A = \begin{bmatrix} 1 & 1 & 0 \\ 0 & 1 & 1 \\ 1 & 0 & 1 \end{bmatrix}</math> is neither unitary, Hermitian, nor skew-Hermitian, because its eigenvalues are <math>2, (1\pm i\sqrt{3})/2</math>; yet it is normal because <math display="block">AA^* = \begin{bmatrix} 2 & 1 & 1 \\ 1 & 2 & 1 \\ 1 & 1 & 2 \end{bmatrix} = A^*A.</math><!-- For the curious, the four classes \begin{bmatrix} a & b & 0 \\ 0 & a & b \\ b & 0 & a \end{bmatrix} \begin{bmatrix} a & b & 0 \\ 0 & a & -b \\ b & 0 & a \end{bmatrix} \begin{bmatrix} a & b & 0 \\ 0 & a & b \\ -b & 0 & a \end{bmatrix} \begin{bmatrix} a & b & 0 \\ 0 & a & -b \\ -b & 0 & a \end{bmatrix} are neither unitary nor skew-Hermitian for all non-zero real a and b. There are more 3×3 examples, but among 2×2 matrices, there are only ones that are multiples of unitary matrices. --> == Consequences == {{math theorem | name = Proposition | math_statement = A normal [[triangular matrix]] is [[diagonal matrix|diagonal]].}} {{math proof | proof = Let {{mvar|A}} be any normal upper triangular matrix. Since <math display="block">(A^* A)_{ii} = (A A^*)_{ii},</math> using subscript notation, one can write the equivalent expression using instead the {{mvar|i}}th unit vector (<math>\hat \mathbf e_i</math>) to select the {{mvar|i}}th row and {{mvar|i}}th column: <math display="block">\hat \mathbf e_i^\intercal \left(A^* A\right) \hat \mathbf e_i = \hat \mathbf e_i^\intercal \left(A A^*\right) \hat \mathbf e_i.</math> The expression <math display="block">\left( A \hat \mathbf e_i \right)^* \left( A \hat \mathbf e_i\right) = \left( A^* \hat \mathbf e_i \right)^* \left( A^* \hat \mathbf e_i\right)</math> is equivalent, and so is <math display="block">\left \|A \hat \mathbf e_i \right\|^2 = \left \|A^* \hat \mathbf e_i \right \|^2,</math> which shows that the {{mvar|i}}th row must have the same norm as the {{mvar|i}}th column.{{pb}} Consider {{math|1=''i'' = 1}}. The first entry of row 1 and column 1 are the same, and the rest of column 1 is zero (because of triangularity). This implies the first row must be zero for entries 2 through {{mvar|n}}. Continuing this argument for row–column pairs 2 through {{mvar|n}} shows {{mvar|A}} is diagonal. [[Q.E.D.]]}} The concept of normality is important because normal matrices are precisely those to which the [[spectral theorem]] applies: {{math theorem | name = Proposition | math_statement = A matrix {{mvar|A}} is normal if and only if there exist a [[diagonal matrix]] {{math|Λ}} and a [[unitary matrix]] {{mvar|U}} such that {{math|1=''A'' = ''U''Λ''U''<sup>*</sup>}}.}} The diagonal entries of {{math|Λ}} are the [[eigenvalue]]s of {{mvar|A}}, and the columns of {{mvar|U}} are the [[eigenvector]]s of {{mvar|A}}. The matching eigenvalues in {{math|Λ}} come in the same order as the eigenvectors are ordered as columns of {{mvar|U}}. Another way of stating the [[spectral theorem]] is to say that normal matrices are precisely those matrices that can be represented by a diagonal matrix with respect to a properly chosen [[orthonormal basis]] of {{math|'''C'''<sup>''n''</sup>}}. Phrased differently: a matrix is normal if and only if its [[eigenspace]]s span {{math|'''C'''<sup>''n''</sup>}} and are pairwise [[orthogonal]] with respect to the standard inner product of {{math|'''C'''<sup>''n''</sup>}}. The spectral theorem for normal matrices is a special case of the more general [[Schur decomposition]] which holds for all square matrices. Let {{mvar|A}} be a square matrix. Then by Schur decomposition it is unitary similar to an upper-triangular matrix, say, {{mvar|B}}. If {{mvar|A}} is normal, so is {{mvar|B}}. But then {{mvar|B}} must be diagonal, for, as noted above, a normal upper-triangular matrix is diagonal. The spectral theorem permits the classification of normal matrices in terms of their spectra, for example: {{math theorem | name = Proposition | math_statement = A normal matrix is unitary if and only if all of its eigenvalues (its spectrum) lie on the unit circle of the complex plane.}} {{math theorem | name = Proposition | math_statement = A normal matrix is [[self-adjoint]] if and only if its spectrum is contained in [[real number|<math>\R</math>]]. In other words: A normal matrix is [[Hermitian matrix|Hermitian]] if and only if all its eigenvalues are [[real number|real]].}} In general, the sum or product of two normal matrices need not be normal. However, the following holds: {{math theorem | name = Proposition | math_statement = If {{mvar|A}} and {{mvar|B}} are normal with {{math|1=''AB'' = ''BA''}}, then both {{math|''AB''}} and {{math|''A'' + ''B''}} are also normal. Furthermore there exists a unitary matrix {{mvar|U}} such that {{math|''UAU''<sup>*</sup>}} and {{math|''UBU''<sup>*</sup>}} are diagonal matrices. In other words {{mvar|A}} and {{mvar|B}} are [[simultaneously diagonalizable]].}} In this special case, the columns of {{math|''U''<sup>*</sup>}} are eigenvectors of both {{mvar|A}} and {{mvar|B}} and form an orthonormal basis in {{math|'''C'''<sup>''n''</sup>}}. This follows by combining the theorems that, over an algebraically closed field, [[commuting matrices]] are [[simultaneously triangularizable]] and a normal matrix is diagonalizable – the added result is that these can both be done simultaneously. == Equivalent definitions == It is possible to give a fairly long list of equivalent definitions of a normal matrix. Let {{mvar|A}} be a {{math|''n'' × ''n''}} complex matrix. Then the following are equivalent: # {{mvar|A}} is normal. # {{mvar|A}} is [[diagonalizable matrix|diagonalizable]] by a unitary matrix. # There exists a set of eigenvectors of {{mvar|A}} which forms an orthonormal basis for {{math|'''C'''<sup>''n''</sup>}}. # <math>\left\| A \mathbf{x} \right\| = \left\| A^* \mathbf{x} \right\|</math> for every {{Math|'''x'''}}. # The [[Frobenius norm]] of {{mvar|A}} can be computed by the eigenvalues of {{mvar|A}}: <math display="inline"> \operatorname{tr} \left(A^* A\right) = \sum_j \left| \lambda_j \right|^2 </math>. # The [[Hermitian matrix|Hermitian]] part {{math|{{sfrac|1|2}}(''A'' + ''A''<sup>*</sup>)}} and [[Skew-Hermitian matrix|skew-Hermitian]] part {{math|{{sfrac|1|2}}(''A'' − ''A''<sup>*</sup>)}} of {{mvar|A}} commute. # {{math|''A''<sup>*</sup>}} is a polynomial (of degree {{math|≤ ''n'' − 1}}) in {{mvar|A}}.<ref group="lower-alpha">Proof: When <math>A</math> is normal, use [[Lagrange polynomial|Lagrange's interpolation]] formula to construct a polynomial <math>P</math> such that <math>\overline{\lambda_j} = P(\lambda_j)</math>, where <math>\lambda_j</math> are the eigenvalues of <math>A</math>.</ref> # {{math|1=''A''<sup>*</sup> = ''AU''}} for some unitary matrix {{mvar|U}}.<ref>{{Harvp|Horn|Johnson|1985|page=109}}</ref> # {{mvar|U}} and {{mvar|P}} commute, where we have the [[polar decomposition]] {{math|1=''A'' = ''UP''}} with a unitary matrix {{mvar|U}} and some [[positive-definite matrix|positive semidefinite matrix]] {{mvar|P}}. # {{mvar|A}} commutes with some normal matrix {{mvar|N}} with distinct{{clarify|date=September 2023}} eigenvalues. # {{math|1=''σ<sub>i</sub>'' = {{abs|''λ<sub>i</sub>''}}}} for all {{math|1 ≤ ''i'' ≤ ''n''}} where {{mvar|A}} has [[singular values]] {{math|''σ''<sub>1</sub> ≥ ⋯ ≥ ''σ<sub>n</sub>''}} and has eigenvalues that are indexed with ordering {{math|{{abs|''λ''<sub>1</sub>}} ≥ ⋯ ≥ {{abs|''λ<sub>n</sub>''}}}}.<ref>{{Harvp|Horn|Johnson|1991|page=[https://archive.org/details/topicsinmatrixan0000horn/page/157 157]}}</ref> Some but not all of the above generalize to normal operators on infinite-dimensional Hilbert spaces. For example, a bounded operator satisfying (9) is only [[quasinormal operator|quasinormal]]. ==Normal matrix analogy== {{confusing|date=October 2023|analogies in the below list}} It is occasionally useful (but sometimes misleading) to think of the relationships of special kinds of normal matrices as analogous to the relationships of the corresponding type of complex numbers of which their eigenvalues are composed. This is because any function (that can be expressed as a power series) of a non-defective matrix acts directly on each of its eigenvalues, and the conjugate transpose of its spectral decomposition <math>VD V^*</math> is <math>VD^*V^*</math>, where <math>D</math> is the diagonal matrix of eigenvalues. Likewise, if two normal matrices commute and are therefore simultaneously diagonalizable, any operation between these matrices also acts on each corresponding pair of eigenvalues. * The [[conjugate transpose]] is analogous to the [[complex conjugate]]. * [[Unitary matrix|Unitary matrices]] are analogous to [[complex number]]s on the [[unit circle]]. * [[Hermitian matrix|Hermitian matrices]] are analogous to [[real number]]s. * Hermitian [[positive-definite matrix|positive definite matrices]] are analogous to [[positive real numbers]]. * [[Skew-Hermitian matrix|Skew Hermitian matrices]] are analogous to purely [[imaginary number]]s. * [[Inverse matrix|Invertible matrices]] are analogous to non-zero [[complex number]]s. * The inverse of a matrix has each eigenvalue inverted. * A uniform [[Scaling_(geometry)|scaling matrix]] is analogous to a constant number. * In particular, the [[zero matrix|zero matrix]] is analogous to 0, and * the [[identity matrix|identity]] matrix is analogous to 1. * An [[idempotent matrix]] is an orthogonal projection with each eigenvalue either 0 or 1. * A normal [[Involutory_matrix|involution]] has eigenvalues <math>\pm 1</math>. As a special case, the complex numbers may be embedded in the normal 2×2 real matrices by the mapping <math display="block">a + bi \mapsto \begin{bmatrix} a & b \\ -b & a \end{bmatrix} = a\, \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix} + b\, \begin{bmatrix} 0 & 1 \\ -1 & 0 \end{bmatrix}\,.</math> which preserves addition and multiplication. It is easy to check that this embedding respects all of the above analogies. == See also == * [[Hermitian matrix]] * [[Least-squares normal matrix]] == Notes == <references group="lower-alpha" responsive="1"></references> == Citations == <references group="" responsive="1"></references> == Sources == * {{Citation|last1=Horn|first1=Roger Alan|title=Matrix Analysis|year=1985|publisher=[[Cambridge University Press]]|isbn=978-0-521-38632-6|last2=Johnson|first2=Charles Royal|author-link=Roger Horn|authorlink2=Charles Royal Johnson}}. * {{Cite book|last1=Horn|first1=Roger Alan|title=Topics in Matrix Analysis|last2=Johnson|first2=Charles Royal|publisher=Cambridge University Press|year=1991|isbn=978-0-521-30587-7|author-link=Roger Horn|author-link2=Charles Royal Johnson}} {{Matrix classes}} [[Category:Matrices (mathematics)]] [[ja:正規作用素]]
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)
Pages transcluded onto the current version of this page
(
help
)
:
Template:Ambox
(
edit
)
Template:Citation
(
edit
)
Template:Cite book
(
edit
)
Template:Clarify
(
edit
)
Template:Confusing
(
edit
)
Template:Harvp
(
edit
)
Template:Math
(
edit
)
Template:Math proof
(
edit
)
Template:Math theorem
(
edit
)
Template:Matrix classes
(
edit
)
Template:Mvar
(
edit
)
Template:Short description
(
edit
)