Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Trace (linear algebra)
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Properties == === Basic properties === The trace is a [[linear operator|linear mapping]]. That is,<ref name=":1" /><ref name=":2" /> <math display="block">\begin{align} \operatorname{tr}(\mathbf{A} + \mathbf{B}) &= \operatorname{tr}(\mathbf{A}) + \operatorname{tr}(\mathbf{B}) \\ \operatorname{tr}(c\mathbf{A}) &= c \operatorname{tr}(\mathbf{A}) \end{align}</math> for all square matrices {{math|'''A'''}} and {{math|'''B'''}}, and all [[scalar (mathematics)|scalar]]s {{mvar|c}}.<ref name="LipschutzLipson"/>{{rp|34}} A matrix and its [[transpose]] have the same trace:<ref name=":1" /><ref name=":2" /><ref name="LipschutzLipson"/>{{rp|34}} <math display="block">\operatorname{tr}(\mathbf{A}) = \operatorname{tr}\left(\mathbf{A}^\mathsf{T}\right).</math> This follows immediately from the fact that transposing a square matrix does not affect elements along the main diagonal. === Trace of a product === The trace of a square matrix which is the product of two matrices can be rewritten as the sum of entry-wise products of their elements, i.e. as the sum of all elements of their [[Hadamard product (matrices)|Hadamard product]]. Phrased directly, if {{math|'''A'''}} and {{math|'''B'''}} are two {{math|''m'' × ''n''}} matrices, then: <math display="block"> \operatorname{tr}\left(\mathbf{A}^\mathsf{T}\mathbf{B}\right) = \operatorname{tr}\left(\mathbf{A}\mathbf{B}^\mathsf{T}\right) = \operatorname{tr}\left(\mathbf{B}^\mathsf{T}\mathbf{A}\right) = \operatorname{tr}\left(\mathbf{B}\mathbf{A}^\mathsf{T}\right) = \sum_{i=1}^m \sum_{j=1}^n a_{ij}b_{ij} \; . </math> If one views any real {{math|''m'' × ''n''}} matrix as a vector of length {{mvar|mn}} (an operation called [[Vectorization (mathematics)|vectorization]]) then the above operation on {{math|'''A'''}} and {{math|'''B'''}} coincides with the standard [[dot product]]. According to the above expression, {{math|tr('''A'''<sup>⊤</sup>'''A''')}} is a sum of squares and hence is nonnegative, equal to zero if and only if {{math|'''A'''}} is zero.<ref name="HornJohnson">{{cite book |title=Matrix Analysis |edition=2nd |first1=Roger A. |last1=Horn |first2=Charles R. |last2=Johnson |isbn=9780521839402 |publisher=Cambridge University Press|year=2013}}</ref>{{rp|7}} Furthermore, as noted in the above formula, {{math|tr('''A'''<sup>⊤</sup>'''B''') {{=}} tr('''B'''<sup>⊤</sup>'''A''')}}. These demonstrate the positive-definiteness and symmetry required of an [[inner product]]; it is common to call {{math|tr('''A'''<sup>⊤</sup>'''B''')}} the [[Frobenius inner product]] of {{math|'''A'''}} and {{math|'''B'''}}. This is a natural inner product on the [[vector space]] of all real matrices of fixed dimensions. The [[norm (mathematics)|norm]] derived from this inner product is called the [[Frobenius norm]], and it satisfies a submultiplicative property, as can be proven with the [[Cauchy–Schwarz inequality]]: <math display="block">0 \leq \left[\operatorname{tr}(\mathbf{A} \mathbf{B})\right]^2 \leq \operatorname{tr}\left(\mathbf{A}^\mathsf{T} \mathbf{A}\right) \operatorname{tr}\left(\mathbf{B}^\mathsf{T} \mathbf{B}\right) ,</math> if {{math|'''A'''}} and {{math|'''B'''}} are real matrices such that {{math|'''A''' '''B'''}} is a square matrix. The Frobenius inner product and norm arise frequently in [[matrix calculus]] and [[statistics]]. The Frobenius inner product may be extended to a [[hermitian inner product]] on the [[complex vector space]] of all complex matrices of a fixed size, by replacing {{math|'''B'''}} by its [[complex conjugate]]. The symmetry of the Frobenius inner product may be phrased more directly as follows: the matrices in the trace of a product can be switched without changing the result. If {{math|'''A'''}} and {{math|'''B'''}} are {{math|''m'' × ''n''}} and {{math|''n'' × ''m''}} real or complex matrices, respectively, then<ref name=":1" /><ref name=":2" /><ref name="LipschutzLipson"/>{{rp|34}}<ref group="note">This is immediate from the definition of the [[matrix product]]: <math display="block">\operatorname{tr}(\mathbf{A}\mathbf{B}) = \sum_{i=1}^m \left(\mathbf{A}\mathbf{B}\right)_{ii} = \sum_{i=1}^m \sum_{j=1}^n a_{ij} b_{ji} = \sum_{j=1}^n \sum_{i=1}^m b_{ji} a_{ij} = \sum_{j=1}^n \left(\mathbf{B}\mathbf{A}\right)_{jj} = \operatorname{tr}(\mathbf{B}\mathbf{A}).</math> </ref> {{Equation box 1 |indent=: |title= |equation = <math>\operatorname{tr}(\mathbf{A}\mathbf{B}) = \operatorname{tr}(\mathbf{B}\mathbf{A})</math> |cellpadding= 6 |border |border colour = #0073CF |background colour=#F5FFFA }} This is notable both for the fact that {{math|'''AB'''}} does not usually equal {{math|'''BA'''}}, and also since the trace of either does not usually equal {{math|tr('''A''')tr('''B''')}}.<ref group="note">For example, if <math display="block"> \mathbf{A} = \begin{pmatrix} 0 & 1 \\ 0 & 0 \end{pmatrix},\quad \mathbf{B} = \begin{pmatrix} 0 & 0 \\ 1 & 0 \end{pmatrix}, </math> then the product is <math display="block">\mathbf{AB} = \begin{pmatrix} 1 & 0 \\ 0 & 0 \end{pmatrix},</math> and the traces are {{math|tr('''AB''') {{=}} 1 ≠ 0 ⋅ 0 {{=}} tr('''A''')tr('''B''')}}.</ref> The [[similarity invariance|similarity-invariance]] of the trace, meaning that {{math|tr('''A''') {{=}} tr('''P'''<sup>−1</sup>'''AP''')}} for any square matrix {{math|'''A'''}} and any invertible matrix {{math|'''P'''}} of the same dimensions, is a fundamental consequence. This is proved by <math display="block"> \operatorname{tr}\left(\mathbf{P}^{-1}(\mathbf{A}\mathbf{P})\right) = \operatorname{tr}\left((\mathbf{A} \mathbf{P})\mathbf{P}^{-1}\right) = \operatorname{tr}(\mathbf{A}). </math> Similarity invariance is the crucial property of the trace in order to discuss traces of [[linear transformation]]s as below. Additionally, for real column vectors <math>\mathbf{a}\in\mathbb{R}^n</math> and <math>\mathbf{b}\in\mathbb{R}^n</math>, the trace of the outer product is equivalent to the inner product: {{Equation box 1 |indent=: |title= |equation = <math>\operatorname{tr}\left(\mathbf{b}\mathbf{a}^\textsf{T}\right) = \mathbf{a}^\textsf{T}\mathbf{b}</math> |cellpadding= 6 |border |border colour = #0073CF |background colour=#F5FFFA }} === Cyclic property === More generally, the trace is ''invariant under [[circular shift]]s'', that is, {{Equation box 1 |indent=: |title= |equation = <math>\operatorname{tr}(\mathbf{A}\mathbf{B}\mathbf{C}\mathbf{D}) = \operatorname{tr}(\mathbf{B}\mathbf{C}\mathbf{D}\mathbf{A}) = \operatorname{tr}(\mathbf{C}\mathbf{D}\mathbf{A}\mathbf{B}) = \operatorname{tr}(\mathbf{D}\mathbf{A}\mathbf{B}\mathbf{C}).</math> |cellpadding= 6 |border |border colour = #0073CF |background colour=#F5FFFA}} This is known as the ''cyclic property''. Arbitrary permutations are not allowed: in general, <math display="block">\operatorname{tr}(\mathbf{A}\mathbf{B}\mathbf{C}\mathbf{D}) \ne \operatorname{tr}(\mathbf{A}\mathbf{C}\mathbf{B}\mathbf{D}) ~.</math> However, if products of ''three'' [[symmetric matrix|symmetric]] matrices are considered, any permutation is allowed, since: <math display="block">\operatorname{tr}(\mathbf{A}\mathbf{B}\mathbf{C}) = \operatorname{tr}\left(\left(\mathbf{A}\mathbf{B}\mathbf{C}\right)^{\mathsf T}\right) = \operatorname{tr}(\mathbf{C}\mathbf{B}\mathbf{A}) = \operatorname{tr}(\mathbf{A}\mathbf{C}\mathbf{B}),</math> where the first equality is because the traces of a matrix and its transpose are equal. Note that this is not true in general for more than three factors. === Trace of a Kronecker product === The trace of the [[Kronecker product]] of two matrices is the product of their traces: <math display="block">\operatorname{tr}(\mathbf{A} \otimes \mathbf{B}) = \operatorname{tr}(\mathbf{A})\operatorname{tr}(\mathbf{B}).</math> ===Characterization of the trace=== The following three properties: <math display="block">\begin{align} \operatorname{tr}(\mathbf{A} + \mathbf{B}) &= \operatorname{tr}(\mathbf{A}) + \operatorname{tr}(\mathbf{B}), \\ \operatorname{tr}(c\mathbf{A}) &= c \operatorname{tr}(\mathbf{A}), \\ \operatorname{tr}(\mathbf{A}\mathbf{B}) &= \operatorname{tr}(\mathbf{B}\mathbf{A}), \end{align}</math> characterize the trace [[up to]] a scalar multiple in the following sense: If <math>f</math> is a [[linear functional]] on the space of square matrices that satisfies <math>f(xy) = f(yx),</math> then <math>f</math> and <math>\operatorname{tr}</math> are proportional.<ref group="note">Proof: Let <math>e_{ij}</math> the standard basis and note that <math>f\left(e_{ij}\right) = f\left(e_{i} e_{j}^\top\right) = f\left(e_i e_1^\top e_1 e_j^\top\right) = f\left(e_1 e_j^\top e_i e_1^\top\right) = f\left(0\right) = 0</math> if <math>i \neq j</math> and <math>f\left(e_{jj}\right) = f\left(e_{11}\right)</math> <math display="block">f(\mathbf{A}) = \sum_{i, j} [\mathbf{A}]_{ij} f\left(e_{ij}\right) = \sum_i [\mathbf{A}]_{ii} f\left(e_{11}\right) = f\left(e_{11}\right) \operatorname{tr}(\mathbf{A}).</math> More abstractly, this corresponds to the decomposition <math display="block">\mathfrak{gl}_n = \mathfrak{sl}_n \oplus k,</math> as <math>\operatorname{tr}(AB) = \operatorname{tr}(BA)</math> (equivalently, <math>\operatorname{tr}([A, B]) = 0</math>) defines the trace on <math>\mathfrak{sl}_n,</math> which has complement the scalar matrices, and leaves one degree of freedom: any such map is determined by its value on scalars, which is one scalar parameter and hence all are multiple of the trace, a nonzero such map.</ref> For <math>n\times n</math> matrices, imposing the normalization <math>f(\mathbf{I}) = n</math> makes <math>f</math> equal to the trace. ===Trace as the sum of eigenvalues=== Given any {{math|''n'' × ''n''}} matrix {{math|'''A'''}}, there is {{Equation box 1 |indent=: |title= |equation = <math>\operatorname{tr}(\mathbf{A}) = \sum_{i=1}^n \lambda_i</math> |cellpadding= 6 |border |border colour = #0073CF |background colour=#F5FFFA}} where {{math|λ<sub>1</sub>, ..., λ<sub>''n''</sub>}} are the [[eigenvalue]]s of {{math|'''A'''}} counted with multiplicity. This holds true even if {{math|'''A'''}} is a real matrix and some (or all) of the eigenvalues are complex numbers. This may be regarded as a consequence of the existence of the [[Jordan canonical form]], together with the similarity-invariance of the trace discussed above. ===Trace of commutator=== When both {{math|'''A'''}} and {{math|'''B'''}} are {{math|''n'' × ''n''}} matrices, the trace of the (ring-theoretic) [[commutator]] of {{math|'''A'''}} and {{math|'''B'''}} vanishes: {{math|1=tr(['''A''', '''B''']) = 0}}, because {{math|1=tr('''AB''') = tr('''BA''')}} and {{math|tr}} is linear. One can state this as "the trace is a map of [[Lie algebras]] {{math|gl<sub>''n''</sub> → ''k''}} from operators to scalars", as the commutator of scalars is trivial (it is an [[Abelian Lie algebra]]). In particular, using similarity invariance, it follows that the identity matrix is never similar to the commutator of any pair of matrices. Conversely, any square matrix with zero trace is a linear combination of the commutators of pairs of matrices.<ref group="note">Proof: <math>\mathfrak{sl}_n</math> is a [[semisimple Lie algebra]] and thus every element in it is a linear combination of commutators of some pairs of elements, otherwise the [[derived algebra]] would be a proper ideal.</ref> Moreover, any square matrix with zero trace is [[Unitary representation|unitarily equivalent]] to a square matrix with diagonal consisting of all zeros. ===Traces of special kinds of matrices=== {{bulleted list | The trace of the {{math|''n'' × ''n''}} [[identity matrix]] is the dimension of the space, namely {{mvar|n}}. <math display="block">\operatorname{tr}\left(\mathbf{I}_n\right) = n</math> This leads to [[Dimension (vector space)#Trace|generalizations of dimension using trace]]. | The trace of a [[Hermitian matrix]] is real, because the elements on the diagonal are real. | The trace of a [[permutation matrix]] is the number of [[Fixed point (mathematics)|fixed points]] of the corresponding permutation, because the diagonal term {{math|''a''<sub>''ii''</sub>}} is 1 if the {{math|''i''}}th point is fixed and 0 otherwise. | The trace of a [[Projection_(linear_algebra)|projection matrix]] is the dimension of the target space. <math display="block">\begin{align} \mathbf{P}_\mathbf{X} &= \mathbf{X}\left(\mathbf{X}^\mathsf{T} \mathbf{X}\right)^{-1} \mathbf{X}^\mathsf{T} \\[3pt] \Longrightarrow \operatorname{tr}\left(\mathbf{P}_\mathbf{X}\right) &= \operatorname{rank}(\mathbf{X}). \end{align}</math> The matrix {{math|'''P<sub>X</sub>'''}} is idempotent. | More generally, the trace of any [[idempotent matrix]], i.e. one with {{math|1='''A'''<sup>2</sup> = '''A'''}}, equals its own [[rank (linear algebra)|rank]]. | The trace of a [[nilpotent matrix]] is zero. {{pb}} When the characteristic of the base field is zero, the converse also holds: if {{math|1=tr('''A'''<sup>''k''</sup>) = 0}} for all {{mvar|k}}, then {{math|'''A'''}} is nilpotent. {{pb}} When the characteristic {{math|''n'' > 0}} is positive, the identity in {{mvar|n}} dimensions is a counterexample, as <math>\operatorname{tr}\left(\mathbf{I}_n^k\right) = \operatorname{tr}\left(\mathbf{I}_n\right) = n \equiv 0</math>, but the identity is not nilpotent. }} === Relationship to the characteristic polynomial === The trace of an <math>n \times n</math> matrix <math>A</math> is the coefficient of <math>t^{n-1}</math> in the [[characteristic polynomial]], possibly changed of sign, according to the convention in the definition of the characteristic polynomial.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)