Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Trace (linear algebra)
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=== Trace of a product === The trace of a square matrix which is the product of two matrices can be rewritten as the sum of entry-wise products of their elements, i.e. as the sum of all elements of their [[Hadamard product (matrices)|Hadamard product]]. Phrased directly, if {{math|'''A'''}} and {{math|'''B'''}} are two {{math|''m'' × ''n''}} matrices, then: <math display="block"> \operatorname{tr}\left(\mathbf{A}^\mathsf{T}\mathbf{B}\right) = \operatorname{tr}\left(\mathbf{A}\mathbf{B}^\mathsf{T}\right) = \operatorname{tr}\left(\mathbf{B}^\mathsf{T}\mathbf{A}\right) = \operatorname{tr}\left(\mathbf{B}\mathbf{A}^\mathsf{T}\right) = \sum_{i=1}^m \sum_{j=1}^n a_{ij}b_{ij} \; . </math> If one views any real {{math|''m'' × ''n''}} matrix as a vector of length {{mvar|mn}} (an operation called [[Vectorization (mathematics)|vectorization]]) then the above operation on {{math|'''A'''}} and {{math|'''B'''}} coincides with the standard [[dot product]]. According to the above expression, {{math|tr('''A'''<sup>⊤</sup>'''A''')}} is a sum of squares and hence is nonnegative, equal to zero if and only if {{math|'''A'''}} is zero.<ref name="HornJohnson">{{cite book |title=Matrix Analysis |edition=2nd |first1=Roger A. |last1=Horn |first2=Charles R. |last2=Johnson |isbn=9780521839402 |publisher=Cambridge University Press|year=2013}}</ref>{{rp|7}} Furthermore, as noted in the above formula, {{math|tr('''A'''<sup>⊤</sup>'''B''') {{=}} tr('''B'''<sup>⊤</sup>'''A''')}}. These demonstrate the positive-definiteness and symmetry required of an [[inner product]]; it is common to call {{math|tr('''A'''<sup>⊤</sup>'''B''')}} the [[Frobenius inner product]] of {{math|'''A'''}} and {{math|'''B'''}}. This is a natural inner product on the [[vector space]] of all real matrices of fixed dimensions. The [[norm (mathematics)|norm]] derived from this inner product is called the [[Frobenius norm]], and it satisfies a submultiplicative property, as can be proven with the [[Cauchy–Schwarz inequality]]: <math display="block">0 \leq \left[\operatorname{tr}(\mathbf{A} \mathbf{B})\right]^2 \leq \operatorname{tr}\left(\mathbf{A}^\mathsf{T} \mathbf{A}\right) \operatorname{tr}\left(\mathbf{B}^\mathsf{T} \mathbf{B}\right) ,</math> if {{math|'''A'''}} and {{math|'''B'''}} are real matrices such that {{math|'''A''' '''B'''}} is a square matrix. The Frobenius inner product and norm arise frequently in [[matrix calculus]] and [[statistics]]. The Frobenius inner product may be extended to a [[hermitian inner product]] on the [[complex vector space]] of all complex matrices of a fixed size, by replacing {{math|'''B'''}} by its [[complex conjugate]]. The symmetry of the Frobenius inner product may be phrased more directly as follows: the matrices in the trace of a product can be switched without changing the result. If {{math|'''A'''}} and {{math|'''B'''}} are {{math|''m'' × ''n''}} and {{math|''n'' × ''m''}} real or complex matrices, respectively, then<ref name=":1" /><ref name=":2" /><ref name="LipschutzLipson"/>{{rp|34}}<ref group="note">This is immediate from the definition of the [[matrix product]]: <math display="block">\operatorname{tr}(\mathbf{A}\mathbf{B}) = \sum_{i=1}^m \left(\mathbf{A}\mathbf{B}\right)_{ii} = \sum_{i=1}^m \sum_{j=1}^n a_{ij} b_{ji} = \sum_{j=1}^n \sum_{i=1}^m b_{ji} a_{ij} = \sum_{j=1}^n \left(\mathbf{B}\mathbf{A}\right)_{jj} = \operatorname{tr}(\mathbf{B}\mathbf{A}).</math> </ref> {{Equation box 1 |indent=: |title= |equation = <math>\operatorname{tr}(\mathbf{A}\mathbf{B}) = \operatorname{tr}(\mathbf{B}\mathbf{A})</math> |cellpadding= 6 |border |border colour = #0073CF |background colour=#F5FFFA }} This is notable both for the fact that {{math|'''AB'''}} does not usually equal {{math|'''BA'''}}, and also since the trace of either does not usually equal {{math|tr('''A''')tr('''B''')}}.<ref group="note">For example, if <math display="block"> \mathbf{A} = \begin{pmatrix} 0 & 1 \\ 0 & 0 \end{pmatrix},\quad \mathbf{B} = \begin{pmatrix} 0 & 0 \\ 1 & 0 \end{pmatrix}, </math> then the product is <math display="block">\mathbf{AB} = \begin{pmatrix} 1 & 0 \\ 0 & 0 \end{pmatrix},</math> and the traces are {{math|tr('''AB''') {{=}} 1 ≠ 0 ⋅ 0 {{=}} tr('''A''')tr('''B''')}}.</ref> The [[similarity invariance|similarity-invariance]] of the trace, meaning that {{math|tr('''A''') {{=}} tr('''P'''<sup>−1</sup>'''AP''')}} for any square matrix {{math|'''A'''}} and any invertible matrix {{math|'''P'''}} of the same dimensions, is a fundamental consequence. This is proved by <math display="block"> \operatorname{tr}\left(\mathbf{P}^{-1}(\mathbf{A}\mathbf{P})\right) = \operatorname{tr}\left((\mathbf{A} \mathbf{P})\mathbf{P}^{-1}\right) = \operatorname{tr}(\mathbf{A}). </math> Similarity invariance is the crucial property of the trace in order to discuss traces of [[linear transformation]]s as below. Additionally, for real column vectors <math>\mathbf{a}\in\mathbb{R}^n</math> and <math>\mathbf{b}\in\mathbb{R}^n</math>, the trace of the outer product is equivalent to the inner product: {{Equation box 1 |indent=: |title= |equation = <math>\operatorname{tr}\left(\mathbf{b}\mathbf{a}^\textsf{T}\right) = \mathbf{a}^\textsf{T}\mathbf{b}</math> |cellpadding= 6 |border |border colour = #0073CF |background colour=#F5FFFA }}
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)