Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Euler's rotation theorem
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Matrix proof== A spatial rotation is a linear map in one-to-one correspondence with a {{nowrap|3 Γ 3}} [[rotation matrix]] {{math|'''R'''}} that transforms a coordinate [[Vector (mathematics and physics)|vector]] {{math|'''x'''}} into {{math|'''X'''}}, that is {{math|'''Rx''' {{=}} '''X'''}}. Therefore, another version of Euler's theorem is that for every rotation {{math|'''R'''}}, there is a nonzero vector {{math|'''n'''}} for which {{math|'''Rn''' {{=}} '''n'''}}; this is exactly the claim that {{math|'''n'''}} is an [[eigenvector]] of {{math|'''R'''}} associated with the [[eigenvalue]] 1. Hence it suffices to prove that 1 is an eigenvalue of {{math|'''R'''}}; the rotation axis of {{math|'''R'''}} will be the line {{math|''ΞΌ'''''n'''}}, where {{math|'''n'''}} is the eigenvector with eigenvalue 1. A rotation matrix has the fundamental property that its inverse is its transpose, that is :<math> \mathbf{R}^\mathsf{T}\mathbf{R} = \mathbf{R}\mathbf{R}^\mathsf{T} = \mathbf{I}, </math> where {{math|'''I'''}} is the {{nowrap|3 Γ 3}} identity matrix and superscript T indicates the transposed matrix. Compute the determinant of this relation to find that a rotation matrix has [[determinant]] Β±1. In particular, :<math>\begin{align} 1 = \det(\mathbf{I}) &= \det\left(\mathbf{R}^\mathsf{T}\mathbf{R}\right) = \det\left(\mathbf{R}^\mathsf{T}\right)\det(\mathbf{R}) = \det(\mathbf{R})^2 \\ \Longrightarrow\qquad \det(\mathbf{R}) &= \pm 1. \end{align}</math> A rotation matrix with determinant +1 is a proper rotation, and one with a negative determinant β1 is an ''improper rotation'', that is a reflection combined with a proper rotation. It will now be shown that a proper rotation matrix {{math|'''R'''}} has at least one invariant vector {{math|'''n'''}}, i.e., {{math|'''Rn''' {{=}} '''n'''}}. Because this requires that {{math|('''R''' β '''I''')'''n''' {{=}} 0}}, we see that the vector {{math|'''n'''}} must be an [[eigenvector]] of the matrix {{math|'''R'''}} with eigenvalue {{math|''Ξ»'' {{=}} 1}}. Thus, this is equivalent to showing that {{math|det('''R''' β '''I''') {{=}} 0}}. Use the two relations :<math> \det(-\mathbf{A}) = (-1)^{3} \det(\mathbf{A}) = - \det(\mathbf{A}) \quad</math> for any {{nowrap|3 Γ 3}} matrix '''A''' and :<math> \det\left(\mathbf{R}^{-1} \right) = 1 \quad</math> (since {{math|det('''R''') {{=}} 1}}) to compute :<math>\begin{align} &\det(\mathbf{R} - \mathbf{I}) = \det\left((\mathbf{R} - \mathbf{I})^\mathsf{T}\right) \\ {}={} &\det\left(\mathbf{R}^\mathsf{T} - \mathbf{I}\right) = \det\left(\mathbf{R}^{-1} - \mathbf{R}^{-1}\mathbf{R}\right) \\ {}={} &\det\left(\mathbf{R}^{-1}(\mathbf{I} - \mathbf{R})\right) = \det\left(\mathbf{R}^{-1}\right) \, \det(-(\mathbf{R} - \mathbf{I})) \\ {}={} &-\det(\mathbf{R} - \mathbf{I}) \\[3pt] \Longrightarrow\ 0 ={} &\det(\mathbf{R} - \mathbf{I}). \end{align}</math> This shows that {{math|''Ξ»'' {{=}} 1}} is a root (solution) of the [[Characteristic polynomial|characteristic equation]], that is, :<math> \det(\mathbf{R} - \lambda \mathbf{I}) = 0\quad \hbox{for}\quad \lambda=1. </math> In other words, the matrix {{math|'''R''' β '''I'''}} is singular and has a non-zero [[Kernel (matrix)|kernel]], that is, there is at least one non-zero vector, say {{math|'''n'''}}, for which :<math> (\mathbf{R} - \mathbf{I}) \mathbf{n} = \mathbf{0} \quad \Longleftrightarrow \quad \mathbf{R}\mathbf{n} = \mathbf{n}. </math> The line {{math|''ΞΌ'''''n'''}} for real {{mvar|ΞΌ}} is invariant under {{math|'''R'''}}, i.e., {{math|''ΞΌ'''''n'''}} is a rotation axis. This proves Euler's theorem. <!-- This result is often used as a showcase for the elementary theory of zeroes of polynomials, proving by a process of elimination that there are no other possibilities than that of the characteristic polynomial having 1 as a zero, but that ultimately results in a longer argument than the one above. First, since the matrix is real and has side 3, its characteristic polynomial must be real and of odd degree, hence it has a real zero. Second, multiplication by the matrix preserves the length of all vectors, and thus all eigenvalues must have absolute value 1. This leaves only 1 and β1 as candidates for being the real eigenvalue. Because complex eigenvalues occur in conjugate pairs, they give no net contribution to the determinant. Hence, the fact that the determinant is 1 means the multiplicity of β1 as a zero of the characteristic equation must be even. If the multiplicity is 2 then the third zero must be 1. If the multiplicity of β1 is 0 then the real zero must be 1. Either way, 1 is a zero of the characteristic equation. -->
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)