Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Rotation matrix
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
====Determining the axis==== [[File:Rotation decomposition.png|thumb|A rotation {{mvar|R}} around axis {{math|'''u'''}} can be decomposed using 3 endomorphisms {{math|'''P'''}}, {{math|('''I''' − '''P''')}}, and {{math|'''Q'''}} (click to enlarge).]] Given a {{nowrap|3 × 3}} rotation matrix {{mvar|R}}, a vector {{math|'''u'''}} parallel to the rotation axis must satisfy :<math>R\mathbf{u} = \mathbf{u},</math> since the rotation of {{math|'''u'''}} around the rotation axis must result in {{math|'''u'''}}. The equation above may be solved for {{math|'''u'''}} which is unique up to a scalar factor unless {{mvar|R}} is the [[identity matrix]] {{mvar|I}}. Further, the equation may be rewritten :<math>R\mathbf{u} = I \mathbf{u} \implies \left(R - I\right) \mathbf{u} = 0,</math> which shows that {{math|'''u'''}} lies in the [[null space]] of {{math|''R'' − ''I''}}. Viewed in another way, {{math|'''u'''}} is an [[eigenvector]] of {{mvar|R}} corresponding to the [[eigenvalue]] {{math|''λ'' {{=}} 1}}. Every rotation matrix must have this eigenvalue, the other two eigenvalues being [[complex conjugate]]s of each other. It follows that a general rotation matrix in three dimensions has, up to a multiplicative constant, only one real eigenvector. One way to determine the rotation axis is by showing that: :<math>\begin{align} 0 &= R^\mathsf{T} 0 + 0 \\ &= R^\mathsf{T}\left(R - I\right) \mathbf{u} + \left(R - I\right) \mathbf{u} \\ &= \left(R^\mathsf{T}R - R^\mathsf{T} + R - I\right) \mathbf{u} \\ &= \left(I - R^\mathsf{T} + R - I\right) \mathbf{u} \\ &= \left(R - R^\mathsf{T}\right) \mathbf{u} \end{align}</math> Since {{math|(''R'' − ''R''<sup>T</sup>)}} is a [[skew-symmetric matrix]], we can choose {{math|'''u'''}} such that :<math>[\mathbf u]_{\times} = \left(R - R^\mathsf{T}\right).</math> The matrix–vector product becomes a [[cross product]] of a vector with itself, ensuring that the result is zero: :<math>\left(R - R^\mathsf{T}\right) \mathbf{u} = [\mathbf u]_{\times} \mathbf{u} = \mathbf{u} \times \mathbf{u} = 0\,</math> Therefore, if :<math>R = \begin{bmatrix} a & b & c \\ d & e & f \\ g & h & i \\ \end{bmatrix},</math> then :<math>\mathbf{u} = \begin{bmatrix} h-f \\ c-g \\ d-b \\ \end{bmatrix}.</math> The magnitude of {{math|'''u'''}} computed this way is {{math|{{norm|'''u'''}} {{=}} 2 sin ''θ''}}, where {{mvar|θ}} is the angle of rotation. This '''does not work''' if {{mvar|R}} is symmetric. Above, if {{math|''R'' − ''R''<sup>T</sup>}} is zero, then all subsequent steps are invalid. In this case, the angle of rotation is 0° or 180° and any nonzero column of {{math|''I'' + ''R''}} is an eigenvector of {{mvar|R}} with eigenvalue 1 because {{math|''R''(''I'' + ''R'') {{=}} ''R'' + ''R''<sup>2</sup> {{=}} ''R'' + ''RR''<sup>T</sup> {{=}} ''I'' + ''R''}}.<ref>{{Cite journal |last1=Palais |first1=Bob |last2=Palais |first2=Richard |date=2007-12-20 |title=Euler's fixed point theorem: The axis of a rotation |journal=Journal of Fixed Point Theory and Applications |language=en |volume=2 |issue=2 |pages=215–220 |doi=10.1007/s11784-007-0042-5 |issn=1661-7738 |mr=2372984 |doi-access=free}}</ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)