Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Orthogonal matrix
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Properties== ===Matrix properties=== A real square matrix is orthogonal [[if and only if]] its columns form an [[orthonormal basis]] of the [[Euclidean space]] {{math|'''R'''<sup>''n''</sup>}} with the ordinary Euclidean [[dot product]], which is the case if and only if its rows form an orthonormal basis of {{math|'''R'''<sup>''n''</sup>}}. It might be tempting to suppose a matrix with orthogonal (not orthonormal) columns would be called an orthogonal matrix, but such matrices have no special interest and no special name; they only satisfy {{math|1=''M''<sup>T</sup>''M'' = ''D''}}, with {{mvar|D}} a [[diagonal matrix]]. The [[determinant]] of any orthogonal matrix is +1 or −1. This follows from basic facts about determinants, as follows: <math display="block">1=\det(I)=\det\left(Q^\mathrm{T}Q\right)=\det\left(Q^\mathrm{T}\right)\det(Q)=\bigl(\det(Q)\bigr)^2 .</math> The converse is not true; having a determinant of ±1 is no guarantee of orthogonality, even with orthogonal columns, as shown by the following counterexample. <math display="block">\begin{bmatrix} 2 & 0 \\ 0 & \frac{1}{2} \end{bmatrix}</math> With permutation matrices the determinant matches the [[even and odd permutations|signature]], being +1 or −1 as the parity of the permutation is even or odd, for the determinant is an alternating function of the rows. Stronger than the determinant restriction is the fact that an orthogonal matrix can always be [[diagonalizable matrix|diagonalized]] over the [[complex number]]s to exhibit a full set of [[Eigenvalues and eigenvectors|eigenvalues]], all of which must have (complex) [[absolute value|modulus]] 1. ===Group properties=== The inverse of every orthogonal matrix is again orthogonal, as is the matrix product of two orthogonal matrices. In fact, the set of all {{math|''n'' × ''n''}} orthogonal matrices satisfies all the axioms of a [[group (mathematics)|group]]. It is a [[compact space|compact]] [[Lie group]] of dimension {{math|{{sfrac|''n''(''n'' − 1)|2}}}}, called the [[orthogonal group]] and denoted by {{math|O(''n'')}}. The orthogonal matrices whose determinant is +1 form a [[connected space|path-connected]] [[normal subgroup]] of {{math|O(''n'')}} of [[index of a subgroup|index]] 2, the [[special orthogonal group]] {{math|SO(''n'')}} of rotations. The [[quotient group]] {{math|O(''n'')/SO(''n'')}} is isomorphic to {{math|O(1)}}, with the projection map choosing [+1] or [−1] according to the determinant. Orthogonal matrices with determinant −1 do not include the identity, and so do not form a subgroup but only a [[coset]]; it is also (separately) connected. Thus each orthogonal group falls into two pieces; and because the projection map [[exact sequence|splits]], {{math|O(''n'')}} is a [[semidirect product]] of {{math|SO(''n'')}} by {{math|O(1)}}. In practical terms, a comparable statement is that any orthogonal matrix can be produced by taking a rotation matrix and possibly negating one of its columns, as we saw with {{nowrap|2 × 2}} matrices. If {{mvar|n}} is odd, then the semidirect product is in fact a [[direct product of groups|direct product]], and any orthogonal matrix can be produced by taking a rotation matrix and possibly negating all of its columns. This follows from the property of determinants that negating a column negates the determinant, and thus negating an odd (but not even) number of columns negates the determinant. Now consider {{math|(''n'' + 1) × (''n'' + 1)}} orthogonal matrices with bottom right entry equal to 1. The remainder of the last column (and last row) must be zeros, and the product of any two such matrices has the same form. The rest of the matrix is an {{math|''n'' × ''n''}} orthogonal matrix; thus {{math|O(''n'')}} is a subgroup of {{math|O(''n'' + 1)}} (and of all higher groups). <math display="block">\begin{bmatrix} & & & 0\\ & \mathrm{O}(n) & & \vdots\\ & & & 0\\ 0 & \cdots & 0 & 1 \end{bmatrix}</math> Since an elementary reflection in the form of a [[Householder matrix]] can reduce any orthogonal matrix to this constrained form, a series of such reflections can bring any orthogonal matrix to the identity; thus an orthogonal group is a [[reflection group]]. The last column can be fixed to any unit vector, and each choice gives a different copy of {{math|O(''n'')}} in {{math|O(''n'' + 1)}}; in this way {{math|O(''n'' + 1)}} is a [[fiber bundle|bundle]] over the unit sphere {{math|''S''<sup>''n''</sup>}} with fiber {{math|O(''n'')}}. Similarly, {{math|SO(''n'')}} is a subgroup of {{math|SO(''n'' + 1)}}; and any special orthogonal matrix can be generated by [[Givens rotation|Givens plane rotations]] using an analogous procedure. The bundle structure persists: {{math|SO(''n'') ↪ SO(''n'' + 1) → ''S''<sup>''n''</sup>}}. A single rotation can produce a zero in the first row of the last column, and series of {{math|''n'' − 1}} rotations will zero all but the last row of the last column of an {{math|''n'' × ''n''}} rotation matrix. Since the planes are fixed, each rotation has only one degree of freedom, its angle. By induction, {{math|SO(''n'')}} therefore has <math display="block">(n-1) + (n-2) + \cdots + 1 = \frac{n(n-1)}{2}</math> degrees of freedom, and so does {{math|O(''n'')}}. Permutation matrices are simpler still; they form, not a Lie group, but only a finite group, the order [[factorial|{{math|''n''!}}]] [[symmetric group]] {{math|S<sub>''n''</sub>}}. By the same kind of argument, {{math|S<sub>''n''</sub>}} is a subgroup of {{math|S<sub>''n'' + 1</sub>}}. The even permutations produce the subgroup of permutation matrices of determinant +1, the order {{math|{{sfrac|''n''!|2}}}} [[alternating group]]. ===Canonical form=== More broadly, the effect of any orthogonal matrix separates into independent actions on orthogonal two-dimensional subspaces. That is, if {{mvar|Q}} is special orthogonal then one can always find an orthogonal matrix {{mvar|P}}, a (rotational) [[change of basis]], that brings {{mvar|Q}} into block diagonal form: <math display="block">P^\mathrm{T}QP = \begin{bmatrix} R_1 & & \\ & \ddots & \\ & & R_k \end{bmatrix}\ (n\text{ even}), \ P^\mathrm{T}QP = \begin{bmatrix} R_1 & & & \\ & \ddots & & \\ & & R_k & \\ & & & 1 \end{bmatrix}\ (n\text{ odd}).</math> where the matrices {{math|''R''<sub>1</sub>, ..., ''R''<sub>''k''</sub>}} are {{nowrap|2 × 2}} rotation matrices, and with the remaining entries zero. Exceptionally, a rotation block may be diagonal, {{math|±''I''}}. Thus, negating one column if necessary, and noting that a {{nowrap|2 × 2}} reflection diagonalizes to a +1 and −1, any orthogonal matrix can be brought to the form <math display="block">P^\mathrm{T}QP = \begin{bmatrix} \begin{matrix}R_1 & & \\ & \ddots & \\ & & R_k\end{matrix} & 0 \\ 0 & \begin{matrix}\pm 1 & & \\ & \ddots & \\ & & \pm 1\end{matrix} \\ \end{bmatrix},</math> The matrices {{math|''R''<sub>1</sub>, ..., ''R''<sub>''k''</sub>}} give conjugate pairs of eigenvalues lying on the unit circle in the [[complex number|complex plane]]; so this decomposition confirms that all [[Eigenvalues and eigenvectors|eigenvalues]] have [[absolute value]] 1. If {{mvar|n}} is odd, there is at least one real eigenvalue, +1 or −1; for a {{nowrap|3 × 3}} rotation, the eigenvector associated with +1 is the rotation axis. ===Lie algebra=== Suppose the entries of {{mvar|Q}} are differentiable functions of {{mvar|t}}, and that {{math|1=''t'' = 0}} gives {{math|1=''Q'' = ''I''}}. Differentiating the orthogonality condition <math display="block">Q^\mathrm{T} Q = I </math> yields <math display="block">\dot{Q}^\mathrm{T} Q + Q^\mathrm{T} \dot{Q} = 0</math> Evaluation at {{math|1=''t'' = 0}} ({{math|1=''Q'' = ''I''}}) then implies <math display="block">\dot{Q}^\mathrm{T} = -\dot{Q} .</math> In Lie group terms, this means that the [[Lie algebra]] of an orthogonal matrix group consists of [[skew-symmetric matrix|skew-symmetric matrices]]. Going the other direction, the [[matrix exponential]] of any skew-symmetric matrix is an orthogonal matrix (in fact, special orthogonal). For example, the three-dimensional object physics calls [[angular velocity]] is a differential rotation, thus a vector in the Lie algebra <math>\mathfrak{so}(3)</math> tangent to {{math|SO(3)}}. Given {{math|1='''ω''' = (''xθ'', ''yθ'', ''zθ'')}}, with {{math|1='''v''' = (''x'', ''y'', ''z'')}} being a unit vector, the correct skew-symmetric matrix form of {{mvar|'''ω'''}} is <math display="block"> \Omega = \begin{bmatrix} 0 & -z\theta & y\theta \\ z\theta & 0 & -x\theta \\ -y\theta & x\theta & 0 \end{bmatrix} .</math> The exponential of this is the orthogonal matrix for rotation around axis {{math|'''v'''}} by angle {{mvar|θ}}; setting {{math|1=''c'' = cos {{sfrac|''θ''|2}}}}, {{math|1=''s'' = sin {{sfrac|''θ''|2}}}}, <math display="block">\exp(\Omega) = \begin{bmatrix} 1 - 2s^2 + 2x^2 s^2 & 2xy s^2 - 2z sc & 2xz s^2 + 2y sc\\ 2xy s^2 + 2z sc & 1 - 2s^2 + 2y^2 s^2 & 2yz s^2 - 2x sc\\ 2xz s^2 - 2y sc & 2yz s^2 + 2x sc & 1 - 2s^2 + 2z^2 s^2 \end{bmatrix}.</math>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)