Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Orthogonal matrix
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Overview== [[File:Matrix multiplication transpose.svg|thumb|275px|Visual understanding of multiplication by the transpose of a matrix. If A is an orthogonal matrix and B is its transpose, the ij-th element of the product AA<sup>T</sup> will vanish if i≠j, because the i-th row of A is orthogonal to the j-th row of A.]] An orthogonal matrix is the real specialization of a unitary matrix, and thus always a [[normal matrix]]. Although we consider only real matrices here, the definition can be used for matrices with entries from any [[field (mathematics)|field]]. However, orthogonal matrices arise naturally from [[dot product]]s, and for matrices of complex numbers that leads instead to the unitary requirement. Orthogonal matrices preserve the dot product,<ref>[http://tutorial.math.lamar.edu/Classes/LinAlg/OrthogonalMatrix.aspx "Paul's online math notes"]{{Full citation needed|date=January 2013|note=See talk page.}}, Paul Dawkins, [[Lamar University]], 2008. Theorem 3(c)</ref> so, for vectors {{math|'''u'''}} and {{math|'''v'''}} in an {{mvar|n}}-dimensional real [[Euclidean space]] <math display="block">{\mathbf u} \cdot {\mathbf v} = \left(Q {\mathbf u}\right) \cdot \left(Q {\mathbf v}\right) </math> where {{mvar|Q}} is an orthogonal matrix. To see the inner product connection, consider a vector {{math|'''v'''}} in an {{mvar|n}}-dimensional real [[Euclidean space]]. Written with respect to an orthonormal basis, the squared length of {{math|'''v'''}} is {{math|'''v'''<sup>T</sup>'''v'''}}. If a linear transformation, in matrix form {{math|''Q'''''v'''}}, preserves vector lengths, then <math display="block">{\mathbf v}^\mathrm{T}{\mathbf v} = (Q{\mathbf v})^\mathrm{T}(Q{\mathbf v}) = {\mathbf v}^\mathrm{T} Q^\mathrm{T} Q {\mathbf v} .</math> Thus [[dimension (vector space)|finite-dimensional]] linear isometries—rotations, reflections, and their combinations—produce orthogonal matrices. The converse is also true: orthogonal matrices imply orthogonal transformations. However, linear algebra includes orthogonal transformations between spaces which may be neither finite-dimensional nor of the same dimension, and these have no orthogonal matrix equivalent. Orthogonal matrices are important for a number of reasons, both theoretical and practical. The {{math|''n'' × ''n''}} orthogonal matrices form a [[group (mathematics)|group]] under matrix multiplication, the [[orthogonal group]] denoted by {{math|O(''n'')}}, which—with its subgroups—is widely used in mathematics and the physical sciences. For example, the [[point group]] of a molecule is a subgroup of O(3). Because floating point versions of orthogonal matrices have advantageous properties, they are key to many algorithms in numerical linear algebra, such as [[QR decomposition|{{mvar|QR}} decomposition]]. As another example, with appropriate normalization the [[discrete cosine transform]] (used in [[MP3]] compression) is represented by an orthogonal matrix.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)