Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Square matrix
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Special kinds == {| class="wikitable" style="float:right; margin:0ex 0ex 2ex 2ex;" |- ! Name !! Example with ''n'' = 3 |- | [[Diagonal matrix]] || style="text-align:center;" | <math> \begin{bmatrix} a_{11} & 0 & 0 \\ 0 & a_{22} & 0 \\ 0 & 0 & a_{33} \end{bmatrix} </math> |- | [[Lower triangular matrix]] || style="text-align:center;" | <math> \begin{bmatrix} a_{11} & 0 & 0 \\ a_{21} & a_{22} & 0 \\ a_{31} & a_{32} & a_{33} \end{bmatrix} </math> |- | [[Upper triangular matrix]] || style="text-align:center;" | <math> \begin{bmatrix} a_{11} & a_{12} & a_{13} \\ 0 & a_{22} & a_{23} \\ 0 & 0 & a_{33} \end{bmatrix} </math> |} === Diagonal or triangular matrix === If all entries outside the main diagonal are zero, <math>A</math> is called a [[diagonal matrix]]. If all entries below (resp. above) the main diagonal are zero, <math>A</math> is called an upper (resp. lower) [[triangular matrix]]. === Identity matrix === The [[identity matrix]] <math>I_n</math> of size <math>n</math> is the <math>n \times n</math> matrix in which all the elements on the [[main diagonal]] are equal to 1 and all other elements are equal to 0, e.g. <math display="block"> I_1 = \begin{bmatrix} 1 \end{bmatrix} ,\ I_2 = \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix} ,\ \ldots ,\ I_n = \begin{bmatrix} 1 & 0 & \cdots & 0 \\ 0 & 1 & \cdots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \cdots & 1 \end{bmatrix}. </math> It is a square matrix of order {{nowrap|<math>n</math>,}} and also a special kind of [[diagonal matrix]]. The term ''identity matrix'' refers to the property of matrix multiplication that <math display=block>I_m A = A I_n = A</math> for any <math>m \times n</math> matrix {{nowrap|<math>A</math>.}} ===Invertible matrix and its inverse=== A square matrix <math>A</math> is called ''[[invertible matrix|invertible]]'' or ''non-singular'' if there exists a matrix <math>B</math> such that<ref>{{Harvard citations |last1=Brown |year=1991 |nb=yes |loc=Definition I.2.28 }}</ref><ref>{{Harvard citations |last1=Brown |year=1991 |nb=yes |loc=Definition I.5.13 }}</ref> <math display="block">AB = BA = I_n.</math> If <math>B</math> exists, it is unique and is called the ''[[inverse matrix]]'' of {{nowrap|<math>A</math>,}} denoted {{nowrap|<math>A^{-1}</math>.}} ===Symmetric or skew-symmetric matrix=== A square matrix <math>A</math> that is equal to its transpose, i.e., {{nowrap|<math>A^{\mathsf T} = A</math>,}} is a [[symmetric matrix]]. If instead {{nowrap|<math>A^{\mathsf T} = -A</math>,}} then <math>A</math> is called a [[skew-symmetric matrix]]. For a complex square matrix {{nowrap|<math>A</math>,}} often the appropriate analogue of the transpose is the [[conjugate transpose]] {{nowrap|<math>A^*</math>,}} defined as the transpose of the [[complex conjugate]] of {{nowrap|<math>A</math>.}} A complex square matrix <math>A</math> satisfying <math>A^*=A</math> is called a [[Hermitian matrix]]. If instead {{nowrap|<math>A^* = -A</math>,}} then <math>A</math> is called a [[skew-Hermitian matrix]]. By the [[spectral theorem]], real symmetric (or complex Hermitian) matrices have an orthogonal (or unitary) [[eigenbasis]]; i.e., every vector is expressible as a [[linear combination]] of eigenvectors. In both cases, all eigenvalues are real.<ref>{{Harvard citations |last1=Horn |last2=Johnson |year=1985 |nb=yes |loc=Theorem 2.5.6 }}</ref> ===Definite matrix=== {| class="wikitable" style="float:right; text-align:center; margin:0ex 0ex 2ex 2ex;" |- ! [[Positive definite matrix|Positive definite]] !! [[Indefinite matrix|Indefinite]] |- | <math> \begin{bmatrix} 1/4 & 0 \\ 0 & 1 \\ \end{bmatrix} </math> | <math> \begin{bmatrix} 1/4 & 0 \\ 0 & -1/4 \end{bmatrix} </math> |- | {{math|1=''Q''(''x'',''y'') = 1/4 ''x''<sup>2</sup> + ''y''<sup>2</sup>}} | {{math|1=''Q''(''x'',''y'') = 1/4 ''x''<sup>2</sup> β 1/4 ''y''<sup>2</sup>}} |- | [[File:Ellipse in coordinate system with semi-axes labelled.svg|150px]] <br>Points such that {{math|1=''Q''(''x'', ''y'') = 1}} <br> ([[Ellipse]]). | [[File:Hyperbola2 SVG.svg|100x100px]] <br> Points such that {{math|1=''Q''(''x'', ''y'') = 1}} <br> ([[Hyperbola]]). |} A symmetric {{math|''n''Γ''n''}}-matrix is called ''[[positive-definite matrix|positive-definite]]'' (respectively negative-definite; indefinite), if for all nonzero vectors <math>x \in \mathbb{R}^n</math> the associated [[quadratic form]] given by <math display="block" id="quadratic_forms">Q(\mathbf{x}) = \mathbf{x}^\mathsf{T} A \mathbf{x}</math> takes only positive values (respectively only negative values; both some negative and some positive values).<ref>{{Harvard citations |last1=Horn |last2=Johnson |year=1985 |nb=yes |loc=Chapter 7 }}</ref> If the quadratic form takes only non-negative (respectively only non-positive) values, the symmetric matrix is called positive-semidefinite (respectively negative-semidefinite); hence the matrix is indefinite precisely when it is neither positive-semidefinite nor negative-semidefinite. A symmetric matrix is positive-definite if and only if all its eigenvalues are positive.<ref>{{Harvard citations |last1=Horn |last2=Johnson |year=1985 |nb=yes |loc=Theorem 7.2.1 }}</ref> The table at the right shows two possibilities for 2Γ2 matrices. Allowing as input two different vectors instead yields the [[bilinear form]] associated to {{mvar|A}}:<ref>{{Harvard citations |last1=Horn |last2=Johnson |year=1985 |nb=yes |loc=Example 4.0.6, p. 169 }}</ref> <math display="block">B_A(\mathbf{x}, \mathbf{y}) = \mathbf{x}^\mathsf{T} A \mathbf{y}.</math> === Orthogonal matrix === An ''[[orthogonal matrix]]'' is a [[Matrix (mathematics)#Square matrices|square matrix]] with [[real number|real]] entries whose columns and rows are [[orthogonal]] [[unit vector]]s (i.e., [[orthonormality|orthonormal]] vectors). Equivalently, a matrix ''A'' is orthogonal if its [[transpose]] is equal to its [[inverse matrix|inverse]]: <math display="block">A^\textsf{T} = A^{-1}, </math> which entails <math display="block">A^\textsf{T} A = A A^\textsf{T} = I, </math> where ''I'' is the [[identity matrix]]. An orthogonal matrix {{mvar|A}} is necessarily [[Invertible matrix|invertible]] (with inverse {{math|1=''A''<sup>β1</sup> = ''A''<sup>T</sup>}}), [[Unitary matrix|unitary]] ({{math|1=''A''<sup>β1</sup> = ''A''*}}), and [[Normal matrix|normal]] ({{math|1=''A''*''A'' = ''AA''*}}). The [[determinant]] of any orthogonal matrix is either +1 or β1. The [[special orthogonal group]] <math>\operatorname{SO}(n)</math> consists of the {{math|''n'' Γ ''n''}} orthogonal matrices with [[determinant]] +1. The [[complex number|complex]] analogue of an orthogonal matrix is a [[unitary matrix]]. ===Normal matrix=== A real or complex square matrix <math>A</math> is called ''[[normal matrix|normal]]'' if {{nowrap|<math>A^* A = AA^*</math>.}} If a real square matrix is symmetric, skew-symmetric, or orthogonal, then it is normal. If a complex square matrix is Hermitian, skew-Hermitian, or unitary, then it is normal. Normal matrices are of interest mainly because they include the types of matrices just listed and form the broadest class of matrices for which the [[spectral theorem]] holds.<ref>Artin, ''Algebra'', 2nd edition, Pearson, 2018, section 8.6.</ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)