Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Invertible matrix
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Properties == === Invertible matrix theorem === Let {{math|'''A'''}} be a square {{mvar|n}}-by-{{mvar|n}} matrix over a [[field (mathematics)|field]] {{mvar|K}} (e.g., the field {{tmath|\mathbb R}} of real numbers). The following statements are equivalent, i.e., they are either all true or all false for any given matrix:<ref>{{Cite web|last=Weisstein|first=Eric W.|title=Invertible Matrix Theorem|url=https://mathworld.wolfram.com/InvertibleMatrixTheorem.html|access-date=2020-09-08|website=mathworld.wolfram.com|language=en}}</ref> * {{math|'''A'''}} is invertible, i.e. it has an inverse under matrix multiplication, i.e., there exists a {{math|'''B'''}} such that {{math|1='''AB''' = '''I'''{{sub|''n''}} = '''BA'''}}. (In that statement, "invertible" can equivalently be replaced with "left-invertible" or "right-invertible" in which one-sided inverses are considered.) * The linear transformation mapping {{math|'''x'''}} to {{math|'''Ax'''}} is invertible, i.e., it has an inverse under function composition. (There, again, "invertible" can equivalently be replaced with either "left-invertible" or "right-invertible".) * The [[transpose]] {{math|'''A'''<sup>T</sup>}} is an invertible matrix. * {{math|'''A'''}} is [[Row equivalence|row-equivalent]] to the {{mvar|n}}-by-{{mvar|n}} [[identity matrix]] {{math|'''I'''{{sub|''n''}}}}. * {{math|'''A'''}} is [[Row equivalence|column-equivalent]] to the {{mvar|n}}-by-{{mvar|n}} identity matrix {{math|'''I'''{{sub|''n''}}}}. * {{math|'''A'''}} has {{mvar|n}} [[pivot position]]s. * {{math|'''A'''}} has full [[Rank (linear algebra)|rank]]: {{math|1=rank '''A''' = ''n''}}. * {{math|'''A'''}} has a trivial [[Kernel (linear algebra)|kernel]]: {{math|1=ker('''A''') = {'''0'''}.}} * The linear transformation mapping {{math|'''x'''}} to {{math|'''Ax'''}} is bijective; that is, the equation {{math|1='''Ax''' = '''b'''}} has exactly one solution for each {{math|'''b'''}} in {{mvar|K{{sup|n}}}}. (There, "bijective" can equivalently be replaced with "[[injective]]" or "[[surjective]]".) * The columns of {{math|'''A'''}} form a [[basis of a vector space|basis]] of {{mvar|K{{sup|n}}}}. (In this statement, "basis" can equivalently be replaced with either "linearly independent set" or "spanning set") * The rows of {{math|'''A'''}} form a basis of {{mvar|K{{sup|n}}}}. (Similarly, here, "basis" can equivalently be replaced with either "linearly independent set" or "spanning set") * The [[determinant]] of {{math|'''A'''}} is nonzero: {{math|det '''A''' ≠ 0}}. In general, a square matrix over a [[commutative ring]] is invertible if and only if its determinant is a [[Unit (ring theory)|unit]] (i.e. multiplicatively invertible element) of that ring. * The number 0 is not an [[eigenvalue]] of {{math|'''A'''}}. (More generally, a number <math>\lambda</math> is an eigenvalue of {{math|'''A'''}} if the matrix <math>\mathbf{A}-\lambda \mathbf{I}</math> is singular, where {{math|'''I'''}} is the identity matrix.) * The matrix {{math|'''A'''}} can be expressed as a finite product of [[Elementary matrix|elementary matrices]]. === Other properties === Furthermore, the following properties hold for an invertible matrix {{math|'''A'''}}: * <math>(\mathbf A^{-1})^{-1} = \mathbf A</math> * <math>(k \mathbf A)^{-1} = k^{-1} \mathbf A^{-1}</math> for nonzero scalar {{mvar|k}} * <math>(\mathbf{Ax})^+ = \mathbf x^+ \mathbf A^{-1}</math> if {{math|'''A'''}} has orthonormal columns, where {{math|{{sup|+}}}} denotes the [[Moore–Penrose inverse]] and {{math|'''x'''}} is a vector * <math>(\mathbf A^\mathrm{T})^{-1} = (\mathbf A^{-1})^\mathrm{T}</math> * For any invertible {{mvar|n}}-by-{{mvar|n}} matrices {{math|'''A'''}} and {{math|'''B'''}}, <math>(\mathbf{AB})^{-1} = \mathbf B^{-1} \mathbf A^{-1}.</math> More generally, if <math>\mathbf A_1, \dots, \mathbf A_k</math> are invertible {{mvar|n}}-by-{{mvar|n}} matrices, then <math>(\mathbf A_1 \mathbf A_2 \cdots \mathbf A_{k-1} \mathbf A_k)^{-1} = \mathbf A_k^{-1} \mathbf A_{k-1}^{-1} \cdots \mathbf A_2^{-1} \mathbf A_1^{-1}.</math> *<math>\det \mathbf A^{-1} = (\det \mathbf A)^{-1}.</math> The rows of the inverse matrix {{math|'''V'''}} of a matrix {{math|'''U'''}} are [[orthonormal]] to the columns of {{math|'''U'''}} (and vice versa interchanging rows for columns). To see this, suppose that {{math|1='''UV''' = '''VU''' = '''I'''}} where the rows of {{math|'''V'''}} are denoted as <math>v_i^{\mathrm{T}}</math> and the columns of {{math|'''U'''}} as <math>u_j</math> for <math>1 \leq i,j \leq n.</math> Then clearly, the [[Dot product|Euclidean inner product]] of any two <math>v_i^{\mathrm{T}} u_j = \delta_{i,j}.</math> This property can also be useful in constructing the inverse of a square matrix in some instances, where a set of [[orthogonal]] vectors (but not necessarily orthonormal vectors) to the columns of {{math|'''U'''}} are known. In which case, one can apply the iterative [[Gram–Schmidt process]] to this initial set to determine the rows of the inverse {{math|'''V'''}}. A matrix that is its own inverse (i.e., a matrix {{math|'''A'''}} such that {{math|1='''A''' = '''A'''{{sup|−1}}}} and consequently {{math|1='''A'''{{sup|2}} = '''I'''}}) is called an [[involutory matrix]]. === In relation to its adjugate === The [[Adjugate matrix|adjugate]] of a matrix {{math|'''A'''}} can be used to find the inverse of {{math|'''A'''}} as follows: If {{math|'''A'''}} is an invertible matrix, then : <math>\mathbf{A}^{-1} = \frac{1}{\det(\mathbf{A})} \operatorname{adj}(\mathbf{A}).</math> === In relation to the identity matrix === It follows from the [[associativity]] of matrix multiplication that if : <math>\mathbf{AB} = \mathbf{I} \ </math> for ''finite square'' matrices {{math|'''A'''}} and {{math|'''B'''}}, then also : <math>\mathbf{BA} = \mathbf{I}\ </math><ref>{{Cite book | last1=Horn | first1=Roger A. | last2=Johnson | first2=Charles R. | title=Matrix Analysis | publisher=[[Cambridge University Press]] | isbn=978-0-521-38632-6 | year=1985 | page=14 }}.</ref> === Density === Over the field of real numbers, the set of singular {{mvar|n}}-by-{{mvar|n}} matrices, considered as a [[subset]] of {{tmath|\mathbb R^{n \times n},}} is a [[null set]], that is, has [[Lebesgue measure]] zero. That is true because singular matrices are the roots of the [[determinant]] function. It is a [[continuous function]] because it is a [[polynomial]] in the entries of the matrix. Thus in the language of [[measure theory]], [[almost all]] {{mvar|n}}-by-{{mvar|n}} matrices are invertible. Furthermore, the set of {{mvar|n}}-by-{{mvar|n}} invertible matrices is [[open set|open]] and [[dense set|dense]] in the [[topological space]] of all {{mvar|n}}-by-{{mvar|n}} matrices. Equivalently, the set of singular matrices is [[closed set|closed]] and [[nowhere dense]] in the space of {{mvar|n}}-by-{{mvar|n}} matrices. In practice, however, non-invertible matrices may be encountered. In [[numerical analysis|numerical calculations]], matrices that are invertible but close to a non-invertible matrix may still be problematic and are said to be [[Condition number#Matrices|ill-conditioned]].
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)