Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Eigenvalue algorithm
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Eigenvalues and eigenvectors== {{main|Eigenvalues and eigenvectors|Generalized eigenvector}} Given an {{math|''n'' × ''n''}} [[Square matrix#Square matrices|square matrix]] {{math|''A''}} of [[Real number|real]] or [[Complex number|complex]] numbers, an ''[[eigenvalue]]'' {{math|''λ''}} and its associated ''[[generalized eigenvector]]'' {{math|'''v'''}} are a pair obeying the relation<ref name="Axler">{{Citation | last = Axler | first = Sheldon | author-link = Sheldon Axler | title = Down with Determinants! | journal = American Mathematical Monthly | volume = 102 | issue = 2 | pages = 139–154 | url = http://www.axler.net/DwD.pdf | year = 1995 | doi = 10.2307/2975348 | jstor = 2975348 | access-date = 2012-07-31 | archive-url = https://web.archive.org/web/20120913111605/http://www.axler.net/DwD.pdf | archive-date = 2012-09-13 }}</ref> :<math>\left(A - \lambda I\right)^k {\mathbf v} = 0,</math> where {{math|'''v'''}} is a nonzero {{math|''n'' × 1}} column vector, {{math|''I''}} is the {{math|''n'' × ''n''}} [[identity matrix]], {{math|''k''}} is a positive integer, and both {{math|''λ''}} and {{math|'''v'''}} are allowed to be complex even when {{math|''A''}} is real. When {{math|1=''k'' = 1}}, the vector is called simply an ''[[eigenvector]]'', and the pair is called an ''eigenpair''. In this case, {{math|1=''A'''''v''' = ''λ'''''v'''}}. Any eigenvalue {{math|''λ''}} of {{math|''A''}} has ordinary<ref group="note">The term "ordinary" is used here only to emphasize the distinction between "eigenvector" and "generalized eigenvector".</ref> eigenvectors associated to it, for if {{math|''k''}} is the smallest integer such that {{math|1=(''A'' − ''λI'')<sup>''k''</sup> '''v''' = 0}} for a generalized eigenvector {{math|'''v'''}}, then {{math|1=(''A'' − ''λI'')<sup>''k''−1</sup> '''v'''}} is an ordinary eigenvector. The value {{math|''k''}} can always be taken as less than or equal to {{math|''n''}}. In particular, {{math|1=(''A'' − ''λI'')<sup>''n''</sup> '''v''' = 0}} for all generalized eigenvectors {{math|'''v'''}} associated with {{math|''λ''}}. For each eigenvalue {{math|λ}} of {{math|''A''}}, the [[kernel (matrix)|kernel]] {{math|ker(''A'' − ''λI'')}} consists of all eigenvectors associated with {{math|''λ''}} (along with 0), called the ''[[eigenspace]]'' of {{math|''λ''}}, while the vector space {{math|ker((''A'' − ''λI'')<sup>''n''</sup>)}} consists of all generalized eigenvectors, and is called the ''[[generalized eigenspace]]''. The ''[[geometric multiplicity]]'' of {{math|''λ''}} is the dimension of its eigenspace. The ''[[algebraic multiplicity]]'' of {{math|''λ''}} is the dimension of its generalized eigenspace. The latter terminology is justified by the equation :<math>p_A\left(z\right) = \det\left( zI - A \right) = \prod_{i=1}^k (z - \lambda_i)^{\alpha_i},</math> where {{math|det}} is the [[determinant]] function, the {{math|''λ''<sub>''i''</sub>}} are all the distinct eigenvalues of {{math|''A''}} and the {{math|''α''<sub>''i''</sub>}} are the corresponding algebraic multiplicities. The function {{math|1=''p<sub>A</sub>''(''z'')}} is the ''[[characteristic polynomial]]'' of {{math|''A''}}. So the algebraic multiplicity is the multiplicity of the eigenvalue as a [[Properties of polynomial roots|zero]] of the characteristic polynomial. Since any eigenvector is also a generalized eigenvector, the geometric multiplicity is less than or equal to the algebraic multiplicity. The algebraic multiplicities sum up to {{math|''n''}}, the degree of the characteristic polynomial. The equation {{math|1=''p<sub>A</sub>''(''z'') = 0}} is called the ''characteristic equation'', as its roots are exactly the eigenvalues of {{math|''A''}}. By the [[Cayley–Hamilton theorem]], {{math|''A''}} itself obeys the same equation: {{math|1=''p<sub>A</sub>''(''A'') = 0}}.<ref group="note">where the constant term is multiplied by the identity matrix {{math|''I''}}.</ref> As a consequence, the columns of the matrix <math display="inline">\prod_{i \ne j} (A - \lambda_iI)^{\alpha_i}</math> must be either 0 or generalized eigenvectors of the eigenvalue {{math|''λ''<sub>''j''</sub>}}, since they are annihilated by <math>(A - \lambda_jI)^{\alpha_j}</math>. In fact, the [[column space]] is the generalized eigenspace of {{math|''λ''<sub>''j''</sub>}}. Any collection of generalized eigenvectors of distinct eigenvalues is linearly independent, so a basis for all of {{math|'''C'''<sup>''n''</sup>}} can be chosen consisting of generalized eigenvectors. More particularly, this basis {{math|{'''v'''<sub>''i''</sub>}{{su|p=''n''|b=''i''=1}}}} can be chosen and organized so that * if {{math|'''v'''<sub>''i''</sub>}} and {{math|'''v'''<sub>''j''</sub>}} have the same eigenvalue, then so does {{math|'''v'''<sub>''k''</sub>}} for each {{math|''k''}} between {{math|''i''}} and {{math|''j''}}, and * if {{math|'''v'''<sub>''i''</sub>}} is not an ordinary eigenvector, and if {{math|''λ''<sub>''i''</sub>}} is its eigenvalue, then {{math|1=(''A'' − ''λ''<sub>''i''</sub>''I'')'''v'''<sub>''i''</sub> = '''v'''<sub>''i''−1</sub>}} (in particular, {{math|'''v'''<sub>1</sub>}} must be an ordinary eigenvector). If these basis vectors are placed as the column vectors of a matrix {{math|1=''V'' = ['''v'''<sub>1</sub> '''v'''<sub>2</sub> ⋯ '''v'''<sub>''n''</sub>]}}, then {{math|''V''}} can be used to convert {{math|''A''}} to its [[Jordan normal form]]: :<math>V^{-1}AV = \begin{bmatrix} \lambda_1 & \beta_1 & 0 & \ldots & 0 \\ 0 & \lambda_2 & \beta_2 & \ldots & 0 \\ 0 & 0 & \lambda_3 & \ldots & 0 \\ \vdots & \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & 0 & \ldots & \lambda_n \end{bmatrix},</math> where the {{math|''λ''<sub>''i''</sub>}} are the eigenvalues, {{math|1=''β''<sub>''i''</sub> = 1}} if {{math|1=(''A'' − ''λ''<sub>''i''+1</sub>)'''v'''<sub>''i''+1</sub> = '''v'''<sub>''i''</sub>}} and {{math|1=''β''<sub>''i''</sub> = 0}} otherwise. More generally, if {{math|''W''}} is any invertible matrix, and {{math|''λ''}} is an eigenvalue of {{math|''A''}} with generalized eigenvector {{math|'''v'''}}, then {{math|1=(''W''{{i sup|−1}}''AW'' − ''λI'')<sup>''k''</sup> ''W''{{i sup|−''k''}}'''v''' = 0}}. Thus {{math|''λ''}} is an eigenvalue of {{math|''W''{{i sup|−1}}''AW''}} with generalized eigenvector {{math|''W''{{i sup|−''k''}}'''v'''}}. That is, [[similar matrices]] have the same eigenvalues. ===Normal, Hermitian, and real-symmetric matrices=== {{main|Adjoint matrix|Normal matrix|Hermitian matrix}} The [[conjugate transpose|adjoint]] {{math|''M''<sup>*</sup>}} of a complex matrix {{math|''M''}} is the transpose of the conjugate of {{math|''M''}}: {{math|1=''M'' <sup>*</sup> = {{overline|''M''}} <sup>T</sup>}}. A square matrix {{math|''A''}} is called ''[[Normal matrix|normal]]'' if it commutes with its adjoint: {{math|1=''A''<sup>*</sup>''A'' = ''AA''<sup>*</sup>}}. It is called ''[[Hermitian matrix|Hermitian]]'' if it is equal to its adjoint: {{math|1=''A''<sup>*</sup> = ''A''}}. All Hermitian matrices are normal. If {{math|''A''}} has only real elements, then the adjoint is just the transpose, and {{math|''A''}} is Hermitian if and only if it is [[symmetric matrix|symmetric]]. When applied to column vectors, the adjoint can be used to define the canonical inner product on {{math|'''C'''<sup>''n''</sup>}}: {{math|1='''w''' ⋅ '''v''' = '''w'''<sup>*</sup> '''v'''}}.<ref group="note">This ordering of the inner product (with the conjugate-linear position on the left), is preferred by physicists. Algebraists often place the conjugate-linear position on the right: {{math|1='''w''' ⋅ '''v''' = '''v'''<sup>*</sup> '''w'''}}.</ref> Normal, Hermitian, and real-symmetric matrices have several useful properties: * Every generalized eigenvector of a normal matrix is an ordinary eigenvector. * Any normal matrix is similar to a diagonal matrix, since its Jordan normal form is diagonal. * Eigenvectors of distinct eigenvalues of a normal matrix are orthogonal. * The null space and the image (or column space) of a normal matrix are orthogonal to each other. * For any normal matrix {{math|''A''}}, {{math|'''C'''<sup>''n''</sup>}} has an orthonormal basis consisting of eigenvectors of {{math|''A''}}. The corresponding matrix of eigenvectors is [[Unitary matrix|unitary]]. * The eigenvalues of a Hermitian matrix are real, since {{math|1=({{overline|''λ''}} − ''λ'')'''v''' = (''A''<sup>*</sup> − ''A'')'''v''' = (''A'' − ''A'')'''v''' = 0}} for a non-zero eigenvector {{math|'''v'''}}. * If {{math|''A''}} is real, there is an orthonormal basis for {{math|'''R'''<sup>''n''</sup>}} consisting of eigenvectors of {{math|''A''}} if and only if {{math|''A''}} is symmetric. It is possible for a real or complex matrix to have all real eigenvalues without being Hermitian. For example, a real [[triangular matrix]] has its eigenvalues along its diagonal, but in general is not symmetric.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)