Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Tridiagonal matrix
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Properties== A tridiagonal matrix is a matrix that is both upper and lower [[Hessenberg matrix]].<ref>{{cite book|first1=Roger A.| last1=Horn | first2=Charles R. | last2=Johnson | title = Matrix Analysis | publisher= Cambridge University Press |year= 1985 |isbn = 0521386322 | page=28}}</ref> In particular, a tridiagonal matrix is a [[direct sum]] of ''p'' 1-by-1 and ''q'' 2-by-2 matrices such that {{nowrap|1=''p'' + ''q''/2 = ''n''}} β the dimension of the tridiagonal. Although a general tridiagonal matrix is not necessarily [[Symmetric matrix|symmetric]] or [[Hermitian matrix|Hermitian]], many of those that arise when solving linear algebra problems have one of these properties. Furthermore, if a real tridiagonal matrix ''A'' satisfies ''a''<sub>''k'',''k''+1</sub> ''a''<sub>''k''+1,''k''</sub> > 0 for all ''k'', so that the signs of its entries are symmetric, then it is [[similar (linear algebra)|similar]] to a Hermitian matrix, by a diagonal change of basis matrix. Hence, its [[eigenvalue]]s are real. If we replace the strict inequality by ''a''<sub>''k'',''k''+1</sub> ''a''<sub>''k''+1,''k''</sub> ≥ 0, then by continuity, the eigenvalues are still guaranteed to be real, but the matrix need no longer be similar to a Hermitian matrix.<ref>Horn & Johnson, page 174</ref> The [[Set (mathematics)|set]] of all ''n × n'' tridiagonal matrices forms a ''3n-2'' [[dimensional]] [[vector space]]. Many linear algebra [[algorithm]]s require significantly less [[Computational complexity theory|computational effort]] when applied to diagonal matrices, and this improvement often carries over to tridiagonal matrices as well. ===Determinant=== {{Main|Continuant (mathematics)}} The [[determinant]] of a tridiagonal matrix ''A'' of order ''n'' can be computed from a three-term [[recurrence relation]].<ref>{{Cite journal | last1 = El-Mikkawy | first1 = M. E. A. | title = On the inverse of a general tridiagonal matrix | doi = 10.1016/S0096-3003(03)00298-4 | journal = Applied Mathematics and Computation | volume = 150 | issue = 3 | pages = 669β679 | year = 2004 }}</ref> Write ''f''<sub>1</sub> = |''a''<sub>1</sub>| = ''a''<sub>1</sub> (i.e., ''f''<sub>1</sub> is the determinant of the 1 by 1 matrix consisting only of ''a''<sub>1</sub>), and let :<math>f_n = \begin{vmatrix} a_1 & b_1 \\ c_1 & a_2 & b_2 \\ & c_2 & \ddots & \ddots \\ & & \ddots & \ddots & b_{n-1} \\ & & & c_{n-1} & a_n \end{vmatrix}.</math> The sequence (''f''<sub>''i''</sub>) is called the [[continuant (mathematics)|continuant]] and satisfies the recurrence relation :<math>f_n = a_n f_{n-1} - c_{n-1}b_{n-1}f_{n-2}</math> with initial values ''f''<sub>0</sub> = 1 and ''f''<sub>β1</sub> = 0. The cost of computing the determinant of a tridiagonal matrix using this formula is linear in ''n'', while the cost is cubic for a general matrix. ===Inversion=== The [[inverse matrix|inverse]] of a non-singular tridiagonal matrix ''T'' :<math>T = \begin{pmatrix} a_1 & b_1 \\ c_1 & a_2 & b_2 \\ & c_2 & \ddots & \ddots \\ & & \ddots & \ddots & b_{n-1} \\ & & & c_{n-1} & a_n \end{pmatrix}</math> is given by<!-- (T^{-1})_{ij} = \begin{cases} (-1)^{j-i}b_i \cdots b_{j-1} \theta_{i-1} \phi_{j+1}/\theta_n & \text{ if } i < j\\ \theta_{i-1} \phi_{j+1}/\theta_n & \text{ if } i = j\\ (-1)^{i-j}c_j \cdots c_{i-1} \theta_{j-1} \phi_{i+1}/\theta_n & \text{ if } i > j\\ \end{cases} Correction!? --> :<math>(T^{-1})_{ij} = \begin{cases} (-1)^{i+j}b_i \cdots b_{j-1} \theta_{i-1} \phi_{j+1}/\theta_n & \text{ if } i < j\\ \theta_{i-1} \phi_{j+1}/\theta_n & \text{ if } i = j\\ (-1)^{i+j}c_j \cdots c_{i-1} \theta_{j-1} \phi_{i+1}/\theta_n & \text{ if } i > j\\ \end{cases}</math> where the ''ΞΈ<sub>i</sub>'' satisfy the recurrence relation :<math>\theta_i = a_i \theta_{i-1} - b_{i-1}c_{i-1}\theta_{i-2} \qquad i=2,3,\ldots,n</math> with initial conditions ''ΞΈ''<sub>0</sub> = 1, ''ΞΈ''<sub>1</sub> = ''a''<sub>1</sub> and the ''Ο''<sub>''i''</sub> satisfy :<math>\phi_i = a_i \phi_{i+1} - b_i c_i \phi_{i+2} \qquad i=n-1,\ldots,1</math> with initial conditions ''Ο''<sub>''n''+1</sub> = 1 and ''Ο''<sub>''n''</sub> = ''a<sub>n</sub>''.<ref>{{Cite journal | last1 = Da Fonseca | first1 = C. M. | doi = 10.1016/j.cam.2005.08.047 | title = On the eigenvalues of some tridiagonal matrices | journal = Journal of Computational and Applied Mathematics | volume = 200 | pages = 283β286 | year = 2007 | doi-access = free }}</ref><ref>{{Cite journal | last1 = Usmani | first1 = R. A. | doi = 10.1016/0024-3795(94)90414-6 | title = Inversion of a tridiagonal jacobi matrix | journal = Linear Algebra and Its Applications | volume = 212-213 | pages = 413β414 | year = 1994 | doi-access = free }}</ref> Closed form solutions can be computed for special cases such as [[symmetric matrix|symmetric matrices]] with all diagonal and off-diagonal elements equal<ref>{{Cite journal | last1 = Hu | first1 = G. Y. | last2 = O'Connell | first2 = R. F. | doi = 10.1088/0305-4470/29/7/020 | title = Analytical inversion of symmetric tridiagonal matrices | journal = Journal of Physics A: Mathematical and General | volume = 29 | issue = 7 | pages = 1511 | year = 1996 | bibcode = 1996JPhA...29.1511H }}</ref> or [[Toeplitz matrices]]<ref>{{Cite journal | last1 = Huang | first1 = Y. | last2 = McColl | first2 = W. F. | doi = 10.1088/0305-4470/30/22/026 | title = Analytical inversion of general tridiagonal matrices | journal = Journal of Physics A: Mathematical and General | volume = 30 | issue = 22 | pages = 7919 | year = 1997 | bibcode = 1997JPhA...30.7919H }}</ref> and for the general case as well.<ref>{{Cite journal | last1 = Mallik | first1 = R. K. | doi = 10.1016/S0024-3795(00)00262-7 | title = The inverse of a tridiagonal matrix | journal = Linear Algebra and Its Applications | volume = 325 | pages = 109β139 | year = 2001 | issue = 1β3 | doi-access = free }}</ref><ref>{{Cite journal | last1 = KΔ±lΔ±Γ§ | first1 = E. | doi = 10.1016/j.amc.2007.07.046 | title = Explicit formula for the inverse of a tridiagonal matrix by backward continued fractions | journal = Applied Mathematics and Computation | volume = 197 | pages = 345β357 | year = 2008 }}</ref> In general, the inverse of a tridiagonal matrix is a [[semiseparable matrix]] and vice versa.<ref name="VandebrilBarel2008">{{cite book|author1=Raf Vandebril|author2=Marc Van Barel|author3=Nicola Mastronardi|title=Matrix Computations and Semiseparable Matrices. Volume I: Linear Systems|year=2008|publisher=JHU Press|isbn=978-0-8018-8714-7|at=Theorem 1.38, p. 41}}</ref> The inverse of a symmetric tridiagonal matrix can be written as a [[single-pair matrix]] (a.k.a. ''generator-representable semiseparable matrix'') of the form<ref name="Meurant1992">{{cite journal |last1=Meurant |first1=Gerard |title=A review on the inverse of symmetric tridiagonal and block tridiagonal matrices |journal=SIAM Journal on Matrix Analysis and Applications |date=1992 |volume=13 |issue=3 |pages=707β728 |doi=10.1137/0613045 |url=https://doi.org/10.1137/0613045|url-access=subscription }}</ref><ref>{{cite journal |last1=Bossu |first1=Sebastien |title=Tridiagonal and single-pair matrices and the inverse sum of two single-pair matrices |journal=Linear Algebra and Its Applications |date=2024 |volume=699 |pages=129β158 |doi=10.1016/j.laa.2024.06.018 |url=https://authors.elsevier.com/a/1jOTP5YnCtZEc|arxiv=2304.06100 }}</ref> <math>\begin{pmatrix} \alpha_1 & -\beta_1 \\ -\beta_1 & \alpha_2 & -\beta_2 \\ & \ddots & \ddots & \ddots & \\ & & \ddots & \ddots & -\beta_{n-1} \\ & & & -\beta_{n-1} & \alpha_n \end{pmatrix}^{-1} = \begin{pmatrix} a_1 b_1 & a_1 b_2 & \cdots & a_1 b_n \\ a_1 b_2 & a_2 b_2 & \cdots & a_2 b_n \\ \vdots & \vdots & \ddots & \vdots \\ a_1 b_n & a_2 b_n & \cdots & a_n b_n \end{pmatrix} = \left( a_{\min(i,j)} b_{\max(i,j)} \right) </math> where <math>\begin{cases} \displaystyle a_i = \frac{\beta_{i} \cdots \beta_{n-1}}{\delta_i \cdots \delta_n\,b_n} \\ \displaystyle b_i = \frac{\beta_1 \cdots \beta_{i-1}}{d_1 \cdots d_i}\end{cases}</math> with <math>\begin{cases} d_n = \alpha_n,\quad d_{i-1} = \alpha_{i-1} - \frac{\beta_{i-1}^2}{d_{i}}, & i = n, n-1, \cdots, 2, \\ \delta_1 = \alpha_1, \quad \delta_{i+1} = \alpha_{i+1} - \frac{\beta_{i}^2}{\delta_{i}}, & i = 1, 2, \cdots, n-1. \end{cases} </math> ===Solution of linear system=== {{Main|tridiagonal matrix algorithm}} A system of equations ''Ax'' = ''b'' for <math>b\in \R^n</math> can be solved by an efficient form of Gaussian elimination when ''A'' is tridiagonal called [[tridiagonal matrix algorithm]], requiring ''O''(''n'') operations.<ref>{{cite book|first1=Gene H.|last1=Golub|author-link1=Gene H. Golub |first2=Charles F. |last2=Van Loan|author-link2=Charles F. Van Loan| title =Matrix Computations| edition=3rd|publisher=The Johns Hopkins University Press|year=1996|isbn=0-8018-5414-8}}</ref> ===Eigenvalues=== When a tridiagonal matrix is also [[Toeplitz matrix|Toeplitz]], there is a simple closed-form solution for its eigenvalues, namely:<ref>{{Cite journal | doi = 10.1002/nla.1811| title = Tridiagonal Toeplitz matrices: Properties and novel applications| journal = Numerical Linear Algebra with Applications| volume = 20| issue = 2| pages = 302| year = 2013| last1 = Noschese | first1 = S. | last2 = Pasquini | first2 = L. | last3 = Reichel | first3 = L. }}</ref><ref>This can also be written as <math> a + 2 \sqrt{bc} \cos(k \pi / {(n+1)}) </math> because <math> \cos(x) = -\cos(\pi-x) </math>, as is done in: {{Cite journal | last1 = Kulkarni | first1 = D. | last2 = Schmidt | first2 = D. | last3 = Tsui | first3 = S. K. | title = Eigenvalues of tridiagonal pseudo-Toeplitz matrices | doi = 10.1016/S0024-3795(99)00114-7 | journal = Linear Algebra and Its Applications | volume = 297 | pages = 63β80 | year = 1999 | issue = 1β3 | url = https://hal.archives-ouvertes.fr/hal-01461924/file/KST.pdf }}</ref> :<math> a - 2 \sqrt{bc} \cos \left (\frac{k\pi}{n+1} \right ), \qquad k=1, \ldots, n. </math> A real [[symmetric matrix|symmetric]] tridiagonal matrix has real eigenvalues, and all the eigenvalues are [[Eigenvalues and eigenvectors#Algebraic multiplicity|distinct (simple)]] if all off-diagonal elements are nonzero.<ref>{{Cite book | last1 = Parlett | first1 = B.N. | title = The Symmetric Eigenvalue Problem |orig-year = 1980 | publisher =SIAM |date=1997 |oclc= 228147822 |series=Classics in applied mathematics |volume=20 |isbn=0-89871-402-8 }}</ref> Numerous methods exist for the numerical computation of the eigenvalues of a real symmetric tridiagonal matrix to arbitrary finite precision, typically requiring <math>O(n^2)</math> operations for a matrix of size <math>n\times n</math>, although fast algorithms exist which (without parallel computation) require only <math>O(n\log n)</math>.<ref>{{Cite journal |last1 = Coakley |first1= E.S. |last2=Rokhlin | first2=V. | title =A fast divide-and-conquer algorithm for computing the spectra of real symmetric tridiagonal matrices | doi = 10.1016/j.acha.2012.06.003 |journal = [[Applied and Computational Harmonic Analysis]] |volume = 34 |issue = 3 |pages = 379β414 |year =2012 |doi-access = free }}</ref> As a side note, an ''unreduced'' symmetric tridiagonal matrix is a matrix containing non-zero off-diagonal elements of the tridiagonal, where the eigenvalues are distinct while the eigenvectors are unique up to a scale factor and are mutually orthogonal.<ref>{{cite thesis |last1=Dhillon |first1=Inderjit Singh |title=A New O(n<sup>2</sup>) Algorithm for the Symmetric Tridiagonal Eigenvalue/Eigenvector Problem |page=8 |type=PhD |publisher=University of California, Berkeley |id=CSD-97-971, ADA637073 |date=1997 |url=http://www.cs.utexas.edu/~inderjit/public_papers/thesis.pdf}}</ref> === Similarity to symmetric tridiagonal matrix === For ''unsymmetric'' or ''nonsymmetric'' tridiagonal matrices one can compute the eigendecomposition using a similarity transformation. Given a real tridiagonal, nonsymmetric matrix :<math> T = \begin{pmatrix} a_1 & b_1 \\ c_1 & a_2 & b_2 \\ & c_2 & \ddots & \ddots \\ & & \ddots & \ddots & b_{n-1} \\ & & & c_{n-1} & a_n \end{pmatrix} </math> where <math>b_i \neq c_i </math>. Assume that each product of off-diagonal entries is {{em|strictly}} positive <math>b_i c_i > 0 </math> and define a transformation matrix <math>D</math> by<ref name=kreer_operator>{{Cite journal | last1 = Kreer | first1 = M. | title = Analytic birth-death processes: a Hilbert space approach | doi = 10.1016/0304-4149(94)90112-0 | journal = Stochastic Processes and Their Applications | volume = 49 | issue = 1 | pages = 65β74 | year = 1994 }}</ref> :<math> D := \operatorname{diag}(\delta_1 , \dots, \delta_n) \quad \text{for} \quad \delta_i := \begin{cases} 1 & , \, i=1 \\ \sqrt{\frac{c_{i-1} \dots c_1}{b_{i-1} \dots b_1}} & , \, i=2,\dots,n \,. \end{cases} </math> The [[Matrix_similarity|similarity transformation]] <math>D^{-1} T D </math> yields a ''symmetric'' tridiagonal matrix <math>J</math> by:<ref>{{cite web |first=GΓ©rard |last=Meurant |title=Tridiagonal matrices |date=October 2008 |url=http://www.math.hkbu.edu.hk/ICM/LecturesAndSeminars/08OctMaterials/1/Slide3.pdf |via=Institute for Computational Mathematics, Hong Kong Baptist University}}</ref><ref name=kreer_operator/> :<math> J:=D^{-1} T D = \begin{pmatrix} a_1 & \sgn b_1 \, \sqrt{b_1 c_1} \\ \sgn b_1 \, \sqrt{b_1 c_1} & a_2 & \sgn b_2 \, \sqrt{b_2 c_2} \\ & \sgn b_2 \, \sqrt{b_2 c_2} & \ddots & \ddots \\ & & \ddots & \ddots & \sgn b_{n-1} \, \sqrt{b_{n-1} c_{n-1}} \\ & & & \sgn b_{n-1} \, \sqrt{b_{n-1} c_{n-1}} & a_n \end{pmatrix} \,. </math> Note that <math>T</math> and <math>J</math> have the same eigenvalues.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)