Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Band matrix
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
{{Short description|Matrix with non-zero elements only in a diagonal band}} In [[mathematics]], particularly [[Matrix (mathematics)|matrix theory]], a '''band matrix''' or '''banded matrix''' is a [[sparse matrix]] whose non-zero entries are confined to a diagonal ''band'', comprising the [[main diagonal]] and zero or more diagonals on either side. ==Band matrix==<!-- [[Bandwidth (sparse matrix)]], [[matrix bandwidth]], [[bandwidth (matrix)]], [[bandwidth (matrix theory)]] redirect here --> ===Bandwidth=== Formally, consider an ''n''×''n'' matrix ''A''=(''a''<sub>''i,j'' </sub>). If all matrix elements are zero outside a diagonally bordered band whose range is determined by constants ''k''<sub>1</sub> and ''k''<sub>2</sub>: :<math>a_{i,j}=0 \quad\mbox{if}\quad j<i-k_1 \quad\mbox{ or }\quad j>i+k_2; \quad k_1, k_2 \ge 0.\,</math> then the quantities ''k''<sub>1</sub> and ''k''<sub>2</sub> are called the '''{{visible anchor|lower bandwidth}}''' and '''{{visible anchor|upper bandwidth}}''', respectively.{{sfn|Golub|Van Loan|1996|loc=§1.2.1}} The '''{{visible anchor|bandwidth}}''' of the matrix is the maximum of ''k''<sub>1</sub> and ''k''<sub>2</sub>; in other words, it is the number ''k'' such that <math> a_{i,j}=0 </math> if <math> |i-j| > k </math>.{{sfn|Atkinson|1989|p=527}} ==Examples== *A band matrix with ''k''<sub>1</sub> = ''k''<sub>2</sub> = 0 is a [[diagonal matrix]], with bandwidth 0. *A band matrix with ''k''<sub>1</sub> = ''k''<sub>2</sub> = 1 is a [[tridiagonal matrix]], with bandwidth 1. *For ''k''<sub>1</sub> = ''k''<sub>2</sub> = 2 one has a pentadiagonal matrix and so on. *[[triangular matrix|Triangular matrices]] **For ''k''<sub>1</sub> = 0, ''k''<sub>2</sub> = ''n''−1, one obtains the definition of an upper [[triangular matrix]] **similarly, for ''k''<sub>1</sub> = ''n''−1, ''k''<sub>2</sub> = 0 one obtains a lower triangular matrix. * Upper and lower [[Hessenberg matrix|Hessenberg matrices]] * [[Toeplitz matrices]] when bandwidth is limited. * [[block-diagonal matrix|Block diagonal matrices]] * [[shift matrix|Shift matrices]] and [[shear matrix|shear matrices]] * Matrices in [[Jordan normal form]] * A [[skyline matrix]], also called "variable band matrix"{{snd}}a generalization of band matrix * The inverses of [[Lehmer matrix|Lehmer matrices]] are constant tridiagonal matrices, and are thus band matrices. ==Applications== In [[numerical analysis]], matrices from [[finite element]] or [[finite difference]] problems are often banded. Such matrices can be viewed as descriptions of the coupling between the problem variables; the banded property corresponds to the fact that variables are not coupled over arbitrarily large distances. Such matrices can be further divided{{snd}}for instance, banded matrices exist where every element in the band is nonzero. Problems in higher dimensions also lead to banded matrices, in which case the band itself also tends to be sparse. For instance, a partial differential equation on a square domain (using central differences) will yield a matrix with a bandwidth equal to the [[square root]] of the matrix dimension, but inside the band only 5 diagonals are nonzero. Unfortunately, applying [[Gaussian elimination]] (or equivalently an [[LU decomposition]]) to such a matrix results in the band being filled in by many non-zero elements. ==Band storage== Band matrices are usually stored by storing the diagonals in the band; the rest is implicitly zero. For example, a [[tridiagonal matrix]] has bandwidth 1. The 6-by-6 matrix :<math> \begin{bmatrix} B_{11} & B_{12} & 0 & \cdots & \cdots & 0 \\ B_{21} & B_{22} & B_{23} & \ddots & \ddots & \vdots \\ 0 & B_{32} & B_{33} & B_{34} & \ddots & \vdots \\ \vdots & \ddots & B_{43} & B_{44} & B_{45} & 0 \\ \vdots & \ddots & \ddots & B_{54} & B_{55} & B_{56} \\ 0 & \cdots & \cdots & 0 & B_{65} & B_{66} \end{bmatrix} </math> is stored as the 6-by-3 matrix :<math> \begin{bmatrix} 0 & B_{11} & B_{12}\\ B_{21} & B_{22} & B_{23} \\ B_{32} & B_{33} & B_{34} \\ B_{43} & B_{44} & B_{45} \\ B_{54} & B_{55} & B_{56} \\ B_{65} & B_{66} & 0 \end{bmatrix}. </math> A further saving is possible when the matrix is symmetric. For example, consider a symmetric 6-by-6 matrix with an upper bandwidth of 2: :<math> \begin{bmatrix} A_{11} & A_{12} & A_{13} & 0 & \cdots & 0 \\ & A_{22} & A_{23} & A_{24} & \ddots & \vdots \\ & & A_{33} & A_{34} & A_{35} & 0 \\ & & & A_{44} & A_{45} & A_{46} \\ & sym & & & A_{55} & A_{56} \\ & & & & & A_{66} \end{bmatrix}. </math> This matrix is stored as the 6-by-3 matrix: :<math> \begin{bmatrix} A_{11} & A_{12} & A_{13} \\ A_{22} & A_{23} & A_{24} \\ A_{33} & A_{34} & A_{35} \\ A_{44} & A_{45} & A_{46} \\ A_{55} & A_{56} & 0 \\ A_{66} & 0 & 0 \end{bmatrix}. </math> ==Band form of sparse matrices== From a computational point of view, working with band matrices is always preferential to working with similarly dimensioned [[square matrices]]. A band matrix can be likened in complexity to a rectangular matrix whose row dimension is equal to the bandwidth of the band matrix. Thus the work involved in performing operations such as multiplication falls significantly, often leading to huge savings in terms of calculation time and [[calculation complexity|complexity]]. As sparse matrices lend themselves to more efficient computation than dense matrices, as well as in more efficient utilization of computer storage, there has been much research focused on finding ways to minimise the bandwidth (or directly minimise the fill-in) by applying permutations to the matrix, or other such equivalence or similarity transformations.{{sfn|Davis|2006|loc=§7.7}} The [[Cuthill–McKee algorithm]] can be used to reduce the bandwidth of a sparse [[symmetric matrix]]. There are, however, matrices for which the [[reverse Cuthill–McKee algorithm]] performs better. There are many other methods in use. The problem of finding a representation of a matrix with minimal bandwidth by means of permutations of rows and columns is [[NP-hard]].{{sfn|Feige|2000}} ==See also== * [[Diagonal matrix]] * [[Graph bandwidth]] ==Notes== {{Reflist}} ==References== * {{Citation | last = Atkinson | first = Kendall E. | title = An Introduction to Numerical Analysis | year = 1989 | publisher = John Wiley & Sons | isbn = 0-471-62489-6 }}. * {{Citation | last = Davis | first = Timothy A. | title = Direct Methods for Sparse Linear Systems | year = 2006 | publisher = Society for Industrial and Applied Mathematics | isbn = 978-0-898716-13-9 }}. * {{Citation | last = Feige | first = Uriel | contribution = Coping with the NP-Hardness of the Graph Bandwidth Problem | title = Algorithm Theory - SWAT 2000 | series = Lecture Notes in Computer Science | volume = 1851 | year = 2000 | pp = 129–145 | doi = 10.1007/3-540-44985-X_2 }}. * {{Citation | first1=Gene H. | last1=Golub | author1-link=Gene H. Golub | first2=Charles F. | last2=Van Loan | author2-link=Charles F. Van Loan | year=1996 | title=Matrix Computations | edition=3rd | publisher=Johns Hopkins | place=Baltimore | isbn=978-0-8018-5414-9 }}. * {{Citation|last1=Press|first1=WH|last2=Teukolsky|first2=SA|last3=Vetterling|first3=WT|last4=Flannery|first4=BP|year=2007|title=Numerical Recipes: The Art of Scientific Computing|edition=3rd|publisher=Cambridge University Press| publication-place=New York|isbn=978-0-521-88068-8|chapter=Section 2.4|chapter-url=http://apps.nrbook.com/empanel/index.html?pg=56}}. ==External links== * [http://www.netlib.org/lapack/lug/node124.html Information pertaining to LAPACK and band matrices] * [http://www.netlib.org/linalg/html_templates/node89.html#SECTION00930000000000000000 A tutorial on banded matrices and other sparse matrix formats] {{Matrix classes}} {{Authority control}} [[Category:Sparse matrices]]
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)
Pages transcluded onto the current version of this page
(
help
)
:
Template:Authority control
(
edit
)
Template:Citation
(
edit
)
Template:Matrix classes
(
edit
)
Template:Reflist
(
edit
)
Template:Sfn
(
edit
)
Template:Short description
(
edit
)
Template:Snd
(
edit
)
Template:Visible anchor
(
edit
)