Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Hankel matrix
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
{{Short description|Square matrix in which each ascending skew-diagonal from left to right is constant}} In [[linear algebra]], a '''Hankel matrix''' (or '''[[catalecticant]] matrix'''), named after [[Hermann Hankel]], is a rectangular matrix in which each ascending skew-diagonal from left to right is constant. For example, <math display=block>\qquad\begin{bmatrix} a & b & c & d & e \\ b & c & d & e & f \\ c & d & e & f & g \\ d & e & f & g & h \\ e & f & g & h & i \\ \end{bmatrix}.</math> More generally, a '''Hankel matrix''' is any <math>n \times n</math> [[matrix (mathematics)|matrix]] <math>A</math> of the form <math display=block>A = \begin{bmatrix} a_0 & a_1 & a_2 & \ldots & a_{n-1} \\ a_1 & a_2 & & &\vdots \\ a_2 & & & & a_{2n-4} \\ \vdots & & & a_{2n-4} & a_{2n-3} \\ a_{n-1} & \ldots & a_{2n-4} & a_{2n-3} & a_{2n-2} \end{bmatrix}.</math> In terms of the components, if the <math>i,j</math> element of <math>A</math> is denoted with <math>A_{ij}</math>, and assuming <math>i \le j</math>, then we have <math>A_{i,j} = A_{i+k,j-k}</math> for all <math>k = 0,...,j-i.</math> ==Properties== * Any Hankel matrix is [[symmetric matrix|symmetric]]. * Let <math>J_n</math> be the <math>n \times n</math> [[exchange matrix]]. If <math>H</math> is an <math>m \times n</math> Hankel matrix, then <math>H = T J_n</math> where <math>T</math> is an <math>m \times n</math> [[Toeplitz matrix]]. ** If <math>T</math> is [[real number|real]] symmetric, then <math>H = T J_n</math> will have the same [[eigenvalue]]s as <math>T</math> up to sign.<ref name="simax1">{{cite journal | last = Yasuda | first = M. | title = A Spectral Characterization of Hermitian Centrosymmetric and Hermitian Skew-Centrosymmetric K-Matrices | journal = SIAM J. Matrix Anal. Appl. | volume = 25 | issue = 3 | pages = 601–605 | year = 2003 | doi = 10.1137/S0895479802418835}}</ref> * The [[Hilbert matrix]] is an example of a Hankel matrix. * The [[determinant]] of a Hankel matrix is called a [[catalecticant]]. ==Hankel operator== Given a [[formal Laurent series]] <math display="block"> f(z) = \sum_{n=-\infty}^N a_n z^n, </math> the corresponding '''Hankel operator''' is defined as<ref>{{harvnb|Fuhrmann|2012|loc=§8.3}}</ref> <math display="block"> H_f : \mathbf C[z] \to \mathbf z^{-1} \mathbf C[[z^{-1}]]. </math> This takes a [[polynomial]] <math>g \in \mathbf C[z]</math> and sends it to the product <math>fg</math>, but discards all powers of <math>z</math> with a non-negative exponent, so as to give an element in <math>z^{-1} \mathbf C[[z^{-1}]]</math>, the [[formal power series]] with strictly negative exponents. The map <math>H_f</math> is in a natural way <math>\mathbf C[z]</math>-linear, and its matrix with respect to the elements <math>1, z, z^2, \dots \in \mathbf C[z]</math> and <math>z^{-1}, z^{-2}, \dots \in z^{-1}\mathbf C[[z^{-1}]]</math> is the Hankel matrix <math display=block>\begin{bmatrix} a_1 & a_2 & \ldots \\ a_2 & a_3 & \ldots \\ a_3 & a_4 & \ldots \\ \vdots & \vdots & \ddots \end{bmatrix}.</math> Any Hankel matrix arises in this way. A [[theorem]] due to [[Kronecker]] says that the [[rank (linear algebra)|rank]] of this matrix is finite precisely if <math>f</math> is a [[rational function]], that is, a fraction of two polynomials <math display="block"> f(z) = \frac{p(z)}{q(z)}. </math> ==Approximations== We are often interested in approximations of the Hankel operators, possibly by low-order operators. In order to approximate the output of the operator, we can use the spectral norm (operator 2-norm) to measure the error of our approximation. This suggests [[singular value decomposition]] as a possible technique to approximate the action of the operator. Note that the matrix <math>A</math> does not have to be finite. If it is infinite, traditional methods of computing individual singular vectors will not work directly. We also require that the approximation is a Hankel matrix, which can be shown with [[AAK theory]]. ==Hankel matrix transform== {{Distinguish|Hankel transform}} The '''Hankel matrix transform''', or simply '''Hankel transform''', of a [[sequence]] <math>b_k</math> is the sequence of the determinants of the Hankel matrices formed from <math>b_k</math>. Given an integer <math>n > 0</math>, define the corresponding <math>(n \times n)</math>-dimensional Hankel matrix <math>B_n</math> as having the matrix elements <math>[B_n]_{i,j} = b_{i+j}.</math> Then the sequence <math>h_n</math> given by <math display="block"> h_n = \det B_n </math> is the Hankel transform of the sequence <math>b_k.</math> The Hankel transform is invariant under the [[binomial transform]] of a sequence. That is, if one writes <math display="block"> c_n = \sum_{k=0}^n {n \choose k} b_k </math> as the binomial transform of the sequence <math>b_n</math>, then one has <math>\det B_n = \det C_n.</math> == Applications of Hankel matrices == Hankel matrices are formed when, given a sequence of output data, a realization of an underlying state-space or [[hidden Markov model]] is desired.<ref>{{cite book |first=Masanao |last=Aoki |author-link=Masanao Aoki |chapter=Prediction of Time Series |title=Notes on Economic Time Series Analysis : System Theoretic Perspectives |location=New York |publisher=Springer |year=1983 |isbn=0-387-12696-1 |pages=38–47 |chapter-url=https://books.google.com/books?id=l_LsCAAAQBAJ&pg=PA38 }}</ref> The singular value decomposition of the Hankel matrix provides a means of computing the ''A'', ''B'', and ''C'' matrices which define the state-space realization.<ref>{{cite book |first=Masanao |last=Aoki |chapter=Rank determination of Hankel matrices |title=Notes on Economic Time Series Analysis : System Theoretic Perspectives |location=New York |publisher=Springer |year=1983 |isbn=0-387-12696-1 |pages=67–68 |chapter-url=https://books.google.com/books?id=l_LsCAAAQBAJ&pg=PA67 }}</ref> The Hankel matrix formed from the signal has been found useful for decomposition of non-stationary signals and time-frequency representation. === Method of moments for polynomial distributions === The [[Method of moments (statistics)|method of moments]] applied to polynomial distributions results in a Hankel matrix that needs to be [[inverse matrix|inverted]] in order to obtain the weight parameters of the polynomial distribution approximation.<ref name="PolyD2">J. Munkhammar, L. Mattsson, J. Rydén (2017) "Polynomial probability distribution estimation using the method of moments". PLoS ONE 12(4): e0174573. https://doi.org/10.1371/journal.pone.0174573</ref> === Positive Hankel matrices and the Hamburger moment problems === {{Further|Hamburger moment problem}} ==See also== * [[Cauchy matrix]] * [[Jacobi operator]] * [[Toeplitz matrix]], an "upside down" (that is, row-reversed) Hankel matrix * [[Vandermonde matrix]] == Notes == {{Reflist}} == References == *[[Richard P. Brent|Brent R.P.]] (1999), "Stability of fast algorithms for structured linear systems", ''Fast Reliable Algorithms for Matrices with Structure'' (editors—T. Kailath, A.H. Sayed), ch.4 ([[Society for Industrial and Applied Mathematics|SIAM]]). *{{cite book | last = Fuhrmann | first = Paul A. | title = A polynomial approach to linear algebra | edition = 2 | series = Universitext | year = 2012 | publisher = Springer | location = New York, NY | isbn = 978-1-4614-0337-1 | doi = 10.1007/978-1-4614-0338-8 | zbl = 1239.15001 }} * {{cite book | title=Structured matrices and polynomials: unified superfast algorithms | author=Victor Y. Pan | author-link=Victor Pan | publisher=[[Birkhäuser]] | year=2001 | isbn=0817642404 }} * {{cite book | title=An introduction to Hankel operators | author=J.R. Partington | author-link=Jonathan Partington | series=LMS Student Texts | volume=13 | publisher=[[Cambridge University Press]] | year=1988 | isbn=0-521-36791-3 }} {{Matrix classes}} {{Authority control}} [[Category:Matrices (mathematics)]] [[Category:Transforms]]
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)
Pages transcluded onto the current version of this page
(
help
)
:
Template:Authority control
(
edit
)
Template:Cite book
(
edit
)
Template:Cite journal
(
edit
)
Template:Distinguish
(
edit
)
Template:Further
(
edit
)
Template:Harvnb
(
edit
)
Template:Matrix classes
(
edit
)
Template:Reflist
(
edit
)
Template:Short description
(
edit
)