Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Perron–Frobenius theorem
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Applications== Numerous books have been written on the subject of non-negative matrices, and Perron–Frobenius theory is invariably a central feature. The following examples given below only scratch the surface of its vast application domain. ===Non-negative matrices=== The Perron–Frobenius theorem does not apply directly to non-negative matrices. Nevertheless, any reducible square matrix ''A'' may be written in upper-triangular block form (known as the '''normal form of a reducible matrix''')<ref>{{harvnb|Varga|2002|p=2.43 (page 51)}}</ref> ::::''PAP''<sup>−1</sup> = <math> \left( \begin{smallmatrix} B_1 & * & * & \cdots & * \\ 0 & B_2 & * & \cdots & * \\ \vdots & \vdots & \vdots & & \vdots \\ 0 & 0 & 0 & \cdots & * \\ 0 & 0 & 0 & \cdots & B_h \end{smallmatrix} \right)</math> where ''P'' is a permutation matrix and each ''B<sub>i</sub>'' is a square matrix that is either irreducible or zero. Now if ''A'' is non-negative then so too is each block of ''PAP''<sup>−1</sup>, moreover the spectrum of ''A'' is just the union of the spectra of the ''B<sub>i</sub>''. The invertibility of ''A'' can also be studied. The inverse of ''PAP''<sup>−1</sup> (if it exists) must have diagonal blocks of the form ''B<sub>i</sub>''<sup>−1</sup> so if any ''B<sub>i</sub>'' isn't invertible then neither is ''PAP''<sup>−1</sup> or ''A''. Conversely let ''D'' be the block-diagonal matrix corresponding to ''PAP''<sup>−1</sup>, in other words ''PAP''<sup>−1</sup> with the asterisks zeroised. If each ''B<sub>i</sub>'' is invertible then so is ''D'' and ''D''<sup>−1</sup>(''PAP''<sup>−1</sup>) is equal to the identity plus a nilpotent matrix. But such a matrix is always invertible (if ''N<sup>k</sup>'' = 0 the inverse of 1 − ''N'' is 1 + ''N'' + ''N''<sup>2</sup> + ... + ''N''<sup>''k''−1</sup>) so ''PAP''<sup>−1</sup> and ''A'' are both invertible. Therefore, many of the spectral properties of ''A'' may be deduced by applying the theorem to the irreducible ''B<sub>i</sub>''. For example, the Perron root is the maximum of the ρ(''B<sub>i</sub>''). While there will still be eigenvectors with non-negative components it is quite possible that none of these will be positive. ===Stochastic matrices=== A row (column) [[stochastic matrix]] is a square matrix each of whose rows (columns) consists of non-negative real numbers whose sum is unity. The theorem cannot be applied directly to such matrices because they need not be irreducible. If ''A'' is row-stochastic then the column vector with each entry 1 is an eigenvector corresponding to the eigenvalue 1, which is also ρ(''A'') by the remark above. It might not be the only eigenvalue on the unit circle: and the associated eigenspace can be multi-dimensional. If ''A'' is row-stochastic and irreducible then the Perron projection is also row-stochastic and all its rows are equal. ===Algebraic graph theory=== The theorem has particular use in [[algebraic graph theory]]. The "underlying graph" of a nonnegative ''n''-square matrix is the graph with vertices numbered 1, ..., ''n'' and arc ''ij'' if and only if ''A<sub>ij</sub>'' ≠ 0. If the underlying graph of such a matrix is strongly connected, then the matrix is irreducible, and thus the theorem applies. In particular, the [[adjacency matrix]] of a [[strongly connected component|strongly connected graph]] is irreducible.<ref>{{cite book |author-link=Richard A. Brualdi |first1=Richard A. |last1=Brualdi |author-link2=H. J. Ryser |first2=Herbert J. |last2=Ryser |title=Combinatorial Matrix Theory |url=https://archive.org/details/combinatorialmat0000brua_x9u3 |url-access=registration |location=Cambridge |publisher=Cambridge UP |year=1992 |isbn=978-0-521-32265-2 }}</ref><ref>{{cite book |author-link=Richard A. Brualdi |first1=Richard A. |last1=Brualdi |first2=Dragos |last2=Cvetkovic |title=A Combinatorial Approach to Matrix Theory and Its Applications |publisher=CRC Press |location=Boca Raton, FL |year=2009 |isbn=978-1-4200-8223-4 }}</ref> ===Finite Markov chains=== The theorem has a natural interpretation in the theory of finite [[Markov chain]]s (where it is the matrix-theoretic equivalent of the convergence of an irreducible finite Markov chain to its stationary distribution, formulated in terms of the transition matrix of the chain; see, for example, the article on the [[subshift of finite type]]).<!--which article?--> ===Compact operators=== {{main|Krein–Rutman theorem}} More generally, it can be extended to the case of non-negative [[compact operator]]s, which, in many ways, resemble finite-dimensional matrices. These are commonly studied in physics, under the name of [[transfer operator]]s, or sometimes '''Ruelle–Perron–Frobenius operators''' (after [[David Ruelle]]). In this case, the leading eigenvalue corresponds to the [[thermodynamic equilibrium]] of a [[dynamical system]], and the lesser eigenvalues to the decay modes of a system that is not in equilibrium. Thus, the theory offers a way of discovering the [[arrow of time]] in what would otherwise appear to be reversible, deterministic dynamical processes, when examined from the point of view of [[point-set topology]].<ref>{{cite book |first=Michael C. |last=Mackey |title=Time's Arrow: The origins of thermodynamic behaviour |location=New York |publisher=Springer-Verlag |year=1992 |isbn=978-0-387-97702-7 }}</ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)