Template:Short description Template:Distinguish Template:More footnotes

In mathematics, the spectral radius of a square matrix is the maximum of the absolute values of its eigenvalues.<ref>Template:Cite book</ref> More generally, the spectral radius of a bounded linear operator is the supremum of the absolute values of the elements of its spectrum. The spectral radius is often denoted by Template:Math.

DefinitionEdit

MatricesEdit

Let Template:Math be the eigenvalues of a matrix Template:Math. The spectral radius of Template:Math is defined as

<math>\rho(A) = \max \left \{ |\lambda_1|, \dotsc, |\lambda_n| \right \}.</math>

The spectral radius can be thought of as an infimum of all norms of a matrix. Indeed, on the one hand, <math> \rho(A) \leqslant \|A\| </math> for every natural matrix norm <math>\|\cdot\|</math>; and on the other hand, Gelfand's formula states that <math> \rho(A) = \lim_{k\to\infty} \|A^k\|^{1/k} </math>. Both of these results are shown below.

However, the spectral radius does not necessarily satisfy <math> \|A\mathbf{v}\| \leqslant \rho(A) \|\mathbf{v}\| </math> for arbitrary vectors <math> \mathbf{v} \in \mathbb{C}^n </math>. To see why, let <math>r > 1</math> be arbitrary and consider the matrix

<math> C_r = \begin{pmatrix} 0 & r^{-1} \\ r & 0 \end{pmatrix} </math>.

The characteristic polynomial of <math> C_r </math> is <math> \lambda^2 - 1 </math>, so its eigenvalues are <math>\{-1, 1\}</math> and thus <math>\rho(C_r) = 1</math>. However, <math>C_r \mathbf{e}_1 = r \mathbf{e}_2</math>. As a result,

<math> \| C_r \mathbf{e}_1 \| = r > 1 = \rho(C_r) \|\mathbf{e}_1\|. </math>

As an illustration of Gelfand's formula, note that <math>\|C_r^k\|^{1/k} \to 1</math> as <math>k \to \infty</math>, since <math>C_r^k = I</math> if <math>k</math> is even and <math>C_r^k = C_r</math> if <math>k</math> is odd.

A special case in which <math> \|A\mathbf{v}\| \leqslant \rho(A) \|\mathbf{v}\| </math> for all <math> \mathbf{v} \in \mathbb{C}^n </math> is when <math>A</math> is a Hermitian matrix and <math> \|\cdot\| </math> is the Euclidean norm. This is because any Hermitian Matrix is diagonalizable by a unitary matrix, and unitary matrices preserve vector length. As a result,

<math> \|A\mathbf{v}\| = \|U^*DU\mathbf{v}\| = \|DU\mathbf{v}\| \leqslant \rho(A) \|U\mathbf{v}\| = \rho(A) \|\mathbf{v}\| .</math>

Bounded linear operatorsEdit

In the context of a bounded linear operator Template:Mvar on a Banach space, the eigenvalues need to be replaced with the elements of the spectrum of the operator, i.e. the values <math>\lambda</math> for which <math>A - \lambda I</math> is not bijective. We denote the spectrum by

<math>\sigma(A) = \left\{ \lambda \in \Complex: A - \lambda I \; \text{is not bijective} \right\}</math>

The spectral radius is then defined as the supremum of the magnitudes of the elements of the spectrum:

<math>\rho(A) = \sup_{\lambda \in \sigma(A)} |\lambda|</math>

Gelfand's formula, also known as the spectral radius formula, also holds for bounded linear operators: letting <math>\|\cdot\|</math> denote the operator norm, we have

<math>\rho(A) = \lim_{k \to \infty}\|A^k\|^{\frac{1}{k}}=\inf_{k\in\mathbb{N}^*} \|A^k\|^{\frac{1}{k}}.</math>

A bounded operator (on a complex Hilbert space) is called a spectraloid operator if its spectral radius coincides with its numerical radius. An example of such an operator is a normal operator.

GraphsEdit

The spectral radius of a finite graph is defined to be the spectral radius of its adjacency matrix.

This definition extends to the case of infinite graphs with bounded degrees of vertices (i.e. there exists some real number Template:Mvar such that the degree of every vertex of the graph is smaller than Template:Mvar). In this case, for the graph Template:Mvar define:

<math> \ell^2(G) = \left \{ f : V(G) \to \mathbf{R} \ : \ \sum\nolimits_{v \in V(G)} \left \|f(v)^2 \right \| < \infty \right \}.</math>

Let Template:Mvar be the adjacency operator of Template:Mvar:

<math> \begin{cases} \gamma : \ell^2(G) \to \ell^2(G) \\ (\gamma f)(v) = \sum_{(u,v) \in E(G)} f(u) \end{cases}</math>

The spectral radius of Template:Mvar is defined to be the spectral radius of the bounded linear operator Template:Mvar.

Upper boundsEdit

Upper bounds on the spectral radius of a matrixEdit

The following proposition gives simple yet useful upper bounds on the spectral radius of a matrix.

Proposition. Let Template:Math with spectral radius Template:Math and a sub-multiplicative matrix norm Template:Math. Then for each integer <math>k \geqslant 1</math>:

<math>\rho(A)\leq \|A^k\|^{\frac{1}{k}}.</math>

Proof

Let Template:Math be an eigenvector-eigenvalue pair for a matrix A. By the sub-multiplicativity of the matrix norm, we get:

<math>|\lambda|^k\|\mathbf{v}\| = \|\lambda^k \mathbf{v}\| = \|A^k \mathbf{v}\| \leq \|A^k\|\cdot\|\mathbf{v}\|.</math>

Since Template:Math, we have

<math>|\lambda|^k \leq \|A^k\|</math>

and therefore

<math>\rho(A)\leq \|A^k\|^{\frac{1}{k}}.</math>

concluding the proof.

Upper bounds for spectral radius of a graphEdit

There are many upper bounds for the spectral radius of a graph in terms of its number n of vertices and its number m of edges. For instance, if

<math>\frac{(k-2)(k-3)}{2} \leq m-n \leq \frac{k(k-3)}{2}</math>

where <math>3 \le k \le n</math> is an integer, then<ref>Template:Cite journal</ref>

<math>\rho(G) \leq \sqrt{2 m-n-k+\frac{5}{2}+\sqrt{2 m-2 n+\frac{9}{4}}}</math>

Symmetric matricesEdit

For real-valued matrices <math>A</math> the inequality <math>\rho(A) \leq {\|A\|}_{2}</math> holds in particular, where <math>{\|\cdot\|}_{2}</math> denotes the spectral norm. In the case where <math>A</math> is symmetric, this inequality is tight:

Theorem. Let <math>A \in \mathbb{R}^{n \times n}</math> be symmetric, i.e., <math>A = A^T.</math> Then it holds that <math>\rho(A) = {\|A\|}_{2}.</math>

Proof

Let <math>(v_i, \lambda_i)_{i=1}^{n}</math> be the eigenpairs of A. Due to the symmetry of A, all <math>v_i</math> and <math>\lambda_i</math> are real-valued and the eigenvectors <math>v_i</math> are orthonormal. By the definition of the spectral norm, there exists an <math>x \in \mathbb{R}^{n}</math> with <math>{\|x\|}_{2} = 1</math> such that <math>{\|A\|}_{2} = {\| A x \|}_{2}.</math> Since the eigenvectors <math>v_i</math> form a basis of <math>\mathbb{R}^{n},</math> there exists factors <math>\delta_{1}, \ldots, \delta_{n} \in \mathbb{R}^{n}</math> such that <math>\textstyle x = \sum_{i = 1}^{n} \delta_{i} v_{i}</math> which implies that

<math>A x = \sum_{i = 1}^{n}\delta_{i} A v_{i} = \sum_{i = 1}^{n} \delta_{i} \lambda_{i} v_{i}.</math>

From the orthonormality of the eigenvectors <math>v_i</math> it follows that

<math>{\| A x\|}_{2} = \| \sum_{i = 1}^{n} \delta_{i} \lambda_{i} v_{i}\|_{2} = \sum_{i = 1}^{n} {|\delta_{i}|} \cdot {|\lambda_{i}|} \cdot {\| v_{i}\|}_{2} = \sum_{i = 1}^{n} {|\delta_{i}|} \cdot {|\lambda_{i}|} </math>

and

<math>{\|x\|}_{2} = \| \sum_{i = 1}^{n} \delta_{i} v_{i} \|_{2} = \sum_{i = 1}^{n} {|\delta_{i}|} \cdot {\| v_{i} \|}_{2} = \sum_{i = 1}^{n} {|\delta_{i}|}.</math>

Since <math>x</math> is chosen such that it maximizes <math>{\|Ax\|}_{2}</math> while satisfying <math>{\|x\|}_{2} = 1,</math> the values of <math>\delta_{i}</math> must be such that they maximize <math>\textstyle \sum_{i = 1}^{n} {|\delta_{i}|} \cdot {|\lambda_{i}|}</math> while satisfying <math>\textstyle \sum_{i = 1}^{n} {|\delta_{i}|} = 1.</math> This is achieved by setting <math>\delta_{k} = 1</math> for <math>k = \mathrm{arg\,max}_{i=1}^{n} {|\lambda_i|}</math> and <math>\delta_{i} = 0</math> otherwise, yielding a value of <math>{\|Ax\|}_{2} = {|\lambda_k|} = \rho(A).</math>

Power sequenceEdit

The spectral radius is closely related to the behavior of the convergence of the power sequence of a matrix; namely as shown by the following theorem.

Theorem. Let Template:Math with spectral radius Template:Math. Then Template:Math if and only if

<math>\lim_{k \to \infty} A^k = 0.</math>

On the other hand, if Template:Math, <math>\lim_{k \to \infty} \|A^k\| = \infty</math>. The statement holds for any choice of matrix norm on Template:Math.

Proof

Assume that <math>A^k</math> goes to zero as <math>k</math> goes to infinity. We will show that Template:Math. Let Template:Math be an eigenvector-eigenvalue pair for A. Since Template:Math, we have

<math>\begin{align}
 0 &= \left(\lim_{k \to \infty} A^k \right) \mathbf{v} \\
   &= \lim_{k \to \infty} \left(A^k\mathbf{v} \right ) \\
   &= \lim_{k \to \infty} \lambda^k\mathbf{v} \\
   &= \mathbf{v} \lim_{k \to \infty} \lambda^k

\end{align}</math>

Since Template:Math by hypothesis, we must have

<math>\lim_{k \to \infty}\lambda^k = 0,</math>

which implies <math>|\lambda| < 1</math>. Since this must be true for any eigenvalue <math>\lambda</math>, we can conclude that Template:Math.

Now, assume the radius of Template:Mvar is less than Template:Math. From the Jordan normal form theorem, we know that for all Template:Math, there exist Template:Math with Template:Mvar non-singular and Template:Mvar block diagonal such that:

<math>A = VJV^{-1}</math>

with

<math>J=\begin{bmatrix}

J_{m_1}(\lambda_1) & 0 & 0 & \cdots & 0 \\ 0 & J_{m_2}(\lambda_2) & 0 & \cdots & 0 \\ \vdots & \cdots & \ddots & \cdots & \vdots \\ 0 & \cdots & 0 & J_{m_{s-1}}(\lambda_{s-1}) & 0 \\ 0 & \cdots & \cdots & 0 & J_{m_s}(\lambda_s) \end{bmatrix}</math>

where

<math>J_{m_i}(\lambda_i)=\begin{bmatrix}

\lambda_i & 1 & 0 & \cdots & 0 \\ 0 & \lambda_i & 1 & \cdots & 0 \\ \vdots & \vdots & \ddots & \ddots & \vdots \\ 0 & 0 & \cdots & \lambda_i & 1 \\ 0 & 0 & \cdots & 0 & \lambda_i \end{bmatrix}\in \mathbf{C}^{m_i \times m_i}, 1\leq i\leq s.</math>

It is easy to see that

<math>A^k=VJ^kV^{-1}</math>

and, since Template:Mvar is block-diagonal,

<math>J^k=\begin{bmatrix}

J_{m_1}^k(\lambda_1) & 0 & 0 & \cdots & 0 \\ 0 & J_{m_2}^k(\lambda_2) & 0 & \cdots & 0 \\ \vdots & \cdots & \ddots & \cdots & \vdots \\ 0 & \cdots & 0 & J_{m_{s-1}}^k(\lambda_{s-1}) & 0 \\ 0 & \cdots & \cdots & 0 & J_{m_s}^k(\lambda_s) \end{bmatrix}</math>

Now, a standard result on the Template:Mvar-power of an <math>m_i \times m_i</math> Jordan block states that, for <math>k \geq m_i-1</math>:

<math>J_{m_i}^k(\lambda_i)=\begin{bmatrix}

\lambda_i^k & {k \choose 1}\lambda_i^{k-1} & {k \choose 2}\lambda_i^{k-2} & \cdots & {k \choose m_i-1}\lambda_i^{k-m_i+1} \\ 0 & \lambda_i^k & {k \choose 1}\lambda_i^{k-1} & \cdots & {k \choose m_i-2}\lambda_i^{k-m_i+2} \\ \vdots & \vdots & \ddots & \ddots & \vdots \\ 0 & 0 & \cdots & \lambda_i^k & {k \choose 1}\lambda_i^{k-1} \\ 0 & 0 & \cdots & 0 & \lambda_i^k \end{bmatrix}</math>

Thus, if <math>\rho(A) < 1</math> then for all Template:Mvar <math>|\lambda_i| < 1</math>. Hence for all Template:Mvar we have:

<math>\lim_{k \to \infty}J_{m_i}^k=0</math>

which implies

<math>\lim_{k \to \infty} J^k = 0.</math>

Therefore,

<math>\lim_{k \to \infty}A^k=\lim_{k \to \infty}VJ^kV^{-1}=V \left (\lim_{k \to \infty}J^k \right )V^{-1}=0</math>

On the other side, if <math>\rho(A)>1</math>, there is at least one element in Template:Mvar that does not remain bounded as Template:Mvar increases, thereby proving the second part of the statement.

Gelfand's formulaEdit

Gelfand's formula, named after Israel Gelfand, gives the spectral radius as a limit of matrix norms.

TheoremEdit

For any matrix norm Template:Math we have<ref>The formula holds for any Banach algebra; see Lemma IX.1.8 in Template:Harvnb and Template:Harvnb</ref>

<math>\rho(A)=\lim_{k \to \infty} \left \|A^k \right \|^{\frac{1}{k}}</math>.

Moreover, in the case of a consistent matrix norm <math>\lim_{k \to \infty} \left \|A^k \right \|^{\frac{1}{k}}</math> approaches <math>\rho(A)</math> from above (indeed, in that case <math>\rho(A) \leq \left \|A^k \right \|^{\frac{1}{k}}</math> for all <math>k</math>).

ProofEdit

For any Template:Math, let us define the two following matrices:

<math>A_{\pm}= \frac{1}{\rho(A) \pm\varepsilon}A.</math>

Thus,

<math>\rho \left (A_{\pm} \right ) = \frac{\rho(A)}{\rho(A) \pm \varepsilon}, \qquad \rho (A_+) < 1 < \rho (A_-).</math>

We start by applying the previous theorem on limits of power sequences to Template:Math:

<math>\lim_{k \to \infty} A_+^k=0.</math>

This shows the existence of Template:Math such that, for all Template:Math,

<math>\left\|A_+^k \right \| < 1.</math>

Therefore,

<math>\left \|A^k \right \|^{\frac{1}{k}} < \rho(A)+\varepsilon.</math>

Similarly, the theorem on power sequences implies that <math>\|A_-^k\|</math> is not bounded and that there exists Template:Math such that, for all Template:Math,

<math>\left\|A_-^k \right \| > 1.</math>

Therefore,

<math>\left\|A^k \right\|^{\frac{1}{k}} > \rho(A)-\varepsilon.</math>

Let Template:Math}. Then,

<math>\forall \varepsilon>0\quad \exists N\in\mathbf{N} \quad \forall k\geq N \quad \rho(A)-\varepsilon < \left \|A^k \right \|^{\frac{1}{k}} < \rho(A)+\varepsilon,</math>

that is,

<math>\lim_{k \to \infty} \left \|A^k \right \|^{\frac{1}{k}} = \rho(A).</math>

This concludes the proof.

CorollaryEdit

Gelfand's formula yields a bound on the spectral radius of a product of commuting matrices: if <math>A_1, \ldots, A_n</math> are matrices that all commute, then

<math>\rho(A_1 \cdots A_n) \leq \rho(A_1) \cdots \rho(A_n).</math>

Numerical exampleEdit

File:Gelfand's formula for a 3x3 matrix.svg
The convergence of all 3 matrix norms to the spectral radius.

Consider the matrix

<math>A=\begin{bmatrix}

9 & -1 & 2\\ -2 & 8 & 4\\ 1 & 1 & 8 \end{bmatrix}</math>

whose eigenvalues are Template:Math; by definition, Template:Math. In the following table, the values of <math>\|A^k\|^{\frac{1}{k}}</math> for the four most used norms are listed versus several increasing values of k (note that, due to the particular form of this matrix,<math>\|.\|_1=\|.\|_\infty</math>):

Notes and referencesEdit

Template:Reflist

BibliographyEdit

See alsoEdit

Template:Functional Analysis Template:SpectralTheory