Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Operator theory
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
{{Short description|Mathematical field of study}} In [[mathematics]], '''operator theory''' is the study of [[linear operator]]s on [[function space]]s, beginning with [[differential operator]]s and [[integral operator]]s. The operators may be presented abstractly by their characteristics, such as [[bounded linear operator]]s or [[closed operator]]s, and consideration may be given to [[nonlinear operator]]s. The study, which depends heavily on the [[topology]] of function spaces, is a branch of [[functional analysis]]. If a collection of operators forms an [[algebra over a field]], then it is an [[operator algebra]]. The description of operator algebras is part of operator theory. ==Single operator theory== Single operator theory deals with the properties and classification of operators, considered one at a time. For example, the classification of [[normal operator]]s in terms of their [[spectrum of an operator|spectra]] falls into this category. ===Spectrum of operators=== {{Main article|Spectral theorem}} The '''spectral theorem''' is any of a number of results about [[linear operator]]s or about [[matrix (mathematics)|matrices]].<ref>[[V. S. Sunder|Sunder, V.S.]] ''Functional Analysis: Spectral Theory (1997) Birkhäuser Verlag</ref> In broad terms the spectral [[theorem]] provides conditions under which an [[Operator (mathematics)|operator]] or a matrix can be [[Diagonalizable matrix|diagonalized]] (that is, represented as a [[diagonal matrix]] in some basis). This concept of diagonalization is relatively straightforward for operators on [[finite-dimensional]] spaces, but requires some modification for operators on infinite-dimensional spaces. In general, the spectral theorem identifies a class of [[linear operator]]s that can be modelled by [[multiplication operator]]s, which are as simple as one can hope to find. In more abstract language, the spectral theorem is a statement about [[commutative ring|commutative]] [[C*-algebra]]s. See also [[spectral theory]] for a historical perspective. Examples of operators to which the spectral theorem applies are [[self-adjoint operator]]s or more generally [[normal operator]]s on [[Hilbert space]]s. The spectral theorem also provides a [[canonical form|canonical]] decomposition, called the '''spectral decomposition''', '''eigenvalue decomposition''', or '''[[eigendecomposition of a matrix|eigendecomposition]]''', of the underlying vector space on which the operator acts. ====Normal operators==== {{main article|Normal operator}} A '''normal operator''' on a [[complex number|complex]] [[Hilbert space]] <math>H</math> is a [[continuous function (topology)|continuous]] [[linear operator]] <math>N\colon H \rightarrow H</math> that [[commutator|commutes]] with its [[hermitian adjoint]] <math>N^{\ast}</math>, that is: <math>NN^{\ast} = N^{\ast}N</math>.<ref>{{citation | last1 = Hoffman | first1 = Kenneth | last2 = Kunze | first2 = Ray | author2-link = Ray Kunze | edition = 2nd | location = Englewood Cliffs, N.J. | mr = 0276251 | page = 312 | publisher = Prentice-Hall, Inc. | title = Linear algebra | year = 1971}}</ref> Normal operators are important because the [[spectral theorem]] holds for them. Today, the class of normal operators is well understood. Examples of normal operators are * [[unitary operator]]s: <math>U^{\ast} = U^{-1}</math> * [[Hermitian operator]]s (i.e., selfadjoint operators): <math>N^{\ast} = N</math>; (also, anti-selfadjoint operators: <math>N^{\ast} = -N</math>) * [[positive operator]]s: <math>N = M^{\ast}M</math>, where <math>M</math> is any operator * [[Normal matrix|normal matrices]] can be seen as normal operators if one takes the Hilbert space to be <math>\mathbb{C}^{n}</math>. The spectral theorem extends to a more general class of matrices. Let <math>A</math> be an operator on a finite-dimensional [[inner product space]]. <math>A</math> is said to be [[normal matrix|normal]] if <math>A^{\ast}A = AA^{\ast}</math>. One can show that <math>A</math> is normal if and only if it is unitarily diagonalizable: By the [[Schur decomposition]], we have <math>A = UTU^{\ast}</math>, where <math>U</math> is unitary and <math>T</math> [[upper triangular]]. Since <math>A</math> is normal, <math>T^{\ast}T = TT^{\ast}</math>. Therefore, <math>T</math> must be diagonal since normal upper triangular matrices are diagonal. The converse is obvious. In other words, <math>A</math> is normal if and only if there exists a [[unitary matrix]] <math>U</math> such that <math display="block">A = U D U^* </math> where <math>D</math> is a [[diagonal matrix]]. Then, the entries of the diagonal of <math>D</math> are the [[eigenvalue]]s of <math>A</math>. The column vectors of <math>U</math> are the [[eigenvector]]s of <math>A</math> and they are [[orthonormal]]. Unlike the Hermitian case, the entries of <math>D</math> need not be real. ===Polar decomposition=== {{Main article|Polar decomposition}} The '''polar decomposition''' of any [[bounded linear operator]] ''A'' between complex [[Hilbert space]]s is a canonical factorization as the product of a [[partial isometry]] and a non-negative operator.<ref>{{citation|title=A Course in Operator Theory | series=[[Graduate Studies in Mathematics]]|first=John B. |last=Conway|publisher=American Mathematical Society|year= 2000 | isbn=0821820656}}</ref> The polar decomposition for matrices generalizes as follows: if ''A'' is a bounded linear operator then there is a unique factorization of ''A'' as a product ''A'' = ''UP'' where ''U'' is a partial isometry, ''P'' is a non-negative self-adjoint operator and the initial space of ''U'' is the closure of the range of ''P''. The operator ''U'' must be weakened to a partial isometry, rather than unitary, because of the following issues. If ''A'' is the [[shift operator|one-sided shift]] on ''l''{{i sup|2}}('''N'''), then |''A''| = (''A*A'')<sup>1/2</sup> = ''I''. So if ''A'' = ''U'' |''A''|, ''U'' must be ''A'', which is not unitary. The existence of a polar decomposition is a consequence of [[Douglas' lemma]]: {{math theorem | name = Lemma | math_statement = If ''A'', ''B'' are bounded operators on a Hilbert space ''H'', and ''A*A'' ≤ ''B*B'', then there exists a contraction ''C'' such that ''A'' = ''CB''. Furthermore, ''C'' is unique if ''Ker''(''B*'') ⊂ ''Ker''(''C'').}} The operator ''C'' can be defined by {{math|1=''C''(''Bh'') = ''Ah''}}, extended by continuity to the closure of ''Ran''(''B''), and by zero on the orthogonal complement of {{math|Ran(''B'')}}. The operator ''C'' is well-defined since {{math|''A*A'' ≤ ''B*B''}} implies {{math|Ker(''B'') ⊂ Ker(''A'')}}. The lemma then follows. In particular, if {{math|1=''A*A'' = ''B*B''}}, then ''C'' is a partial isometry, which is unique if {{math|Ker(''B*'') ⊂ Ker(''C'').}} In general, for any bounded operator ''A'', <math display="block">A^*A = (A^*A)^{\frac{1}{2}} (A^*A)^{\frac{1}{2}},</math> where (''A*A'')<sup>1/2</sup> is the unique positive square root of ''A*A'' given by the usual [[functional calculus]]. So by the lemma, we have <math display="block">A = U (A^*A)^{\frac{1}{2}}</math> for some partial isometry ''U'', which is unique if Ker(''A'') ⊂ Ker(''U''). (Note {{math|1=Ker(''A'') = Ker(''A*A'') = Ker(''B'') = Ker(''B*'')}}, where {{math|1=''B'' = ''B*'' = (''A*A'')<sup>1/2</sup>}}.) Take ''P'' to be (''A*A'')<sup>1/2</sup> and one obtains the polar decomposition ''A'' = ''UP''. Notice that an analogous argument can be used to show ''A = P'U' '', where ''P' '' is positive and ''U' '' a partial isometry. When ''H'' is finite dimensional, ''U'' can be extended to a unitary operator; this is not true in general (see example above). Alternatively, the polar decomposition can be shown using the operator version of [[singular value decomposition#Bounded operators on Hilbert spaces|singular value decomposition]]. By property of the [[continuous functional calculus]], |''A''| is in the [[C*-algebra]] generated by ''A''. A similar but weaker statement holds for the partial isometry: the polar part ''U'' is in the [[von Neumann algebra]] generated by ''A''. If ''A'' is invertible, ''U'' will be in the [[C*-algebra]] generated by ''A'' as well. ===Connection with complex analysis=== Many operators that are studied are operators on Hilbert spaces of [[holomorphic function]]s, and the study of the operator is intimately linked to questions in function theory. For example, [[Beurling's theorem]] describes the [[invariant subspace]]s of the unilateral shift in terms of inner functions, which are bounded holomorphic functions on the unit disk with unimodular boundary values almost everywhere on the circle. Beurling interpreted the unilateral shift as multiplication by the independent variable on the [[Hardy space]].<ref>{{citation|first=Nikolai|last=Nikolski|authorlink = Nikolai Nikolski|title=A treatise on the shift operator|publisher=Springer-Verlag|year=1986| isbn=0-387-90176-0}}. A sophisticated treatment of the connections between Operator theory and Function theory in the [[Hardy space]].</ref> The success in studying multiplication operators, and more generally [[Toeplitz operator]]s (which are multiplication, followed by projection onto the Hardy space) has inspired the study of similar questions on other spaces, such as the [[Bergman space]]. ==Operator algebras== The theory of [[operator algebra]]s brings [[algebra over a field|algebra]]s of operators such as [[C*-algebra]]s to the fore. ===C*-algebras=== {{Main article|C*-algebra}} A C*-algebra, ''A'', is a [[Banach algebra]] over the field of [[complex number]]s, together with a [[Map (mathematics)|map]] {{math|1=* : ''A'' → ''A''}}. One writes ''x*'' for the image of an element ''x'' of ''A''. The map * has the following properties:<ref>{{citation |first=William | last=Arveson|authorlink = William Arveson|title=An Invitation to C*-Algebra| publisher=Springer-Verlag | year=1976 |isbn=0-387-90176-0}}. An excellent introduction to the subject, accessible for those with a knowledge of basic [[functional analysis]].</ref> * It is an [[Semigroup with involution|involution]], for every ''x'' in ''A'' <math display="block"> x^{**} = (x^*)^* = x </math> * For all ''x'', ''y'' in ''A'': <math display="block"> (x + y)^* = x^* + y^* </math> <math display="block"> (x y)^* = y^* x^*</math> * For every λ in '''C''' and every ''x'' in ''A'': <math display="block"> (\lambda x)^* = \overline{\lambda} x^* .</math> * For all ''x'' in ''A'': <math display="block"> \|x^* x \| = \left\|x\right\| \left\|x^*\right\|.</math> '''Remark.''' The first three identities say that ''A'' is a [[*-algebra]]. The last identity is called the '''C* identity''' and is equivalent to: <math display="block">\|xx^*\| = \|x\|^2,</math> The C*-identity is a very strong requirement. For instance, together with the [[spectral radius|spectral radius formula]], it implies that the C*-norm is uniquely determined by the algebraic structure: <math display="block"> \|x\|^2 = \|x^* x\| = \sup\{|\lambda| : x^* x - \lambda \,1 \text{ is not invertible} \}.</math> ==See also== * [[Invariant subspace]] * [[Functional calculus]] * [[Spectral theory]] ** [[Resolvent formalism]] * [[Compact operator]] ** [[Fredholm theory]] of [[integral equation]]s ***[[Integral operator]] ***[[Fredholm operator]] * [[Self-adjoint operator]] * [[Unbounded operator]] ** [[Differential operator]] * [[Umbral calculus]] * [[Contraction mapping]] * [[Positive operator]] on a [[Hilbert space]] * [[Perron–Frobenius theorem#See also|Nonnegative operator]] on a [[ordered vector space|partially ordered vector space]] ==References== {{reflist}} ==Further reading== * [[John B. Conway|Conway, J. B.]]: ''A Course in Functional Analysis'', 2nd edition, Springer-Verlag, 1994, {{isbn|0-387-97245-5}} * {{cite book | isbn = 978-0582237438 | title = Introduction to Operator Theory | last1 = Yoshino | first1 = Takashi | year = 1993 | publisher = Chapman and Hall/CRC }} ==External links== * [http://www.mathphysics.com/opthy/OpHistory.html History of Operator Theory] {{Functional Analysis}} {{Authority control}} [[Category:Operator theory| ]]
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)
Pages transcluded onto the current version of this page
(
help
)
:
Template:Authority control
(
edit
)
Template:Citation
(
edit
)
Template:Cite book
(
edit
)
Template:Functional Analysis
(
edit
)
Template:I sup
(
edit
)
Template:Isbn
(
edit
)
Template:Main article
(
edit
)
Template:Math
(
edit
)
Template:Math theorem
(
edit
)
Template:Reflist
(
edit
)
Template:Short description
(
edit
)