Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Iterative method
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Linear systems== In the case of a [[system of linear equations]], the two main classes of iterative methods are the '''stationary iterative methods''', and the more general [[Krylov subspace]] methods. ===Stationary iterative methods=== ==== Introduction ==== Stationary iterative methods solve a linear system with an [[Operator (mathematics)|operator]] approximating the original one; and based on a measurement of the error in the result ([[Residual (numerical analysis)|the residual]]), form a "correction equation" for which this process is repeated. While these methods are simple to derive, implement, and analyze, convergence is only guaranteed for a limited class of matrices. ====Definition==== An ''iterative method'' is defined by :<math> \mathbf{x}^{k+1} := \Psi ( \mathbf{x}^k ), \quad k \geq 0 </math> and for a given linear system <math> A\mathbf x= \mathbf b </math> with exact solution <math> \mathbf{x}^* </math> the ''error'' by :<math> \mathbf{e}^k := \mathbf{x}^k - \mathbf{x}^*, \quad k \geq 0. </math> An iterative method is called ''linear'' if there exists a matrix <math> C \in \R^{n\times n} </math> such that :<math> \mathbf{e}^{k+1} = C \mathbf{e}^k \quad \forall k \geq 0 </math> and this matrix is called the ''iteration matrix''. An iterative method with a given iteration matrix <math> C </math> is called ''convergent'' if the following holds :<math> \lim_{k\rightarrow \infty} C^k=0. </math> An important theorem states that for a given iterative method and its iteration matrix <math> C </math> it is convergent if and only if its [[spectral radius]] <math> \rho(C) </math> is smaller than unity, that is, :<math> \rho(C) < 1. </math> The basic iterative methods work by [[Matrix splitting|splitting]] the matrix <math> A </math> into :<math> A = M - N </math> and here the matrix <math> M </math> should be easily [[Invertible matrix|invertible]]. The iterative methods are now defined as :<math> M \mathbf{x}^{k+1} = N \mathbf{x}^k + b, \quad k \geq 0, </math> or, equivalently, :<math> \mathbf{x}^{k+1} = \mathbf{x}^k + M^{-1} (b - A \mathbf{x}^k), \quad k \geq 0. </math> From this follows that the iteration matrix is given by :<math> C = I - M^{-1}A = M^{-1}N. </math> ====Examples==== Basic examples of stationary iterative methods use a splitting of the matrix <math> A </math> such as :<math> A = D+L+U\,,\quad D := \text{diag}( (a_{ii})_i) </math> where <math> D </math> is only the diagonal part of <math> A </math>, and <math> L </math> is the strict lower [[Triangular matrix|triangular part]] of <math> A </math>. Respectively, <math> U </math> is the strict upper triangular part of <math> A </math>. * [[Modified Richardson iteration|Richardson method]]: <math> M:=\frac{1}{\omega} I \quad (\omega \neq 0) </math> * [[Jacobi method]]: <math> M:=D </math> * [[Jacobi method#Weighted Jacobi method|Damped Jacobi method]]: <math> M:=\frac{1}{\omega}D \quad (\omega \neq 0) </math> * [[Gauss–Seidel method]]: <math> M:=D+L </math> * [[Successive over-relaxation|Successive over-relaxation method]] (SOR): <math> M:=\frac{1}{\omega}D+L \quad (\omega \neq 0) </math> * [[Symmetric successive over-relaxation]] (SSOR): <math> M := \frac{1}{\omega (2-\omega)} (D+\omega L) D^{-1} (D+\omega U) \quad (\omega \not \in \{0,2\}) </math> Linear stationary iterative methods are also called [[Relaxation (iterative method)|relaxation methods]]. ===Krylov subspace methods=== Krylov subspace methods<ref>Charles George Broyden and Maria Terasa Vespucci: ''Krylov Solvers for Linear Algebraic Systems: Krylov Solvers'', Elsevier, ISBN 0-444-51474-0, (2004).</ref> work by forming a [[basis (linear algebra)|basis]] of the sequence of successive matrix powers times the initial residual (the '''Krylov sequence'''). The approximations to the solution are then formed by minimizing the residual over the subspace formed. The prototypical method in this class is the [[conjugate gradient method]] (CG) which assumes that the system matrix <math> A </math> is [[Symmetric matrix|symmetric]] [[Positive-definite matrix|positive-definite]]. For symmetric (and possibly indefinite) <math> A </math> one works with the [[minimal residual method]] (MINRES). In the case of non-symmetric matrices, methods such as the [[generalized minimal residual method]] (GMRES) and the [[biconjugate gradient method]] (BiCG) have been derived. ====Convergence of Krylov subspace methods==== Since these methods form a basis, it is evident that the method converges in ''N'' iterations, where ''N'' is the system size. However, in the presence of rounding errors this statement does not hold; moreover, in practice ''N'' can be very large, and the iterative process reaches sufficient accuracy already far earlier. The analysis of these methods is hard, depending on a complicated function of the [[spectrum of an operator|spectrum]] of the operator. ===Preconditioners=== The approximating operator that appears in stationary iterative methods can also be incorporated in Krylov subspace methods such as [[GMRES]] (alternatively, [[preconditioning|preconditioned]] Krylov methods can be considered as accelerations of stationary iterative methods), where they become transformations of the original operator to a presumably better conditioned one. The construction of preconditioners is a large research area.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)