Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Linear differential equation
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==System of linear differential equations== {{Main|Matrix differential equation}} {{see also|system of differential equations}} A system of linear differential equations consists of several linear differential equations that involve several unknown functions. In general one restricts the study to systems such that the number of unknown functions equals the number of equations. An arbitrary linear ordinary differential equation and a system of such equations can be converted into a first order system of linear differential equations by adding variables for all but the highest order derivatives. That is, if {{tmath| y', y'', \ldots, y^{(k)} }} appear in an equation, one may replace them by new unknown functions {{tmath|y_1, \ldots, y_k }} that must satisfy the equations {{tmath|1=y'=y_1}} and {{tmath|1=y_i'=y_{i+1},}} for {{math|1=''i'' = 1, ..., ''k'' β 1}}. A linear system of the first order, which has {{mvar|n}} unknown functions and {{mvar|n}} differential equations may normally be solved for the derivatives of the unknown functions. If it is not the case this is a [[differential-algebraic system of equations|differential-algebraic system]], and this is a different theory. Therefore, the systems that are considered here have the form <math display="block">\begin{align} y_1'(x) &= b_1(x) +a_{1,1}(x)y_1+\cdots+a_{1,n}(x)y_n\\[1ex] &\;\;\vdots\\[1ex] y_n'(x) &= b_n(x) +a_{n,1}(x)y_1+\cdots+a_{n,n}(x)y_n, \end{align}</math> where {{tmath|b_n}} and the {{tmath|a_{i,j} }} are functions of {{mvar|x}}. In matrix notation, this system may be written (omitting "{{math|(''x'')}}") <math display="block">\mathbf{y}' = A\mathbf{y}+\mathbf{b}.</math> The solving method is similar to that of a single first order linear differential equations, but with complications stemming from noncommutativity of matrix multiplication. Let <math display="block">\mathbf{u}' = A\mathbf{u}.</math> be the homogeneous equation associated to the above matrix equation. Its solutions form a [[vector space]] of dimension {{mvar|n}}, and are therefore the columns of a [[square matrix]] of functions {{tmath|U(x)}}, whose [[determinant]] is not the zero function. If {{math|1=''n'' = 1}}, or {{mvar|A}} is a matrix of constants, or, more generally, if {{mvar|A}} commutes with its [[antiderivative]] {{tmath|1=\textstyle B=\int Adx}}, then one may choose {{mvar|U}} equal the [[matrix exponential|exponential]] of {{mvar|B}}. In fact, in these cases, one has <math display="block">\frac{d}{dx}\exp(B) = A\exp (B).</math> In the general case there is no closed-form solution for the homogeneous equation, and one has to use either a [[numerical method]], or an approximation method such as [[Magnus expansion]]. Knowing the matrix {{mvar|U}}, the general solution of the non-homogeneous equation is <math display="block">\mathbf{y}(x) = U(x)\mathbf{y_0} + U(x)\int U^{-1}(x)\mathbf{b}(x)\,dx,</math> where the column matrix <math>\mathbf{y_0}</math> is an arbitrary [[constant of integration]]. If initial conditions are given as <math display="block">\mathbf y(x_0)=\mathbf y_0,</math> the solution that satisfies these initial conditions is <math display="block">\mathbf{y}(x) = U(x)U^{-1}(x_0)\mathbf{y_0} + U(x)\int_{x_0}^x U^{-1}(t)\mathbf{b}(t)\,dt.</math>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)