Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Jacobian matrix and determinant
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
{{Short description|Matrix of partial derivatives of a vector-valued function}} {{Redirect|Jacobian matrix|the operator|Jacobi matrix (operator)}} {{Calculus |Multivariable}} In [[vector calculus]], the '''Jacobian matrix''' ({{IPAc-en|dʒ|ə|ˈ|k|əʊ|b|i|ə|n}},<ref>{{cite web|url=https://en.oxforddictionaries.com/definition/jacobian|title=Jacobian - Definition of Jacobian in English by Oxford Dictionaries|website=Oxford Dictionaries - English|access-date=2 May 2018|url-status=dead|archive-url=https://web.archive.org/web/20171201043633/https://en.oxforddictionaries.com/definition/jacobian|archive-date=1 December 2017}}</ref><ref>{{cite web|url=http://www.dictionary.com/browse/jacobian|title=the definition of jacobian|website=Dictionary.com|access-date=2 May 2018|url-status=live|archive-url=https://web.archive.org/web/20171201040801/http://www.dictionary.com/browse/jacobian|archive-date=1 December 2017}}</ref><ref>{{cite web|url=https://forvo.com/word/jacobian/|title=Jacobian pronunciation: How to pronounce Jacobian in English|first=Forvo|last=Team|website=forvo.com|access-date=2 May 2018}}</ref> {{IPAc-en|dʒ|ᵻ|-|,_|j|ᵻ|-}}) of a [[vector-valued function]] of several variables is the [[matrix (mathematics)|matrix]] of all its first-order [[partial derivative]]s. If this matrix is [[square matrix|square]], that is, if the number of variables equals the number of [[Euclidean_vector#Decomposition|components]] of function values, then its [[determinant]] is called the '''Jacobian determinant'''. Both the matrix and (if applicable) the determinant are often referred to simply as the '''Jacobian'''.<ref>{{cite web|url=http://mathworld.wolfram.com/Jacobian.html|title=Jacobian|first=Weisstein, Eric|last=W.|website=mathworld.wolfram.com|access-date=2 May 2018|url-status=live|archive-url=https://web.archive.org/web/20171103144419/http://mathworld.wolfram.com/Jacobian.html|archive-date=3 November 2017}}</ref> They are named after [[Carl Gustav Jacob Jacobi]]. The Jacobian matrix is the natural generalization to vector valued functions of several variables of the [[derivative]] and the [[differential of a function|differential]] of a usual function. This generalization includes generalizations of the [[inverse function theorem]] and the [[implicit function theorem]], where the non-nullity of the derivative is replaced by the non-nullity of the Jacobian determinant, and the [[multiplicative inverse]] of the derivative is replaced by the [[inverse of a matrix|inverse]] of the Jacobian matrix. The Jacobian determinant is fundamentally used for changes of variables in [[multiple integral]]s. == Definition == Let <math display="inline">\mathbf{f}: \mathbb{R}^n \to \mathbb{R}^m</math> be a function such that each of its first-order partial derivatives exists on <math display="inline>\mathbb{R}^n</math>. This function takes a point {{tmath|1=\mathbf x =(x_1,\ldots,x_n)\in \mathbb{R}^n}} as input and produces the vector {{tmath|1=\mathbf f(\mathbf x) = (f_1(\mathbf x), \ldots, f_m(\mathbf x)) \in \mathbb{R}^m}} as output. Then the Jacobian matrix of {{math|'''f'''}}, denoted {{math|'''J<sub>f</sub>'''}}, is the {{tmath|m\times n}} matrix whose {{math|(''i'', ''j'')}} entry is <math display="inline">\frac{\partial f_i}{\partial x_j};</math> explicitly <math display="block">\mathbf{J_f} = \begin{bmatrix} \dfrac{\partial \mathbf{f}}{\partial x_1} & \cdots & \dfrac{\partial \mathbf{f}}{\partial x_n} \end{bmatrix} = \begin{bmatrix} \nabla^{\mathsf{T}} f_1 \\ \vdots \\ \nabla^{\mathsf{T}} f_m \end{bmatrix} = \begin{bmatrix} \dfrac{\partial f_1}{\partial x_1} & \cdots & \dfrac{\partial f_1}{\partial x_n}\\ \vdots & \ddots & \vdots\\ \dfrac{\partial f_m}{\partial x_1} & \cdots & \dfrac{\partial f_m}{\partial x_n} \end{bmatrix}</math> where <math>\nabla^{\mathsf{T}} f_i</math> is the transpose (row vector) of the [[gradient]] of the <math>i</math>-th component. The Jacobian matrix, whose entries are functions of {{math|'''x'''}}, is denoted in various ways; other common notations include {{math|''D'''''f'''}}, <math>\nabla \mathbf{f}</math>, and <math display="inline">\frac{\partial(f_1,\ldots,f_m)}{\partial(x_1,\ldots,x_n)}</math>.<ref>{{Cite book |last1=Holder |first1=Allen |title=An Introduction to computational science |last2=Eichholz |first2=Joseph |date=2019 |publisher=Springer |isbn=978-3-030-15679-4 |series=International Series in Operations Research & Management Science |location=Cham, Switzerland |pages=53}}</ref><ref>{{Cite book |last=Lovett |first=Stephen |url=https://books.google.com/books?id=G1bGDwAAQBAJ |title=Differential Geometry of Manifolds |date=2019-12-16 |publisher=CRC Press |isbn=978-0-429-60782-0 |pages=16 |language=en}}</ref> Some authors define the Jacobian as the [[transpose]] of the form given above. The Jacobian matrix [[Matrix_(mathematics)#Linear_transformations|represents]] the [[total derivative|differential]] of {{math|'''f'''}} at every point where {{math|'''f'''}} is differentiable. In detail, if {{math|'''h'''}} is a [[displacement vector]] represented by a [[column matrix]], the [[matrix product]] {{math|'''J'''('''x''') ⋅ '''h'''}} is another displacement vector, that is the best linear approximation of the change of {{math|'''f'''}} in a [[neighborhood (mathematics)|neighborhood]] of {{math|'''x'''}}, if {{math|'''f'''('''x''')}} is [[Differentiable function|differentiable]] at {{math|'''x'''}}.{{efn|Differentiability at {{math|'''x'''}} implies, but is not implied by, the existence of all first-order partial derivatives at {{math|'''x'''}}, and hence is a stronger condition.}} This means that the function that maps {{math|'''y'''}} to {{math|'''f'''('''x''') + '''J'''('''x''') ⋅ ('''y''' – '''x''')}} is the best [[linear approximation]] of {{math|'''f'''('''y''')}} for all points {{math|'''y'''}} close to {{math|'''x'''}}. The [[linear map]] {{math|'''h''' → '''J'''('''x''') ⋅ '''h'''}} is known as the ''derivative'' or the [[total derivative|''differential'']] of {{math|'''f'''}} at {{math|'''x'''}}. When <math display="inline">m=n</math>, the Jacobian matrix is square, so its [[determinant]] is a well-defined function of {{math|'''x'''}}, known as the '''Jacobian determinant''' of {{math|'''f'''}}. It carries important information about the local behavior of {{math|'''f'''}}. In particular, the function {{math|'''f'''}} has a differentiable [[inverse function]] in a neighborhood of a point {{math|'''x'''}} if and only if the Jacobian determinant is nonzero at {{math|'''x'''}} (see [[inverse function theorem]] for an explanation of this and [[Jacobian conjecture]] for a related problem of ''global'' invertibility). The Jacobian determinant also appears when changing the variables in [[multiple integral]]s (see [[Integration_by_substitution#Substitution_for_multiple_variables|substitution rule for multiple variables]]). When <math display="inline">m=1</math>, that is when <math display="inline"> f: \mathbb{R}^n \to \mathbb{R}</math> is a [[scalar field|scalar-valued function]], the Jacobian matrix reduces to the [[row vector]] <math>\nabla^{\mathsf{T}} f</math>; this row vector of all first-order partial derivatives of {{tmath|f}} is the transpose of the [[gradient]] of {{tmath|f}}, i.e. <math>\mathbf{J}_{f} = \nabla^{\mathsf{T}} f</math>. Specializing further, when <math display="inline">m=n=1</math>, that is when <math display="inline">f: \mathbb{R} \to \mathbb{R}</math> is a [[scalar field|scalar-valued function]] of a single variable, the Jacobian matrix has a single entry; this entry is the derivative of the function {{tmath|f}}. These concepts are named after the [[mathematician]] [[Carl Gustav Jacob Jacobi]] (1804–1851). == Jacobian matrix == The Jacobian of a vector-valued function in several variables generalizes the [[gradient]] of a [[scalar (mathematics)|scalar]]-valued function in several variables, which in turn generalizes the derivative of a scalar-valued function of a single variable. In other words, the Jacobian matrix of a scalar-valued [[multivariate function|function of several variables]] is (the transpose of) its gradient and the gradient of a scalar-valued function of a single variable is its derivative. At each point where a function is differentiable, its Jacobian matrix can also be thought of as describing the amount of "stretching", "rotating" or "transforming" that the function imposes locally near that point. For example, if {{math|(''x''′, ''y''′) {{=}} '''f'''(''x'', ''y'')}} is used to smoothly transform an image, the Jacobian matrix {{math|'''J'''<sub>'''f'''</sub>(''x'', ''y'')}}, describes how the image in the neighborhood of {{math|(''x'', ''y'')}} is transformed. If a function is differentiable at a point, its differential is given in coordinates by the Jacobian matrix. However, a function does not need to be differentiable for its Jacobian matrix to be defined, since only its first-order [[partial derivative]]s are required to exist. If {{math|'''f'''}} is [[derivative|differentiable]] at a point {{math|'''p'''}} in {{math|'''R'''<sup>''n''</sup>}}, then its [[Total derivative#The total derivative as a linear map|differential]] is represented by {{math|'''J'''<sub>'''f'''</sub>('''p''')}}. In this case, the [[linear transformation]] represented by {{math|'''J'''<sub>'''f'''</sub>('''p''')}} is the best [[linear approximation]] of {{math|'''f'''}} near the point {{math|'''p'''}}, in the sense that <math display="block">\mathbf f(\mathbf x) - \mathbf f(\mathbf p) = \mathbf J_{\mathbf f}(\mathbf p)(\mathbf x - \mathbf p) + o(\|\mathbf x - \mathbf p\|) \quad (\text{as } \mathbf{x} \to \mathbf{p}),</math> where {{math|''o''(‖'''x''' − '''p'''‖)}} is a [[Big O notation#Little-o notation|quantity]] that approaches zero much faster than the [[Euclidean distance|distance]] between {{math|'''x'''}} and {{math|'''p'''}} does as {{math|'''x'''}} approaches {{math|'''p'''}}. This approximation specializes to the approximation of a scalar function of a single variable by its [[Taylor polynomial]] of degree one, namely <math display="block">f(x) - f(p) = f'(p) (x - p) + o(x - p) \quad (\text{as } x \to p).</math> In this sense, the Jacobian may be regarded as a kind of "[[derivative|first-order derivative]]" of a vector-valued function of several variables. In particular, this means that the [[gradient]] of a scalar-valued function of several variables may too be regarded as its "first-order derivative". Composable differentiable functions {{math|'''f''' : '''R'''<sup>''n''</sup> → '''R'''<sup>''m''</sup>}} and {{math|'''g''' : '''R'''<sup>''m''</sup> → '''R'''<sup>''k''</sup>}} satisfy the [[Chain_rule#General_rule|chain rule]], namely <math>\mathbf{J}_{\mathbf{g} \circ \mathbf{f}}(\mathbf{x}) = \mathbf{J}_{\mathbf{g}}(\mathbf{f}(\mathbf{x})) \mathbf{J}_{\mathbf{f}}(\mathbf{x})</math> for {{math|'''x''' }} in {{math|'''R'''<sup>''n''</sup>}}. The Jacobian of the gradient of a scalar function of several variables has a special name: the [[Hessian matrix]], which in a sense is the "[[second derivative]]" of the function in question. == Jacobian determinant == [[File:Jacobian_determinant_and_distortion.svg|thumb|400px|A nonlinear map <math>f \colon \mathbb{R}^{2} \to \mathbb{R}^{2}</math> sends a small square (left, in red) to a distorted parallelogram (right, in red). The Jacobian at a point gives the best linear approximation of the distorted parallelogram near that point (right, in translucent white), and the Jacobian determinant gives the ratio of the area of the approximating parallelogram to that of the original square.]] If {{math|1=''m'' = ''n''}}, then {{math|'''f'''}} is a function from {{math|'''R'''<sup>''n''</sup>}} to itself and the Jacobian matrix is a [[square matrix]]. We can then form its [[determinant]], known as the '''Jacobian determinant'''. The Jacobian determinant is sometimes simply referred to as "the Jacobian". The Jacobian determinant at a given point gives important information about the behavior of {{math|'''f'''}} near that point. For instance, the [[continuously differentiable function]] {{math|'''f'''}} is [[invertible]] near a point {{math|'''p''' ∈ '''R'''<sup>''n''</sup>}} if the Jacobian determinant at {{math|'''p'''}} is non-zero. This is the [[inverse function theorem]]. Furthermore, if the Jacobian determinant at {{math|'''p'''}} is [[positive number|positive]], then {{math|'''f'''}} preserves [[Orientation (vector space)|orientation]] near {{math|'''p'''}}; if it is [[negative number|negative]], {{math|'''f'''}} reverses orientation. The [[absolute value]] of the Jacobian determinant at {{math|'''p'''}} gives us the factor by which the function {{math|'''f'''}} expands or shrinks [[volume]]s near {{math|'''p'''}}; this is why it occurs in the general [[substitution rule]]. The Jacobian determinant is used when making a [[Integration by substitution#Substitution for multiple variables|change of variables]] when evaluating a [[multiple integral]] of a function over a region within its domain. To accommodate for the change of coordinates the magnitude of the Jacobian determinant arises as a multiplicative factor within the integral. This is because the {{math|''n''}}-dimensional {{math|''dV''}} element is in general a [[parallelepiped]] in the new coordinate system, and the {{math|''n''}}-volume of a parallelepiped is the determinant of its edge vectors. The Jacobian can also be used to determine the stability of [[equilibrium point|equilibria]] for [[matrix differential equation|systems of differential equations]] by approximating behavior near an equilibrium point. == Inverse == According to the [[inverse function theorem]], the [[Invertible matrix|matrix inverse]] of the Jacobian matrix of an [[invertible function]] {{math|'''f''' : '''R'''<sup>''n''</sup> → '''R'''<sup>''n''</sup>}} is the Jacobian matrix of the ''inverse'' function. That is, the Jacobian matrix of the inverse function at a point {{math|'''p'''}} is <math display="block">\mathbf J_{\mathbf{f}^{-1}}(\mathbf{p}) = {\mathbf J^{-1}_{\mathbf{f}}(\mathbf{f}^{-1}(\mathbf{p}))},</math> and the Jacobian determinant is <math display="block">\det(\mathbf{J}_{\mathbf{f}^{-1}}(\mathbf{p})) = \frac{1}{\det(\mathbf{J}_{\mathbf{f}}(\mathbf{f}^{-1}(\mathbf{p})))}.</math> If the Jacobian is continuous and nonsingular at the point {{math|'''p'''}} in {{math|'''R'''<sup>''n''</sup>}}, then {{math|'''f'''}} is invertible when restricted to some [[Neighbourhood (mathematics)|neighbourhood]] of {{math|'''p'''}}. In other words, if the Jacobian determinant is not zero at a point, then the function is ''locally invertible'' near this point. The (unproved) [[Jacobian conjecture]] is related to global invertibility in the case of a polynomial function, that is a function defined by ''n'' [[polynomial]]s in ''n'' variables. It asserts that, if the Jacobian determinant is a non-zero constant (or, equivalently, that it does not have any complex zero), then the function is invertible and its inverse is a polynomial function. == Critical points == {{main|Critical point (mathematics)|l1=Critical point}} If {{math|'''f''' : '''R'''<sup>''n''</sup> → '''R'''<sup>''m''</sup>}} is a [[differentiable function]], a ''critical point'' of {{math|'''f'''}} is a point where the [[rank (linear algebra)|rank]] of the Jacobian matrix is not maximal. This means that the rank at the critical point is lower than the rank at some neighbour point. In other words, let {{math|''k''}} be the maximal dimension of the [[open ball]]s contained in the image of {{math|'''f'''}}; then a point is critical if all [[minor (linear algebra)|minor]]s of rank {{math|''k''}} of {{math|'''f'''}} are zero. In the case where {{math|1=''m'' = ''n'' = ''k''}}, a point is critical if the Jacobian determinant is zero. == Examples == === Example 1 === Consider a function {{math|'''f''' : '''R'''<sup>2</sup> → '''R'''<sup>2</sup>,}} with {{math|(''x'', ''y'') ↦ (''f''<sub>1</sub>(''x'', ''y''), ''f''<sub>2</sub>(''x'', ''y'')),}} given by <math display="block">\mathbf f\left(\begin{bmatrix} x\\y\end{bmatrix}\right) = \begin{bmatrix} f_1(x,y)\\f_2(x,y)\end{bmatrix} = \begin{bmatrix} x^2 y \\5 x + \sin y \end{bmatrix}.</math> Then we have <math display="block">f_1(x, y) = x^2 y</math> and <math display="block">f_2(x, y) = 5 x + \sin y.</math> The Jacobian matrix of {{math|'''f'''}} is <math display="block">\mathbf J_{\mathbf f}(x, y) = \begin{bmatrix} \dfrac{\partial f_1}{\partial x} & \dfrac{\partial f_1}{\partial y}\\[1em] \dfrac{\partial f_2}{\partial x} & \dfrac{\partial f_2}{\partial y} \end{bmatrix} = \begin{bmatrix} 2 x y & x^2 \\ 5 & \cos y \end{bmatrix}</math> and the Jacobian determinant is <math display="block">\det(\mathbf J_{\mathbf f}(x, y)) = 2 x y \cos y - 5 x^2.</math> === Example 2: polar-Cartesian transformation === The transformation from [[polar coordinate system|polar coordinates]] {{math|(''r'', ''φ'')}} to [[Cartesian coordinate system|Cartesian coordinates]] (''x'', ''y''), is given by the function {{math|'''F''': '''R'''<sup>+</sup> × [0, 2{{pi}}) → '''R'''<sup>2</sup>}} with components <math display="block">\begin{align} x &= r \cos \varphi ; \\ y &= r \sin \varphi . \end{align}</math> <math display="block">\mathbf J_{\mathbf F}(r, \varphi) = \begin{bmatrix} \frac{\partial x}{\partial r} & \frac{\partial x}{\partial\varphi}\\[0.5ex] \frac{\partial y}{\partial r} & \frac{\partial y}{\partial\varphi} \end{bmatrix} = \begin{bmatrix} \cos\varphi & - r\sin \varphi \\ \sin\varphi & r\cos \varphi \end{bmatrix}</math> The Jacobian determinant is equal to {{math|''r''}}. This can be used to transform integrals between the two coordinate systems: <math display="block">\iint_{\mathbf F(A)} f(x, y) \,dx \,dy = \iint_A f(r \cos \varphi, r \sin \varphi) \, r \, dr \, d\varphi .</math> === Example 3: spherical-Cartesian transformation === The transformation from [[spherical coordinate system|spherical coordinates]] {{math|(''ρ'', ''φ'', ''θ'')}}<ref>Joel Hass, Christopher Heil, and Maurice Weir. ''Thomas' Calculus Early Transcendentals, 14e''. Pearson, 2018, p. 959.</ref> to [[Cartesian coordinate system|Cartesian coordinates]] (''x'', ''y'', ''z''), is given by the function {{math|'''F''': '''R'''<sup>+</sup> × [0, ''π'') × [0, 2''π'') → '''R'''<sup>3</sup>}} with components <math display="block">\begin{align} x &= \rho \sin \varphi \cos \theta ; \\ y &= \rho \sin \varphi \sin \theta ; \\ z &= \rho \cos \varphi . \end{align}</math> The Jacobian matrix for this coordinate change is <math display="block">\mathbf J_{\mathbf F}(\rho, \varphi, \theta) = \begin{bmatrix} \dfrac{\partial x}{\partial \rho} & \dfrac{\partial x}{\partial \varphi} & \dfrac{\partial x}{\partial \theta} \\[1em] \dfrac{\partial y}{\partial \rho} & \dfrac{\partial y}{\partial \varphi} & \dfrac{\partial y}{\partial \theta} \\[1em] \dfrac{\partial z}{\partial \rho} & \dfrac{\partial z}{\partial \varphi} & \dfrac{\partial z}{\partial \theta} \end{bmatrix} = \begin{bmatrix} \sin \varphi \cos \theta & \rho \cos \varphi \cos \theta & -\rho \sin \varphi \sin \theta \\ \sin \varphi \sin \theta & \rho \cos \varphi \sin \theta & \rho \sin \varphi \cos \theta \\ \cos \varphi & - \rho \sin \varphi & 0 \end{bmatrix}.</math> The [[determinant]] is {{math|''ρ''<sup>2</sup> sin ''φ''}}. Since {{math|''dV'' {{=}} ''dx'' ''dy'' ''dz''}} is the volume for a rectangular differential volume element (because the volume of a rectangular prism is the product of its sides), we can interpret {{math|''dV'' {{=}} ''ρ''<sup>2</sup> sin ''φ'' ''dρ'' ''dφ'' ''dθ''}} as the volume of the spherical [[differential volume element]]. Unlike rectangular differential volume element's volume, this differential volume element's volume is not a constant, and varies with coordinates ({{math|''ρ''}} and {{math|''φ''}}). It can be used to transform integrals between the two coordinate systems: <math display="block">\iiint_{\mathbf F(U)} f(x, y, z) \,dx \,dy \,dz = \iiint_U f(\rho \sin \varphi \cos \theta, \rho \sin \varphi\sin \theta, \rho \cos \varphi) \, \rho^2 \sin \varphi \, d\rho \, d\varphi \, d\theta .</math> === Example 4 === The Jacobian matrix of the function {{math|'''F''' : '''R'''<sup>3</sup> → '''R'''<sup>4</sup>}} with components <math display="block">\begin{align} y_1 &= x_1 \\ y_2 &= 5 x_3 \\ y_3 &= 4 x_2^2 - 2 x_3 \\ y_4 &= x_3 \sin x_1 \end{align}</math> is <math display="block">\mathbf J_{\mathbf F}(x_1, x_2, x_3) = \begin{bmatrix} \dfrac{\partial y_1}{\partial x_1} & \dfrac{\partial y_1}{\partial x_2} & \dfrac{\partial y_1}{\partial x_3} \\[1em] \dfrac{\partial y_2}{\partial x_1} & \dfrac{\partial y_2}{\partial x_2} & \dfrac{\partial y_2}{\partial x_3} \\[1em] \dfrac{\partial y_3}{\partial x_1} & \dfrac{\partial y_3}{\partial x_2} & \dfrac{\partial y_3}{\partial x_3} \\[1em] \dfrac{\partial y_4}{\partial x_1} & \dfrac{\partial y_4}{\partial x_2} & \dfrac{\partial y_4}{\partial x_3} \end{bmatrix} = \begin{bmatrix} 1 & 0 & 0 \\ 0 & 0 & 5 \\ 0 & 8 x_2 & -2 \\ x_3\cos x_1 & 0 & \sin x_1 \end{bmatrix}.</math> This example shows that the Jacobian matrix need not be a square matrix. === Example 5 === The Jacobian determinant of the function {{math|'''F''' : '''R'''<sup>3</sup> → '''R'''<sup>3</sup>}} with components <math display="block">\begin{align} y_1 &= 5x_2 \\ y_2 &= 4x_1^2 - 2 \sin (x_2 x_3) \\ y_3 &= x_2 x_3 \end{align}</math> is <math display="block">\begin{vmatrix} 0 & 5 & 0 \\ 8 x_1 & -2 x_3 \cos(x_2 x_3) & -2 x_2 \cos (x_2 x_3) \\ 0 & x_3 & x_2 \end{vmatrix} = -8 x_1 \begin{vmatrix} 5 & 0 \\ x_3 & x_2 \end{vmatrix} = -40 x_1 x_2.</math> From this we see that {{math|'''F'''}} reverses orientation near those points where {{math|''x''<sub>1</sub>}} and {{math|''x''<sub>2</sub>}} have the same sign; the function is [[locally]] invertible everywhere except near points where {{math|''x''<sub>1</sub> {{=}} 0}} or {{math|''x''<sub>2</sub> {{=}} 0}}. Intuitively, if one starts with a tiny object around the point {{math|(1, 2, 3)}} and apply {{math|'''F'''}} to that object, one will get a resulting object with approximately {{math|40 × 1 × 2 {{=}} 80}} times the volume of the original one, with orientation reversed. == Other uses == === Dynamical systems === Consider a [[dynamical system]] of the form <math>\dot{\mathbf{x}} = F(\mathbf{x})</math>, where <math>\dot{\mathbf{x}}</math> is the (component-wise) derivative of <math>\mathbf{x}</math> with respect to the [[evolution parameter]] <math>t</math> (time), and <math>F \colon \mathbb{R}^{n} \to \mathbb{R}^{n}</math> is differentiable. If <math>F(\mathbf{x}_{0}) = 0</math>, then <math>\mathbf{x}_{0}</math> is a [[stationary point]] (also called a [[steady state]]). By the [[Hartman–Grobman theorem]], the behavior of the system near a stationary point is related to the [[eigenvalue]]s of <math>\mathbf{J}_{F} \left( \mathbf{x}_{0} \right)</math>, the Jacobian of <math>F</math> at the stationary point.<ref>{{cite book |first1=D. K. |last1=Arrowsmith |first2=C. M. |last2=Place |title=Dynamical Systems: Differential Equations, Maps, and Chaotic Behaviour |chapter=The Linearization Theorem |publisher=Chapman & Hall |location=London |year=1992 |isbn=0-412-39080-9 |pages=77–81 |chapter-url=https://books.google.com/books?id=8qCcP7KNaZ0C&pg=PA77 }} </ref> Specifically, if the eigenvalues all have real parts that are negative, then the system is stable near the stationary point. If any eigenvalue has a real part that is positive, then the point is unstable. If the largest real part of the eigenvalues is zero, the Jacobian matrix does not allow for an evaluation of the stability.<ref>{{cite book |first1=Morris |last1=Hirsch |first2=Stephen |last2=Smale |title=Differential Equations, Dynamical Systems and Linear Algebra |year=1974 |isbn=0-12-349550-4 }}</ref> === Newton's method === A square system of coupled nonlinear equations can be solved iteratively by [[Newton's method#Systems of equations|Newton's method]]. This method uses the Jacobian matrix of the system of equations. ===Regression and least squares fitting=== The Jacobian serves as a linearized [[design matrix]] in statistical [[regression analysis|regression]] and [[curve fitting]]; see [[non-linear least squares]]. The Jacobian is also used in random matrices, moments, local sensitivity and statistical diagnostics.<ref>{{cite journal|last1=Liu|first1=Shuangzhe |last2=Leiva|first2=Victor|last3=Zhuang|first3=Dan|last4=Ma|first4=Tiefeng | last5=Figueroa-Zúñiga | first5=Jorge I.|date=March 2022|title=Matrix differential calculus with applications in the multivariate linear model and its diagnostics|journal=Journal of Multivariate Analysis |volume=188|pages=104849|doi=10.1016/j.jmva.2021.104849|doi-access=free}}</ref><ref>{{Cite journal| last1=Liu|first1=Shuangzhe| last2= Trenkler|first2=Götz| last3=Kollo|first3=Tõnu| last4=von Rosen|first4=Dietrich| last5=Baksalary|first5=Oskar Maria| date= 2023| title= Professor Heinz Neudecker and matrix differential calculus| journal= Statistical Papers |volume=65 |issue=4 |pages=2605–2639 | language=en | doi= 10.1007/s00362-023-01499-w|s2cid=263661094 }}</ref> == See also == * [[Center manifold]] * [[Hessian matrix]] * [[Pushforward (differential)]] ==Notes== {{notelist}} == References == {{Reflist}} == Further reading == * {{cite book |last=Gandolfo |first=Giancarlo |author-link=Giancarlo Gandolfo |title=Economic Dynamics |location=Berlin |publisher=Springer |edition=Third |year=1996 |isbn=3-540-60988-1 |pages=305–330 |chapter=Comparative Statics and the Correspondence Principle |chapter-url=https://books.google.com/books?id=ZMwXi67nhHQC&pg=PA305 }} * {{cite book |first1=Murray H. |last1=Protter |author-link=Murray H. Protter |first2=Charles B. Jr. |last2=Morrey |author-link2=Charles B. Morrey Jr. |title=Intermediate Calculus |location=New York |publisher=Springer |edition=Second |year=1985 |isbn=0-387-96058-9 |chapter=Transformations and Jacobians |pages=412–420 }} == External links == * {{springer|title=Jacobian|id=p/j054080}} * [http://mathworld.wolfram.com/Jacobian.html Mathworld] A more technical explanation of Jacobians {{Matrix classes}} [[Category:Multivariable calculus]] [[Category:Differential calculus]] [[Category:Generalizations of the derivative]] [[Category:Determinants]] [[Category:Matrices (mathematics)]] [[Category:Differential operators]]
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)
Pages transcluded onto the current version of this page
(
help
)
:
Template:Bigger
(
edit
)
Template:Calculus
(
edit
)
Template:Cite book
(
edit
)
Template:Cite journal
(
edit
)
Template:Cite web
(
edit
)
Template:Efn
(
edit
)
Template:Endflatlist
(
edit
)
Template:IPAc-en
(
edit
)
Template:Main
(
edit
)
Template:Math
(
edit
)
Template:Matrix classes
(
edit
)
Template:Notelist
(
edit
)
Template:Redirect
(
edit
)
Template:Reflist
(
edit
)
Template:Short description
(
edit
)
Template:Springer
(
edit
)
Template:Startflatlist
(
edit
)
Template:Tmath
(
edit
)