Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Tensor
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Applications == === Continuum mechanics === Important examples are provided by [[continuum mechanics]]. The stresses inside a solid body or [[fluid]]<ref>{{cite book |last1=Schobeiri |first1=Meinhard T. |date=2021 |title=Fluid Mechanics for Engineers |publisher=Springer |pages=11–29 |chapter=Vector and Tensor Analysis, Applications to Fluid Mechanics}}</ref> are described by a tensor field. The [[Stress (mechanics)|stress tensor]] and [[strain tensor]] are both second-order tensor fields, and are related in a general linear elastic material by a fourth-order [[elasticity tensor]] field. In detail, the tensor quantifying stress in a 3-dimensional solid object has components that can be conveniently represented as a 3 × 3 array. The three faces of a cube-shaped infinitesimal volume segment of the solid are each subject to some given force. The force's vector components are also three in number. Thus, 3 × 3, or 9 components are required to describe the stress at this cube-shaped infinitesimal segment. Within the bounds of this solid is a whole mass of varying stress quantities, each requiring 9 quantities to describe. Thus, a second-order tensor is needed. If a particular [[Volume form|surface element]] inside the material is singled out, the material on one side of the surface will apply a force on the other side. In general, this force will not be orthogonal to the surface, but it will depend on the orientation of the surface in a linear manner. This is described by a tensor of [[type of a tensor|type {{nowrap|(2, 0)}}]], in [[linear elasticity]], or more precisely by a tensor field of type {{nowrap|(2, 0)}}, since the stresses may vary from point to point. === Other examples from physics === Common applications include: * [[Electromagnetic tensor]] (or Faraday tensor) in [[electromagnetism]] * [[Finite deformation tensors]] for describing deformations and [[strain tensor]] for [[Strain (materials science)|strain]] in [[continuum mechanics]] * [[Permittivity]] and [[electric susceptibility]] are tensors in [[anisotropic]] media * [[Four-tensors]] in [[general relativity]] (e.g. [[stress–energy tensor]]), used to represent [[momentum]] [[flux]]es * Spherical tensor operators are the eigenfunctions of the quantum [[angular momentum operator]] in [[spherical coordinates]] * Diffusion tensors, the basis of [[diffusion tensor imaging]], represent rates of diffusion in biological environments * [[Quantum mechanics]] and [[quantum computing]] utilize tensor products for combination of quantum states ===Computer vision and optics=== The concept of a tensor of order two is often conflated with that of a matrix. Tensors of higher order do however capture ideas important in science and engineering, as has been shown successively in numerous areas as they develop. This happens, for instance, in the field of [[computer vision]], with the [[trifocal tensor]] generalizing the [[fundamental matrix (computer vision)|fundamental matrix]]. The field of [[nonlinear optics]] studies the changes to material [[Polarization density#Relation between P and E in various materials|polarization density]] under extreme electric fields. The polarization waves generated are related to the generating [[electric field]]s through the nonlinear susceptibility tensor. If the polarization '''P''' is not linearly proportional to the electric field '''E''', the medium is termed ''nonlinear''. To a good approximation (for sufficiently weak fields, assuming no permanent dipole moments are present), '''P''' is given by a [[Taylor series]] in '''E''' whose coefficients are the nonlinear susceptibilities: :<math> \frac{P_i}{\varepsilon_0} = \sum_j \chi^{(1)}_{ij} E_j + \sum_{jk} \chi_{ijk}^{(2)} E_j E_k + \sum_{jk\ell} \chi_{ijk\ell}^{(3)} E_j E_k E_\ell + \cdots. \!</math> Here <math>\chi^{(1)}</math> is the linear susceptibility, <math>\chi^{(2)}</math> gives the [[Pockels effect]] and [[second harmonic generation]], and <math>\chi^{(3)}</math> gives the [[Kerr effect]]. This expansion shows the way higher-order tensors arise naturally in the subject matter. ===Machine learning=== The properties of [[Tensor (machine learning)|tensors]], especially [[tensor decomposition]], have enabled their use in [[machine learning]] to embed higher dimensional data in [[artificial neural networks]]. This notion of tensor differs significantly from that in other areas of mathematics and physics, in the sense that a tensor is usually regarded as a numerical quantity in a fixed basis, and the dimension of the spaces along the different axes of the tensor need not be the same.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)