Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Nonlinear dimensionality reduction
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Applications of NLDR == High dimensional data can be hard for machines to work with, requiring significant time and space for analysis. It also presents a challenge for humans, since it's hard to visualize or understand data in more than three dimensions. Reducing the dimensionality of a data set, while keep its essential features relatively intact, can make algorithms more efficient and allow analysts to visualize trends and patterns. [[File:Plot of two-dimensional points resulting from NLDR algorithm.jpg|thumb|right|500px| Plot of the two-dimensional points that results from using a NLDR algorithm. In this case, Manifold Sculpting is used to reduce the data into just two dimensions (rotation and scale).]] The reduced-dimensional representations of data are often referred to as "intrinsic variables". This description implies that these are the values from which the data was produced. For example, consider a dataset that contains images of a letter 'A', which has been scaled and rotated by varying amounts. Each image has 32Γ32 pixels. Each image can be represented as a vector of 1024 pixel values. Each row is a sample on a two-dimensional manifold in 1024-dimensional space (a [[Hamming space]]). The [[intrinsic dimension]]ality is two, because two variables (rotation and scale) were varied in order to produce the data. Information about the shape or look of a letter 'A' is not part of the intrinsic variables because it is the same in every instance. Nonlinear dimensionality reduction will discard the correlated information (the letter 'A') and recover only the varying information (rotation and scale). The image to the right shows sample images from this dataset (to save space, not all input images are shown), and a plot of the two-dimensional points that results from using a NLDR algorithm (in this case, Manifold Sculpting was used) to reduce the data into just two dimensions. [[File:Letters pca.png|thumb|right|500px|PCA (a linear dimensionality reduction algorithm) is used to reduce this same dataset into two dimensions, the resulting values are not so well organized.]] By comparison, if [[principal component analysis]], which is a linear dimensionality reduction algorithm, is used to reduce this same dataset into two dimensions, the resulting values are not so well organized. This demonstrates that the high-dimensional vectors (each representing a letter 'A') that sample this manifold vary in a non-linear manner. It should be apparent, therefore, that NLDR has several applications in the field of computer-vision. For example, consider a robot that uses a camera to navigate in a closed static environment. The images obtained by that camera can be considered to be samples on a manifold in high-dimensional space, and the intrinsic variables of that manifold will represent the robot's position and orientation. [[Invariant manifold]]s are of general interest for model order reduction in [[dynamical systems]]. In particular, if there is an attracting invariant manifold in the phase space, nearby trajectories will converge onto it and stay on it indefinitely, rendering it a candidate for dimensionality reduction of the dynamical system. While such manifolds are not guaranteed to exist in general, the theory of [[spectral submanifold|spectral submanifolds (SSM)]] gives conditions for the existence of unique attracting invariant objects in a broad class of dynamical systems.<ref>{{cite journal | doi=10.1007/s11071-016-2974-z | title=Nonlinear normal modes and spectral submanifolds: Existence, uniqueness and use in model reduction | year=2016 | last1=Haller | first1=George | last2=Ponsioen | first2=Sten | journal=Nonlinear Dynamics | volume=86 | issue=3 | pages=1493β1534 | arxiv=1602.00560 | bibcode=2016NonDy..86.1493H | s2cid=44074026 }}</ref> Active research in NLDR seeks to unfold the observation manifolds associated with dynamical systems to develop modeling techniques.<ref>{{cite conference |last1=Gashler |first1=M. |last2=Martinez |first2=T. |url=http://axon.cs.byu.edu/papers/gashler2011ijcnn2.pdf |title=Temporal Nonlinear Dimensionality Reduction |conference=Proceedings of the International Joint Conference on Neural Networks IJCNN'11 |pages=1959β66 |year=2011 }}</ref> Some of the more prominent '''nonlinear dimensionality reduction''' techniques are listed below.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)