Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Nonlinear dimensionality reduction
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=== Laplacian eigenmaps === {{see also|Manifold regularization}} Laplacian eigenmaps uses spectral techniques to perform dimensionality reduction.<ref>{{cite journal |first1=Mikhail |last1=Belkin |author-link2=Partha Niyogi |first2=Partha |last2=Niyogi |title=Laplacian Eigenmaps and Spectral Techniques for Embedding and Clustering |journal=Advances in Neural Information Processing Systems |volume=14 |year=2001 |pages=586–691 |publisher=MIT Press |oclc=52710683 |isbn=0-262-27173-7 |url=http://www.its.caltech.edu/~matilde/BelkinNiyogiLaplacian.pdf }}</ref> This technique relies on the basic assumption that the data lies in a low-dimensional manifold in a high-dimensional space.<ref name="Belkin">{{cite thesis |first=Mikhail |last=Belkin |title=Problems of Learning on Manifolds |type=PhD |publisher=Department of Mathematics, The University of Chicago |date=August 2003 |url=https://web.cse.ohio-state.edu/~belkin.8/papers/papers.html#thesis }} Matlab code for Laplacian Eigenmaps can be found in algorithms at [https://www.cse.ohio-state.edu/~mbelkin/algorithms/algorithms.html Ohio-state.edu]</ref> This algorithm cannot embed out-of-sample points, but techniques based on [[Reproducing kernel Hilbert space]] regularization exist for adding this capability.<ref>{{cite book |last1=Bengio |first1=Yoshua |first2=Jean-Francois |last2=Paiement |first3=Pascal |last3=Vincent |first4=Olivier |last4=Delalleau |first5=Nicolas |last5=Le Roux |first6=Marie |last6=Ouimet |chapter-url=https://papers.nips.cc/paper/2461-out-of-sample-extensions-for-lle-isomap-mds-eigenmaps-and-spectral-clustering.pdf |chapter=Out-of-Sample Extensions for LLE, Isomap, MDS, Eigenmaps, and Spectral Clustering |title=Advances in Neural Information Processing Systems |year=2004 |isbn=0-262-20152-6 |volume=16|publisher=MIT Press }}</ref> Such techniques can be applied to other nonlinear dimensionality reduction algorithms as well. Traditional techniques like principal component analysis do not consider the intrinsic geometry of the data. Laplacian eigenmaps builds a graph from neighborhood information of the data set. Each data point serves as a node on the graph and connectivity between nodes is governed by the proximity of neighboring points (using e.g. the [[k-nearest neighbor algorithm]]). The graph thus generated can be considered as a discrete approximation of the low-dimensional manifold in the high-dimensional space. Minimization of a cost function based on the graph ensures that points close to each other on the manifold are mapped close to each other in the low-dimensional space, preserving local distances. The eigenfunctions of the [[Laplace–Beltrami operator]] on the manifold serve as the embedding dimensions, since under mild conditions this operator has a countable spectrum that is a basis for square integrable functions on the manifold (compare to [[Fourier series]] on the unit circle manifold). Attempts to place Laplacian eigenmaps on solid theoretical ground have met with some success, as under certain nonrestrictive assumptions, the graph Laplacian matrix has been shown to converge to the Laplace–Beltrami operator as the number of points goes to infinity.<ref name="Belkin" />
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)