Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Multidimensional scaling
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
===Non-metric multidimensional scaling (NMDS)=== In contrast to metric MDS, non-metric MDS finds both a [[non-parametric]] [[monotonic]] relationship between the dissimilarities in the item-item matrix and the Euclidean distances between items, and the location of each item in the low-dimensional space. Let <math>d_{ij}</math> be the dissimilarity between points <math>i, j</math>. Let <math>\hat d_{ij} = \| x_i - x_j\|</math> be the Euclidean distance between embedded points <math>x_i, x_j</math>. Now, for each choice of the embedded points <math>x_i</math> and is a monotonically increasing function <math>f</math>, define the "stress" function: :<blockquote><math>S(x_1, ..., x_n; f)=\sqrt{\frac{\sum_{i<j}\bigl(f(d_{ij})-\hat d_{ij}\bigr)^2}{\sum_{i<j} \hat d_{ij}^2}}.</math></blockquote> The factor of <math>\sum_{i<j} \hat d_{ij}^2</math> in the denominator is necessary to prevent a "collapse". Suppose we define instead <math>S=\sqrt{\sum_{i<j}\bigl(f(d_{ij})-\hat d_{ij})^2}</math>, then it can be trivially minimized by setting <math>f = 0</math>, then collapse every point to the same point. A few variants of this cost function exist. MDS programs automatically minimize stress in order to obtain the MDS solution. The core of a non-metric MDS algorithm is a twofold optimization process. First the optimal monotonic transformation of the proximities has to be found. Secondly, the points of a configuration have to be optimally arranged, so that their distances match the scaled proximities as closely as possible. NMDS needs to optimize two objectives simultaneously. This is usually done iteratively: :# Initialize <math>x_i</math> randomly, e. g. by sampling from a normal distribution. :# Do until a stopping criterion (for example, <math>S < \epsilon</math>) :## Solve for <math>f = \arg\min_f S(x_1, ..., x_n ; f)</math> by [[isotonic regression]]. :## Solve for <math>x_1, ..., x_n = \arg\min_{x_1, ..., x_n} S(x_1, ..., x_n ; f)</math> by gradient descent or other methods. :#Return <math>x_i</math> and <math>f</math> [[Louis Guttman]]'s smallest space analysis (SSA) is an example of a non-metric MDS procedure.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)