Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Multidimensional scaling
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Details== The data to be analyzed is a collection of <math>M</math> objects (colors, faces, stocks, . . .) on which a ''distance function'' is defined, :<math>d_{i,j} :=</math> distance between <math>i</math>-th and <math>j</math>-th objects. These distances are the entries of the ''dissimilarity matrix'' :<math> D := \begin{pmatrix} d_{1,1} & d_{1,2} & \cdots & d_{1,M} \\ d_{2,1} & d_{2,2} & \cdots & d_{2,M} \\ \vdots & \vdots & & \vdots \\ d_{M,1} & d_{M,2} & \cdots & d_{M,M} \end{pmatrix}. </math> The goal of MDS is, given <math>D</math>, to find <math>M</math> vectors <math>x_1,\ldots,x_M \in \mathbb{R}^N</math> such that :<math>\|x_i - x_j\| \approx d_{i,j}</math> for all <math>i,j\in {1,\dots,M}</math>, where <math>\|\cdot\|</math> is a [[Norm (mathematics)|vector norm]]. In classical MDS, this norm is the [[Euclidean distance]], but, in a broader sense, it may be a [[metric (mathematics)|metric]] or arbitrary distance function.<ref name="Kruskal">[[Joseph Kruskal|Kruskal, J. B.]], and Wish, M. (1978), ''Multidimensional Scaling'', Sage University Paper series on Quantitative Application in the Social Sciences, 07-011. Beverly Hills and London: Sage Publications.</ref> For example, when dealing with mixed-type data that contain numerical as well as categorical descriptors, [[Gower's distance]] is a common alternative.{{cn|date=June 2024}} In other words, MDS attempts to find a mapping from the <math>M</math> objects into <math>\mathbb{R}^N</math> such that distances are preserved. If the dimension <math>N</math> is chosen to be 2 or 3, we may plot the vectors <math>x_i</math> to obtain a visualization of the similarities between the <math>M</math> objects. Note that the vectors <math>x_i</math> are not unique: With the Euclidean distance, they may be arbitrarily translated, rotated, and reflected, since these transformations do not change the pairwise distances <math>\|x_i - x_j\|</math>. (Note: The symbol <math>\mathbb{R}</math> indicates the set of [[real numbers]], and the notation <math>\mathbb{R}^N</math> refers to the Cartesian product of <math>N</math> copies of <math>\mathbb{R}</math>, which is an <math>N</math>-dimensional vector space over the field of the real numbers.) There are various approaches to determining the vectors <math>x_i</math>. Usually, MDS is formulated as an [[optimization (mathematics)|optimization problem]], where <math>(x_1,\ldots,x_M)</math> is found as a minimizer of some cost function, for example, :<math> \underset{x_1,\ldots,x_M}{\mathrm{argmin}} \sum_{i<j} ( \|x_i - x_j\| - d_{i,j} )^2. \, </math> A solution may then be found by numerical optimization techniques. For some particularly chosen cost functions, minimizers can be stated analytically in terms of matrix [[Eigendecomposition of a matrix|eigendecompositions]].<ref name="borg" />
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)