Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Multidimensional scaling
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
===Classical multidimensional scaling=== It is also known as '''Principal Coordinates Analysis''' (PCoA), Torgerson Scaling or Torgerson–Gower scaling. It takes an input matrix giving dissimilarities between pairs of items and outputs a coordinate matrix whose configuration minimizes a [[loss function]] called ''strain'',<ref name="borg"/> which is given by <math display=block>\text{Strain}_D(x_1,x_2,...,x_n)=\Biggl(\frac{ \sum_{i,j} \bigl( b_{ij} - x_i^T x_j \bigr)^2}{\sum_{i,j}b_{ij}^2} \Biggr)^{1/2},</math> where <math>x_{i}</math> denote vectors in ''N''-dimensional space, <math>x_i^T x_j </math> denotes the scalar product between <math>x_{i}</math> and <math>x_{j}</math>, and <math>b_{ij}</math> are the elements of the matrix <math>B</math> defined on step 2 of the following algorithm, which are computed from the distances. : '''Steps of a Classical MDS algorithm:''' : Classical MDS uses the fact that the coordinate matrix <math>X</math> can be derived by [[Eigendecomposition of a matrix|eigenvalue decomposition]] from <math display="inline">B=XX'</math>. And the matrix <math display="inline">B</math> can be computed from proximity matrix <math display="inline">D</math> by using double centering.<ref>Wickelmaier, Florian. "An introduction to MDS." ''Sound Quality Research Unit, Aalborg University, Denmark'' (2003): 46</ref> :# Set up the squared proximity matrix <math display="inline">D^{(2)}=[d_{ij}^2]</math> :# Apply double centering: <math display="inline">B=-\frac{1}{2}CD^{(2)}C</math> using the [[centering matrix]] <math display="inline">C=I-\frac{1}{n}J_n</math>, where <math display="inline">n</math> is the number of objects, <math display="inline">I</math> is the <math display="inline">n \times n</math> identity matrix, and <math display="inline">J_{n}</math> is an <math display="inline">n\times n</math> matrix of all ones. :# Determine the <math display="inline">m</math> largest [[Eigenvalues and eigenvectors|eigenvalues]] <math display="inline">\lambda_1,\lambda_2,...,\lambda_m</math> and corresponding [[Eigenvalues and eigenvectors|eigenvectors]] <math display="inline">e_1,e_2,...,e_m</math> of <math display="inline">B</math> (where <math display="inline">m</math> is the number of dimensions desired for the output). :# Now, <math display="inline">X=E_m\Lambda_m^{1/2}</math> , where <math display="inline">E_m</math> is the matrix of <math display="inline">m</math> eigenvectors and <math display="inline">\Lambda_m</math> is the [[diagonal matrix]] of <math display="inline">m</math> eigenvalues of <math display="inline">B</math>. :Classical MDS assumes metric distances. So this is not applicable for direct dissimilarity ratings.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)