Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Independent component analysis
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==== Reduced Mixing Problem ==== '''Independent component analysis''' ('''ICA''') addresses the problem of recovering a set of unobserved source signals <math>s_i = (s_{i1}, s_{i2}, \dots, s_{im})^T</math> from observed mixed signals <math>x_i = (x_{i1}, x_{i2}, \dots, x_{im})^T</math>, based on the linear mixing model: <math>x_i = A\,s_i,</math> where the <math>A</math> is an <math>m \times m</math> invertible matrix called the '''mixing matrix''', <math>s_i</math> represents the m‑dimensional vector containing the values of the sources at time <math>t_i</math>, and <math>x_i</math> is the corresponding vector of observed values at time <math>t_i</math>. The goal is to estimate both <math>A</math> and the source signals <math>\{s_i\}</math> solely from the observed data <math>\{x_i\}</math>. After centering, the Gram matrix is computed as: <math> (X^*)^T X^* = Q\,D\,Q^T, </math> where D is a diagonal matrix with positive entries (assuming <math>X^*</math> has maximum rank), and Q is an orthogonal matrix.<ref name="Springer"/> Writing the SVD of the mixing matrix <math>A = U \Sigma V^T</math> and comparing with <math>AA^T = U \Sigma^2 U^T</math> the mixing A has the form <math> A = Q\,D^{1/2}\,V^T. </math> So, the normalized source values satisfy <math>s_i^* = V\,y_i^*</math>, where <math>y_i^* = D^{-\tfrac12}Q^T x_i^*.</math> Thus, ICA reduces to finding the orthogonal matrix <math>V</math>. This matrix can be computed using optimization techniques via projection pursuit methods (see [[#Projection pursuit|Projection Pursuit]]).<ref name="Springer"/> Well-known algorithms for ICA include [[infomax]], [[FastICA]], [[JADE (ICA)|JADE]], and [[kernel-independent component analysis]], among others. In general, ICA cannot identify the actual number of source signals, a uniquely correct ordering of the source signals, nor the proper scaling (including sign) of the source signals. ICA is important to [[blind signal separation]] and has many practical applications. It is closely related to (or even a special case of) the search for a [[factorial code]] of the data, i.e., a new vector-valued representation of each data vector such that it gets uniquely encoded by the resulting code vector (loss-free coding), but the code components are statistically independent.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)