Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Cross-correlation
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
{{Short description|Covariance and correlation}} {{Correlation and covariance}} [[File:Comparison convolution correlation.svg|thumb|400px|Visual comparison of [[convolution]], cross-correlation and [[autocorrelation]]. For the operations involving function {{mvar|f}}, and assuming the height of {{mvar|f}} is 1.0, the value of the result at 5 different points is indicated by the shaded area below each point. Also, the vertical symmetry of {{mvar|f}} is the reason <math>f*g</math> and <math>f \star g</math> are identical in this example.]] In [[signal processing]], '''cross-correlation''' is a [[Similarity measure|measure of similarity]] of two series as a function of the displacement of one relative to the other. This is also known as a ''sliding [[dot product]]'' or ''sliding inner-product''. It is commonly used for searching a long signal for a shorter, known feature. It has applications in [[pattern recognition]], [[single particle analysis]], [[electron tomography]], [[averaging]], [[cryptanalysis]], and [[neurophysiology]]. The cross-correlation is similar in nature to the [[convolution]] of two functions. In an [[autocorrelation]], which is the cross-correlation of a signal with itself, there will always be a peak at a lag of zero, and its size will be the signal energy. In [[probability]] and [[statistics]], the term ''cross-correlations'' refers to the [[covariance and correlation|correlations]] between the entries of two [[Multivariate random variable|random vectors]] <math>\mathbf{X}</math> and <math>\mathbf{Y}</math>, while the ''correlations'' of a random vector <math>\mathbf{X}</math> are the correlations between the entries of <math>\mathbf{X}</math> itself, those forming the [[correlation matrix]] of <math>\mathbf{X}</math>. If each of <math>\mathbf{X}</math> and <math>\mathbf{Y}</math> is a scalar random variable which is realized repeatedly in a [[time series]], then the correlations of the various temporal instances of <math>\mathbf{X}</math> are known as ''autocorrelations'' of <math>\mathbf{X}</math>, and the cross-correlations of <math>\mathbf{X}</math> with <math>\mathbf{Y}</math> across time are temporal cross-correlations. In probability and statistics, the definition of correlation always includes a standardising factor in such a way that correlations have values between β1 and +1. If <math>X</math> and <math>Y</math> are two [[independent (probability)|independent]] [[random variable]]s with [[probability density function]]s <math>f</math> and <math>g</math>, respectively, then the probability density of the difference <math>Y - X</math> is formally given by the cross-correlation (in the signal-processing sense) <math>f \star g</math>; however, this terminology is not used in probability and statistics. In contrast, the [[convolution]] <math>f * g</math> (equivalent to the cross-correlation of <math>f(t)</math> and <math>g(-t)</math>) gives the probability density function of the sum <math>X + Y</math>.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)