Template:Short description In statistics, the Bhattacharyya distance is a quantity which represents a notion of similarity between two probability distributions.<ref>Template:Cite book</ref> It is closely related to the Bhattacharyya coefficient, which is a measure of the amount of overlap between two statistical samples or populations.

It is not a metric, despite being named a "distance", since it does not obey the triangle inequality.

HistoryEdit

Both the Bhattacharyya distance and the Bhattacharyya coefficient are named after Anil Kumar Bhattacharyya, a statistician who worked in the 1930s at the Indian Statistical Institute.<ref name=":0">Template:Cite journal</ref> He has developed this through a series of papers.<ref name=":1" /><ref name=":3">Template:Cite journal</ref><ref name=":2">Template:Cite journal</ref> He developed the method to measure the distance between two non-normal distributions and illustrated this with the classical multinomial populations,<ref name=":1">Template:Cite journal</ref> this work despite being submitted for publication in 1941, appeared almost five years later in Sankhya.<ref name=":1" /><ref name=":0" /> Consequently, Professor Bhattacharyya started working toward developing a distance metric for probability distributions that are absolutely continuous with respect to the Lebesgue measure and published his progress in 1942, at Proceedings of the Indian Science Congress<ref name=":3" /> and the final work has appeared in 1943 in the Bulletin of the Calcutta Mathematical Society.<ref name=":2" />

DefinitionEdit

For probability distributions <math>P</math> and <math>Q</math> on the same domain <math>\mathcal{X}</math>, the Bhattacharyya distance is defined as

<math>D_B(P,Q) = -\ln \left( BC(P,Q) \right)</math>

where

<math>BC(P,Q) = \sum_{x\in \mathcal{X}} \sqrt{P(x) Q(x)}</math>

is the Bhattacharyya coefficient for discrete probability distributions.

For continuous probability distributions, with <math>P(dx) = p(x)dx</math> and <math>Q(dx) = q(x) dx</math> where <math>p(x)</math> and <math>q(x)</math> are the probability density functions, the Bhattacharyya coefficient is defined as

<math>BC(P,Q) = \int_{\mathcal{X}} \sqrt{p(x) q(x)}\, dx</math>.

More generally, given two probability measures <math>P, Q</math> on a measurable space <math>(\mathcal X, \mathcal B)</math>, let <math>\lambda</math> be a (sigma finite) measure such that <math>P</math> and <math>Q</math> are absolutely continuous with respect to <math>\lambda</math> i.e. such that <math>P(dx) = p(x)\lambda(dx)</math>, and <math>Q(dx) = q(x)\lambda(dx)</math> for probability density functions <math>p, q</math> with respect to <math>\lambda</math> defined <math>\lambda</math>-almost everywhere. Such a measure, even such a probability measure, always exists, e.g. <math>\lambda = \tfrac12(P + Q)</math>. Then define the Bhattacharyya measure on <math>(\mathcal X, \mathcal B)</math> by

<math> bc(dx |P,Q) = \sqrt{p(x)q(x)}\, \lambda(dx) = \sqrt{\frac{P(dx)}{\lambda(dx)}(x)\frac{Q(dx)}{\lambda(dx)}(x)}\lambda(dx).</math>

It does not depend on the measure <math>\lambda</math>, for if we choose a measure <math>\mu</math> such that <math>\lambda</math> and an other measure choice <math>\lambda'</math> are absolutely continuous i.e. <math>\lambda = l(x)\mu</math> and <math>\lambda' = l'(x) \mu</math>, then

<math>P(dx) = p(x)\lambda(dx) = p'(x)\lambda'(dx) = p(x)l(x) \mu(dx) = p'(x)l'(x)\mu(dx)</math>,

and similarly for <math>Q</math>. We then have

<math>bc(dx |P,Q) = \sqrt{p(x) q(x)}\, \lambda(dx) = \sqrt{p(x)q(x)}\, l(x)\mu(x) = \sqrt{p(x)l(x)q(x)\, l(x)}\mu(dx) = \sqrt{p'(x)l'(x) q'(x)l'(x)}\, \mu(dx) = \sqrt{p'(x)q'(x)}\,\lambda'(dx)</math>.

We finally define the Bhattacharyya coefficient

<math> BC(P,Q) = \int_{\mathcal X} bc(dx|P,Q) = \int_{\mathcal{X}} \sqrt{p(x) q(x)}\, \lambda(dx)</math>.

By the above, the quantity <math>BC(P,Q)</math> does not depend on <math>\lambda</math>, and by the Cauchy inequality <math>0\le BC(P,Q) \le 1 </math>. Using <math>P(dx) = p(x)\lambda(dx)</math>, and <math>Q(dx) = q(x)\lambda(dx)</math>, <math display="block">BC(P, Q) = \int_{\mathcal X} \sqrt{\frac{p(x)}{q(x)}} Q(dx) = \int_{\mathcal X} \sqrt{\frac{P(dx)}{Q(dx)}} Q(dx) = E_Q\left[\sqrt{\frac{P(dx)}{Q(dx)}}\right] </math>

Gaussian caseEdit

Let <math>p\sim\mathcal{N}(\mu_p,\sigma_p^2)</math>, <math>q\sim\mathcal{N}(\mu_q,\sigma_q^2)</math>, where <math>{\mathcal {N}}(\mu ,\sigma ^{2})</math> is the normal distribution with mean <math>\mu</math> and variance <math>\sigma^2</math>; then

<math>D_B(p,q) = \frac{1}{4} \frac{(\mu_p-\mu_q)^{2}}{\sigma_p^2+\sigma_q^2} + \frac 1 2 \ln\left(\frac{\sigma^2_p + \sigma^2_q}{2\sigma_p\sigma_q}\right)</math>.

And in general, given two multivariate normal distributions <math>p_i=\mathcal{N}(\boldsymbol\mu_i,\,\boldsymbol\Sigma_i)</math>,

<math>D_B(p_1, p_2)={1\over 8}(\boldsymbol\mu_1-\boldsymbol\mu_2)^T \boldsymbol\Sigma^{-1}(\boldsymbol\mu_1-\boldsymbol\mu_2)+{1\over 2}\ln \,\left({\det \boldsymbol\Sigma \over \sqrt{\det \boldsymbol\Sigma_1 \, \det \boldsymbol\Sigma_2} }\right)</math>,

where <math>\boldsymbol\Sigma={\boldsymbol\Sigma_1+\boldsymbol\Sigma_2 \over 2}.</math><ref>Template:Cite journal</ref> Note that the first term is a squared Mahalanobis distance.

PropertiesEdit

<math>0 \le BC \le 1</math> and <math>0 \le D_B \le \infty</math>.

<math>D_B</math> does not obey the triangle inequality, though the Hellinger distance <math>\sqrt{1-BC(p,q)}</math> does.

Bounds on Bayes errorEdit

The Bhattacharyya distance can be used to upper and lower bound the Bayes error rate:

<math display='block'> \frac{1}{2} - \frac{1}{2}\sqrt{1-4\rho^2} \leq L^* \leq \rho</math>

where <math>\rho = \mathbb E \sqrt {\eta(X)(1-\eta(X))}</math> and <math>\eta(X) = \mathbb P(Y=1 | X)</math> is the posterior probability.<ref>Devroye, L., Gyorfi, L. & Lugosi, G. A Probabilistic Theory of Pattern Recognition. Discrete Appl Math 73, 192–194 (1997).</ref>

ApplicationsEdit

The Bhattacharyya coefficient quantifies the "closeness" of two random statistical samples.

Given two sequences from distributions <math>P, Q</math>, bin them into <math>n</math> buckets, and let the frequency of samples from <math>P</math> in bucket <math>i</math> be <math>p_i</math>, and similarly for <math>q_i</math>, then the sample Bhattacharyya coefficient is

<math>BC(\mathbf{p},\mathbf{q}) = \sum_{i=1}^n \sqrt{p_i q_i},</math>

which is an estimator of <math>BC(P, Q)</math>. The quality of estimation depends on the choice of buckets; too few buckets would overestimate <math>BC(P, Q)</math>, while too many would underestimate.

A common task in classification is estimating the separability of classes. Up to a multiplicative factor, the squared Mahalanobis distance is a special case of the Bhattacharyya distance when the two classes are normally distributed with the same variances. When two classes have similar means but significantly different variances, the Mahalanobis distance would be close to zero, while the Bhattacharyya distance would not be.

The Bhattacharyya coefficient is used in the construction of polar codes.<ref>Template:Cite journal</ref>

The Bhattacharyya distance is used in feature extraction and selection,<ref>Euisun Choi, Chulhee Lee, "Feature extraction based on the Bhattacharyya distance", Pattern Recognition, Volume 36, Issue 8, August 2003, Pages 1703–1709</ref> image processing,<ref name="Goudail">François Goudail, Philippe Réfrégier, Guillaume Delyon, "Bhattacharyya distance as a contrast parameter for statistical processing of noisy optical images", JOSA A, Vol. 21, Issue 7, pp. 1231−1240 (2004)</ref> speaker recognition,<ref name="You">Chang Huai You, "An SVM Kernel With GMM-Supervector Based on the Bhattacharyya Distance for Speaker Recognition", Signal Processing Letters, IEEE, Vol 16, Is 1, pp. 49-52</ref> phone clustering,<ref name="Mak">Mak, B., "Phone clustering using the Bhattacharyya distance", Spoken Language, 1996. ICSLP 96. Proceedings., Fourth International Conference on, Vol 4, pp. 2005–2008 vol.4, 3−6 Oct 1996</ref> and in genetics.<ref>Template:Cite journal</ref>

See alsoEdit

ReferencesEdit

Template:Reflist

External linksEdit