Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Kolmogorov–Smirnov test
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
{{Short description|Statistical test comparing two probability distributions}} {{CS1 config|mode=cs1}} [[File:KS Example.png|thumb|300px|Illustration of the Kolmogorov–Smirnov statistic. The red line is a model [[Cumulative distribution function|CDF]], the blue line is an [[Empirical distribution function| empirical CDF]], and the black arrow is the KS statistic.]] In [[statistics]], the '''Kolmogorov–Smirnov test''' (also '''K–S test''' or '''KS test''') is a [[nonparametric statistics|nonparametric test]] of the equality of continuous (or discontinuous, see [[#Discrete and mixed null distribution|Section 2.2]]), one-dimensional [[probability distribution]]s. It can be used to test whether a [[random sample|sample]] came from a given reference probability distribution (one-sample K–S test), or to test whether two samples came from the same distribution (two-sample K–S test). Intuitively, it provides a method to qualitatively <!-- Quantitively, surely? --> answer the question "How likely is it that we would see a collection of samples like this if they were drawn from that probability distribution?" or, in the second case, "How likely is it that we would see two sets of samples like this if they were drawn from the same (but unknown) probability distribution?". It is named after [[Andrey Kolmogorov]] and [[Nikolai Smirnov (mathematician)|Nikolai Smirnov]]. The Kolmogorov–Smirnov statistic quantifies a [[metric (mathematics)|distance]] between the [[empirical distribution function]] of the sample and the [[cumulative distribution function]] of the reference distribution, or between the empirical distribution functions of two samples. The [[null distribution]] of this statistic is calculated under the [[null hypothesis]] that the sample is drawn from the reference distribution (in the one-sample case) or that the samples are drawn from the same distribution (in the two-sample case). In the one-sample case, the distribution considered under the null hypothesis may be continuous (see [[#Kolmogorov distribution|Section 2]]), purely discrete or mixed (see [[#Discrete and mixed null distribution|Section 2.2]]). In the two-sample case (see [[#Two-sample Kolmogorov–Smirnov test|Section 3]]), the distribution considered under the null hypothesis is a continuous distribution but is otherwise unrestricted. The two-sample K–S test is one of the most useful and general [[Nonparametric statistics|nonparametric methods]] for comparing two samples, as it is sensitive to differences in both location and shape of the empirical cumulative distribution functions of the two samples. The Kolmogorov–Smirnov test can be modified to serve as a [[goodness of fit]] test. In the special case of testing for [[Normal distribution|normality]] of the distribution, samples are standardized and compared with a standard normal distribution. This is equivalent to setting the mean and variance of the reference distribution equal to the sample estimates, and it is known that using these to define the specific reference distribution changes the null distribution of the test statistic (see [[#Test with estimated parameters|Test with estimated parameters]]). Various studies have found that, even in this corrected form, the test is less [[Power_of_a_test|powerful]] for testing normality than the [[Shapiro–Wilk test]] or [[Anderson–Darling test]].<ref>{{cite journal | first = M. A. | last = Stephens | year = 1974 | title = EDF Statistics for Goodness of Fit and Some Comparisons | journal = Journal of the American Statistical Association | volume = 69 | issue = 347| pages = 730–737 | jstor =2286009 | doi = 10.2307/2286009 }}</ref> However, these other tests have their own disadvantages. For instance the Shapiro–Wilk test is known not to work well in samples with many identical values. ==One-sample Kolmogorov–Smirnov statistic== The [[empirical distribution function]] ''F''<sub>''n''</sub> for ''n'' [[Independent and identically distributed random variables|independent and identically distributed]] (i.i.d.) ordered observations ''X<sub>i</sub>'' is defined as <math display="block"> F_{n}(x)=\frac{\text {number of (elements in the sample} \leq x)}{n}=\frac{1}{n} \sum_{i=1}^{n} 1_{(-\infty,x]}(X_{i}), </math> where <math>1_{(-\infty,x]}(X_i)</math> is the [[indicator function]], equal to 1 if <math>X_i \leq x</math> and equal to 0 otherwise. The Kolmogorov–Smirnov [[statistic]] for a given [[cumulative distribution function]] ''F''(''x'') is <math display="block">D_n= \sup_x |F_n(x)-F(x)|</math> where sup<sub>''x''</sub> is the [[supremum]] of the set of distances. Intuitively, the statistic takes the largest absolute difference between the two distribution functions across all ''x'' values. By the [[Glivenko–Cantelli theorem]], if the sample comes from the distribution ''F''(''x''), then ''D''<sub>''n''</sub> converges to 0 [[almost surely]] in the limit when <math>n</math> goes to infinity. Kolmogorov strengthened this result, by effectively providing the rate of this convergence (see [[#Kolmogorov distribution|Kolmogorov distribution]]). [[Donsker's theorem]] provides a yet stronger result. In practice, the statistic requires a relatively large number of data points (in comparison to other goodness of fit criteria such as the [[Anderson–Darling test]] statistic) to properly reject the null hypothesis. ==Kolmogorov distribution== [[File:KolmogorovDistrPDF.png|thumb|600px|Illustration of the Kolmogorov distribution's [[probability density function|PDF]]]] The Kolmogorov distribution is the distribution of the [[random variable]] <math display="block">K=\sup_{t\in[0,1]}|B(t)|</math> where ''B''(''t'') is the [[Brownian bridge]]. The [[cumulative distribution function]] of ''K'' is given by<ref>{{Cite journal |vauthors=Marsaglia G, Tsang WW, Wang J |year=2003 |title=Evaluating Kolmogorov's Distribution |journal=Journal of Statistical Software |volume=8 |issue=18 |pages=1–4 |doi=10.18637/jss.v008.i18 |doi-access=free }}</ref> <math display="block">\begin{align} \operatorname{Pr}(K\leq x) &= 1-2\sum_{k=1}^\infty (-1)^{k-1} e^{-2k^2 x^2} \\ &=\frac{\sqrt{2\pi}}{x}\sum_{k=1}^\infty e^{-(2k-1)^2\pi^2/(8x^2)}, \end{align}</math> which can also be expressed by the [[Jacobi theta function]] <math>\vartheta_{01}(z=0;\tau=2ix^2/\pi)</math>. Both the form of the Kolmogorov–Smirnov test statistic and its asymptotic distribution under the null hypothesis were published by [[Andrey Kolmogorov]],<ref name=AK>{{Cite journal |author=Kolmogorov A |year=1933 |title=Sulla determinazione empirica di una legge di distribuzione |journal=G. Ist. Ital. Attuari |volume=4 |pages=83–91}}</ref> while a table of the distribution was published by [[Nikolai Smirnov (mathematician)|Nikolai Smirnov]].<ref>{{Cite journal |author=Smirnov N |year=1948 |title=Table for estimating the goodness of fit of empirical distributions |journal=[[Annals of Mathematical Statistics]] |volume=19 |issue=2 |pages=279–281 |doi=10.1214/aoms/1177730256|doi-access=free }}</ref> Recurrence relations for the distribution of the test statistic in finite samples are available.<ref name=AK/> Under null hypothesis that the sample comes from the hypothesized distribution ''F''(''x''), <math display="block">\sqrt{n}D_n\xrightarrow{n\to\infty}\sup_t |B(F(t))|</math> [[convergence of random variables|in distribution]], where ''B''(''t'') is the Brownian bridge. If ''F'' is continuous then under the null hypothesis <math>\sqrt{n}D_n</math> converges to the Kolmogorov distribution, which does not depend on ''F''. This result may also be known as the Kolmogorov theorem. The accuracy of this limit as an approximation to the exact CDF of <math>K</math> when <math>n</math> is finite is not very impressive: even when <math>n=1000</math>, the corresponding maximum error is about <math>0.9~\%</math>; this error increases to <math>2.6~\%</math> when <math>n=100</math> and to a totally unacceptable <math>7~\%</math> when <math>n=10</math>. However, a very simple expedient of replacing <math>x</math> by <math display="block">x+\frac{1}{6\sqrt{n}}+ \frac{x-1}{4n}</math> in the argument of the Jacobi theta function reduces these errors to <math>0.003~\%</math>, <math>0.027\%</math>, and <math>0.27~\%</math> respectively; such accuracy would be usually considered more than adequate for all practical applications.<ref>{{Cite journal |vauthors=Vrbik, Jan |year=2018 |title=Small-Sample Corrections to Kolmogorov–Smirnov Test Statistic |journal=Pioneer Journal of Theoretical and Applied Statistics |volume=15 |issue=1–2 |pages=15–23}}</ref> The ''goodness-of-fit'' test or the Kolmogorov–Smirnov test can be constructed by using the critical values of the Kolmogorov distribution. This test is asymptotically valid when <math>n \to\infty.</math> It rejects the null hypothesis at level <math>\alpha</math> if <math display="block">\sqrt{n}D_n>K_\alpha,\,</math> where ''K''<sub>''α''</sub> is found from <math display="block">\operatorname{Pr}(K\leq K_\alpha)=1-\alpha.\,</math> The asymptotic [[statistical power|power]] of this test is 1. Fast and accurate algorithms to compute the cdf <math>\operatorname{Pr}(D_n \leq x)</math> or its complement for arbitrary <math>n</math> and <math>x</math>, are available from: * <ref name=SL2011>{{Cite journal |vauthors=Simard R, L'Ecuyer P |year=2011 |title=Computing the Two-Sided Kolmogorov–Smirnov Distribution |journal=Journal of Statistical Software |volume=39 |issue=11 |pages=1–18 |doi=10.18637/jss.v039.i11 |doi-access=free }}</ref> and <ref>{{Cite journal |vauthors=Moscovich A, Nadler B |year=2017 |title=Fast calculation of boundary crossing probabilities for Poisson processes |journal=Statistics and Probability Letters |volume=123 |pages=177–182 |doi=10.1016/j.spl.2016.11.027|arxiv=1503.04363 |s2cid=12868694 }}</ref> for continuous null distributions with code in C and Java to be found in.<ref name=SL2011/> * <ref name=DKT2019>{{Cite journal |vauthors=Dimitrova DS, Kaishev VK, Tan S |year=2020 |title=Computing the Kolmogorov–Smirnov Distribution when the Underlying cdf is Purely Discrete, Mixed or Continuous |journal=Journal of Statistical Software |volume=95 |issue=10 |pages=1–42 |doi= 10.18637/jss.v095.i10 |doi-access=free }}</ref> for purely discrete, mixed or continuous null distribution implemented in the KSgeneral package <ref name=KSgeneral>{{Cite web|url=https://CRAN.R-project.org/package=KSgeneral |title=KSgeneral: KSgeneral: Computing P-Values of the One-Sample K-S Test and the Two-Sample K-S and Kuiper Tests for (Dis)Continuous Null Distribution|last1=Dimitrova|first1=Dimitrina |last2=Yun|first2=Jia | last3=Kaishev| first3=Vladimir | last4=Tan|first4=Senren|website=CRAN.R-project.org/package=KSgeneral|date=21 May 2024}}</ref> of the [[R (programming language)|R project for statistical computing]], which for a given sample also computes the KS test statistic and its p-value. Alternative C++ implementation is available from.<ref name=DKT2019/> ===Test with estimated parameters=== If either the form or the parameters of ''F''(''x'') are determined from the data ''X''<sub>''i''</sub> the critical values determined in this way are invalid. In such cases, [[Monte Carlo method|Monte Carlo]] or other methods may be required, but tables have been prepared for some cases. Details for the required modifications to the test statistic and for the critical values for the [[normal distribution]] and the [[exponential distribution]] have been published,<ref name="Pearson & Hartley">{{cite book |title= Biometrika Tables for Statisticians |editor=Pearson, E. S. |editor2=Hartley, H. O. |year=1972 |volume=2 |publisher=Cambridge University Press |isbn=978-0-521-06937-3 |pages=117–123, Tables 54, 55}}</ref> and later publications also include the [[Gumbel distribution]].<ref name="Shorak & Wellner">{{cite book |title=Empirical Processes with Applications to Statistics |first1=Galen R. |last1=Shorack |first2=Jon A. |last2=Wellner |year=1986 |isbn=978-0-471-86725-8 |publisher=Wiley |page=239}}</ref> The [[Lilliefors test]] represents a special case of this for the normal distribution. The logarithm transformation may help to overcome cases where the Kolmogorov test data does not seem to fit the assumption that it came from the normal distribution. Using estimated parameters, the question arises which estimation method should be used. Usually this would be the [[Maximum likelihood estimation|maximum likelihood method]], but e.g. for the normal distribution MLE has a large bias error on sigma. Using a moment fit or KS minimization instead has a large impact on the critical values, and also some impact on test power. If we need to decide for Student-T data with df = 2 via KS test whether the data could be normal or not, then a ML estimate based on H<sub>0</sub> (data is normal, so using the standard deviation for scale) would give much larger KS distance, than a fit with minimum KS. In this case we should reject H<sub>0</sub>, which is often the case with MLE, because the sample standard deviation might be very large for T-2 data, but with KS minimization we may get still a too low KS to reject H<sub>0</sub>. In the Student-T case, a modified KS test with KS estimate instead of MLE, makes the KS test indeed slightly worse. However, in other cases, such a modified KS test leads to slightly better test power.{{Citation needed|date=May 2022}} ===Discrete and mixed null distribution=== Under the assumption that <math>F</math> is non-decreasing and right-continuous, with countable (possibly infinite) number of jumps, the KS test statistic can be expressed as: <math display="block">D_n= \sup_x |F_n(x)-F(x)| = \sup_{0 \leq t \leq 1} |F_n(F^{-1}(t)) - F(F^{-1}(t))|.</math> From the right-continuity of <math>F</math>, it follows that <math>F(F^{-1}(t)) \geq t</math> and <math>F^{-1}(F(x)) \leq x </math> and hence, the distribution of <math>D_{n}</math> depends on the null distribution <math>F</math>, i.e., is no longer distribution-free as in the continuous case. Therefore, a fast and accurate method has been developed to compute the exact and asymptotic distribution of <math>D_{n}</math> when <math>F</math> is purely discrete or mixed,<ref name=DKT2019/> implemented in C++ and in the KSgeneral package <ref name=KSgeneral/> of the [[R (programming language)|R language]]. The functions <code>disc_ks_test()</code>, <code>mixed_ks_test()</code> and <code>cont_ks_test()</code> compute also the KS test statistic and p-values for purely discrete, mixed or continuous null distributions and arbitrary sample sizes. The KS test and its p-values for discrete null distributions and small sample sizes are also computed in <ref name=arnold-emerson>{{Cite journal |first1=Taylor B. |last1=Arnold |first2=John W. |last2=Emerson |year=2011 |title=Nonparametric Goodness-of-Fit Tests for Discrete Null Distributions |journal=The R Journal |volume=3 |issue=2 |pages=34\[Dash]39 |url=http://journal.r-project.org/archive/2011-2/RJournal_2011-2_Arnold+Emerson.pdf |doi=10.32614/rj-2011-016|doi-access=free }}</ref> as part of the dgof package of the R language. Major statistical packages among which [[SAS (software)|SAS]] <code>PROC NPAR1WAY</code>,<ref>{{cite web|url=https://support.sas.com/documentation/cdl/en/statug/68162/HTML/default/viewer.htm#statug_npar1way_toc.htm|title=SAS/STAT(R) 14.1 User's Guide|website=support.sas.com|access-date=14 April 2018}}</ref> [[Stata]] <code>ksmirnov</code><ref>{{cite web|url=https://www.stata.com/manuals15/rksmirnov.pdf|title=ksmirnov — Kolmogorov–Smirnov equality-of-distributions test|website=stata.com|access-date=14 April 2018}}</ref> implement the KS test under the assumption that <math>F(x)</math> is continuous, which is more conservative if the null distribution is actually not continuous (see <ref name=Noether63>{{Cite journal |vauthors=Noether GE |year=1963|title=Note on the Kolmogorov Statistic in the Discrete Case |journal=Metrika |volume=7 |issue=1 |pages=115–116|doi=10.1007/bf02613966|s2cid=120687545}}</ref> <ref name=Slakter65>{{Cite journal |vauthors=Slakter MJ |year=1965|title=A Comparison of the Pearson Chi-Square and Kolmogorov Goodness-of-Fit Tests with Respect to Validity |journal=Journal of the American Statistical Association |volume=60 |issue=311 |pages=854–858 |doi=10.2307/2283251|jstor=2283251}}</ref> <ref name=Walsh63>{{Cite journal |vauthors=Walsh JE |year=1963 |title=Bounded Probability Properties of Kolmogorov–Smirnov and Similar Statistics for Discrete Data |journal=Annals of the Institute of Statistical Mathematics |volume=15 |issue=1 |pages=153–158|doi=10.1007/bf02865912|s2cid=122547015 }}</ref>). ==Two-sample Kolmogorov–Smirnov test== [[File:KS2 Example.png|thumb|300px|Illustration of the two-sample Kolmogorov–Smirnov statistic. Red and blue lines each correspond to an empirical distribution function, and the black arrow is the two-sample KS statistic.]] The Kolmogorov–Smirnov test may also be used to test whether two underlying one-dimensional probability distributions differ. In this case, the Kolmogorov–Smirnov statistic is <math display="block">D_{n,m}=\sup_x |F_{1,n}(x)-F_{2,m}(x)|,</math> where <math>F_{1,n}</math> and <math>F_{2,m}</math> are the [[empirical distribution function]]s of the first and the second sample respectively, and <math>\sup</math> is the [[Infimum and supremum|supremum function]]. For large samples, the null hypothesis is rejected at level <math>\alpha</math> if <math display="block">D_{n,m}>c(\alpha)\sqrt{\frac{n + m}{n\cdot m}}.</math> Where <math>n</math> and <math>m</math> are the sizes of first and second sample respectively. The value of <math>c({\alpha})</math> is given in the table below for the most common levels of <math>\alpha</math> {| class="wikitable" |- ! <math>\alpha</math> | 0.20 || 0.15 || 0.10 || 0.05 || 0.025 || 0.01 || 0.005 || 0.001 |- ! <math>c({\alpha})</math> | 1.073 || 1.138 || 1.224 || 1.358 || 1.48 || 1.628 || 1.731 || 1.949 |} and in general<ref>Eq. (15) in Section 3.3.1 of Knuth, D.E., The Art of Computer Programming, Volume 2 (Seminumerical Algorithms), 3rd Edition, Addison Wesley, Reading Mass, 1998.</ref> by <math display="block">c\left(\alpha\right)=\sqrt{-\ln\left(\tfrac{\alpha}{2}\right)\cdot \tfrac{1}{2}},</math> so that the condition reads <math display="block">D_{n,m}>\sqrt{-\ln\left(\tfrac{\alpha}{2}\right)\cdot \tfrac{1 + \tfrac{m}{n}}{2m}}.</math> Here, again, the larger the sample sizes, the more sensitive the minimal bound: For a given ratio of sample sizes (e.g. <math>m=n</math>), the minimal bound scales in the size of either of the samples according to its inverse square root. Note that the two-sample test checks whether the two data samples come from the same distribution. This does not specify what that common distribution is (e.g. whether it's normal or not normal). Again, tables of critical values have been published. A shortcoming of the univariate Kolmogorov–Smirnov test is that it is not very powerful because it is devised to be sensitive against all possible types of differences between two distribution functions. Some argue<ref>{{cite journal |last1=Marozzi |first1=Marco |title=Some Notes on the Location-Scale Cucconi Test |journal=Journal of Nonparametric Statistics |date=2009 |volume=21 |issue=5 |pages=629–647 |doi=10.1080/10485250902952435 |s2cid=120038970 }}</ref><ref>{{cite journal |last1=Marozzi |first1=Marco |title=Nonparametric Simultaneous Tests for Location and Scale Testing: a Comparison of Several Methods |journal=Communications in Statistics – Simulation and Computation |date=2013 |volume=42 |issue=6 |pages=1298–1317 |doi=10.1080/03610918.2012.665546 |s2cid=28146102 }}</ref> that the [[Cucconi test]], originally proposed for simultaneously comparing location and scale, can be much more powerful than the Kolmogorov–Smirnov test when comparing two distribution functions. Two-sample KS tests have been applied in economics to detect asymmetric effects and to study natural experiments.<ref>{{cite journal |last1=Monge |first1=Marco |title=Two-Sample Kolmogorov-Smirnov Tests as Causality Tests. A narrative of Latin American inflation from 2020 to 2022. |date=2023 |volume=17 |issue=1 |pages=68–78 |url=https://rches.utem.cl/articulos/two-sample-kolmogorov-smirnov-tests-as-causality-tests-a-narrative-of-latin-american-inflation-from-2020-to-2022/|journal=Revista Chilena de Economía y Sociedad }}</ref> ==Setting confidence limits for the shape of a distribution function== {{main article|Dvoretzky–Kiefer–Wolfowitz inequality}} While the Kolmogorov–Smirnov test is usually used to test whether a given ''F''(''x'') is the underlying probability distribution of ''F''<sub>''n''</sub>(''x''), the procedure may be inverted to give confidence limits on ''F''(''x'') itself. If one chooses a critical value of the test statistic ''D''<sub>''α''</sub> such that P(''D''<sub>''n''</sub> > ''D''<sub>''α''</sub>) = ''α'', then a band of width ±''D''<sub>''α''</sub> around ''F''<sub>''n''</sub>(''x'') will entirely contain ''F''(''x'') with probability 1 − ''α''. ==The Kolmogorov–Smirnov statistic in more than one dimension== A distribution-free multivariate Kolmogorov–Smirnov goodness of fit test has been proposed by [[Ana Justel|Justel]], Peña and Zamar (1997).<ref>{{cite journal |last1=Justel |first1=A.|author1-link=Ana Justel |last2=Peña |first2=D. |last3=Zamar |first3=R. |year=1997 |title=A multivariate Kolmogorov–Smirnov test of goodness of fit |journal=Statistics & Probability Letters |volume=35 |issue=3 |pages=251–259 |doi=10.1016/S0167-7152(97)00020-5 |citeseerx=10.1.1.498.7631 }}</ref> The test uses a statistic which is built using Rosenblatt's transformation, and an algorithm is developed to compute it in the bivariate case. An approximate test that can be easily computed in any dimension is also presented. The Kolmogorov–Smirnov test statistic needs to be modified if a similar test is to be applied to [[multivariate statistics|multivariate data]]. This is not straightforward because the maximum difference between two joint [[cumulative distribution function]]s is not generally the same as the maximum difference of any of the complementary distribution functions. Thus the maximum difference will differ depending on which of <math>\Pr(X < x \land Y < y)</math> or <math>\Pr(X < x \land Y > y)</math> or any of the other two possible arrangements is used. One might require that the result of the test used should not depend on which choice is made. One approach to generalizing the Kolmogorov–Smirnov statistic to higher dimensions which meets the above concern is to compare the cdfs of the two samples with all possible orderings, and take the largest of the set of resulting KS statistics. In ''d'' dimensions, there are 2<sup>''d''</sup> − 1 such orderings. One such variation is due to Peacock<ref name="Peacock">{{cite journal |author = Peacock J.A. |title = Two-dimensional goodness-of-fit testing in astronomy |journal = [[Monthly Notices of the Royal Astronomical Society]] |volume = 202 |issue = 3 |pages = 615–627 |year = 1983 |bibcode = 1983MNRAS.202..615P |doi=10.1093/mnras/202.3.615|doi-access = free }}</ref> (see also Gosset<ref>{{cite journal |author = Gosset E. |title = A three-dimensional extended Kolmogorov–Smirnov test as a useful tool in astronomy} |journal = Astronomy and Astrophysics |volume = 188 |issue = 1 |pages = 258–264 |year = 1987 |bibcode = 1987A&A...188..258G }}</ref> for a 3D version) and another to Fasano and Franceschini<ref name="Fasano">{{cite journal |author=Fasano, G. |author2=Franceschini, A. |year=1987 |title= A multidimensional version of the Kolmogorov–Smirnov test |journal= Monthly Notices of the Royal Astronomical Society |issn=0035-8711 |volume= 225 |pages= 155–170 |bibcode=1987MNRAS.225..155F |doi=10.1093/mnras/225.1.155|doi-access= free }}</ref> (see Lopes et al. for a comparison and computational details).<ref name="Lopes">{{cite conference |author=Lopes, R.H.C. |author2=Reid, I. |author3=Hobson, P.R. |title= The two-dimensional Kolmogorov–Smirnov test |conference= XI International Workshop on Advanced Computing and Analysis Techniques in Physics Research |date= 23–27 April 2007 |location= Amsterdam, the Netherlands |url= http://dspace.brunel.ac.uk/bitstream/2438/1166/1/acat2007.pdf }}</ref> Critical values for the test statistic can be obtained by simulations, but depend on the dependence structure in the joint distribution. ==Implementations== The Kolmogorov–Smirnov test is implemented in many software programs. Most of these implement both the one and two sampled test. * [[Mathematica]] has [https://reference.wolfram.com/language/ref/KolmogorovSmirnovTest.html KolmogorovSmirnovTest]. * [[MATLAB]]'s Statistics Toolbox has [https://de.mathworks.com/help/stats/kstest.html kstest] and [https://nl.mathworks.com/help/stats/kstest2.html kstest2] for one-sample and two-sample Kolmogorov–Smirnov tests, respectively. * The [[R (programming language)|R]] package "KSgeneral"<ref name=KSgeneral/> computes the KS test statistics and its p-values under arbitrary, possibly discrete, mixed or continuous null distribution. * [[R (programming language)|R]]'s statistics base-package implements the test as [https://stat.ethz.ch/R-manual/R-patched/library/stats/html/ks.test.html ks.test {stats}] in its "stats" package. * [[SAS (software)|SAS]] implements the test in its PROC NPAR1WAY procedure. * In [[Python (programming language)|Python]], the [[SciPy]] package implements the test in the scipy.stats.kstest function.<ref>{{cite web |url= https://docs.scipy.org/doc/scipy/reference/generated/scipy.stats.kstest.html |title=scipy.stats.kstest |work=SciPy v1.7.1 Manual |publisher=The Scipy community |access-date= 26 October 2021}}</ref> * [[SYSTAT (statistics)|SYSTAT]] (SPSS Inc., Chicago, IL) * [[Java (programming language)|Java]] has an implementation of this test provided by [[Apache Commons]].<ref>{{cite web |url=https://commons.apache.org/proper/commons-math/javadocs/api-3.5/org/apache/commons/math3/stat/inference/KolmogorovSmirnovTest.html |title=KolmogorovSmirnovTest|access-date= 18 June 2019}}</ref> * [[KNIME]] has a node implementing this test based on the above Java implementation.<ref>{{cite web |url=https://www.knime.com/whats-new-in-knime-37#new-statistics-nodes |title=New statistics nodes |access-date= 25 June 2020}}</ref> *[[Julia (programming language)|Julia]] has the package [https://juliastats.org/HypothesisTests.jl/stable/ HypothesisTests.jl] with the function ExactOneSampleKSTest(x::AbstractVector{<:Real}, d::UnivariateDistribution).<ref>{{Cite web|url=https://juliastats.org/HypothesisTests.jl/stable/nonparametric/#Kolmogorov-Smirnov-test-1|title = Nonparametric tests · HypothesisTests.jl}}</ref> *[[StatsDirect]] (StatsDirect Ltd, Manchester, UK) implements [https://www.statsdirect.com/help/nonparametric_methods/smirnov.htm all common variants]. * [[Stata]] (Stata Corporation, College Station, TX) implements the test in ksmirnov (Kolmogorov–Smirnov equality-of-distributions test) command.<ref> {{ cite web |url=https://www.stata.com/manuals15/rksmirnov.pdf|title=ksmirnov — Kolmogorov –Smirnov equality-of-distributions test |access-date= 18 June 2019}} </ref> * [[PSPP]] implements the test in its [https://www.gnu.org/software/pspp/manual/html_node/KOLMOGOROV_002dSMIRNOV.html KOLMOGOROV-SMIRNOV] (or using KS shortcut function). * The Real Statistics Resource Pack for [[Microsoft Excel|Excel]] runs the test as KSCRIT and KSPROB.<ref>{{cite web |url=http://www.real-statistics.com/tests-normality-and-symmetry/statistical-tests-normality-symmetry/kolmogorov-smirnov-test/|title=Kolmogorov–Smirnov Test for Normality Hypothesis Testing | access-date= 18 June 2019}}</ref> ==See also== *[[Lepage test]] *[[Cucconi test]] *[[Kuiper's test]] *[[Shapiro–Wilk test]] *[[Anderson–Darling test]] *[[Cramér–von Mises criterion|Cramér–von Mises test]] *[[Wasserstein metric]] ==References== {{Reflist|30em}} ==Further reading== * {{cite book |last=Daniel |first=Wayne W. |chapter=Kolmogorov–Smirnov one-sample test |title=Applied Nonparametric Statistics |location=Boston |publisher=PWS-Kent |edition=2nd |year=1990 |isbn=978-0-534-91976-4 |pages=319–330 |chapter-url=https://books.google.com/books?id=0hPvAAAAMAAJ&pg=PA319 }} * {{cite book | last = Eadie | first = W.T. |author2=D. Drijard |author3=F.E. James |author4=M. Roos |author5=B. Sadoulet | title = Statistical Methods in Experimental Physics | publisher = North-Holland | year = 1971 | location = Amsterdam | pages = 269–271 | isbn = 978-0-444-10117-4 }} * {{cite book | last1 = Stuart | first1 = Alan | first2 = Keith | last2 = Ord | first3=Steven [F.] | last3=Arnold | title=Classical Inference and the Linear Model | edition=Sixth | series = Kendall's Advanced Theory of Statistics | volume = 2A | year = 1999 | publisher = Arnold | location = London | isbn=978-0-340-66230-4 | mr=1687411 | pages = 25.37–25.43 }} *{{cite book |last1=Corder |first1=G. W. |last2=Foreman |first2=D. I. |year=2014 |title=Nonparametric Statistics: A Step-by-Step Approach |publisher=Wiley |isbn=978-1-118-84031-3 }} *{{cite journal |last=Stephens |first=M. A. |year=1979 |title=Test of fit for the logistic distribution based on the empirical distribution function |journal=Biometrika |volume=66 |issue=3 |pages=591–595 |doi=10.1093/biomet/66.3.591 }} *{{cite journal |last1=Kesemen |first1=O. |last2=Tiryaki |first2=B.K. |last3=Tezel |first3=Ö. |last4=Özkul |first4=E.|year=2021 |title=A new goodness of fit test for multivariate normality |journal=Hacettepe Journal of Mathematics and Statistics |volume=50 |issue=3 |pages=872–894 |doi=10.15672/hujms.644516 |doi-access=free }} ==External links== *{{springer|title=Kolmogorov–Smirnov test|id=p/k055740}} *[https://web.archive.org/web/20050710021649/http://www.physics.csbsju.edu/stats/KS-test.html Short introduction] *[http://www.itl.nist.gov/div898/handbook/eda/section3/eda35g.htm KS test explanation] *[http://www.ciphersbyritter.com/JAVASCRP/NORMCHIK.HTM JavaScript implementation of one- and two-sided tests] *[http://jumk.de/statistic-calculator/ Online calculator with the KS test] * Open-source C++ code to compute the [https://web.archive.org/web/20171204072150/http://root.cern.ch/root/html/TMath.html#TMath:KolmogorovProb Kolmogorov distribution] and perform the [https://web.archive.org/web/20171204072150/http://root.cern.ch/root/html/TMath.html#TMath:KolmogorovTest KS test] *Paper on [http://www.jstatsoft.org/v08/i18/paper Evaluating Kolmogorov's Distribution]; contains C implementation. This is the method used in [[Matlab]]. *Paper on [http://www.jstatsoft.org/v39/i11/paper Computing the Two-Sided Kolmogorov–Smirnov Distribution]; computing the cdf of the KS statistic in C or Java. *Paper [http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0085777 powerlaw: A Python Package for Analysis of Heavy-Tailed Distributions]; Jeff Alstott, Ed Bullmore, Dietmar Plenz. Among others, it also performs the Kolmogorov–Smirnov test. Source code and installers of powerlaw package are available at [https://pypi.python.org/pypi/powerlaw PyPi]. {{Statistics}} {{Use dmy dates|date=February 2020}} {{DEFAULTSORT:Kolmogorov-Smirnov Test}} [[Category:Statistical distance]] [[Category:Nonparametric statistics]] [[Category:Normality tests]]
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)
Pages transcluded onto the current version of this page
(
help
)
:
Template:CS1 config
(
edit
)
Template:Citation needed
(
edit
)
Template:Cite book
(
edit
)
Template:Cite conference
(
edit
)
Template:Cite journal
(
edit
)
Template:Cite web
(
edit
)
Template:Main article
(
edit
)
Template:Reflist
(
edit
)
Template:Short description
(
edit
)
Template:Springer
(
edit
)
Template:Statistics
(
edit
)
Template:Use dmy dates
(
edit
)