Template:Short description Template:About

File:1755 Lisbon earthquake.jpg
Extreme value theory is used to model the risk of extreme, rare events, such as the 1755 Lisbon earthquake.

Extreme value theory or extreme value analysis (EVA) is the study of extremes in statistical distributions.

It is widely used in many disciplines, such as structural engineering, finance, economics, earth sciences, traffic prediction, and geological engineering. For example, EVA might be used in the field of hydrology to estimate the probability of an unusually large flooding event, such as the 100-year flood. Similarly, for the design of a breakwater, a coastal engineer would seek to estimate the 50 year wave and design the structure accordingly.

Data analysisEdit

Two main approaches exist for practical extreme value analysis.

The first method relies on deriving block maxima (minima) series as a preliminary step. In many situations it is customary and convenient to extract the annual maxima (minima), generating an annual maxima series (AMS).

The second method relies on extracting, from a continuous record, the peak values reached for any period during which values exceed a certain threshold (falls below a certain threshold). This method is generally referred to as the peak over threshold method (POT).<ref>Template:Cite journal</ref>

For AMS data, the analysis may partly rely on the results of the Fisher–Tippett–Gnedenko theorem, leading to the generalized extreme value distribution being selected for fitting.<ref>Template:Harvp</ref><ref>Template:Harvp</ref> However, in practice, various procedures are applied to select between a wider range of distributions. The theorem here relates to the limiting distributions for the minimum or the maximum of a very large collection of independent random variables from the same distribution. Given that the number of relevant random events within a year may be rather limited, it is unsurprising that analyses of observed AMS data often lead to distributions other than the generalized extreme value distribution (GEVD) being selected.<ref>Template:Harvp</ref>

For POT data, the analysis may involve fitting two distributions: One for the number of events in a time period considered and a second for the size of the exceedances.

A common assumption for the first is the Poisson distribution, with the generalized Pareto distribution being used for the exceedances. A tail-fitting can be based on the Pickands–Balkema–de Haan theorem.<ref>Template:Harvp</ref><ref>Template:Harvp</ref>

Novak (2011) reserves the term "POT method" to the case where the threshold is non-random, and distinguishes it from the case where one deals with exceedances of a random threshold.<ref>Template:Harvp</ref>

ApplicationsEdit

Applications of extreme value theory include predicting the probability distribution of: Template:Div col begin

Template:Cite journal </ref>

Template:Cite report </ref> and performances in other athletic disciplines<ref> Template:Cite journal </ref><ref> Template:Cite journal </ref><ref> Template:Cite journal </ref>

Template:Cite journal </ref><ref> Template:Cite journal </ref>

Template:Cite journal </ref>

Template:Cite journal </ref>

Template:Cite journal</ref>

Template:Cite journal </ref>

Template:Div col end

HistoryEdit

The field of extreme value theory was pioneered by L. Tippett (1902–1985). Tippett was employed by the British Cotton Industry Research Association, where he worked to make cotton thread stronger. In his studies, he realized that the strength of a thread was controlled by the strength of its weakest fibres. With the help of R.A. Fisher, Tippet obtained three asymptotic limits describing the distributions of extremes assuming independent variables. E.J. Gumbel (1958)<ref>Template:Harvp</ref> codified this theory. These results can be extended to allow for slight correlations between variables, but the classical theory does not extend to strong correlations of the order of the variance. One universality class of particular interest is that of log-correlated fields, where the correlations decay logarithmically with the distance.

Univariate theoryEdit

{{#invoke:Labelled list hatnote|labelledList|Main article|Main articles|Main page|Main pages}} The theory for extreme values of a single variable is governed by the extreme value theorem, also called the Fisher–Tippett–Gnedenko theorem, which describes which of the three possible distributions for extreme values applies for a particular statistical variable <math>X</math>.

Multivariate theoryEdit

Extreme value theory in more than one variable introduces additional issues that have to be addressed. One problem that arises is that one must specify what constitutes an extreme event.<ref name=Morton-Bowers-1996> Template:Cite journal </ref> Although this is straightforward in the univariate case, there is no unambiguous way to do this in the multivariate case. The fundamental problem is that although it is possible to order a set of real-valued numbers, there is no natural way to order a set of vectors.

As an example, in the univariate case, given a set of observations <math>\ x_i\ </math> it is straightforward to find the most extreme event simply by taking the maximum (or minimum) of the observations. However, in the bivariate case, given a set of observations <math>\ ( x_i, y_i )\ </math>, it is not immediately clear how to find the most extreme event. Suppose that one has measured the values <math>\ (3, 4)\ </math> at a specific time and the values <math>\ (5, 2)\ </math> at a later time. Which of these events would be considered more extreme? There is no universal answer to this question.

Another issue in the multivariate case is that the limiting model is not as fully prescribed as in the univariate case. In the univariate case, the model (GEV distribution) contains three parameters whose values are not predicted by the theory and must be obtained by fitting the distribution to the data. In the multivariate case, the model not only contains unknown parameters, but also a function whose exact form is not prescribed by the theory. However, this function must obey certain constraints.<ref> Template:Cite book </ref><ref> Template:Cite book </ref> It is not straightforward to devise estimators that obey such constraints though some have been recently constructed.<ref name=dC2014> Template:Cite journal </ref><ref name=hanson2017> Template:Cite journal</ref><ref name=dC2013> Template:Cite journal </ref>

As an example of an application, bivariate extreme value theory has been applied to ocean research.<ref name=Morton-Bowers-1996/><ref> Template:Cite journal </ref>

Non-stationary extremesEdit

Statistical modeling for nonstationary time series was developed in the 1990s.<ref name=dS1990> Template:Cite journal </ref> Methods for nonstationary multivariate extremes have been introduced more recently.<ref name=dC2012> Template:Cite book </ref> The latter can be used for tracking how the dependence between extreme values changes over time, or over another covariate.<ref name=castro2018> Template:Cite journal </ref><ref name=mhalla2019> Template:Cite journal </ref><ref name=EB2018> Template:Cite journal </ref>

See alsoEdit

Template:Div col begin

Extreme value distributions

Template:Div col end


ReferencesEdit

Template:Reflist

SourcesEdit

Template:Refbegin

Template:Refend

SoftwareEdit

|CitationClass=web }} — Package for extreme value statistics in R.

  • {{#invoke:citation/CS1|citation

|CitationClass=web }} — Package for extreme value statistics in Julia.

  • {{#invoke:citation/CS1|citation

|CitationClass=web }}

External linksEdit

Template:Refbegin

Template:Refend

Template:Authority control