Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Propagation of uncertainty
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Example formulae== This table shows the variances and standard deviations of simple functions of the real variables <math>A, B</math> with standard deviations <math>\sigma_A, \sigma_B,</math> [[Covariance and correlation|covariance]] <math>\sigma_{AB} = \rho_{AB} \sigma_A \sigma_B,</math> and correlation <math>\rho_{AB}.</math> The real-valued coefficients <math>a</math> and <math>b</math> are assumed exactly known (deterministic), i.e., <math>\sigma_a = \sigma_b = 0.</math> In the right-hand columns of the table, <math>A</math> and <math>B</math> are [[expected value|expectation values]], and <math>f</math> is the value of the function calculated at those values. {| class="wikitable" ! Function !! Variance !! Standard deviation |- | <math>f = aA\,</math> | <math>\sigma_f^2 = a^2\sigma_A^2</math> | <math>\sigma_f = |a|\sigma_A</math> |- | <math>f = A + B</math> | <math>\sigma_f^2 = \sigma_A^2 + \sigma_B^2 + 2\sigma_{AB}</math> | <math>\sigma_f = \sqrt{\sigma_A^2 + \sigma_B^2 + 2\sigma_{AB}}</math> |- | <math>f = A - B</math> | <math>\sigma_f^2 = \sigma_A^2 + \sigma_B^2 - 2\sigma_{AB}</math> | <math>\sigma_f = \sqrt{\sigma_A^2 + \sigma_B^2 - 2\sigma_{AB}}</math> |- | <math>f = aA + bB</math> | <math>\sigma_f^2 = a^2\sigma_A^2 + b^2\sigma_B^2 + 2ab\,\sigma_{AB}</math> | <math>\sigma_f = \sqrt{a^2\sigma_A^2 + b^2\sigma_B^2 + 2ab\,\sigma_{AB}}</math> |- | <math>f = aA - bB</math> | <math>\sigma_f^2 = a^2\sigma_A^2 + b^2\sigma_B^2 - 2ab\,\sigma_{AB}</math> | <math>\sigma_f = \sqrt{a^2\sigma_A^2 + b^2\sigma_B^2 - 2ab\,\sigma_{AB}}</math> |- | <math>f = AB</math> | <math>\sigma_f^2 \approx f^2 \left[\left(\frac{\sigma_A}{A}\right)^2 + \left(\frac{\sigma_B}{B}\right)^2 + 2\frac{\sigma_{AB}}{AB} \right]</math><ref>{{cite web |url=http://ipl.physics.harvard.edu/wp-uploads/2013/03/PS3_Error_Propagation_sp13.pdf |title=A Summary of Error Propagation |page=2 |access-date=2016-04-04 |archive-url = https://web.archive.org/web/20161213135602/http://ipl.physics.harvard.edu/wp-uploads/2013/03/PS3_Error_Propagation_sp13.pdf |archive-date=2016-12-13 |url-status=dead }}</ref><ref>{{cite web |url=http://web.mit.edu/fluids-modules/www/exper_techniques/2.Propagation_of_Uncertaint.pdf |title=Propagation of Uncertainty through Mathematical Operations |page=5 |access-date=2016-04-04}}</ref> | <math>\sigma_f \approx \left| f \right| \sqrt{ \left(\frac{\sigma_A}{A}\right)^2 + \left(\frac{\sigma_B}{B}\right)^2 + 2\frac{\sigma_{AB}}{AB} }</math> |- | <math>f = \frac{A}{B}</math> | <math>\sigma_f^2 \approx f^2 \left[\left(\frac{\sigma_A}{A}\right)^2 + \left(\frac{\sigma_B}{B}\right)^2 - 2\frac{\sigma_{AB}}{AB} \right]</math><ref>{{cite web |url=http://www.sagepub.com/upm-data/6427_Chapter_4__Lee_%28Analyzing%29_I_PDF_6.pdf |title=Strategies for Variance Estimation |page=37 |access-date=2013-01-18}}</ref> | <math>\sigma_f \approx \left| f \right| \sqrt{ \left(\frac{\sigma_A}{A}\right)^2 + \left(\frac{\sigma_B}{B}\right)^2 - 2\frac{\sigma_{AB}}{AB} }</math> |- | <math>f = \frac{A}{A+B}</math> | <math>\sigma_f^2 \approx \frac{f^2}{\left(A+B\right)^2} \left(\frac{B^2}{A^2}\sigma_A^2 +\sigma_B^2 - 2\frac{B}{A} \sigma_{AB} \right)</math> | <math>\sigma_f \approx \left|\frac{f}{A+B}\right| \sqrt{\frac{B^2}{A^2}\sigma_A^2 +\sigma_B^2 - 2\frac{B}{A} \sigma_{AB} }</math> |- | <math>f = a A^b</math> | <math>\sigma_f^2 \approx \left( {a}{b}{A}^{b-1}{\sigma_A} \right)^2 = \left( \frac{{f}{b}{\sigma_A}}{A} \right)^2 </math> | <math>\sigma_f \approx \left| {a}{b}{A}^{b-1}{\sigma_A} \right| = \left| \frac{{f}{b}{\sigma_A}}{A} \right| </math> |- | <math>f = a \ln(bA)</math> | <math>\sigma_f^2 \approx \left(a \frac{\sigma_A}{A} \right)^2</math><ref name=harris2003>{{citation |first1=Daniel C. |last1=Harris | title=Quantitative chemical analysis |edition=6th |publisher=Macmillan |year=2003 |isbn=978-0-7167-4464-1 |page=56 |url=https://books.google.com/books?id=csTsQr-v0d0C&pg=PA56 }}</ref> | <math>\sigma_f \approx \left| a \frac{\sigma_A}{A}\right|</math> |- | <math>f = a \log_{10}(bA)</math> | <math>\sigma_f^2 \approx \left(a \frac{\sigma_A}{A \ln(10)} \right)^2</math><ref name=harris2003/> | <math>\sigma_f \approx \left| a \frac{\sigma_A}{A \ln(10)} \right|</math> |- | <math>f = a e^{bA}</math> | <math>\sigma_f^2 \approx f^2 \left( b\sigma_A \right)^2</math><ref>{{cite web|url=http://www.foothill.edu/psme/daley/tutorials_files/10.%20Error%20Propagation.pdf|date=October 9, 2009|title=Error Propagation tutorial|work=Foothill College | access-date=2012-03-01}}</ref> | <math>\sigma_f \approx \left| f \right| \left| \left( b\sigma_A \right) \right| </math> |- | <math>f = a^{bA}</math> | <math>\sigma_f^2 \approx f^2 (b\ln(a)\sigma_A)^2</math> | <math>\sigma_f \approx \left| f \right| \left| b \ln(a) \sigma_A \right|</math> |- | <math>f = a \sin(bA)</math> | <math>\sigma_f^2 \approx \left[ a b \cos(b A) \sigma_A \right]^2</math> | <math>\sigma_f \approx \left| a b \cos(b A) \sigma_A \right|</math> |- | <math>f = a \cos \left( b A \right)\,</math> | <math>\sigma_f^2 \approx \left[ a b \sin(b A) \sigma_A \right]^2</math> | <math>\sigma_f \approx \left| a b \sin(b A) \sigma_A \right|</math> |- |<math>f = a \tan \left( b A \right)\,</math> |<math>\sigma_f^2 \approx \left[ a b \sec^2(b A) \sigma_A \right]^2</math> |<math>\sigma_f \approx \left| a b \sec^2(b A) \sigma_A \right|</math> |- | <math>f = A^B</math> | <math>\sigma_f^2 \approx f^2 \left[ \left( \frac{B}{A}\sigma_A \right)^2 +\left( \ln(A)\sigma_B \right)^2 + 2 \frac{B \ln(A)}{A} \sigma_{AB} \right]</math> | <math>\sigma_f \approx \left| f \right| \sqrt{ \left( \frac{B}{A}\sigma_A \right)^2 +\left( \ln(A)\sigma_B \right)^2 + 2 \frac{B \ln(A)}{A} \sigma_{AB} } </math> |- | <math>f = \sqrt{aA^2 \pm bB^2}</math> | <math>\sigma_f^2 \approx \left(\frac{A}{f}\right)^2 a^2\sigma_A^2 + \left(\frac{B}{f}\right)^2 b^2\sigma_B^2 \pm 2ab\frac{AB}{f^2}\,\sigma_{AB}</math> | <math>\sigma_f \approx \sqrt{\left(\frac{A}{f}\right)^2 a^2\sigma_A^2 + \left(\frac{B}{f}\right)^2 b^2\sigma_B^2 \pm 2ab\frac{AB}{f^2}\,\sigma_{AB}}</math> |} For uncorrelated variables (<math>\rho_{AB} = 0</math>, <math>\sigma_{AB} = 0</math>) expressions for more complicated functions can be derived by combining simpler functions. For example, repeated multiplication, assuming no correlation, gives <math display="block">f = ABC; \qquad \left(\frac{\sigma_f}{f}\right)^2 \approx \left(\frac{\sigma_A}{A}\right)^2 + \left(\frac{\sigma_B}{B}\right)^2+ \left(\frac{\sigma_C}{C}\right)^2.</math> For the case <math>f = AB </math> we also have Goodman's expression<ref name="Goodman1960"/> for the exact variance: for the uncorrelated case it is <math display="block">\operatorname{V}[XY] = \operatorname{E}[X]^2 \operatorname{V}[Y] + \operatorname{E}[Y]^2 \operatorname{V}[X] + \operatorname{E}\left[\left(X - \operatorname{E}(X)\right)^2 \left(Y - \operatorname{E}(Y)\right)^2\right],</math> and therefore we have <math display="block">\sigma_f^2 = A^2\sigma_B^2 + B^2\sigma_A^2 + \sigma_A^2\sigma_B^2.</math> ===Effect of correlation on differences=== If ''A'' and ''B'' are uncorrelated, their difference ''A'' β ''B'' will have more variance than either of them. An increasing positive correlation (<math>\rho_{AB} \to 1</math>) will decrease the variance of the difference, converging to zero variance for perfectly correlated variables with the [[homoscedastic|same variance]]. On the other hand, a negative correlation (<math>\rho_{AB} \to -1</math>) will further increase the variance of the difference, compared to the uncorrelated case. For example, the self-subtraction ''f'' = ''A'' β ''A'' has zero variance <math>\sigma_f^2 = 0</math> only if the variate is perfectly [[autocorrelation|autocorrelated]] (<math>\rho_A = 1</math>). If ''A'' is uncorrelated, <math>\rho_A = 0,</math> then the output variance is twice the input variance, <math>\sigma_f^2 = 2\sigma^2_A.</math> And if ''A'' is perfectly anticorrelated, <math>\rho_A = -1,</math> then the input variance is quadrupled in the output, <math>\sigma_f^2 = 4 \sigma^2_A</math> (notice <math>1 - \rho_A = 2</math> for ''f'' = ''aA'' β ''aA'' in the table above).
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)