Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Renormalization
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
{{Short description|Method in physics used to deal with infinities}} {{Use American English|date=January 2019}} {{Use mdy dates|date=January 2019}} {{Renormalization and regularization}} {{Quantum field theory|cTopic=Tools}} '''Renormalization''' is a collection of techniques in [[quantum field theory]], [[statistical field theory]], and the theory of [[self-similarity|self-similar]] geometric structures, that are used to treat [[infinity|infinities]] arising in calculated quantities by altering values of these quantities to compensate for effects of their '''self-interactions'''<!--boldface per WP:R#PLA; 'Self-interaction' and 'Self-interactions' redirect here-->. But even if no infinities arose in [[One-loop Feynman diagram|loop diagrams]] in quantum field theory, it could be shown that it would be necessary to renormalize the mass and fields appearing in the original [[Lagrangian (field theory)|Lagrangian]].<ref>See e.g., Weinberg vol I, chapter 10.</ref> For example, an [[electron]] theory may begin by postulating an electron with an initial mass and charge. In [[quantum field theory]] a cloud of [[virtual particle]]s, such as [[photon]]s, [[positron]]s, and others surrounds and interacts with the initial electron. Accounting for the interactions of the surrounding particles (e.g. collisions at different energies) shows that the electron-system behaves as if it had a different mass and charge than initially postulated. Renormalization, in this example, mathematically replaces the initially postulated mass and charge of an electron with the experimentally observed mass and charge. Mathematics and experiments prove that positrons and more massive particles such as [[proton]]s exhibit precisely the same observed charge as the electron β even in the presence of much stronger interactions and more intense clouds of virtual particles. Renormalization specifies relationships between parameters in the theory when parameters describing large distance scales differ from parameters describing small distance scales. Physically, the pileup of contributions from an infinity of scales involved in a problem may then result in further infinities. When describing [[spacetime]] as a continuum, certain statistical and quantum mechanical constructions are not [[well-defined]]. To define them, or make them unambiguous, a [[continuum limit]] must carefully remove "construction scaffolding" of lattices at various scales. Renormalization procedures are based on the requirement that certain physical quantities (such as the mass and charge of an electron) equal observed (experimental) values. That is, the experimental value of the physical quantity yields practical applications, but due to their empirical nature the observed measurement represents areas of quantum field theory that require deeper derivation from theoretical bases. Renormalization was first developed in [[quantum electrodynamics]] (QED) to make sense of [[infinity|infinite]] integrals in [[perturbation theory (quantum mechanics)|perturbation theory]]. Initially viewed as a suspect provisional procedure even by some of its originators, renormalization eventually was embraced as an important and [[self-consistent]] actual mechanism of scale physics in several fields of [[physics]] and [[mathematics]]. Despite his later skepticism, it was [[Paul Dirac]] who pioneered renormalization.<ref>{{Cite journal |last1=Sanyuk |first1=Valerii I. |last2=Sukhanov |first2=Alexander D. |date=2003-09-01 |title=Dirac in 20th century physics: a centenary assessment |url=https://ufn.ru/en/articles/2003/9/c/ |journal=Physics-Uspekhi |language=en |volume=46 |issue=9 |pages=937β956 |doi=10.1070/PU2003v046n09ABEH001165 |issn=1063-7869}}</ref><ref name=":62">{{Cite thesis |last=Kar |first=Arnab |title=Renormalization from Classical to Quantum Physics |date=2014 |publisher=University of Rochester |url=https://inspirehep.net/literature/1411969 |degree=}}</ref> Today, the point of view has shifted: on the basis of the breakthrough [[renormalization group]] insights of [[Nikolay Bogolyubov]] and [[Kenneth G. Wilson|Kenneth Wilson]], the focus is on variation of physical quantities across contiguous scales, while distant scales are related to each other through "effective" descriptions. All scales are linked in a broadly systematic way, and the actual physics pertinent to each is extracted with the suitable specific computational techniques appropriate for each. Wilson clarified which variables of a system are crucial and which are redundant. Renormalization is distinct from [[Regularization (physics)|regularization]], another technique to control infinities by assuming the existence of new unknown physics at new scales.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)