Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Logit
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== History == Several approaches have been explored to adapt linear regression methods to a domain where the output is a probability value <math>(0, 1)</math>, instead of any real number <math>(-\infty, +\infty)</math>. In many cases, such efforts have focused on modeling this problem by mapping the range <math>(0, 1)</math> to <math>(-\infty, +\infty)</math> and then running the linear regression on these transformed values.<ref name="Cramer2003"/> In 1934, [[Chester Ittner Bliss]] used the cumulative normal distribution function to perform this mapping and called his model [[probit]], an abbreviation for "'''prob'''ability un'''it'''". This is, however, computationally more expensive.<ref name="Cramer2003">{{Cite web |url=http://www.cambridge.org/resources/0521815886/1208_default.pdf |title=The origins and development of the logit model |first=J. S. |last=Cramer |year=2003 |publisher=Cambridge UP |archive-url=https://web.archive.org/web/20240919043104/https://www.cambridge.org/resources/0521815886/1208_default.pdf |archive-date=19 September 2024 |url-status=dead }}</ref> In 1944, [[Joseph Berkson]] used log of odds and called this function ''logit'', an abbreviation for "'''log'''istic un'''it'''", following the analogy for probit: {{quote|"I use this term [logit] for <math>\ln p/q</math> following Bliss, who called the analogous function which is linear on {{tmath|x}} for the normal curve 'probit'."|Joseph Berkson (1944){{sfn|Berkson|1944|loc=p. 361, footnote 2}}}} Log odds was used extensively by [[Charles Sanders Peirce]] (late 19th century).<ref>{{cite book |title=The history of statistics : the measurement of uncertainty before 1900 |last=Stigler |first=Stephen M. |author-link=Stephen M. Stigler |year=1986 |publisher=Belknap Press of Harvard University Press |location=Cambridge, Massachusetts |isbn=978-0-674-40340-6 |url-access=registration |url=https://archive.org/details/historyofstatist00stig }}</ref> [[G. A. Barnard]] in 1949 coined the commonly used term ''log-odds'';<ref>{{citation|title=Logistic Regression Models|first=Joseph M.|last=Hilbe|authorlink=Joseph Hilbe|publisher=CRC Press|year=2009|isbn=9781420075779|page=3|url=https://books.google.com/books?id=tmHMBQAAQBAJ&pg=PA3}}.</ref>{{sfn|Barnard|1949|p=120}} the log-odds of an event is the logit of the probability of the event.<ref>{{citation|title=Logit Models from Economics and Other Fields|first=J. S.|last=Cramer|publisher=Cambridge University Press|year=2003|isbn=9781139438193|page=13|url=https://books.google.com/books?id=1Od2d72pPXUC&pg=PA13}}.</ref> Barnard also coined the term ''lods'' as an abstract form of "log-odds",{{sfn|Barnard|1949|p=120,128}} but suggested that "in practice the term 'odds' should normally be used, since this is more familiar in everyday life".{{sfn|Barnard|1949|p=136}}
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)