Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Operant conditioning
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==History== [[File:PSM V80 D211 Edward Lee Thorndike.png|thumb|264x264px|[[Edward Thorndike|Edward Lee Thorndike]] in 1912]] ===Thorndike's law of effect=== {{Main|Law of effect}} Operant conditioning, sometimes called ''instrumental learning'', was first extensively studied by [[Edward L. Thorndike]] (1874β1949), who observed the behavior of cats trying to escape from home-made puzzle boxes.<ref name=":0">{{cite journal|last1=Thorndike|first1=E.L.|year=1901|title=Animal intelligence: An experimental study of the associative processes in animals|journal=Psychological Review Monograph Supplement|volume=2|pages=1β109}}</ref> A cat could escape from the box by a simple response such as pulling a cord or pushing a pole, but when first constrained, the cats took a long time to get out. With repeated trials ineffective responses occurred less frequently and successful responses occurred more frequently, so the cats escaped more and more quickly.<ref name=":0" /> Thorndike generalized this finding in his [[law of effect]], which states that behaviors followed by satisfying consequences tend to be repeated and those that produce unpleasant consequences are less likely to be repeated. In short, some consequences ''strengthen'' behavior and some consequences ''weaken'' behavior. By plotting escape time against trial number Thorndike produced the first known animal [[learning curve]]s through this procedure.<ref>Miltenberger, R. G. "Behavioral Modification: Principles and Procedures". [[Thomson/Wadsworth]], 2008. p. 9.</ref> Humans appear to learn many simple behaviors through the sort of process studied by Thorndike, now called operant conditioning. That is, responses are retained when they lead to a successful outcome and discarded when they do not, or when they produce aversive effects. This usually happens without being planned by any "teacher", but operant conditioning has been used by parents in teaching their children for thousands of years.<ref name="parenting">Miltenberger, R. G., & Crosland, K. A. (2014). Parenting. The wiley blackwell handbook of operant and classical conditioning. (pp. 509β531) Wiley-Blackwell. {{doi|10.1002/9781118468135.ch20}}</ref> ===B. F. Skinner=== [[File:B.F. Skinner at Harvard circa 1950.jpg|thumb|219x219px|B.F. Skinner at the Harvard Psychology Department, circa 1950]] {{Main|B. F. Skinner}}[[B.F. Skinner]] (1904β1990) is referred to as the Father of operant conditioning, and his work is frequently cited in connection with this topic. His 1938 book "The Behavior of Organisms: An Experimental Analysis",<ref>{{cite book |last1=Skinner |first1=B. F. |title=The Behavior of Organisms: An experimental Analysis |date=1938 |publisher=Appleton-Century-Crofts |location=New York |url=https://openlibrary.org/works/OL1725489W/The_behavior_of_organisms_an_experimental_analysis}}</ref> initiated his lifelong study of operant conditioning and its application to human and animal behavior. Following the ideas of [[Ernst Mach]], Skinner rejected Thorndike's reference to unobservable mental states such as satisfaction, building his analysis on observable behavior and its equally observable consequences.<ref>{{cite journal|last1=Skinner|first1=B. F.|s2cid=17811847|year=1950|title=Are theories of learning necessary?|journal=Psychological Review |volume=57|issue=4|pages=193β216|doi=10.1037/h0054367|pmid=15440996}}</ref> Skinner believed that classical conditioning was too simplistic to be used to describe something as complex as human behavior. Operant conditioning, in his opinion, better described human behavior as it examined causes and effects of intentional behavior. To implement his empirical approach, Skinner invented the [[operant conditioning chamber]], or "''Skinner Box''", in which subjects such as pigeons and rats were isolated and could be exposed to carefully controlled stimuli. Unlike Thorndike's puzzle box, this arrangement allowed the subject to make one or two simple, repeatable responses, and the rate of such responses became Skinner's primary behavioral measure.<ref>Schacter, Daniel L., Daniel T. Gilbert, and Daniel M. Wegner. "B. F. Skinner: The role of reinforcement and Punishment", subsection in: Psychology; Second Edition. New York: Worth, Incorporated, 2011, 278β288.</ref> Another invention, the cumulative recorder, produced a graphical record from which these response rates could be estimated. These records were the primary data that Skinner and his colleagues used to explore the effects on response rate of various reinforcement schedules.<ref name="ReferenceA">Ferster, C. B. & Skinner, B. F. "Schedules of Reinforcement", 1957 New York: Appleton-Century-Crofts</ref> A reinforcement schedule may be defined as "any procedure that delivers reinforcement to an organism according to some well-defined rule".<ref>{{cite journal|last=Staddon|first=J. E. R|author2=D. T Cerutti|date=February 2003|title=Operant Conditioning|journal=Annual Review of Psychology|volume=54|issue=1|pages=115β144|doi=10.1146/annurev.psych.54.101601.145124|pmc=1473025|pmid=12415075}}</ref> The effects of schedules became, in turn, the basic findings from which Skinner developed his account of operant conditioning. He also drew on many less formal observations of human and animal behavior.<ref>Mecca Chiesa (2004) Radical Behaviorism: The philosophy and the science</ref> Many of Skinner's writings are devoted to the application of operant conditioning to human behavior.<ref>Skinner, B. F. "Science and Human Behavior", 1953. New York: MacMillan</ref> In 1948 he published ''[[Walden Two]]'', a fictional account of a peaceful, happy, productive community organized around his conditioning principles.<ref>Skinner, B.F. (1948). Walden Two. Indianapolis: Hackett</ref> In 1957, [[B. F. Skinner|Skinner]] published ''[[Verbal Behavior (book)|Verbal Behavior]]'',<ref>Skinner, B. F. "Verbal Behavior", 1957. New York: Appleton-Century-Crofts</ref> which extended the principles of operant conditioning to language, a form of human behavior that had previously been analyzed quite differently by linguists and others. Skinner defined new functional relationships such as "mands" and "tacts" to capture some essentials of language, but he introduced no new principles, treating verbal behavior like any other behavior controlled by its consequences, which included the reactions of the speaker's audience.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)