Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Generative grammar
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Principles == Generative grammar is an umbrella term for a variety of approaches to linguistics. What unites these approaches is the goal of uncovering the cognitive basis of language by formulating and testing explicit models of humans' subconscious grammatical knowledge.<ref name ="WasowHandbookUmbrella">{{cite encyclopedia |title=Generative Grammar |encyclopedia=The Handbook of Linguistics|year=2003|last=Wasow|first=Thomas|author-link=Tom Wasow|editor-last1=Aronoff|editor-first1=Mark|editor-last2=Ress-Miller|editor-first2=Janie|publisher= Blackwell|url=https://www.blackwellpublishing.com/content/BPL_Images/Content_store/WWW_Content/9780631204978/12.pdf|doi=10.1002/9780470756409.ch12}|pages=296,311|quote="...generative grammar is not so much a theory as a family or theories, or a school of thought... [having] shared assumptions and goals, widely used formal devices, and generally accepted empirical results"}}</ref><ref name=carnie_p5>{{Cite book |last=Carnie|first=Andrew|title=Syntax: A Generative Introduction|author-link=Andrew Carnie|publisher=Wiley-Blackwell|year=2002|isbn=978-0-631-22543-0|page=5}}</ref> === Cognitive science === Generative grammar studies language as part of [[cognitive science]]. Thus, research in the generative tradition involves formulating and testing hypotheses about the mental processes that allow humans to use language.<ref>{{Cite book|last=Carnie|first=Andrew|author-link=Andrew Carnie|title=Syntax: A Generative Introduction |publisher=Wiley-Blackwell|year=2002|isbn=978-0-631-22543-0|pages=4-6,8}}</ref><ref name ="WasowHandbookMental">{{cite encyclopedia|title=Generative Grammar|encyclopedia=The Handbook of Linguistics|year=2003|last=Wasow|first=Thomas|author-link=Tom Wasow|editor-last1=Aronoff|editor-first1=Mark|editor-last2=Ress-Miller|editor-first2=Janie|publisher= Blackwell|url=https://www.blackwellpublishing.com/content/BPL_Images/Content_store/WWW_Content/9780631204978/12.pdf|doi=10.1002/9780470756409.ch12|pages=295-296,299-300}}</ref><ref name = "AdgerCogSci">{{cite book |last=Adger|first=David|author-link=David Adger|year=2003|title=Core syntax: A minimalist approach|publisher=Oxford University Press|page=14|isbn=978-0199243709}}</ref> Like other approaches in linguistics, generative grammar engages in [[linguistic description]] rather than [[linguistic prescriptivism|linguistic prescription]].<ref>{{Cite book|last=Carnie|first=Andrew|author-link=Andrew Carnie|title=Syntax: A Generative Introduction|publisher=Wiley-Blackwell|year=2002|isbn=978-0-631-22543-0|page=8}}</ref><ref name ="WasowHandbookPreDes">{{cite encyclopedia|title=Generative Grammar|encyclopedia=The Handbook of Linguistics|year=2003|last=Wasow|first=Thomas|author-link=Tom Wasow|editor-last1=Aronoff|editor-first1=Mark|editor-last2=Ress-Miller|editor-first2=Janie|publisher= Blackwell|url=https://www.blackwellpublishing.com/content/BPL_Images/Content_store/WWW_Content/9780631204978/12.pdf|doi=10.1002/9780470756409.ch12|pages=295,297}}</ref> === Explicitness and generality === Generative grammar proposes models of language consisting of explicit rule systems, which make testable [[falsifiability|falsifiable]] predictions. This is different from [[traditional grammar]] where grammatical patterns are often described more loosely.<ref name ="WasowHandbookExpGen">{{cite encyclopedia|title=Generative Grammar|encyclopedia=The Handbook of Linguistics|year=2003|last=Wasow|first=Thomas|author-link=Tom Wasow|editor-last1=Aronoff|editor-first1=Mark|editor-last2=Ress-Miller|editor-first2=Janie|publisher= Blackwell|url=https://www.blackwellpublishing.com/content/BPL_Images/Content_store/WWW_Content/9780631204978/12.pdf|doi=10.1002/9780470756409.ch12|pages=298-300}}</ref><ref name = "AdgerExpGen">{{cite book|last=Adger|first=David|author-link=David Adger|year=2003|title=Core syntax: A minimalist approach|publisher=Oxford University Press|pages=14-15|isbn=978-0199243709}}</ref> These models are intended to be parsimonious, capturing generalizations in the data with as few rules as possible. For example, because English [[imperative mood|imperative]] [[tag questions]] obey the same restrictions that second person [[future tense|future]] [[declarative mood|declarative]] tags do, [[Paul Postal]] proposed that the two constructions are derived from the same underlying structure. By adopting this hypothesis, he was able to capture the restrictions on tags with a single rule. This kind of reasoning is commonplace in generative research.<ref name ="WasowHandbookExpGen"/> Particular theories within generative grammar have been expressed using a variety of [[formal system]]s, many of which are modifications or extensions of [[context free grammars]].<ref name ="WasowHandbookExpGen"/> === Competence versus performance === Generative grammar generally distinguishes [[linguistic competence]] and [[linguistic performance]].<ref name ="WasowHandbookCompPerf">{{cite encyclopedia|title=Generative Grammar|encyclopedia=The Handbook of Linguistics|year=2003|last=Wasow|first=Thomas|author-link=Tom Wasow|editor-last1=Aronoff|editor-first1=Mark|editor-last2=Ress-Miller|editor-first2=Janie|publisher= Blackwell|url=https://www.blackwellpublishing.com/content/BPL_Images/Content_store/WWW_Content/9780631204978/12.pdf|doi=10.1002/9780470756409.ch12}|pages=297-298}}</ref> Competence is the collection of subconscious rules that one knows when one knows a language; performance is the system which puts these rules to use.<ref name ="WasowHandbookCompPerf"/><ref>{{cite book|last=Pritchett|first=Bradley|year=1992|title=Grammatical competence and parsing performance|publisher=University of Chicago Press|page=2|isbn=0-226-68442-3}}</ref> This distinction is related to the broader notion of [[David Marr (neuroscientist)#Levels_of_analysis|Marr's levels]] used in other cognitive sciences, with competence corresponding to Marr's computational level.<ref>{{cite book |last=Marr|first=David|author-link=David Marr (neuroscientist)|year=1982|title=Vision|publisher=MIT Press|isbn=978-0262514620|page=28}}</ref> For example, generative theories generally provide competence-based explanations for why [[English language|English]] speakers would judge the sentence in (1)<!--This refers to "*That cats is eating the mouse". Please update labels if necessary.--> as [[acceptability (linguistics)|odd]]. In these explanations, the sentence would be [[ungrammatical]] because the rules of English only generate sentences where [[demonstrative]]s [[Agreement (linguistics)|agree]] with the [[grammatical number]] of their associated [[noun]].<ref name = "AdgerCompPerf">{{cite book |last=Adger|first=David|author-link=David Adger|year=2003|title=Core syntax: A minimalist approach|publisher=Oxford University Press|pages=4-7,17|isbn=978-0199243709}}</ref> :(1) *That cats is eating the mouse. By contrast, generative theories generally provide performance-based explanations for the oddness of [[center embedding]] sentences like one in (2).<!--This refers to "The cat that the dog that the man fed chased meowed." Please update labels if necessary.--> According to such explanations, the grammar of English could in principle generate such sentences, but doing so in practice is so taxing on [[working memory]] that the sentence ends up being [[parsing|unparsable]].<ref name = "AdgerCompPerf" /><ref name="DillonMomaSlides">{{citation |url=https://shotam.github.io/LING611_slides/LING611_day1.pdf|last1=Dillon|first1=Brian|last2=Momma|first2=Shota|title=Psychological background to linguistic theories|year=2021|type=Course notes}}</ref> :(2) *The cat that the dog that the man fed chased meowed. In general, performance-based explanations deliver a simpler theory of grammar at the cost of additional assumptions about memory and parsing. As a result, the choice between a competence-based explanation and a performance-based explanation for a given phenomenon is not always obvious and can require investigating whether the additional assumptions are supported by independent evidence.<ref name="DillonMomaSlides"/><ref>{{cite encyclopedia|title=Deriving competing predictions from grammatical approaches and reductionist approaches to island effects|encyclopedia=Experimental syntax and island effects|year=2013|last1=Sprouse|first1=Jon|last2=Wagers|first2=Matt|last3=Phillips|first3=Colin|author-link3=Colin Phillips|editor-last1=Sprouse|editor-first1=Jon|editor-last2=Hornstein|editor-first2=Norbert|editor-link2=Norbert Hornstein|publisher=Cambridge University Press|doi=10.1017/CBO9781139035309.002}}</ref> For example, while many generative models of syntax explain [[syntactic island|island effects]] by positing constraints within the grammar, it has also been argued that some or all of these constraints are in fact the result of limitations on performance.<ref>{{cite encyclopedia|title=On the nature of island constraints I: Language processing and reductionist accounts|encyclopedia=Experimental syntax and island effects|year=2013|last=Phillips|first=Colin|editor-last1=Sprouse|editor-first1=Jon|editor-last2=Hornstein|editor-first2=Norbert|publisher=Cambridge University Press|url=http://www.colinphillips.net/wp-content/uploads/2014/08/phillips2013_islands1.pdf|doi=10.1017/CBO9781139035309.005}}</ref><ref>{{cite encyclopedia|title=Islands in the grammar? Standards of evidence|encyclopedia=Experimental syntax and island effects|year=2013|last1=Hofmeister|first1=Philip|last2=Staum Casasanto|first2=Laura|last3=Sag|first3=Ivan|author-link3=Ivan Sag|editor-last1=Sprouse|editor-first1=Jon|editor-last2=Hornstein|editor-first2=Norbert|publisher=Cambridge University Press|doi=10.1017/CBO9781139035309.004}}</ref> Non-generative approaches often do not posit any distinction between competence and performance. For instance, [[usage-based models of language]] assume that grammatical patterns arise as the result of usage.<ref> {{cite book|last1=Vyvyan|first1=Evans|author-link=Vyvyan Evans|last2=Green|first2=Melanie|year=2006|title=Cognitive Linguistics: An Introduction|publisher=Edinburgh University Press|pages=108-111|isbn=0-7486-1832-5}}</ref> === Innateness and universality === A major goal of generative research is to figure out which aspects of linguistic competence are innate and which are not. Within generative grammar, it is generally accepted that at least some [[domain-specific]] aspects are innate, and the term "universal grammar" is often used as a placeholder for whichever those turn out to be.<ref name ="WasowHandbookUniversality">{{cite encyclopedia |title=Generative Grammar |encyclopedia=The Handbook of Linguistics|year=2003|last=Wasow|first=Thomas|author-link=Tom Wasow|editor-last1=Aronoff|editor-first1=Mark|editor-last2=Ress-Miller|editor-first2=Janie|publisher= Blackwell|url=https://www.blackwellpublishing.com/content/BPL_Images/Content_store/WWW_Content/9780631204978/12.pdf|doi=10.1002/9780470756409.ch12}|page=299}}</ref><ref name = "PesetskyUG">{{cite encyclopedia|title=Linguistic universals and universal grammar|encyclopedia=The MIT encyclopedia of the cognitive sciences|year=1999|last=Pesetsky|first=David|author-link=David Pesetsky|editor-last1=Wilson|editor-first1=Robert|editor-last2=Keil|editor-first2=Frank|publisher=MIT Press|doi=10.7551/mitpress/4660.001.0001 |pages=476-478}}</ref> The idea that at least some aspects are innate is motivated by [[poverty of the stimulus]] arguments.<ref name = "AdgerPOS">{{cite book|last=Adger|first=David|author-link=David Adger|year=2003|title=Core syntax: A minimalist approach|publisher=Oxford University Press|pages=8-11|isbn=978-0199243709}}</ref><ref name ="Lasnik&LidzPOS">{{cite encyclopedia |title=The Argument from the Poverty of the Stimulus|last1=Lasnik|first1=Howard|author-link1=Howard Lasnik|last2=Lidz|first2=Jeffrey|author-link2=Jeffrey Lidz|encyclopedia=The Oxford Handbook of Universal Grammar|year=2017|editor-last=Roberts|editor-first=Ian|editor-link=Ian Roberts (linguist)|publisher=Oxford University Press|url=https://jefflidz.com/Docs/LasnikLidz2016.pdf}}</ref> For example, one famous poverty of the stimulus argument concerns the acquisition of [[yes-no question]]s in English. This argument starts from the observation that children only make mistakes compatible with rules targeting [[hierarchical structure (linguistics)|hierarchical structure]] even though the examples which they encounter could have been generated by a simpler rule that targets linear order. In other words, children seem to ignore the possibility that the question rule is as simple as "switch the order of the first two words" and immediately jump to alternatives that rearrange [[constituent (linguistics)|constituents]] in [[Tree (data_structure)|tree structure]]s. This is taken as evidence that children are born knowing that grammatical rules involve hierarchical structure, even though they have to figure out what those rules are.<ref name = "AdgerPOS"/><ref name ="Lasnik&LidzPOS"/><ref>{{cite journal|last1=Crain|first1=Stephen|author-link1=Stephen Crain|last2=Nakayama|first2=Mineharu|year=1987|title=Structure dependence in grammar formation|journal=Language|volume=63|issue=3|doi=10.2307/415004}}</ref> The empirical basis of poverty of the stimulus arguments has been challenged by [[Geoffrey Pullum]] and others, leading to back-and-forth debate in the [[language acquisition]] literature.<ref name="PullumScholz">{{cite journal|last1=Pullum|first1=Geoff|author-link1=Geoff Pullum|last2=Scholz|first2=Barbara|author-link2=Barbara Scholz|date=2002|title=Empirical assessment of stimulus poverty arguments|journal=The Linguistic Review|volume=18|issue=1β2|pages=9β50|doi=10.1515/tlir.19.1-2.9}}</ref><ref name="LegateYang">{{cite journal|last1=Legate|first1=Julie Anne|author-link1=Julie Anne Legate|last2=Yang|first2=Charles|author-link2=Charles Yang (linguist)|date=2002|title=Empirical re-assessment of stimulus poverty arguments|journal=The Linguistic Review|volume=18|issue=1β2|pages=151-162|doi=10.1515/tlir.19.1-2.9|url=https://www.ling.upenn.edu/~ycharles/papers/tlr-final.pdf}}</ref> Recent work has also suggested that some [[recurrent neural network]] architectures are able to learn hierarchical structure without an explicit constraint.<ref>{{cite journal|last1=McCoy|first1=R. Thomas|last2=Frank|first2=Robert|last3=Linzen|first3=Tal|year=2018 |title=Revisiting the poverty of the stimulus: hierarchical generalization without a hierarchical bias in recurrent neural networks|journal=Proceedings of the 40th Annual Conference of the Cognitive Science Society|pages=2093-2098|url=https://tallinzen.net/media/papers/mccoy_frank_linzen_2018_cogsci.pdf}}</ref> Within generative grammar, there are a variety of theories about what universal grammar consists of. One notable hypothesis proposed by [[Hagit Borer]] holds that the fundamental syntactic operations are universal and that all variation arises from different [[feature (linguistics)|feature]]-specifications in the [[mental lexicon|lexicon]].<ref name="PesetskyUG"/><ref>{{cite encyclopedia|title=Parameters|encyclopedia=The Oxford Handbook of Linguistic Minimalism|year=2012|last= Gallego|first=Γngel|editor-last=Boeckx|editor-first=Cedric|publisher=Oxford University Press|doi=10.1093/oxfordhb/9780199549368.013.0023}}</ref> On the other hand, a strong hypothesis adopted in some variants of [[Optimality Theory]] holds that humans are born with a universal set of constraints, and that all variation arises from differences in how these constraints are ranked.<ref name="PesetskyUG"/><ref name ="McCarthyOT">{{cite book|last=McCarthy|first=John|year=1992|title=Doing optimality theory|publisher=Wiley|pages=1-3|isbn=978-1-4051-5136-8}}</ref> In a 2002 paper, [[Noam Chomsky]], [[Marc Hauser]] and [[W. Tecumseh Fitch]] proposed that universal grammar consists solely of the capacity for hierarchical phrase structure.<ref>{{cite journal|last1=Hauser|first1=Marc|author-link1=Marc Hauser|last2=Chomsky|first2=Noam|author-link2=Noam Chomsky|last3=Fitch|first3=W. Tecumseh|author-link3=W. Tecumseh Fitch|year=2002|title=The faculty of language: what is it, who has it, and how did it evolve|journal=Science|volume=298|pages=1569-1579|doi=10.1126/science.298.5598.1569}}</ref> In day-to-day research, the notion that universal grammar exists motivates analyses in terms of general principles. As much as possible, facts about particular languages are derived from these general principles rather than from language-specific stipulations.<ref name ="WasowHandbookUniversality"/>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)