Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Computational linguistics
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Modeling language acquisition== The fact that during [[language acquisition]], children are largely only exposed to positive evidence,<ref>Bowerman, M. (1988). [http://pubman.mpdl.mpg.de/pubman/item/escidoc:468143:4/component/escidoc:532427/bowerman_1988_The-No.pdf The "no negative evidence" problem: How do children avoid constructing an overly general grammar. Explaining language universals].</ref> meaning that the only evidence for what is a correct form is provided, and no evidence for what is not correct,<ref name="autogenerated1971">Braine, M.D.S. (1971). On two types of models of the internalization of grammars. In D.I. Slobin (Ed.), The ontogenesis of grammar: A theoretical perspective. New York: Academic Press.</ref> was a limitation for the models at the time because the now available [[deep learning]] models were not available in late 1980s.<ref name="powers1989">Powers, D.M.W. & Turk, C.C.R. (1989). ''Machine Learning of Natural Language''. Springer-Verlag. {{ISBN|978-0-387-19557-5}}.</ref> It has been shown that languages can be learned with a combination of simple input presented incrementally as the child develops better memory and longer attention span,<ref name="autogenerated1993">{{cite journal|title= Learning and development in neural networks: The importance of starting small|journal= Cognition|volume= 48|issue= 1|pages= 71–99|doi= 10.1016/0010-0277(93)90058-4|pmid= 8403835|year= 1993|last1= Elman|first1= Jeffrey L.|s2cid= 2105042|citeseerx= 10.1.1.135.4937}}</ref> which explained the long period of [[language acquisition]] in human infants and children.<ref name="autogenerated1993"/> Robots have been used to test linguistic theories.<ref>{{cite journal | last1 = Salvi | first1 = G. | last2 = Montesano | first2 = L. | last3 = Bernardino | first3 = A. | last4 = Santos-Victor | first4 = J. | year = 2012 | title = Language bootstrapping: learning word meanings from the perception-action association | journal = IEEE Transactions on Systems, Man, and Cybernetics - Part B: Cybernetics | volume = 42 | issue = 3| pages = 660–71 | doi = 10.1109/TSMCB.2011.2172420 | pmid = 22106152 | arxiv = 1711.09714 | s2cid = 977486 }}</ref> Enabled to learn as children might, models were created based on an [[affordance]] model in which mappings between actions, perceptions, and effects were created and linked to spoken words. Crucially, these robots were able to acquire functioning word-to-meaning mappings without needing grammatical structure. Using the [[Price equation]] and [[Pólya urn]] dynamics, researchers have created a system which not only predicts future linguistic evolution but also gives insight into the evolutionary history of modern-day languages.<ref>{{cite journal|author1=Gong, T.|author2=Shuai, L.|author3=Tamariz, M.|author4=Jäger, G.|name-list-style=amp|year=2012|title=Studying Language Change Using Price Equation and Pólya-urn Dynamics|editor=E. Scalas|journal=PLOS ONE|volume=7|issue=3|page=e33171|doi=10.1371/journal.pone.0033171|pmid=22427981|pmc=3299756|bibcode=2012PLoSO...733171G|doi-access=free}}</ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)