Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Link grammar
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=== Costs and selection === Connectors can have an optional [[floating-point]] cost markup, so that some are "cheaper" to use than others, thus giving preference to certain parses over others.<ref name="intro"/> That is, the total cost of parse is the sum of the individual costs of the connectors that were used; the cheapest parse indicates the most likely parse. This is used for parse-ranking multiple ambiguous parses. The fact that the costs are local to the connectors, and are not a global property of the algorithm makes them essentially [[Markov property|Markovian]] in nature.<ref>{{cite conference |author1=John Lafferty |author2=Daniel Sleator |author3=Davey Temperley |title=Grammatical Trigrams: a Probabilistic Model of Link Grammar |conference=Proceedings of the AAAI Conference on Probabilistic Approaches to Natural Language |year=1992 |url=https://www.cs.cmu.edu/afs/cs.cmu.edu/project/link/pub/www/papers/ps/gram3gram.pdf}}</ref><ref>_{{cite arXiv |author=Ramon Ferrer-i-Cancho |year=2013 |title=Hubiness, length, crossings and their relationships in dependency trees |class=cs.CL |eprint=1304.4086}}</ref><ref>{{cite journal |author=D. Temperley |year=2008 |title=Dependency length minimization in natural and artificial languages |journal=Journal of Quantitative Linguistics |volume=15 |number=3 |pages=256β282|doi=10.1080/09296170802159512 }}</ref><ref>{{cite book |author=E. Gibson |year=2000 |chapter=The dependency locality theory: A distance-based theory of linguistic complexity |editor1=Marantz, A. |editor2=Miyashita, Y. |editor3=O'Neil, W. |title=Image, Language, Brain: Papers from the first Mind Articulation Project Symposium |publisher=MIT Press |place=Cambridge, MA}}</ref><ref>{{cite journal |author=Haitao Liu |url=http://www.lingviko.net/JCS.pdf |title=Dependency distance as a metric of language comprehension difficulty |year=2008 |journal=Journal of Cognitive Science |volume=9 |number=2 |pages=159β191|doi=10.17791/jcs.2008.9.2.159 }}</ref><ref>{{cite journal |author1=Richard Futrell |author2=Kyle Mahowald |author3=Edward Gibson |title=Large-scale evidence of dependency length minimization in 37 languages |year=2015 |journal=PNAS |volume=112 |number=33 |pages=10336β10341 |doi=10.1073/pnas.1502134112|doi-access=free |pmid=26240370 |pmc=4547262 |bibcode=2015PNAS..11210336F }}</ref> The assignment of a log-likelihood to linkages allows link grammar to implement the [[Selection (linguistics)|semantic selection]] of predicate-argument relationships. That is, certain constructions, although syntactically valid, are extremely unlikely. In this way, link grammar embodies some of the ideas present in [[operator grammar]]. Because the costs are additive, they behave like the logarithm of the probability (since log-likelihoods are additive), or equivalently, somewhat like the [[Entropy (information theory)|entropy]] (since entropies are additive). This makes link grammar compatible with machine learning techniques such as [[hidden Markov model]]s and the [[Viterbi algorithm]], because the link costs correspond to the link weights in [[Markov network]]s or [[Bayesian network]]s.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)