Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Probabilistic context-free grammar
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
In [[theoretical linguistics]] and [[computational linguistics]], '''probabilistic context free grammars''' ('''PCFGs''') extend [[context-free grammar]]s, similar to how [[hidden Markov model]]s extend [[regular grammar]]s. Each [[Formal grammar#The syntax of grammars|production]] is assigned a probability. The probability of a derivation (parse) is the product of the probabilities of the productions used in that derivation. These probabilities can be viewed as parameters of the model, and for large problems it is convenient to learn these parameters via [[machine learning]]. A probabilistic grammar's validity is constrained by context of its training dataset. PCFGs originated from [[grammar theory]], and have application in areas as diverse as [[natural language processing]] to the study the structure of [[RNA]] molecules and design of [[programming language]]s. Designing efficient PCFGs has to weigh factors of scalability and generality. Issues such as grammar ambiguity must be resolved. The grammar design affects results accuracy. Grammar parsing algorithms have various time and memory requirements.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)