Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Artificial intelligence
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=== Knowledge representation === [[File:General Formal Ontology.svg|thumb|upright=1.2|An ontology represents knowledge as a set of concepts within a domain and the relationships between those concepts.]] [[Knowledge representation]] and [[knowledge engineering]]<ref>[[Knowledge representation]] and [[knowledge engineering]]: {{Harvtxt|Russell|Norvig|2021|loc=chpt. 10}}, {{Harvtxt|Poole|Mackworth|Goebel|1998|pp=23β46, 69β81, 169β233, 235β277, 281β298, 319β345}}, {{Harvtxt|Luger|Stubblefield|2004|pp=227β243}}, {{Harvtxt|Nilsson|1998|loc=chpt. 17.1β17.4, 18}}</ref> allow AI programs to answer questions intelligently and make deductions about real-world facts. Formal knowledge representations are used in content-based indexing and retrieval,{{Sfnp|Smoliar|Zhang|1994}} scene interpretation,{{Sfnp|Neumann|MΓΆller|2008}} clinical decision support,{{Sfnp|Kuperman|Reichley|Bailey|2006}} knowledge discovery (mining "interesting" and actionable inferences from large [[database]]s),{{Sfnp|McGarry|2005}} and other areas.{{Sfnp|Bertini|Del Bimbo|Torniai|2006}} A [[knowledge base]] is a body of knowledge represented in a form that can be used by a program. An [[ontology (information science)|ontology]] is the set of objects, relations, concepts, and properties used by a particular domain of knowledge.{{Sfnp|Russell|Norvig|2021|pp=272}} Knowledge bases need to represent things such as objects, properties, categories, and relations between objects;<ref>Representing categories and relations: [[Semantic network]]s, [[description logic]]s, [[Inheritance (object-oriented programming)|inheritance]] (including [[Frame (artificial intelligence)|frames]], and [[Scripts (artificial intelligence)|scripts]]): {{Harvtxt|Russell|Norvig|2021|loc=Β§10.2 & 10.5}}, {{Harvtxt|Poole|Mackworth|Goebel|1998|pp=174β177}}, {{Harvtxt|Luger|Stubblefield|2004|pp=248β258}}, {{Harvtxt|Nilsson|1998|loc=chpt. 18.3}}</ref> situations, events, states, and time;<ref>Representing events and time:[[Situation calculus]], [[event calculus]], [[fluent calculus]] (including solving the [[frame problem]]): {{Harvtxt|Russell|Norvig|2021|loc=Β§10.3}}, {{Harvtxt|Poole|Mackworth|Goebel|1998|pp=281β298}}, {{Harvtxt|Nilsson|1998|loc=chpt. 18.2}}</ref> causes and effects;<ref>[[Causality#Causal calculus|Causal calculus]]: {{Harvtxt|Poole|Mackworth|Goebel|1998|pp=335β337}}</ref> knowledge about knowledge (what we know about what other people know);<ref>Representing knowledge about knowledge: Belief calculus, [[modal logic]]s: {{Harvtxt|Russell|Norvig|2021|loc=Β§10.4}}, {{Harvtxt|Poole|Mackworth|Goebel|1998|pp=275β277}}</ref> [[default reasoning]] (things that humans assume are true until they are told differently and will remain true even when other facts are changing);<ref name="Default reasoning">[[Default reasoning]], [[Frame problem]], [[default logic]], [[non-monotonic logic]]s, [[circumscription (logic)|circumscription]], [[closed world assumption]], [[abductive reasoning|abduction]]: {{Harvtxt|Russell|Norvig|2021|loc=Β§10.6}}, {{Harvtxt|Poole|Mackworth|Goebel|1998|pp=248β256, 323β335}}, {{Harvtxt|Luger|Stubblefield|2004|pp=335β363}}, {{Harvtxt|Nilsson|1998|loc=~18.3.3}} (Poole ''et al.'' places abduction under "default reasoning". Luger ''et al.'' places this under "uncertain reasoning").</ref> and many other aspects and domains of knowledge. Among the most difficult problems in knowledge representation are the breadth of [[Commonsense knowledge (artificial intelligence)|commonsense knowledge]] (the set of atomic facts that the average person knows is enormous);<ref name="Breadth of commonsense knowledge">Breadth of commonsense knowledge: {{Harvtxt|Lenat|Guha|1989|loc=Introduction}}, {{Harvtxt|Crevier|1993|pp=113β114}}, {{Harvtxt|Moravec|1988|p=13}}, {{Harvtxt|Russell|Norvig|2021|pp=241, 385, 982}} ([[qualification problem]])</ref> and the sub-symbolic form of most commonsense knowledge (much of what people know is not represented as "facts" or "statements" that they could express verbally).<ref name="Psychological evidence of the prevalence of sub"/> There is also the difficulty of [[knowledge acquisition]], the problem of obtaining knowledge for AI applications.{{Efn|It is among the reasons that [[expert system]]s proved to be inefficient for capturing knowledge.{{Sfnp|Newquist|1994|p=296}}{{Sfnp|Crevier|1993|pp=204β208}}}}
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)