Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Question answering
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
===Mathematical question answering=== An open-source, math-aware, question answering system called [[MathQA]], based on [[Ask Platypus]] and [[Wikidata]], was published in 2018.<ref name="SchubotzScharpf2018">{{cite journal|author1=Moritz Schubotz|author2=Philipp Scharpf|display-authors=et al.|title=Introducing MathQA: a Math-Aware question answering system|doi=10.1108/IDD-06-2018-0022| arxiv=1907.01642|date=12 September 2018|journal=Information Discovery and Delivery|volume=46|issue=4|pages=214–224|publisher=Emerald Publishing Limited|doi-access=free}}</ref> MathQA takes an English or Hindi natural language question as input and returns a mathematical formula retrieved from Wikidata as a succinct answer, translated into a computable form that allows the user to insert values for the variables. The system retrieves names and values of variables and common constants from Wikidata if those are available. It is claimed that the system outperforms a commercial computational mathematical knowledge engine on a test set.<ref name="SchubotzScharpf2018">{{cite journal|author1=Moritz Schubotz|author2=Philipp Scharpf|display-authors=et al.|title=Introducing MathQA: a Math-Aware question answering system|doi=10.1108/IDD-06-2018-0022| arxiv=1907.01642|date=12 September 2018|journal=Information Discovery and Delivery|volume=46|issue=4|pages=214–224|publisher=Emerald Publishing Limited|doi-access=free}}</ref> MathQA is hosted by Wikimedia at https://mathqa.wmflabs.org/. In 2022, it was extended to answer 15 math question types.<ref>Scharpf, P. Schubotz, M. Gipp, B. [https://www.gipp.com/wp-content/papercite-data/pdf/scharpf2022.pdf Mining Mathematical Documents for Question Answering via Unsupervised Formula Labeling] ACM/IEEE Joint Conference on Digital Libraries, 2022.</ref> MathQA methods need to combine natural and formula language. One possible approach is to perform supervised annotation via [[Entity Linking]]. The "ARQMath Task" at [[CLEF]] 2020<ref name=":1">{{Citation|last1=Zanibbi|first1=Richard|date=2020|url=http://dx.doi.org/10.1007/978-3-030-58219-7_15|pages=169–193|place=Cham|publisher=Springer International Publishing|isbn=978-3-030-58218-0|access-date=2021-06-09|last2=Oard|first2=Douglas W.|last3=Agarwal|first3=Anurag|last4=Mansouri|first4=Behrooz|title=Experimental IR Meets Multilinguality, Multimodality, and Interaction |chapter=Overview of ARQMath 2020: CLEF Lab on Answer Retrieval for Questions on Math |series=Lecture Notes in Computer Science |volume=12260 |doi=10.1007/978-3-030-58219-7_15|s2cid=221351064 |url-access=subscription}}</ref> was launched to address the problem of linking newly posted questions from the platform Math [[Stack Exchange]] to existing ones that were already answered by the community. Providing hyperlinks to already answered, semantically related questions helps users to get answers earlier but is a challenging problem because semantic relatedness is not trivial.<ref name=":0">{{Cite book |last=Scharpf |display-authors=etal |url=http://worldcat.org/oclc/1228449497 |title=ARQMath Lab: An Incubator for Semantic Formula Search in zbMATH Open? |date=2020-12-04 |oclc=1228449497}}</ref> The lab was motivated by the fact that 20% of mathematical queries in general-purpose search engines are expressed as well-formed questions.<ref>{{Cite book|last1=Mansouri|first1=Behrooz|last2=Zanibbi|first2=Richard|last3=Oard|first3=Douglas W.|title=2019 ACM/IEEE Joint Conference on Digital Libraries (JCDL) |chapter=Characterizing Searches for Mathematical Concepts |date=June 2019|chapter-url=http://dx.doi.org/10.1109/jcdl.2019.00019|pages=57–66|publisher=IEEE|doi=10.1109/jcdl.2019.00019|isbn=978-1-7281-1547-4|s2cid=198972305}}</ref> The challenge contained two separate sub-tasks. Task 1: "Answer retrieval" matching old post answers to newly posed questions, and Task 2: "Formula retrieval" matching old post formulae to new questions. Starting with the domain of mathematics, which involves formula language, the goal is to later extend the task to other domains (e.g., STEM disciplines, such as chemistry, biology, etc.), which employ other types of special notation (e.g., chemical formulae).<ref name=":1" /><ref name=":0" /> The inverse of mathematical question answering—mathematical question generation—has also been researched. The PhysWikiQuiz physics question generation and test engine retrieves mathematical formulae from Wikidata together with semantic information about their constituting identifiers (names and values of variables).<ref>{{Cite journal |last1=Scharpf |first1=Philipp |last2=Schubotz |first2=Moritz |last3=Spitz |first3=Andreas |last4=Greiner-Petter |first4=Andre |last5=Gipp |first5=Bela |date=2022 |title=Collaborative and AI-aided Exam Question Generation using Wikidata in Education |doi=10.13140/RG.2.2.30988.18568 |arxiv=2211.08361|s2cid=253270181 }}</ref> The formulae are then rearranged to generate a set of formula variants. Subsequently, the variables are substituted with random values to generate a large number of different questions suitable for individual student tests. PhysWikiquiz is hosted by Wikimedia at https://physwikiquiz.wmflabs.org/.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)