Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Concept inventory
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
{{short description|Knowledge assessment tool}} A '''concept inventory''' is a [[criterion-referenced test]] designed to help determine whether a student has an accurate working [[knowledge]] of a specific set of concepts. Historically, concept inventories have been in the form of [[multiple-choice test]]s in order to aid interpretability and facilitate administration in large classes. Unlike a typical, teacher-authored multiple-choice test, questions and response choices on concept inventories are the subject of extensive research. The aims of the research include ascertaining (a) the range of what individuals think a particular question is asking and (b) the most common responses to the questions. Concept inventories are evaluated to ensure test [[reliability (statistics)|reliability]] and [[validity (statistics)|validity]]. In its final form, each question includes one correct answer and several distractors. Ideally, a score on a criterion-referenced test reflects the degrees of proficiency of the test taker with one or more KSAs (knowledge, skills and/abilities), and may report results with one unidimensional score and/or multiple sub-scores. Criterion-referenced tests differ from [[norm-referenced tests]] in that (in theory) the former report level of proficiency relative pre-determined level and the latter reports relative standing to other test takers. Criterion-referenced tests may be used to determine whether a student reached predetermined levels of proficiency (i.e., scoring above some [[cutoff score]]) and therefore move on to the next unit or level of study. The distractors are incorrect or irrelevant answers that are usually (but not always) based on students' commonly held misconceptions.<ref>"Development and Validation of Instruments to Measure Learning of Expert-Like Thinking." W. K. Adams & C. E. Wieman, 2010. International Journal of Science Education, 1-24. iFirst, {{doi|10.1080/09500693.2010.512369}}</ref> Test developers often research student misconceptions by examining students' responses to open-ended essay questions and conducting "think-aloud" interviews with students. The distractors chosen by students help researchers understand student thinking and give instructors insights into students' prior knowledge (and, sometimes, firmly held beliefs). This foundation in research underlies instrument construction and design, and plays a role in helping educators obtain clues about students' ideas, [[scientific misconceptions]], and ''didaskalogenic'' ("teacher-induced" or "teaching-induced") confusions and conceptual [[Lacuna model|lacunae]] that interfere with learning. ==Concept inventories in use== Concept inventories are education-related diagnostic tests.<ref name="Tregust, 1988">{{cite journal | last=Treagust | first=David F. | title=Development and use of diagnostic tests to evaluate students' misconceptions in science | journal=International Journal of Science Education | publisher=Informa UK Limited | volume=10 | issue=2 | year=1988 | issn=0950-0693 | doi=10.1080/0950069880100204 | pages=159β169| bibcode=1988IJSEd..10..159T }}</ref> In 1985 Halloun and Hestenes introduced a "multiple-choice mechanics diagnostic test" to examine students' concepts about motion.<ref name="Halloun and Hestenes">Hallouin, I. A., & Hestenes, D. [https://davidhestenes.net/modeling/R&E/Hestenes_CommonSenseConcept.pdf Common sense concepts about motion] (1985). American Journal of Physics, 53, 1043-1055</ref> It evaluates student understanding of basic concepts in classical (macroscopic) mechanics. A little later, the [[Force Concept Inventory]] (FCI), another concept inventory, was developed.<ref name="Halloun and Hestenes"/><ref>{{cite journal | last1=Hestenes | first1=David | last2=Wells | first2=Malcolm | last3=Swackhamer | first3=Gregg | title=Force concept inventory | journal=The Physics Teacher | publisher=American Association of Physics Teachers (AAPT) | volume=30 | issue=3 | year=1992 | issn=0031-921X | doi=10.1119/1.2343497 | pages=141β158| bibcode=1992PhTea..30..141H | s2cid=12311835 | url=https://davidhestenes.net/modeling/R&E/FCI.PDF }}</ref><ref>{{cite journal | last=Hestenes | first=David | title=Who needs physics education research!? | journal=American Journal of Physics | publisher=American Association of Physics Teachers (AAPT) | volume=66 | issue=6 | year=1998 | issn=0002-9505 | doi=10.1119/1.18898 | pages=465β467| bibcode=1998AmJPh..66..465H| url=https://davidhestenes.net/modeling/R&E/WhoNeedsPER.pdf }}</ref> The FCI was designed to assess student understanding of the [[Classical mechanics|Newtonian]] concepts of force. Hestenes (1998) found that while "nearly 80% of the [students completing introductory college physics courses] could state [[Newton's laws of motion|Newton's Third Law]] at the beginning of the course, FCI data showed that less than 15% of them fully understood it at the end". These results have been replicated in a number of studies involving students at a range of institutions (see sources section below). That said, there remain questions as what exactly the FCI measures.<ref name="Huffman">{{cite journal | last1=Huffman | first1=Douglas | last2=Heller | first2=Patricia|author2-link=Patricia Heller | title=What does the force concept inventory actually measure? | journal=The Physics Teacher | publisher=American Association of Physics Teachers (AAPT) | volume=33 | issue=3 | year=1995 | issn=0031-921X | doi=10.1119/1.2344171 | pages=138β143| bibcode=1995PhTea..33..138H |url=http://www.physics.emory.edu/faculty/weeks//journal/hestenes-tpt95c.pdf}}</ref> Results using the FCI have led to greater recognition in the [[science education]] community of the importance of students' "interactive engagement" with the materials to be mastered.<ref name="Hake">{{cite journal | last=Hake | first=Richard R. | title=Interactive-engagement versus traditional methods: A six-thousand-student survey of mechanics test data for introductory physics courses | journal=American Journal of Physics | publisher=American Association of Physics Teachers (AAPT) | volume=66 | issue=1 | year=1998 | issn=0002-9505 | doi=10.1119/1.18809 | pages=64β74| bibcode=1998AmJPh..66...64H | s2cid=14835931 }}</ref> Since the development of the FCI, other physics instruments have been developed. These include the ''Force and Motion Conceptual Evaluation'' concept<ref name="Thornton">{{cite journal | last1=Thornton | first1=Ronald K. | last2=Sokoloff | first2=David R. | title=Assessing student learning of Newton's laws: The Force and Motion Conceptual Evaluation and the Evaluation of Active Learning Laboratory and Lecture Curricula | journal=American Journal of Physics | publisher=American Association of Physics Teachers (AAPT) | volume=66 | issue=4 | year=1998 | issn=0002-9505 | doi=10.1119/1.18863 | pages=338β352| bibcode=1998AmJPh..66..338T }}</ref> and the ''Brief Electricity and Magnetism Assessment''.<ref name="Ding">Ding, L, [[Ruth Chabay|Chabay, R]], Sherwood, B, & Beichner, R (2006). Evaluating an electricity and magnetism assessment tool: Brief electricity and magnetism assessment Brief Electricity and Magnetism Assessment (BEMA). Phys. Rev. ST Physics Ed. Research 2, 7 pages. {{cite journal |title=Evaluating an electricity and magnetism assessment tool: Brief electricity and magnetism assessment |journal=Physical Review Special Topics - Physics Education Research |volume=2 |issue=1 |pages=010105 |doi=10.1103/PhysRevSTPER.2.010105 |year=2006 |last1=Ding |first1=Lin |last2=Chabay |first2=Ruth|author2-link= Ruth Chabay |last3=Sherwood |first3=Bruce |last4=Beichner |first4=Robert |bibcode=2006PRPER...2a0105D |doi-access=free }}</ref> For a discussion of how a number of concept inventories were developed see Beichner.<ref>{{cite journal | last=Beichner | first=Robert J. | title=Testing student interpretation of kinematics graphs | journal=American Journal of Physics | publisher=American Association of Physics Teachers (AAPT) | volume=62 | issue=8 | year=1994 | issn=0002-9505 | doi=10.1119/1.17449 | pages=750β762| bibcode=1994AmJPh..62..750B }}</ref> In addition to physics, concept inventories have been developed in [[statistics]],<ref name = "Allen">Allen, K (2006) The Statistics Concept Inventory: Development and Analysis of a Cognitive Assessment Instrument in Statistics. Doctoral dissertation, The University of Oklahoma. [https://web.archive.org/web/20120315000000*/https://engineering.purdue.edu/SCI/pubs/Kirk%20Allen%20dissertation.pdf]</ref> [[chemistry]],<ref>{{Cite web |url=http://jchemed.chem.wisc.edu/JCEDlib/QBank/collection/CQandChP/CQs/ConceptsInventory/CCIIntro.html |title=The Chemical Concepts Inventory. Visited Feb. 14, 2011 |access-date=2007-07-30 |archive-url=https://web.archive.org/web/20070718142732/http://jchemed.chem.wisc.edu/JCEDLib/QBank/collection/CQandChP/CQs/ConceptsInventory/CCIIntro.html |archive-date=2007-07-18 |url-status=dead }}</ref><ref name="Wright"/> [[astronomy]],<ref>[http://solar.physics.montana.edu/aae/adt/] Astronomy Diagnostic Test (ADT) Version 2.0, visited Feb. 14, 2011</ref> basic [[biology]],<ref name = "Garvin-Doxas & Klymkowsky, 2008">{{cite journal | last1=Garvin-Doxas | first1=Kathy | last2=Klymkowsky | first2=Michael W. | editor-last=Alberts | editor-first=Bruce | title=Understanding Randomness and its Impact on Student Learning: Lessons Learned from Building the Biology Concept Inventory (BCI) | journal=CBE: Life Sciences Education | publisher=American Society for Cell Biology (ASCB) | volume=7 | issue=2 | year=2008 | issn=1931-7913 | doi=10.1187/cbe.07-08-0063 | pages=227β233|pmid=18519614| pmc=2424310 }}</ref><ref>{{cite journal | last=D'Avanzo | first=Charlene | title=Biology Concept Inventories: Overview, Status, and Next Steps | journal=BioScience | publisher=Oxford University Press (OUP) | volume=58 | issue=11 | year=2008 | issn=1525-3244 | doi=10.1641/b581111 | pages=1079β1085|doi-access=free}}</ref><ref>D'Avanzo C, Anderson CW, Griffith A, Merrill J. 2010. Thinking like a biologist: Using diagnostic questions to help students reason with biological principles. (17 January 2010; www.biodqc.org/)</ref><ref>{{cite journal | last1=Wilson | first1=Christopher D. | last2=Anderson | first2=Charles W. | last3=Heidemann | first3=Merle | last4=Merrill | first4=John E. | last5=Merritt | first5=Brett W. | last6=Richmond | first6=Gail | last7=Sibley | first7=Duncan F. | last8=Parker | first8=Joyce M.|display-authors=5 | title=Assessing Students' Ability to Trace Matter in Dynamic Systems in Cell Biology | journal=CBE: Life Sciences Education | publisher=American Society for Cell Biology (ASCB) | volume=5 | issue=4 | year=2006 | issn=1931-7913 | doi=10.1187/cbe.06-02-0142 | pages=323β331| pmid=17146039 | pmc=1681358 | doi-access=free }}</ref> [[natural selection]],<ref name="AFN">{{cite journal | last1=Anderson | first1=Dianne L. | last2=Fisher | first2=Kathleen M. | last3=Norman | first3=Gregory J. | title=Development and evaluation of the conceptual inventory of natural selection | journal=Journal of Research in Science Teaching | publisher=Wiley | volume=39 | issue=10 | date=2002-11-14 | issn=0022-4308 | doi=10.1002/tea.10053 | pages=952β978| bibcode=2002JRScT..39..952A | doi-access=free }}</ref><ref name="Nehm & Schonfeld 2008">{{cite journal | last1=Nehm | first1=Ross H. | last2=Schonfeld | first2=Irvin Sam | title=Measuring knowledge of natural selection: A comparison of the CINS, an open-response instrument, and an oral interview | journal=Journal of Research in Science Teaching | publisher=Wiley | volume=45 | issue=10 | year=2008 | issn=0022-4308 | doi=10.1002/tea.20251 | pages=1131β1160| bibcode=2008JRScT..45.1131N |url=http://www1.ccny.cuny.edu/prospective/socialsci/psychology/faculty/upload/Nehm-Schonfeld-2008-JRST.pdf|archive-url=https://web.archive.org/web/20110517013711/http://www1.ccny.cuny.edu/prospective/socialsci/psychology/faculty/upload/Nehm-Schonfeld-2008-JRST.pdf |archive-date=2011-05-17 }}</ref><ref name="Nehm & Schonfeld 2010">Nehm R & Schonfeld IS (2010). The future of natural selection knowledge measurement: A reply to Anderson et al. (2010). Journal of Research in Science Teaching, 47, 358-362. [http://www1.ccny.cuny.edu/prospective/socialsci/psychology/faculty/upload/Nehm-and-Schonfeld-2010-JRST.pdf] {{Webarchive|url=https://web.archive.org/web/20110719182444/http://www1.ccny.cuny.edu/prospective/socialsci/psychology/faculty/upload/Nehm-and-Schonfeld-2010-JRST.pdf |date=2011-07-19 }}</ref> [[genetics]],<ref>{{cite journal | last1=Smith | first1=Michelle K. | last2=Wood | first2=William B. | last3=Knight | first3=Jennifer K. | editor-last=Ebert-May | editor-first=Diane | title=The Genetics Concept Assessment: A New Concept Inventory for Gauging Student Understanding of Genetics | journal=CBE: Life Sciences Education | publisher=American Society for Cell Biology (ASCB) | volume=7 | issue=4 | year=2008 | issn=1931-7913 | doi=10.1187/cbe.08-08-0045 | pages=422β430| pmid=19047428 | pmc=2592048 }}</ref> [[engineering]],<ref>Concept Inventory Assessment Instruments for Engineering Science. Visited Feb. 14, 2011. [http://www.foundationcoalition.org/home/keycomponents/assessment_evaluation.html]</ref> [[geoscience]].<ref name="Libarkin and Anderson">[[Julie Libarkin|Libarkin, J.C.]], Ward, E.M.G., Anderson, S.W., Kortemeyer, G., Raeburn, S.P., 2011, Revisiting the Geoscience Concept Inventory: A call to the community: GSA Today, v. 21, n. 8, p. 26-28. [http://geoscienceconceptinventory.wikispaces.com/home] {{Webarchive|url=https://web.archive.org/web/20130726065637/http://geoscienceconceptinventory.wikispaces.com/home |date=2013-07-26 }}</ref> and [[computer science]].<ref name="Caceffo, Wolfman, Booth and Azevedo (2016)">Caceffo, R.; Wolfman, S.; Booth, K.; Azevedo, R. (2016). Developing a Computer Science Concept Inventory for Introductory Programming. In Proceedings of the 47th ACM Technical Symposium on Computing Science Education (SIGCSE '16). ACM, New York, NY, USA, 364-369. DOI=https://dx.doi.org/10.1145/2839509.2844559 [http://dl.acm.org/citation.cfm?id=2844559]</ref> In many areas, foundational scientific concepts transcend disciplinary boundaries. An example of an inventory that assesses knowledge of such concepts is an instrument developed by Odom and Barrow (1995) to evaluate understanding of [[diffusion]] and [[osmosis]].<ref name="Odom">Odom AL, Barrow LH 1995 Development and application of a two-tier diagnostic test measuring college biology students' understanding of diffusion and osmosis after a course of instruction. Journal of Research In Science Teaching 32: 45-61.</ref> In addition, there are non-multiple choice conceptual instruments, such as the ''essay-based approach''<ref name="Wright">{{cite journal | last1=Wampold | first1=Bruce E. | last2=Wright | first2=John C. | last3=Williams | first3=Paul H. | last4=Millar | first4=Susan B. | last5=Koscuik | first5=Steve A. | last6=Penberthy | first6=Debra L. | title=A Novel Strategy for Assessing the Effects of Curriculum Reform on Student Competence | journal=Journal of Chemical Education | publisher=American Chemical Society (ACS) | volume=75 | issue=8 | year=1998 | issn=0021-9584 | doi=10.1021/ed075p986 | pages=986β992| bibcode=1998JChEd..75..986W |url=http://bioliteracy.colorado.edu/Readings/Wright.pdf}}</ref> and the essay and oral exams concept to measure student understanding of Lewis structures in chemistry.<ref name="Nehm & Schonfeld 2008"/><ref>{{cite journal | last1=Cooper | first1=Melanie M. | last2=Underwood | first2=Sonia M. | last3=Hilley | first3=Caleb Z. | title=Development and validation of the implicit information from Lewis structures instrument (IILSI): do students connect structures with properties? | journal=Chem. Educ. Res. Pract. | publisher=Royal Society of Chemistry (RSC) | volume=13 | issue=3 | year=2012 | issn=1109-4028 | doi=10.1039/c2rp00010e | pages=195β200}}</ref> ==Caveats associated with concept inventory use== Some concept inventories are problematic. The concepts tested may not be fundamental or important in a particular discipline, the concepts involved may not be explicitly taught in a class or curriculum, or answering a question correctly may require only a superficial understanding of a topic. It is therefore possible to either over-estimate or under-estimate student content mastery. While concept inventories designed to identify trends in student thinking may not be useful in monitoring learning gains as a result of pedagogical interventions, disciplinary mastery may not be the variable measured by a particular instrument. Users should be careful to ensure that concept inventories are actually testing conceptual understanding, rather than test-taking ability, language skills, or other abilities that can influence test performance. The use of multiple-choice exams as concept inventories is not without controversy. The very structure of multiple-choice type concept inventories raises questions involving the extent to which complex, and often nuanced situations and ideas must be simplified or clarified to produce unambiguous responses. For example, a multiple-choice exam designed to assess knowledge of key concepts in natural selection<ref name="AFN"/> does not meet a number of standards of quality control.<ref name="Nehm & Schonfeld 2010"/> One problem with the exam is that the two members of each of several pairs of parallel items, with each pair designed to measure exactly one key concept in natural selection, sometimes have very different levels of difficulty.<ref name="Nehm & Schonfeld 2008"/> Another problem is that the multiple-choice exam overestimates knowledge of natural selection as reflected in student performance on a diagnostic essay exam and a diagnostic oral exam, two instruments with reasonably good [[construct validity]].<ref name="Nehm & Schonfeld 2008"/> Although scoring concept inventories in the form of essay or oral exams is labor-intensive, costly, and difficult to implement with large numbers of students, such exams can offer a more realistic appraisal of the actual levels of students' conceptual mastery as well as their misconceptions.<ref name="Wright"/><ref name="Nehm & Schonfeld 2008"/> Recently, however, computer technology has been developed that can [[Automated essay scoring|score essay responses]] on concept inventories in biology and other domains,<ref>{{cite journal | last1=Nehm | first1=RH | last2=Ha | first2=M | last3=Mayfield | first3=E | title=Transforming Biology Assessment with Machine Learning: Automated Scoring of Written Evolutionary Explanations | journal=Journal of Science Education and Technology | volume=21 | issue=1 | year=2012 | doi=10.1007/s10956-011-9300-9 | pages=183β196| s2cid=254747549 }}</ref> promising to facilitate the scoring of concept inventories organized as (transcribed) oral exams as well as essays. ==See also== {{div col|colwidth=20em|rules=yes}} * {{annotated link|Authentic assessment}} * {{annotated link|Classical test theory}} * {{annotated link|Concept map}} * {{annotated link|Conceptual question}} * {{annotated link|Confidence-based learning}} * {{annotated link|Construct validity}} * {{annotated link|Constructive alignment}} * {{annotated link|Criterion-referenced test|only=explicit}} * {{annotated link|Educational assessment}} * {{annotated link|Item response theory}} * {{annotated link|Norm-referenced test}} * {{annotated link|Ontology (information science)}} * {{annotated link|Psychometrics}} * {{annotated link|Rubric (academic)|Rubrics for assessment}} * {{annotated link|Standards-based education reform in the United States}} * {{annotated link|Standardized test}} * {{annotated link|Standards-based assessment}} {{div col end}} ==References== {{reflist}} ==External links== * [http://solar.physics.montana.edu/aae/adt/ Astronomy] * [https://edstools.colorado.edu/input/i-multi.php?inv=bci&cond=0/ Biology Concept Inventory] * {{usurped|1=[https://archive.today/20130414081711/http://www.biodqc.org/ Bio-Diagnostic Question Clusters]}} * [https://web.archive.org/web/20070713020533/http://www.flaguide.org/cat/diagnostic/diagnostic5.php Classroom Concepts and Diagnostic Tests] * [https://testbook.com/chemistry Chemistry] * [https://web.archive.org/web/20190129194618/http://dqc.crcstl.msu.edu/ Diagnostic Question Clusters in Biology] * [http://www.foundationcoalition.org/home/keycomponents/assessment_evaluation.html Engineering] * [https://web.archive.org/web/20110917034304/http://www.evolutionassessment.org/ Evolution Assessment] * [https://web.archive.org/web/20041212202058/http://modeling.la.asu.edu/R%26E/Research.html Force Concept Inventory] * [http://www.lifescied.org/cgi/content/full/7/4/422?maxtoshow=&hits=10&RESULTFORMAT=1&author1=smith&author2=knight&andorexacttitle=and&andorexacttitleabs=and&andorexactfulltext=and&searchid=1&FIRSTINDEX=0&sortspec=relevance&resourcetype=HWCIT,HWELTR Genetics] * [http://geoscienceconceptinventory.wikispaces.com/home Geosciences] * [https://web.archive.org/web/20110310120759/http://www.lifescinventory.edu.au/ Molecular Life Sciences Concept Inventory] * [http://www.ncsu.edu/per/TestInfo.html Physics] * [https://engineering.purdue.edu/SCI Statistics] * {{usurped|1=[https://archive.today/20130414082107/http://www.biodqc.org/contacts Thinking Like a Biologist]}} {{DEFAULTSORT:Concept Inventory}} [[Category:Tests]] [[Category:Science education]] [[Category:Physics education]]
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)
Pages transcluded onto the current version of this page
(
help
)
:
Template:Annotated link
(
edit
)
Template:Cite journal
(
edit
)
Template:Cite web
(
edit
)
Template:Div col
(
edit
)
Template:Div col end
(
edit
)
Template:Doi
(
edit
)
Template:Reflist
(
edit
)
Template:Short description
(
edit
)
Template:Usurped
(
edit
)
Template:Webarchive
(
edit
)