Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Fallacy
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=== Knowledge value measurement fallacy === The increasing availability and circulation of [[big data]] are driving a proliferation of new metrics for scholarly authority,<ref name="Meho">{{Cite journal |last=Meho |first=Lokman I. |year=2007 |title=The Rise and Rise of Citation Analysis |journal=[[Physics World]] |volume=January |pages=32β36 |arxiv=physics/0701012 |bibcode=2007physics...1012M |doi=10.1088/2058-7058/20/1/33 |s2cid=16532275}}</ref><ref>{{Cite news |last=Jensen |first=Michael |date=June 15, 2007 |title=The New Metrics of Scholarly Authority |work=[[The Chronicle of Higher Education]] |publisher=The Chron |editor-last=Riley |editor-first=Michael G. |url=http://chronicle.com/article/The-New-Metrics-of-Scholarly/5449 |access-date=28 October 2013 |issn=0009-5982 |oclc=1554535}}</ref> and there is lively discussion regarding the relative usefulness of such metrics for measuring the value of knowledge production in the context of an "information tsunami".<ref name="Baveye 2010 191β215">{{Cite journal |last=Baveye |first=Phillippe C. |year=2010 |title=Sticker Shock and Looming Tsunami: The High Cost of Academic Serials in Perspective |journal=[[Journal of Scholarly Publishing]] |volume=41 |issue=2 |pages=191β215 |doi=10.1353/scp.0.0074 |s2cid=145424660}}<!--|access-date=28 October 2013--></ref> For example, [[anchoring]] fallacies can occur when unwarranted weight is given to data generated by metrics that the arguers themselves acknowledge are flawed. For example, the limitations of the [[journal impact factor]] (JIF) are well documented,<ref>{{Cite book |last=National Communication Journal |url=http://www.natcom.org/uploadedFiles/More_Scholarly_Resources/CCA%20Impact%20Factor%20Report%20Final.pdf |title=Impact Factors, Journal Quality, and Communication Journals: A Report for the Council of Communication Associations |publisher=National Communication Association |year=2013 |location=Washington, D.C. |access-date=2016-02-22 |archive-url=https://web.archive.org/web/20160404212454/http://www.natcom.org/uploadedFiles/More_Scholarly_Resources/CCA%20Impact%20Factor%20Report%20Final.pdf |archive-date=April 4, 2016 |url-status=dead}}</ref> and even JIF pioneer Eugene Garfield notes that, "while citation data create new tools for analyses of research performance, it should be stressed that they supplement rather than replace other quantitative and qualitative indicators".<ref>{{Cite journal |last=Garfield |first=Eugene |year=1993 |title=What Citations Tell us About Canadian Research |journal=Canadian Journal of Library and Information Science |volume=18 |issue=4 |page=34}}</ref> To the extent that arguers jettison the acknowledged limitations of JIF-generated data in evaluative judgments or leave behind Garfield's "supplement rather than replace" caveat, they commit anchoring fallacies. The [[observational interpretation fallacy]] is the [[cognitive bias]] where association identified in observational studies are misinterpreted as [[causal relationship]]s. A [[naturalistic fallacy]] can occur, for example, in the case of sheer quantity metrics based on the premise "more is better"<ref name="Baveye 2010 191β215" /> or, in the case of developmental assessment in the field of psychology, "higher is better".<ref>{{Cite journal |last=Stein |first=Zachary |date=October 2008 |title=Myth Busting and Metric Making: Refashioning the Discourse about Development |url=http://www.archive-ilr.com/archives-2008/2008-10/2008-10-article-stein.php |journal=Integral Leadership Review |volume=8 |issue=5 |archive-url=https://archive.today/20131030094158/http://www.archive-ilr.com/archives-2008/2008-10/2008-10-article-stein.php |archive-date=October 30, 2013 |access-date=October 28, 2013}}</ref> A [[false analogy]] occurs when claims are supported by unsound comparisons between data points. For example, the [[Scopus]] and [[Web of Science]] bibliographic databases have difficulty distinguishing between citations of scholarly work that are arms-length endorsements, ceremonial citations, or negative citations (indicating the citing author withholds endorsement of the cited work).<ref name=Meho/> Hence, measurement-based value claims premised on the uniform quality of all citations may be questioned on false analogy grounds. As another example, consider the [[Faculty Scholarly Productivity Index]] of Academic Analytics. This tool purports to measure overall faculty productivity, yet it does not capture data based on citations in books. This creates a possibility that low productivity measurements using the tool commit [[argument from silence]] fallacies, to the extent that such measurements are supported by the absence of book citation data. [[Ecological fallacy|Ecological fallacies]] can be committed when one measures the scholarly productivity of a sub-group of individuals (e.g. "Puerto Rican" faculty) via reference to aggregate data about a larger and different group (e.g., "Hispanic" faculty).<ref>{{Cite journal |last=Allen |first=Henry L. |year=1997 |title=Faculty Workload and Productivity: Ethnic and Gender Disparities |url=http://www.nea.org/assets/img/PubAlmanac/ALM_97_04.pdf |url-status=dead |journal=NEA 1997 Almanac of Higher Education |page=39 |archive-url=https://web.archive.org/web/20150707001555/http://www.nea.org/assets/img/PubAlmanac/ALM_97_04.pdf |archive-date=July 7, 2015 |access-date=October 29, 2013}}</ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)