Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Impact factor
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
===Assumed correlation between impact factor and quality=== The journal impact factor was originally designed by Eugene Garfield as a metric to help librarians make decisions about which journals were worth indexing, as the JIF aggregates the number of citations to articles published in each journal. Since then, the JIF has become associated as a mark of journal "quality", and gained widespread use for evaluation of research and researchers instead, even at the institutional level. It thus has significant impact on steering research practices and behaviours.<ref name="Lariviere 2018">{{cite arXiv |title=The Journal Impact Factor: A Brief History, Critique, and Discussion of Adverse Effects |eprint=1801.08992 |vauthors=Gargouri Y, Hajjem C, Lariviere V, Gingras Y, Carr L, Brody T, Harnad S |year=2018 |class=cs.DL}}</ref><ref name="Curry 2018">{{cite journal |vauthors=Curry S |title=Let's move beyond the rhetoric: it's time to change how we judge research |journal=Nature |volume=554 |issue=7691 |page=147 |date=February 2018 |pmid=29420505 |doi=10.1038/d41586-018-01642-w |bibcode=2018Natur.554..147C}}</ref><ref>{{cite journal |vauthors=Al-Hoorie A, Vitta JP |year=2019 |title=The seven sins of L2 research: A review of 30 journals' statistical quality and their CiteScore, SJR, SNIP, JCR Impact Factors |journal=Language Teaching Research |volume=23 |issue=6 |pages=727–744 |doi=10.1177/1362168818767191 |s2cid=149857357}}</ref> By 2010, national and international research funding institutions were already starting to point out that numerical indicators such as the JIF should not be considered as a measure of quality.{{NoteTag |{{cite press release |url=https://www.dfg.de/en/service/press/press_releases/2010/pressemitteilung_nr_07/ |title='Quality Not Quantity' – DFG Adopts Rules to Counter the Flood of Publications in Research |id=DFG Press Release No. 7 |publisher=Deutsche Forschungsgemeinschaft ([[German Research Foundation]]) |date=2010}}}} In fact, research was indicating that the JIF is a highly manipulated metric,<ref name="Falagas 2008">{{cite journal |vauthors=Falagas ME, Alexiou VG |title=The top-ten in journal impact factor manipulation |journal=Archivum Immunologiae et Therapiae Experimentalis |volume=56 |issue=4 |pages=223–6 |year=2008 |pmid=18661263 |doi=10.1007/s00005-008-0024-5 |s2cid=7482376}}</ref><ref name="Tort 2012">{{cite journal |vauthors=Tort AB, Targino ZH, Amaral OB |title=Rising publication delays inflate journal impact factors |journal=PLOS ONE |volume=7 |issue=12 |page=e53374 |year=2012 |pmid=23300920 |pmc=3534064 |doi=10.1371/journal.pone.0053374 |bibcode=2012PLoSO...753374T |doi-access=free}}</ref><ref name="Fong 2017">{{cite journal |vauthors=Fong EA, Wilhite AW |title=Authorship and citation manipulation in academic research |journal=PLOS ONE |volume=12 |issue=12 |page=e0187394 |year=2017 |pmid=29211744 |pmc=5718422 |doi=10.1371/journal.pone.0187394 |bibcode=2017PLoSO..1287394F |doi-access=free}}</ref> and the justification for its continued widespread use beyond its original narrow purpose seems due to its simplicity (easily calculable and comparable number), rather than any actual relationship to research quality.<ref name="Adler 2008">{{cite journal |vauthors=Adler R, Ewing J, Taylor P |title=Citation Statistics: A Report from the International Mathematical Union (IMU) in Cooperation with the International Council of Industrial and Applied Mathematics (ICIAM) and the Institute of Mathematical Statistics (IMS) |journal=Statistical Science |date=2009 |volume=24 |issue=1 |pages=1–14 |doi=10.1214/09-STS285 |url=https://www.mathunion.org/fileadmin/IMU/Report/CitationStatistics.pdf |arxiv=0910.3529 |jstor=20697661 |s2cid=219477 |issn=0883-4237}}</ref><ref name="Brembs 2018">{{cite journal |vauthors=Brembs B |title=Prestigious Science Journals Struggle to Reach Even Average Reliability |journal=Frontiers in Human Neuroscience |volume=12 |page=37 |year=2018 |pmid=29515380 |pmc=5826185 |doi=10.3389/fnhum.2018.00037 |doi-access=free}}</ref><ref name="Lariviere 2009">{{cite arXiv |title=The Impact Factor's Matthew Effect: A Natural Experiment in Bibliometrics |eprint=0908.3177 |vauthors=Gargouri Y, Hajjem C, Lariviere V, Gingras Y, Carr L, Brody T, Harnad S |year=2009 |class=physics.soc-ph}}</ref> Empirical evidence shows that the misuse of the JIF—and journal ranking metrics in general—has a number of negative consequences for the scholarly communication system. These include gaps between the reach of a journal and the quality of its individual papers<ref name="Brembs_2013">{{cite journal |last1=Brembs |first1=Björn |last2=Button |first2=Katherine |last3=Munafò |first3=Marcus |date=2013-06-24 |title=Deep impact: unintended consequences of journal rank |journal=[[Frontiers Media|Frontiers in Human Neuroscience]] |volume=7 |page=291 |arxiv=1301.3748 |bibcode=2013arXiv1301.3748B |doi=10.3389/fnhum.2013.00291 |issn=1662-5161 |pmc=3690355 |pmid=23805088 |doi-access=free}}</ref> and insufficient coverage of social sciences and humanities as well as research outputs from across Latin America, Africa, and South-East Asia.<ref>{{Cite journal |last1=Severin |first1=Anna |last2=Strinzel |first2=Michaela |last3=Egger |first3=Matthias |last4=Barros |first4=Tiago |last5=Sokolov |first5=Alexander |last6=Mouatt |first6=Julia Vilstrup |last7=Müller |first7=Stefan |date=2023-08-29 |editor-last=Dirnagl |editor-first=Ulrich |title=Relationship between journal impact factor and the thoroughness and helpfulness of peer reviews |journal=[[PLOS Biology]] |language=en |volume=21 |issue=8 |pages=e3002238 |doi=10.1371/journal.pbio.3002238 |issn=1544-9173 |eissn=1545-7885 |lccn=2003212293 |oclc=1039259630 |pmc=10464996 |pmid=37643173 |doi-access=free}}</ref> Additional drawbacks include the marginalization of research in [[vernacular language]]s and on locally relevant topics and inducement to unethical authorship and citation practices. More generally, the impact factors fosters a reputation economy, where scientific success is based on publishing in prestigious journals ahead of actual research qualities such as rigorous methods, replicability and social impact. Using journal prestige and the JIF to cultivate a competition regime in academia has been shown to have deleterious effects on research quality.<ref name="Vessuri 2014">{{cite journal |last1=Vessuri |first1=Hebe |last2=Guédon |first2=Jean-Claude |last3=Cetto |first3=Ana María |date=September 2014 |title=Excellence or quality? Impact of the current competition regime on science and scientific publishing in Latin America and its implications for development |url=https://journals.sagepub.com/doi/10.1177/0011392113512839 |journal=[[Current Sociology]] |language=en |volume=62 |issue=5 |pages=647–665 |doi=10.1177/0011392113512839 |issn=0011-3921 |eissn=1461-7064 |s2cid=25166127 |via=[[Sage Publishing]]}}</ref> A number of regional and international initiatives are now providing and suggesting alternative research assessment systems, including key documents such as the [[Leiden Manifesto]]{{NoteTag |{{cite web |url=http://www.leidenmanifesto.org/ |title=The Leiden Manifesto for Research Metrics |year=2015}}}} and the [[San Francisco Declaration on Research Assessment]] (DORA). [[Plan S]] calls for a broader adoption and implementation of such initiatives alongside fundamental changes in the scholarly communication system.{{NoteTag |{{cite web |url=https://www.coalition-s.org/feedback/ |title=Plan S implementation guidelines |date=February 2019}}}} As appropriate measures of quality for authors and research, concepts of research excellence should be remodelled around transparent workflows and accessible research results.<ref name="Moore 2017">{{cite journal |title='Excellence R Us': University Research and the Fetishisation of Excellence |journal=Palgrave Communications |volume=3 |doi=10.1057/palcomms.2016.105 |year=2017 |vauthors=Moore S, Neylon C, Eve MP, O'Donnell DP, Pattinson D |doi-access=free}}</ref><ref name="Owen 2012">{{cite journal |title=Responsible Research and Innovation: From Science in Society to Science for Society, with Society |journal=Science and Public Policy |volume=39 |issue=6 |pages=751–760 |doi=10.1093/scipol/scs093 |year=2012 |vauthors=Owen R, Macnaghten P, Stilgoe J}}</ref><ref name="Hicks 2015" /> JIFs are still regularly used to evaluate research in many countries, which is a problem since a number of issues remain around the opacity of the metric and the fact that it is often negotiated by publishers.<ref name="Guédon 2008">{{cite journal |title=Open Access and the Divide between 'Mainstream" and "peripheral |journal=Como Gerir e Qualificar Revistas Científicas |pages=1–25}}</ref><ref name="Alperin 2018">{{cite journal |vauthors=Alperin JP, Muñoz Nieves C, Schimanski LA, Fischman GE, Niles MT, McKiernan EC |title=How significant are the public dimensions of faculty work in review, promotion and tenure documents? |journal=eLife |volume=8 |date=February 2019 |pmid=30747708 |pmc=6391063 |doi=10.7554/eLife.42254 |doi-access=free }}</ref><ref name="Rossner 2007">{{cite journal |vauthors=Rossner M, Van Epps H, Hill E |title=Show me the data |journal=The Journal of Cell Biology |volume=179 |issue=6 |pages=1091–2 |date=December 2007 |pmid=18086910 |pmc=2140038 |doi=10.1083/jcb.200711140}}</ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)