Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Funding of science
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=== Efficiency of funding === {{See also|Economics of science|Economics of scientific knowledge}} The traditional measurement for efficiency of funding are [[Scientific literature|publication]] output, [[citation impact]], number of [[patent]]s, number of [[Doctor of Philosophy|PhDs]] awarded etc. However, the use of [[Impact factor|journal impact factor]] has generated a [[Publish or perish|publish-or-perish]] culture and a theoretical model has been established whose simulations imply that [[peer review]] and over-competitive research funding foster mainstream opinion to monopoly.<ref>{{cite journal |last1=Fang |first1=H. |year=2011 |title=Peer review and over-competitive research funding fostering mainstream opinion to monopoly |journal=Scientometrics |volume=87 |issue=2 |pages=293–301 |doi=10.1007/s11192-010-0323-4 |s2cid=24236419}}</ref> Calls have been made to reform research assessment, most notably in the [[San Francisco Declaration on Research Assessment]]<ref>{{cite web |title=Read the Declaration |url=https://sfdora.org/read/ |access-date=2022-03-28 |website=DORA |language=en-US |archive-date=2022-03-30 |archive-url=https://web.archive.org/web/20220330042049/https://sfdora.org/read/ |url-status=live }}</ref> and the [[Leiden Manifesto|Leiden Manifesto for research metrics]].<ref>{{Cite journal |last1=Hicks |first1=Diana |last2=Wouters |first2=Paul |last3=Waltman |first3=Ludo |last4=de Rijcke |first4=Sarah |last5=Rafols |first5=Ismael |date=2015-04-23 |title=Bibliometrics: The Leiden Manifesto for research metrics |journal=Nature |language=en |volume=520 |issue=7548 |pages=429–431 |bibcode=2015Natur.520..429H |doi=10.1038/520429a |issn=0028-0836 |pmid=25903611 |s2cid=4462115|doi-access=free |hdl=10261/132304 |hdl-access=free }}</ref> The current system also has limitations to measure excellence in the Global South.<ref>{{Cite journal |last1=Tijssen |first1=Robert |last2=Kraemer-Mbula |first2=Erika |date=2018-06-01 |title=Research excellence in Africa: Policies, perceptions, and performance |url=https://academic.oup.com/spp/article/45/3/392/4600842 |journal=Science and Public Policy |language=en |volume=45 |issue=3 |pages=392–403 |doi=10.1093/scipol/scx074 |issn=0302-3427 |doi-access=free |hdl=1887/65584 |hdl-access=free |archive-date=2023-04-30 |access-date=2022-04-09 |archive-url=https://web.archive.org/web/20230430002914/https://academic.oup.com/spp/article/45/3/392/4600842 |url-status=live }}</ref><ref>{{Cite book |last1=Wallace |first1=L. |url=https://www.worldcat.org/oclc/1156814189 |title=Transforming research excellence. |last2=Tijssen |first2=Robert |date=2019 |isbn=978-1-928502-07-4 |location=Cape Town |oclc=1156814189}}</ref> Novel measurement systems such as the Research Quality Plus has been put forward to better emphasize local knowledge and contextualization in the evaluation of excellence.<ref>{{Cite journal |last1=Lebel |first1=Jean |last2=McLean |first2=Robert |date=July 2018 |title=A better measure of research from the global south |journal=Nature |language=en |volume=559 |issue=7712 |pages=23–26 |bibcode=2018Natur.559...23L |doi=10.1038/d41586-018-05581-4 |issn=0028-0836 |pmid=29973734 |s2cid=49692425|doi-access=free }}</ref> A wide range of interventions has been proposed to improve science funding.<ref name="a038">{{cite journal | last=Gigerenzer | first=Gerd | last2=Allen | first2=Colin | last3=Gaillard | first3=Stefan | last4=Goldstone | first4=Robert L. | last5=Haaf | first5=Julia | last6=Holmes | first6=William R. | last7=Kashima | first7=Yoshihisa | last8=Motz | first8=Benjamin | last9=Musslick | first9=Sebastian | last10=Stefan | first10=Angelika | title=Alternative models of funding curiosity-driven research | journal=Proceedings of the National Academy of Sciences | volume=122 | issue=5 | date=4 February 2025 | issn=0027-8424 | pmid=39869812 | pmc=11804678 | doi=10.1073/pnas.2401237121 | doi-access=free | page=}}</ref><ref name="b807">{{cite journal | last=Aczel | first=Balazs | last2=Barwich | first2=Ann-Sophie | last3=Diekman | first3=Amanda B. | last4=Fishbach | first4=Ayelet | last5=Goldstone | first5=Robert L. | last6=Gomez | first6=Pablo | last7=Gundersen | first7=Odd Erik | last8=von Hippel | first8=Paul T. | last9=Holcombe | first9=Alex O. | last10=Lewandowsky | first10=Stephan | last11=Nozari | first11=Nazbanou | last12=Pestilli | first12=Franco | last13=Ioannidis | first13=John P. A. | title=The present and future of peer review: Ideas, interventions, and evidence | journal=Proceedings of the National Academy of Sciences | volume=122 | issue=5 | date=4 February 2025 | issn=0027-8424 | pmid=39869808 | pmc=11804526 | doi=10.1073/pnas.2401232121 | doi-access=free | url=https://www.pnas.org/doi/pdf/10.1073/pnas.2401232121 | access-date=9 March 2025 | page=}}</ref> [[Open peer review]] can improve the quality of [[scholarly peer review]].<ref name="x870"/> A systematic review found a scarcity of [[randomized controlled trial]]s on [[peer review]] interventions.<ref name="x870">{{cite journal | last=Bruce | first=Rachel | last2=Chauvin | first2=Anthony | last3=Trinquart | first3=Ludovic | last4=Ravaud | first4=Philippe | last5=Boutron | first5=Isabelle | title=Impact of interventions to improve the quality of peer review of biomedical journals: a systematic review and meta-analysis | journal=BMC Medicine | volume=14 | issue=1 | date=2016 | issn=1741-7015 | pmid=27287500 | pmc=4902984 | doi=10.1186/s12916-016-0631-5 | doi-access=free | page=}}</ref> Another question is how to allocate funds to different disciplines, institutions, or researchers. A recent study by Wayne Walsh found that "prestigious institutions had on average 65% higher grant application success rates and 50% larger award sizes, whereas less-prestigious institutions produced 65% more publications and had a 35% higher citation impact per dollar of funding."<ref>{{cite web |title=Research Dollars Go Farther at Less-Prestigious Institutions: Study |url=https://www.the-scientist.com/news-opinion/research-dollars-go-farther-at-less-prestigious-institutions--study-64529 |access-date=2018-07-23 |website=The Scientist Magazine® |language=en |archive-date=2018-07-27 |archive-url=https://web.archive.org/web/20180727163354/https://www.the-scientist.com/news-opinion/research-dollars-go-farther-at-less-prestigious-institutions--study-64529 |url-status=live }}</ref><ref>{{Cite journal |last=Wahls |first=Wayne P. |date=2018-07-13 |title=High cost of bias: Diminishing marginal returns on NIH grant funding to institutions |url=https://www.biorxiv.org/content/early/2018/07/13/367847 |journal=bioRxiv |language=en |page=367847 |doi=10.1101/367847 |doi-access=free |archive-date=2018-10-03 |access-date=2018-07-23 |archive-url=https://web.archive.org/web/20181003235821/https://www.biorxiv.org/content/early/2018/07/13/367847 |url-status=live }}</ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)