Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Reproducibility
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
===Reproducible research in practice=== Psychology has seen a renewal of internal concerns about irreproducible results (see the entry on [[replicability crisis]] for empirical results on success rates of replications). Researchers showed in a 2006 study that, of 141 authors of a publication from the American Psychological Association (APA) empirical articles, 103 (73%) did not respond with their data over a six-month period.<ref>{{Cite journal|last1=Wicherts |first1=J. M. |last2=Borsboom |first2=D. |last3=Kats |first3=J. |last4=Molenaar |first4=D. |title=The poor availability of psychological research data for reanalysis |doi=10.1037/0003-066X.61.7.726 |journal=American Psychologist |volume=61 |issue=7 |pages=726β728 |year=2006 |pmid=17032082}}</ref> In a follow-up study published in 2015, it was found that 246 out of 394 contacted authors of papers in APA journals did not share their data upon request (62%).<ref>{{Cite journal|last1=Vanpaemel |first1=W. |last2=Vermorgen |first2=M. |last3=Deriemaecker |first3=L. |last4=Storms |first4=G. |title=Are we wasting a good crisis? The availability of psychological research data after the storm |doi=10.1525/collabra.13 |journal=Collabra |volume=1 |issue=1 |pages=1β5 |year=2015 |doi-access=free}}</ref> In a 2012 paper, it was suggested that researchers should publish data along with their works, and a dataset was released alongside as a demonstration.<ref>{{Cite journal|last1=Wicherts |first1=J. M. |last2=Bakker |first2=M. |doi=10.1016/j.intell.2012.01.004 |title=Publish (your data) or (let the data) perish! Why not publish your data too? |journal=Intelligence |volume=40 |issue=2 |pages=73β76 |year=2012}}</ref> In 2017, an article published in ''[[Scientific Data (journal)|Scientific Data]]'' suggested that this may not be sufficient and that the whole analysis context should be disclosed.<ref>{{cite journal|last1=Pasquier|first1=Thomas|last2=Lau|first2=Matthew K.|last3=Trisovic|first3=Ana|last4=Boose|first4=Emery R.|last5=Couturier|first5=Ben|last6=Crosas|first6=MercΓ¨|last7=Ellison|first7=Aaron M.|last8=Gibson|first8=Valerie|last9=Jones|first9=Chris R.|last10=Seltzer|first10=Margo|title=If these data could talk|journal=Scientific Data|date=5 September 2017|volume=4|issue=1 |pages=170114|doi=10.1038/sdata.2017.114|pmid=28872630|pmc=5584398|bibcode=2017NatSD...470114P}}</ref> In economics, concerns have been raised in relation to the credibility and reliability of published research. In other sciences, reproducibility is regarded as fundamental and is often a prerequisite to research being published, however in economic sciences it is not seen as a priority of the greatest importance. Most peer-reviewed economic journals do not take any substantive measures to ensure that published results are reproducible, however, the top economics journals have been moving to adopt mandatory data and code archives.<ref>{{cite journal |last1=McCullough |first1=Bruce |title=Open Access Economics Journals and the Market for Reproducible Economic Research |journal=Economic Analysis and Policy |date=March 2009 |volume=39 |issue=1 |pages=117β126 |doi=10.1016/S0313-5926(09)50047-1|doi-access= }}</ref> There is low or no incentives for researchers to share their data, and authors would have to bear the costs of compiling data into reusable forms. Economic research is often not reproducible as only a portion of journals have adequate disclosure policies for datasets and program code, and even if they do, authors frequently do not comply with them or they are not enforced by the publisher. A Study of 599 articles published in 37 peer-reviewed journals revealed that while some journals have achieved significant compliance rates, significant portion have only partially complied, or not complied at all. On an article level, the average compliance rate was 47.5%; and on a journal level, the average compliance rate was 38%, ranging from 13% to 99%.<ref>{{cite journal |last1=Vlaeminck |first1=Sven |last2=Podkrajac |first2=Felix |title=Journals in Economic Sciences: Paying Lip Service to Reproducible Research? |journal=IASSIST Quarterly |date=2017-12-10 |volume=41 |issue=1β4 |page=16 |doi=10.29173/iq6 |url=https://iassistquarterly.com/index.php/iassist/article/view/6/905|hdl=11108/359 |s2cid=96499437 |hdl-access=free }}</ref> A 2018 study published in the journal ''[[PLOS ONE]]'' found that 14.4% of a sample of public health statistics researchers had shared their data or code or both.<ref>{{Cite journal|date=2018|title=Use of reproducible research practices in public health: A survey of public health analysts.|journal=PLOS ONE|volume=13|issue=9|pages=e0202447|issn=1932-6203|oclc=7891624396|bibcode=2018PLoSO..1302447H|last1=Harris|first1=Jenine K.|last2=Johnson|first2=Kimberly J.|last3=Carothers|first3=Bobbi J.|last4=Combs|first4=Todd B.|last5=Luke|first5=Douglas A.|last6=Wang|first6=Xiaoyan|doi=10.1371/journal.pone.0202447|pmid=30208041|pmc=6135378|doi-access=free}}</ref> There have been initiatives to improve reporting and hence reproducibility in the medical literature for many years, beginning with the [[Consolidated Standards of Reporting Trials|CONSORT]] initiative, which is now part of a wider initiative, the [[EQUATOR Network]]. This group has recently turned its attention to how better reporting might reduce waste in research,<ref>{{Cite web|title=Research Waste/EQUATOR Conference {{!}} Research Waste |url=http://researchwaste.net/research-wasteequator-conference/ |website=researchwaste.net |url-status=dead |archive-url=https://web.archive.org/web/20161029015313/http://researchwaste.net:80/research-wasteequator-conference/ |archive-date=29 October 2016}}</ref> especially biomedical research. Reproducible research is key to new discoveries in [[pharmacology]]. A Phase I discovery will be followed by Phase II reproductions as a drug develops towards commercial production. In recent decades Phase II success has fallen from 28% to 18%. A 2011 study found that 65% of medical studies were inconsistent when re-tested, and only 6% were completely reproducible.<ref>{{Cite journal|last1=Prinz |first1=F. |last2=Schlange |first2=T. |last3=Asadullah |first3=K. |doi=10.1038/nrd3439-c1 |title=Believe it or not: How much can we rely on published data on potential drug targets? |journal=Nature Reviews Drug Discovery |volume=10 |issue=9 |page=712 |year=2011 |pmid=21892149 |doi-access=free}}</ref> Some efforts have been made to increase replicability beyond the social and biomedical sciences. Studies in the humanities tend to rely more on expertise and hermeneutics which may make replicability more difficult. Nonetheless, some efforts have been made to call for more transparency and documentation in the humanities.<ref>{{Cite journal |last1=Van Eyghen |first1=Hans |last2= Van den Brink |first2=Gijsbert |last3= Peels |first3= Rik |title=Brooke on the Merton Thesis: A Direct Replication of John Hedley Brooke's Chapter on Scientific and Religious Reform |journal=Zygon: Journal of Religion and Science |volume=59 |issue=2 |year=2024|url=https://www.zygonjournal.org/article/id/11497/| doi=10.16995/zygon.11497|doi-access=free }}</ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)