Scientometrics

Revision as of 13:11, 18 April 2025 by 79.41.35.95 (talk)
(diff) ← Previous revision | Latest revision (diff) | Newer revision → (diff)

Template:Short description Template:For

Scientometrics is a subfield of informetrics that studies quantitative aspects of scholarly literature.<ref>Template:Cite encyclopedia</ref> Major research issues include the measurement of the impact of research papers and academic journals, the understanding of scientific citations, and the use of such measurements in policy and management contexts.<ref name="ScientometricsLeydesdorff">Leydesdorff, L. and Milojevic, S., "Scientometrics" arXiv:1208.4566 (2013), forthcoming in: Lynch, M. (editor), International Encyclopedia of Social and Behavioral Sciences subsection 85030. (2015)</ref> In practice there is a significant overlap between scientometrics and other scientific fields such as information systems, information science, science of science policy, sociology of science, and metascience. Critics have argued that overreliance on scientometrics has created a system of perverse incentives, producing a publish or perish environment that leads to low-quality research.

Historical developmentEdit

<ref>Template:Cite journal</ref><ref>Template:Cite journal</ref><ref>Template:Cite journal</ref><ref>Template:Cite thesis</ref> Modern scientometrics is mostly based on the work of Derek J. de Solla Price and Eugene Garfield. The latter created the Science Citation Index<ref name="ScientometricsLeydesdorff" /> and founded the Institute for Scientific Information which is heavily used for scientometric analysis. A dedicated academic journal, Scientometrics, was established in 1978. The industrialization of science increased the number of publications and research outcomes and the rise of the computers allowed effective analysis of this data.<ref>De Solla Price, D., editorial statement. Scientometrics Volume 1, Issue 1 (1978)</ref> While the sociology of science focused on the behavior of scientists, scientometrics focused on the analysis of publications.<ref name="ScientometricsLeydesdorff" /> Accordingly, scientometrics is also referred to as the scientific and empirical study of science and its outcomes.<ref name="papers.ssrn.com">Template:Cite journal</ref><ref name="Lowry, Paul Benjamin 2013">Lowry, Paul Benjamin; Moody, Gregory D.; Gaskin, James; Galletta, Dennis F.; Humpherys, Sean; Barlow, Jordan B.; and Wilson, David W. (2013). "Evaluating journal quality and the Association for Information Systems (AIS) Senior Scholars' journal basket via bibliometric measures: Do expert journal assessments add value?," MIS Quarterly (MISQ), vol. 37(4), 993–1012. Also, see a YouTube video narrative of this paper at: https://www.youtube.com/watch?v=LZQIDkA-ke0.</ref>

The International Society for Scientometrics and Informetrics founded in 1993 is an association of professionals in the field.<ref>{{#invoke:citation/CS1|citation |CitationClass=web }}</ref>

Later, around the turn of the century, evaluation and ranking of scientists and institutions came more into the spotlights. Based on bibliometric analysis of scientific publications and citations, the Academic Ranking of World Universities ("Shanghai ranking") was first published in 2004 by the Shanghai Jiao Tong University. Impact factors became an important tool to choose between different journals. Rankings such as the Academic Ranking of World Universities and the Times Higher Education World University Rankings (THE-ranking) became an indicator for the status of universities. The h-index became an important indicator of the productivity and impact of the work of a scientist. However, alternative author-level metrics have been proposed.<ref>Template:Cite journal</ref><ref>Template:Cite journal</ref>

Around the same time, the interest of governments in evaluating research for the purpose of assessing the impact of science funding increased. As the investments in scientific research were included as part of the U.S. American Recovery and Reinvestment Act of 2009 (ARRA), a major economic stimulus package, programs like STAR METRICS were set up to assess if the positive impact on the economy would actually occur.<ref name="Lane1">Template:Cite journal</ref>

Methods and findingsEdit

Methods of research include qualitative, quantitative and computational approaches. The main focus of studies have been on institutional productivity comparisons, institutional research rankings, journal rankings<ref name="papers.ssrn.com"/><ref name="Lowry, Paul Benjamin 2013"/><ref>Template:Cite journal Recipient of the Rudolph Joenk Award for Best Paper Published in IEEE Transactions on Professional Communication in 2007.</ref> establishing faculty productivity and tenure standards,<ref>Template:Cite journal</ref> assessing the influence of top scholarly articles,<ref>Template:Cite journal</ref> and developing profiles of top authors and institutions in terms of research performance.<ref>Template:Cite journal</ref>

One significant finding in the field is a principle of cost escalation to the effect that achieving further findings at a given level of importance grow exponentially more costly in the expenditure of effort and resources. However, new algorithmic methods in search, machine learning and data mining are showing that is not the case for many information retrieval and extraction-based problems.Template:Citation needed

More recent methods rely on open source and open data to ensure transparency and reproducibility in line with modern open science requirements. For instance, the Unpaywall index and attendant research on open access trends is based on data retrieved from OAI-PMH endpoints of thousands of open archives provided by libraries and institutions worldwide.<ref>Template:Cite bioRxiv</ref>

Recommendations to avoid common errors in scientometrics include: select topics with sufficient data; use data mining and web scraping, combine methods, and eliminate "false positives".<ref>Jiawei, H., Kamber, M., Han, J., Kamber, M., Pei, J. 2012. Data Mining: Concepts and Techniques. Morgan Kaufmann, Wlatham, EE.UU.</ref><ref>Template:Cite journal</ref> It is also necessary to understand the limits of search engines (e.g. Web of Science, Scopus and Google Scholar) which fail to index thousands of studies in small journals and underdeveloped countries.<ref>Template:Cite journal</ref>

Common scientometric indexesEdit

Indexes may be classified as article-level metrics, author-level metrics, and journal-level metrics depending on which feature they evaluate.

Impact factorEdit

Template:Main article The impact factor (IF) or journal impact factor (JIF) of an academic journal is a measure reflecting the yearly average number of citations to recent articles published in that journal. It is frequently used as a proxy for the relative importance of a journal within its field; journals with higher impact factors are often deemed to be more important than those with lower ones. The impact factor was devised by Eugene Garfield, the founder of the Institute for Scientific Information (ISI).

Science Citation IndexEdit

Template:Main article The Science Citation Index (SCI) is a citation index originally produced by the Institute for Scientific Information (ISI) and created by Eugene Garfield. It was officially launched in 1964. It is now owned by Clarivate Analytics (previously the Intellectual Property and Science business of Thomson Reuters).<ref name=dimension> Template:Cite journal</ref><ref name=evolve> Template:Cite journal</ref><ref name=gOverview> {{#invoke:citation/CS1|citation |CitationClass=web }}</ref><ref name=history-cite-indexing> {{#invoke:citation/CS1|citation |CitationClass=web }}</ref> The larger version (Science Citation Index Expanded) covers more than 8,500 notable and significant journals, across 150 disciplines, from 1900 to the present. These are alternatively described as the world's leading journals of science and technology, because of a rigorous selection process.<ref name=Expanded> {{#invoke:citation/CS1|citation |CitationClass=web }}</ref><ref name=wetland> Template:Cite journal</ref><ref name=shan> Template:Cite journal</ref>

Acknowledgment indexEdit

Template:Main article

An acknowledgment index (British acknowledgement index)<ref name="gram">{{#invoke:citation/CS1|citation |CitationClass=web }}</ref> is a method for indexing and analyzing acknowledgments in the scientific literature and, thus, quantifies the impact of acknowledgments. Typically, a scholarly article has a section in which the authors acknowledge entities such as funding, technical staff, colleagues, etc. that have contributed materials or knowledge or have influenced or inspired their work. Like a citation index, it measures influences on scientific work, but in a different sense; it measures institutional and economic influences as well as informal influences of individual people, ideas, and artifacts. Unlike the impact factor, it does not produce a single overall metric, but analyzes the components separately. However, the total number of acknowledgments to an acknowledged entity can be measured and so can the number of citations to the papers in which the acknowledgment appears. The ratio of this total number of citations to the total number of papers in which the acknowledge entity appears can be construed as the impact of that acknowledged entity.<ref>Template:Cite conference</ref><ref>Template:Cite journal</ref>

AltmetricsEdit

{{#invoke:Labelled list hatnote|labelledList|Main article|Main articles|Main page|Main pages}} In scholarly and scientific publishing, altmetrics are nontraditional bibliometrics<ref name="PLOSCollections">{{#invoke:citation/CS1|citation |CitationClass=web }}</ref> proposed as an alternative<ref>"The "alt" does indeed stand for "alternative"" Jason Priem, leading author in the Altmetrics Manifesto comment 592</ref> or complement<ref name=":2">Template:Cite journal</ref> to more traditional citation impact metrics, such as impact factor and h-index.<ref>Template:Cite journal</ref> The term altmetrics was proposed in 2010,<ref name="Altmetrics-Manifesto-2011">Template:Cite journal</ref> as a generalization of article level metrics,<ref name="UCBerkeley-BinfieldTalk-2009">{{#invoke:citation/CS1|citation |CitationClass=web }}Template:Cbignore</ref> and has its roots in the #altmetrics hashtag. Although altmetrics are often thought of as metrics about articles, they can be applied to people, journals, books, data sets, presentations, videos, source code repositories, web pages, etc. Altmetrics use public APIs across platforms to gather data with open scripts and algorithms. Altmetrics did not originally cover citation counts,<ref name="OpeningScience-2014">Template:Cite book</ref> but calculate scholar impact based on diverse online research output, such as social media, online news media, online reference managers and so on.<ref name="IEEESpectrum-MeasuringImpact-2012">Template:Cite journal</ref><ref name="SerialsReview-Rethinking-2013">Template:Cite journal</ref> It demonstrates both the impact and the detailed composition of the impact.<ref name="Altmetrics-Manifesto-2011" /> Altmetrics could be applied to research filter,<ref name="Altmetrics-Manifesto-2011" /> promotion and tenure dossiers, grant applications<ref>Template:Cite journal</ref><ref>Template:Cite conference</ref> and for ranking newly published articles in academic search engines.<ref name=":6">Template:Cite conference</ref>

CriticismsEdit

Critics have argued that overreliance on scientometrics has created a publish or perish environment with perverse incentives that lead to low-quality research.<ref>Template:Cite journal</ref><ref>Template:Cite journal</ref>

In popular cultureEdit

The main character in Michael Frayn’s novel Skios is a Professor of Scientometrics.

See alsoEdit

Template:Div col

Template:Div col end

JournalsEdit

References and footnotesEdit

Template:Reflist

External linksEdit

Template:Science and technology studies Template:Academic publishing Template:Authority control