{"title":"Technological Impact of Funded Research: A Case Study of Non-Patent References","authors":"Justin Quemener, Luis Miotti, Abdelghani Maddi","doi":"10.1162/qss_a_00281","DOIUrl":"https://doi.org/10.1162/qss_a_00281","url":null,"abstract":"\u0000 Research funding is essential to expand knowledge, foster innovation, and address the complex challenges that shape our future. The scientific literature has extensively addressed the relationship between research funding and the academic impact. More recently, several studies have analyzed the technological impact of funded research as measured through citations in patents, known as Non-Patent References (NPRs). But there remains much to know about NPRs and the multiplication of case studies is necessary to characterize them. Here we analyze a sample of 7065 publications funded by the French Health Research Foundation (FRM) and the citations of these publications in patents. This study shows the high scientific and technological impacts of FRM funding. Indeed, the publications funded by FRM, which are cited in patents, are 3.5 times more frequently cited by other publications than the global average (for funded publications in the entire database, this ratio is 2.6). Furthermore, our results also indicate that USPTO patents citing these publications exhibit high-quality indicators. Moreover, five of these patents have led to approved drug products by the USA Food and Drug Administration (FDA). This study provides further evidence of the positive influence that research funding can have on both scientific and technological advancements.\u0000 \u0000 \u0000 https://www.webofscience.com/api/gateway/wos/peer-review/10.1162/qss_a_00281\u0000","PeriodicalId":34021,"journal":{"name":"Quantitative Science Studies","volume":"110 22","pages":""},"PeriodicalIF":6.4,"publicationDate":"2024-01-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139388015","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Socio-cultural factors and academic openness of world countries","authors":"Mahmood Khosrowjerdi, Silje Hernæs Linhart","doi":"10.1162/qss_a_00278","DOIUrl":"https://doi.org/10.1162/qss_a_00278","url":null,"abstract":"\u0000 Academic openness (i.e., the extent of collaborative academic activities of nations with external actors in the science communication system) has been regarded as a major contributor to strong science. We used several nation-level datasets to explore the associations of socio-cultural factors with the academic openness of world nations. In order to check the robustness of relationships, two distinct datasets for academic openness of nations were used in this research. Our findings showed the strong relationships of some dimensions of national culture and (economic and human) resources with both academic openness indicators. The findings were discussed considering Schwartz's national culture theory, and the implications were presented in conclusion.\u0000 \u0000 \u0000 https://www.webofscience.com/api/gateway/wos/peer-review/10.1162/qss_a_00278\u0000","PeriodicalId":34021,"journal":{"name":"Quantitative Science Studies","volume":"52 40","pages":""},"PeriodicalIF":6.4,"publicationDate":"2023-12-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138593007","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The rise of responsible metrics as a professional reform movement: A collective action frames account","authors":"Alex Rushforth, Björn Hammarfelt","doi":"10.1162/qss_a_00280","DOIUrl":"https://doi.org/10.1162/qss_a_00280","url":null,"abstract":"\u0000 Recent years have seen a rise in awareness around “responsible metrics” and calls for research assessment reforms internationally. Yet within the field of quantitative science studies and in research policy contexts, concerns about the limitations of evaluative bibliometrics are almost as old as the tools themselves. Given that many of the concerns articulated in recent reform movements go back decades, why has momentum for change grown only in the past ten years? In this paper, we draw on analytical insights from the sociology of social movements on collective action frames to chart the emergence, development, and expansion of “responsible metrics” as a professional reform movement. Through reviewing important texts that have shaped reform efforts, we argue that hitherto, three framings have underpinned the responsible metrics reform agenda: the metrics scepticism framing, the professional-expert framing, and the reflexivity framing. We suggest that while these three framings have co-existed within the responsible metrics movement to date, co-habitation between these framings may not last indefinitely, especially as the responsible metrics movement extends into wider research assessment reform movements.\u0000 \u0000 \u0000 https://www.webofscience.com/api/gateway/wos/peer-review/10.1162/qss_a_00280\u0000","PeriodicalId":34021,"journal":{"name":"Quantitative Science Studies","volume":"47 2","pages":""},"PeriodicalIF":6.4,"publicationDate":"2023-12-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138593854","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Scope and limitations of library metrics for the assessment of ebook usage: COUNTER R5 and link resolver","authors":"Mercedes Echeverria, Yacelli Bustamante","doi":"10.1162/qss_a_00279","DOIUrl":"https://doi.org/10.1162/qss_a_00279","url":null,"abstract":"\u0000 Data is at the heart of electronic resource management in academic libraries. Assessing the usage data of electronic resources has become a prevalent approach to demonstrate the value of digital collections, justify library expenditures, and gain insights into how users interact with library materials. This study analyses the usage statistics of electronic books (ebooks) generated locally by the OpenURL link resolver in an academic library, and statistics collected by platform vendors based on Release 5 of the Counting Online Usage of Networked Electronic Resource (COUNTER R5). Three content provider platforms (Cambridge Core, EBSCOhost and ScienceDirect) were analysed as data sources. The COUNTER and link resolver statistics were examined to determine the degree of association between these two metrics. The Spearman correlation coefficient was moderate (rs > 0.561 and <0.678) and statistically significant (p <0.01). This suggests that these metrics capture different aspects of the usage of ebooks in different contexts. Other factors, such as the types of access to electronic resources and the units of content delivered, were also examined. The study concludes with a discussion regarding the scope and limitations of link resolver and COUNTER R5 as library metrics for measuring the usage of ebooks.\u0000 \u0000 \u0000 https://www.webofscience.com/api/gateway/wos/peer-review/10.1162/qss_a_00279\u0000","PeriodicalId":34021,"journal":{"name":"Quantitative Science Studies","volume":"46 3","pages":""},"PeriodicalIF":6.4,"publicationDate":"2023-12-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138593382","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"New methodologies for the digital age? How methods (re-)organize research using social media data","authors":"Yangliu Fan, Sune Lehmann, Anders Blok","doi":"10.1162/qss_a_00271","DOIUrl":"https://doi.org/10.1162/qss_a_00271","url":null,"abstract":"Abstract As “big and broad” social media data continues to expand and become a more prevalent source for research, much remains to be understood about its epistemological and methodological implications. Drawing on an original dataset of 12,732 research articles using social media data, we employ a novel dictionary-based approach to map the use of methods. Specifically, our approach draws on a combination of manual coding and embedding-enhanced query expansion. We cluster journals in groups of densely connected research communities to investigate how heterogeneous these groups are in terms of the methods used. First, our results indicate that research in this domain is largely organized by methods. Some communities tend to have a mono-method culture, while others combine methods in novel ways. Comparing practices across communities, we observe that computational methods have penetrated many research areas but not the research space surrounding ethnography. Second, we identify two core axes of variation—social sciences vs. computer science and methodological individualism vs. relationalism—that organize the domain as a whole, suggesting new methodological divisions and debates. Peer Review https://www.webofscience.com/api/gateway/wos/peer-review/10.1162/qss_a_00271","PeriodicalId":34021,"journal":{"name":"Quantitative Science Studies","volume":"10 6","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-11-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135874855","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Benedetto Lepori, Lutz Bornmann, Félix de Moya Anegón
{"title":"Measuring university size. A comparison of Academic Personnel versus Scientific Talent Pool data","authors":"Benedetto Lepori, Lutz Bornmann, Félix de Moya Anegón","doi":"10.1162/qss_a_00273","DOIUrl":"https://doi.org/10.1162/qss_a_00273","url":null,"abstract":"Abstract This paper compares two measures of organizational size of Higher Education Institutions widely used in the literature: the number of Academic Personnel (AP) measured according to definitions from international education statistics, and the Scientific Talent Pool (STP), i.e. the number of unique authors affiliated to the HEI as derived from the Scopus database. Based on their definitions and operationalizations, we derive expectations on the factors generating differences between these two measures, as related to the HEI’s research orientation and subject mix, as well as to the presence of a university hospital. We test these expectations on a sample of more than 1,500 HEIs in Europe by combining data from the European Tertiary Education Register and from the Scimago Institutions Ranking. Our results provide support to the expected relationships and also highlight cases where the institutional perimeter of HEIs is systematically different between the two sources. We conclude that these two indicators provide complementary measures of institutional size, one more focused on the organizational perimeter as defined by employment relationships, the other on the persons who contribute to the HEI’s scientific visibility. Comparing the two indicators therefore is likely to provide a more in-depth understanding of the HEI resources available. Peer Review https://www.webofscience.com/api/gateway/wos/peer-review/10.1162/qss_a_00273","PeriodicalId":34021,"journal":{"name":"Quantitative Science Studies","volume":"10 5","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-11-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135874856","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Scientific reform, citation politics and the bureaucracy of oblivion","authors":"Berna Devezer, Bart Penders","doi":"10.1162/qss_c_00274","DOIUrl":"https://doi.org/10.1162/qss_c_00274","url":null,"abstract":"Abstract Current reform movements in science seek to change how researchers do science, the tools and infrastructure they use to so, and how they assess each others’ work in terms of quality and value. Here, we argue that openness and replicability are quickly becoming key indicators for such quality assessments and they sometimes operate through citation strategies that actively pursue (some degree of) oblivion for non-reformed science. We do not oppose a genuine pursuit of transparency and methodological quality, but are concerned by how uncritical and oversimplified interpretations of both are skewing the collective memory of the scholarly community. Peer Review https://www.webofscience.com/api/gateway/wos/peer-review/10.1162/qss_c_00274","PeriodicalId":34021,"journal":{"name":"Quantitative Science Studies","volume":"10 4","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-11-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135874857","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Novel utilization of a paper-level classification system for the evaluation of journal impact: An update of the CAS Journal Ranking","authors":"Sichao Tong, Fuyou Chen, Liying Yang, Zhesi Shen","doi":"10.1162/qss_a_00270","DOIUrl":"https://doi.org/10.1162/qss_a_00270","url":null,"abstract":"Abstract Since its first release in 2004, the CAS Journal Ranking, a ranking system of journals based on a citation impact indicator, has been widely used both in selecting journals when submitting manuscripts and conducting research evaluation in China This paper introduces an upgraded version of the CAS Journal Ranking released in 2020 and the corresponding improvements. We will discuss the following improvements: (1) the CWTS paper-level classification system, a fine-grained classification system, utilized for field normalization, (2) the Field Normalized Citation Success Index (FNCSI), an indicator which is robust against not only extremely highly cited publications, but also wrongly assigned document types, and (3) document type difference. In addition, this paper will present part of the ranking results and an interpretation of the features of the FNCSI indicator. Peer Review https://www.webofscience.com/api/gateway/wos/peer-review/10.1162/qss_a_00270","PeriodicalId":34021,"journal":{"name":"Quantitative Science Studies","volume":"10 3","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-11-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135874858","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
David Schindler, Erjia Yan, Sascha Spors, Frank Krüger
{"title":"Retracted articles use less free and open-source software and cite it worse","authors":"David Schindler, Erjia Yan, Sascha Spors, Frank Krüger","doi":"10.1162/qss_a_00275","DOIUrl":"https://doi.org/10.1162/qss_a_00275","url":null,"abstract":"Abstract As an essential mechanism of scientific self-correction, articles are retracted for many reasons including errors in processing data and computation of results. In today’s data-driven science, the validity of research data and results significantly depends on the software employed. We investigate the relationship between software usage and research validity, eventually leading to article retraction, by analyzing software mentioned across 1,924 retraction notices and 3,271 retracted articles. We systematically compare software mentions and related information with control articles sampled by Coarsened Exact Matching by recognizing publication year, scientific domain, and journal rank. We identify article retractions caused by software errors or misuse and find that retracted articles use fewer free and open-source software hampering reproducible research and quality control. Moreover, such differences are also present concerning software citation, where retracted articles less frequently follow software citation guidelines regarding free and open-source software. Peer Review https://www.webofscience.com/api/gateway/wos/peer-review/10.1162/qss_a_00275","PeriodicalId":34021,"journal":{"name":"Quantitative Science Studies","volume":"10 2","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-11-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135874859","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Leigh-Ann Butler, Lisa Matthias, Marc-André Simard, Philippe Mongeon, Stefanie Haustein
{"title":"The Oligopoly’s Shift to Open Access. How the Big Five Academic Publishers Profit from Article Processing Charges","authors":"Leigh-Ann Butler, Lisa Matthias, Marc-André Simard, Philippe Mongeon, Stefanie Haustein","doi":"10.1162/qss_a_00272","DOIUrl":"https://doi.org/10.1162/qss_a_00272","url":null,"abstract":"Abstract This study aims to estimate the total amount of article processing charges (APCs) paid to publish open access (OA) in journals controlled by the five large commercial publishers Elsevier, Sage, Springer-Nature, Taylor & Francis and Wiley between 2015 and 2018. Using publication data from WoS, OA status from Unpaywall and annual APC prices from open datasets and historical fees retrieved via the Internet Archive Wayback Machine, we estimate that globally authors paid $1.06 billion in publication fees to these publishers from 2015–2018. Revenue from gold OA amounted to $612.5 million, while $448.3 million was obtained for publishing OA in hybrid journals. Among the five publishers, Springer-Nature made the most revenue from OA ($589.7 million), followed by Elsevier ($221.4 million), Wiley ($114.3 million), Taylor & Francis ($76.8 million) and Sage ($31.6 million). With Elsevier and Wiley making most of APC revenue from hybrid fees and others focusing on gold, different OA strategies could be observed between publishers. Peer Review https://www.webofscience.com/api/gateway/wos/peer-review/10.1162/qss_a_00272","PeriodicalId":34021,"journal":{"name":"Quantitative Science Studies","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-11-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135874860","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}