Abstract
This study focuses on a recently introduced type of indicator measuring disruptiveness in science. Disruptive research diverges from current lines of research by opening up new lines. In the current study, we included the initially proposed indicator of this new type (Funk & Owen-Smith, 2017;Wu, Wang, & Evans, 2019) and several variants with DI1: DI5, DI1n, DI5n, and DEP. Since indicators should measure what they propose to measure, we investigated the convergent validity of the indicators. We used a list of milestone papers, selected and published by editors of Physical Review Letters, and investigated whether this human (experts)-based list is related to values of the several disruption indicators variants and - if so - which variants show the highest correlation with expert judgements. We used bivariate statistics, multiple regression models, and (coarsened) exact matching (CEM) to investigate the convergent validity of the indicators. The results show that the indicators correlate differently with the milestone paper assignments by the editors. It is not the initially proposed disruption index that performed best (DI1), but the variant DI5 which has been introduced by Bornmann, Devarakonda, Tekles, and Chacko (2020a). In the CEM analysis of this study, the DEP variant - introduced by Bu, Waltman, and Huang (in press) - also showed favorable results. (C) 2021 Elsevier Ltd. All rights reserved.
Dokumententyp: | Zeitschriftenartikel |
---|---|
Fakultät: | Sozialwissenschaften > Department: Institut für Soziologie |
Themengebiete: | 300 Sozialwissenschaften > 300 Sozialwissenschaft, Soziologie |
ISSN: | 1751-1577 |
Sprache: | Englisch |
Dokumenten ID: | 97161 |
Datum der Veröffentlichung auf Open Access LMU: | 05. Jun. 2023, 15:25 |
Letzte Änderungen: | 05. Jun. 2023, 15:25 |