In a previous article of ours, we explained the reasons why the MNCS and all similar per-publication citation indicators should not be used to measure research performance, whereas efficiency indicators (output to input) such as the FSS are valid indicators of performance. The problem frequently indicated in measuring efficiency indicators lies in the availability of input data. If we accept that such data are inaccessible, and instead resort to per-publication citation indicators, the question arises as to what extent institution performance rankings by MNCS are different from those by FSS (and so what effects such results could have on policy-makers, managers and other users of the rankings). Contrasting the 2008–2012 performance by MNCS and FSS of Italian universities in the Sciences, we try to answer that question at field, discipline, and overall university level. We present the descriptive statistics of the shifts in rank, and the correlations of both scores and ranks. The analysis reveals strong correlations in many fields but weak correlations in others. The extent of rank shifts is never negligible: a number of universities shift from top to non-top quartile ranks.

Abramo, G., D'Angelo, C.a. (2016). A comparison of university performance scores and ranks by MNCS and FSS. JOURNAL OF INFORMETRICS, 10(4), 889-901 [10.1016/j.joi.2016.07.004].

A comparison of university performance scores and ranks by MNCS and FSS

ABRAMO, GIOVANNI;D'ANGELO, CIRIACO ANDREA
2016-01-01

Abstract

In a previous article of ours, we explained the reasons why the MNCS and all similar per-publication citation indicators should not be used to measure research performance, whereas efficiency indicators (output to input) such as the FSS are valid indicators of performance. The problem frequently indicated in measuring efficiency indicators lies in the availability of input data. If we accept that such data are inaccessible, and instead resort to per-publication citation indicators, the question arises as to what extent institution performance rankings by MNCS are different from those by FSS (and so what effects such results could have on policy-makers, managers and other users of the rankings). Contrasting the 2008–2012 performance by MNCS and FSS of Italian universities in the Sciences, we try to answer that question at field, discipline, and overall university level. We present the descriptive statistics of the shifts in rank, and the correlations of both scores and ranks. The analysis reveals strong correlations in many fields but weak correlations in others. The extent of rank shifts is never negligible: a number of universities shift from top to non-top quartile ranks.
2016
Pubblicato
Rilevanza internazionale
Articolo
Esperti anonimi
Settore ING-IND/35 - INGEGNERIA ECONOMICO-GESTIONALE
English
Bibliometrics; Productivity; Research evaluation; Universities; Statistics and Probability; Modeling and Simulation; Computer Science Applications1707 Computer Vision and Pattern Recognition; Management Science and Operations Research; Applied Mathematics
http://www.journals.elsevier.com/journal-of-informetrics/
Abramo, G., D'Angelo, C.a. (2016). A comparison of university performance scores and ranks by MNCS and FSS. JOURNAL OF INFORMETRICS, 10(4), 889-901 [10.1016/j.joi.2016.07.004].
Abramo, G; D'Angelo, Ca
Articolo su rivista
File in questo prodotto:
File Dimensione Formato  
1-s2.0-S1751157716301730-main.pdf

solo utenti autorizzati

Licenza: Copyright dell'editore
Dimensione 715.7 kB
Formato Adobe PDF
715.7 kB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/2108/166674
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 7
  • ???jsp.display-item.citation.isi??? 7
social impact