RSS West Midlands: Judging a book by its cover: How much of REF 'research quality' is really 'journal reputation'?
Date: No date given
Speakers: David Selby and Professor David Firth, University of Warwick
The Research Excellence Framework (REF) is a periodic UK-wide assessment of the quality of published research in universities. The most recent REF was in 2014, and the next will be in 2021. The published results of REF2014 include a categorical 'quality profile' for each unit of assessment (typically a university department), reporting what percentage of the unit's REF-submitted research outputs were assessed as being at each of four quality levels (labelled 4, 3, 2* and 1*). Also in the public domain are the original submissions made to REF2014, which include — for each unit of assessment — publication details of the REF-submitted research outputs.
In this study we address the question: to what extent can a REF quality profile for research outputs be attributed to the journals in which (most of) those outputs were published?
The data are the published submissions and results from REF2014. The main statistical challenge comes from the fact that REF quality profiles are available only at the aggregated level of whole units of assessment: the REF panel's assessment of each individual research output is not made public. Our research question is thus an 'ecological inference' problem, which demands special care in model formulation and methodology. The analysis is based on logit models in which journal-specific parameters are regularized via prior 'pseudo-data'. We develop a lack-of-fit measure for the extent to which REF scores appear to depend on publication venues rather than research quality or institution-level differences. Results are presented for several research fields.
Keywords: HDRUK
Venue: The Royal Statistical Society
City: London
Country: United Kingdom
Postcode: EC1Y 8LX
Organizer: Royal Statistical Society
Event types:
- Workshops and courses
Activity log