Research Excellence Framework

The Research Excellence Framework (REF) is a research impact evaluation of British higher education institutions. It is the successor to the Research Assessment Exercise and it was first used in 2014 to assess the period 2008–2013.[1][2] REF is undertaken by the four UK higher education funding bodies: Research England, the Scottish Funding Council (SFC), the Higher Education Funding Council for Wales (HEFCW), and the Department for the Economy, Northern Ireland (DfE).

Its stated aims are to provide accountability for public investment in research, establish "reputational yardsticks",[3] and thereby to achieve an efficient allocation of resources. Critics argue, inter alia, that there is too much focus on the impact of research outside of the university system, and that impact has no real relevance to the quality of research. It is suggested that REF actually encourages mediocrity in published research, and discourages research which might have value in the long term.

The next iteration of the REF was to be in 2021, continuing the previous assessment model of focusing on research outputs, research impact and research environment.[4] However, the process has been delayed because of the COVID-19 pandemic.[5]

History

In June 2007 the Higher Education Funding Council for England (HEFCE) issued a circular letter announcing that a new framework for assessing research quality in UK universities would replace the Research Assessment Exercise (RAE), following the 2008 RAE.[6] The following quote from the letter indicates some of the original motivation:

Our key aims for the new framework will be:

  • to produce robust UK-wide indicators of research excellence for all disciplines which can be used to benchmark quality against international standards and to drive the Council's funding for research
  • to provide a basis for distributing funding primarily by reference to research excellence, and to fund excellent research in all its forms wherever it is found
  • to reduce significantly the administrative burden on institutions in comparison to the RAE
  • to avoid creating any undesirable behavioural incentives
  • to promote equality and diversity
  • to provide a stable framework for our continuing support of a world-leading research base within HE.

The letter also set out a timetable for the development of the REF. HEFCE undertook a consultation exercise during September–December 2009, soliciting responses from stakeholders on the proposals.[7] These include for example the response from Universities UK,[8] and the response from the University and College Union.[9]

In July 2010 (following the May 2010 general election), the Universities and Science minister David Willetts announced that the REF will be delayed by a year in order to assess the efficacy of the impact measure.[10]

In July 2016, Lord Nicholas Stern's review was published, drafting general guidelines for the next REF in 2021.[11] In general, the review was supportive with the methodology used in 2014 to evaluate universities' research, however it emphasised the need for more engagement with the general public and the increase of number of case studies that undertook interdisciplinary approach.[11] The Research-impact.org team at Loughborough University Business and Economic School have been experimenting with crowdfunding for research in order to increase the university's researchers' public engagement.[12]

Research Impact

REF's impact was defined as "an effect on, change or benefit to the economy, society, culture, public policy or services, health, the environment or quality of life, beyond academia".[13]

Grading criteria

Submissions are assessed according to the following criteria:[14]

  • Four star: Quality that is world-leading in originality, significance and rigour.
  • Three star: Quality that is internationally excellent in originality, significance and rigour but which falls short of the highest standards of excellence.
  • Two star: Quality that is recognised internationally in originality, significance and rigour.
  • One star: Quality that is recognised nationally in originality, significance and rigour.
  • Unclassified Quality: that falls below the standard of nationally recognised work. Or work which does not meet the published definition of research for the purposes of this assessment.

Performance rankings

Two publishers, The Guardian[15] and Times Higher Education,[16] produce overall rankings of multidisciplinary universities based on power and quality (GPA).

Power rankings aim to show universities with a breadth of quality, while Quality rankings aim to show the depth of quality.

The Guardian Power rankings only consider rankings graded at Four and Three star, while Times Higher Education Power rankings consider rankings across all gradings.

An additional Quality ranking is the one ranking institutions according to the proportion of their research graded as "Four star". That is, submitted researches graded as "Quality that is world-leading in originality, significance and rigour".[17]

Ranking THE Research Power Top 10 The Guardian Research Power Top 10 THE Quality (GPA) Top 10 Top 10 for highest % with 'World Leading' Research
1 University College London University of Oxford Imperial College London London School of Economics
2 University of Oxford University College London London School of Economics University of Oxford
3 University of Cambridge University of Cambridge University of Oxford University of Cambridge
4 University of Edinburgh University of Edinburgh University of Cambridge Imperial College London
5 University of Manchester University of Manchester Cardiff University University College London
6 King's College London Imperial College London King's College London Cardiff University
7 University of Nottingham King's College London University College London King's College London
8 Imperial College London University of Nottingham University of Warwick University of Edinburgh
9 University of Bristol University of Bristol University of Edinburgh University of Warwick
10 University of Leeds University of Leeds University of Bristol University of Bristol

Since the percentages of eligible staff submitted in REF evaluation are significantly different in different universities, Times Higher Education also provides a research intensity ranking which considers the proportion of the eligible staff submitted.[18] In this research intensity REF ranking, the top thirty universities, excluding three specialty institutions, are as follows.

Ranking THE Research Intensity Top 30
1 University of Cambridge
2 Imperial College London
3 University College London
4 University of Bristol
5 University of Oxford
6 London School of Economics and Political Science
7 Queen's University Belfast
7 University of Southampton
9 University of Warwick
10 University of Edinburgh
11 Loughborough University
12 University of Glasgow
13 University of St Andrews
14 King’s College London
15 University of Strathclyde
16 University of Exeter
16 University of Kent
16 University of Reading
19 University of Essex
20 University of Birmingham
21 Durham University
21 Goldsmiths, University of London
23 Newcastle University
23 University of Manchester
25 University of Nottingham
26 Lancaster University
27 Birkbeck, University of London
28 Royal Holloway, University of London
29 University of York
30 University of Sheffield

Controversies and criticism

A particular source of criticism has been the element of the REF that addresses the "impact" of research. The articles below raise two objections. The main one is that "impact" has been defined to mean impact outside the academy. If researchers were required to pursue this form of impact, it would undermine academic freedom. The other is that impact—as currently construed—is hard to measure in any way that would be regarded as fair and impartial.[19][20][21]

The Higher Education Funding Council for England argue that their measure of "impact" is a broad one which will encompass impact upon the "economy, society, public policy, culture and the quality of life".[19] However, the assessment structure does make what impact practically can be claimed rather narrow (4 page limit, no method section, 10 impact references, 10 research references and only 1 page to summarize the research and the impact respectively). These strict discursive guidelines alongside the REF's dated notion of how research impact functions (teaching research impact excluded, linear model, etc.) does restrict what impact is suited practically more for the assessment.

Another area of criticism, which the REF inherited from the structure of the RAE, is that for most full-time staff members submission normally consists of four published 'research output items'. There is no recognition of the difference between a book and an article in terms of research value. Therefore, the REF system discourages long-term projects that strive for excellence. This problem is particularly evident in the humanities, where most of the ground-breaking research is traditionally not published in articles. Therefore, many researchers are pushed towards a relatively mediocre activity, which will allow them to produce one or two books during the assessment period, but not the kind of monograph that normally would need four or five years of research and writing.

Moreover, the system of the four published items discourages long-term projects with relatively high research risk in the sciences as well, since researchers are reluctant to engage in projects or experiments that may not be successful and may not lead to a publication. Since most of the ground-breaking research in the sciences takes place with precisely such risky and imaginative projects, the type of research activity that is encouraged by the REF structure is quite conservative. Also, in terms of the impact of the examined research, in the history of the sciences and the humanities it is not unusual to take some time until the full impact of a discovery is made. The present system has a vista of only four or five years.

The Times Higher Education also revealed that some universities appeared to be "gaming" the REF system. This included "REF Poaching", in which staff with established research records were headhunted from their universities immediately before the REF, giving the poaching institution full credit for their publications without having taken the risk of supporting the researcher. It also included employing large numbers of staff on 0.2 FTE contracts, the lowest level of employment that qualifies them for REF submission.[22]

In addition to such concerns about what really can be measured by four research output items, and how impact may be measured, the whole system is often criticized as unnecessarily complex and expensive, whereas quality evaluation in the digital age could be much simpler and effective.[23]

The system, with its associated financial implications, has also been criticised for diverting resources from teaching. As such, increases in student fees may often not have resulted in more staff time being spent on teaching.

In July 2016, Lord Nicholas Stern's review was published, drafting general guidelines for the next REF in 2021.[24] One of the recommendations was to increase research public engagement. Research engagement means enhancing delivery of the benefits from research. It also means making the public more aware of the research findings and their implications. One mechanism for public engagement is crowdfunding for research, where dedicated platforms host crowdfunding campaigns for university research, in a range of topics. Crowdfunding for research has two advantages: one, it is a source for a relatively high guaranteed funding, with a rate of around 50%, second, it is a very effective tool to engage with the general public.[12]

One problem that the Stern review did not address in relation to the research impact assessment, is that the structure of case study design template on which impact is assessed, does not contain a method section, and thereby making the assessment of what type of impact was claimed a rhetoric game of who can claim the most (cf. Brauer, 2018).[25] Thereby, grand claims are incentivized by the assessment structure. The problem occurs, because qualitative judgments of the significance and reach of the impact (without an account of the underlying method) cement contemporary values into the assessment, as such; "[…] call it socially constructed, mutual learning, social practice whatever, the key is that we can’t separate characteristics of Impact from the process imposed on value and recognise it as such." (Derrick, 2018:160)[26] When checking the reference of current claims, these were either not accessible (e.g. the relevant websites were taken down), referenced in such a way that it didn't reflect self-authorship or testimonials of individuals connected to the researcher (Brauer, 2018:142-147). Similarly, Sayer (2014)[27] criticizes the overall peer review of the REF process, describing it as poor simulacrum of standard academic quality and that the assessment process is further complicated by the sheer workload of the assessment (p. 35). On a similar note, a RAND study found that the majority of the references were never consulted, certain assessment panels were discouraged from using the internet and the reference help structure of the REF took sometimes two weeks to produce associated references.[28] Thereby, the external impact focus disciplines the assessment into focusing on external values.[29]

In 2018, it was said that REF has negative effects on the humanities.[30]

See also

References

  1. "Results & submissions : REF 2014". Retrieved 22 December 2014.
  2. Atkinson, Peter M. (11 December 2014). "Assess the real cost of research assessment". World View. Nature (paper). 516 (7530): 145. doi:10.1038/516145a.
  3. "What is the REF?". REF2021. Retrieved 24 July 2018.
  4. England, Higher Funding Council of. "2017 : Funding bodies confirm shape of REF 2021 - REF 2021". www.ref.ac.uk. Retrieved 2018-06-29.
  5. https://www.ref.ac.uk/publications/further-update-on-coronavirus-covid-19-and-ref-timetable/
  6. Eastwood, David (6 March 2007). "Future framework for research assessment and funding". HEFCE. circular letter number 06/2007. Archived from the original on 2 February 2010.
  7. "Research Excellence Framework: Second consultation on the assessment and funding of research". HEFCE. September 2009. 2009/38. Retrieved 10 January 2015.
  8. "Universities UK response to HEFCE consultation on the Research Excellence Framework (REF)". Universities UK. 13 December 2009. Archived from the original (.doc) on 16 July 2011.
  9. "Response to the Research Excellence Framework: Second consultation on the assessment and funding of research" (PDF). University and College Union. December 2009.
  10. Baker, Simon (8 July 2010). "REF postponed while Willetts waits for impact 'consensus'". Times High. Educ.
  11. Stern, Lord Nicholas; et al. (July 2016). "Building on Success and Learning from Experience" (PDF). gov.uk. UK Government. Retrieved 3 January 2017.
  12. Rubin, Tzameret (2017). "Is it possible to get the crowd to fund research, isn't it the government's role?". AESIS. Retrieved 2016-12-23.
  13. McLellan, Timothy (2020-08-25). "Impact, theory of change, and the horizons of scientific practice". Social Studies of Science: 030631272095083. doi:10.1177/0306312720950830. ISSN 0306-3127.
  14. "Assessment framework and guidance on submission" (PDF). Research Excellence Framework. July 2011. p. 43. REF 02.2011.
  15. "University Research Excellence Framework 2014 – the full rankings". The Guardian. ISSN 0261-3077. Retrieved 2019-05-03.
  16. "REF 2014: results by subject". Times Higher Education (THE). 2014-12-18. Retrieved 2019-05-03.
  17. Coughlan, Sean (2014-12-18). "London overtaking Oxbridge domination". BBC. Retrieved 2019-05-03.
  18. "REF 2014: winners and losers in 'intensity' ranking". Times Higher Education (THE). 2014-12-19. Retrieved 2019-05-03.
  19. Shepherd, Jessica (13 October 2009). "Humanities research threatened by demands for 'economic impact'". Education. The Guardian. London.
  20. Oswald, Andrew (26 November 2009). "REF should stay out of the game". The Independent. London.
  21. Fernández-Armesto, Felipe (3 December 2009). "Poisonous Impact". Times Higher Education.
  22. Jump, Paul (26 September 2013). "Twenty per cent contracts rise in run-up to REF". Times Higher Education.
  23. Dunleavy, Patrick (10 June 2011). "The Research Excellence Framework is lumbering and expensive. For a fraction of the cost, a digital census of academic research would create unrivalled and genuine information about UK universities' research performance". London School of Economics.
  24. Stern, L. (2016). Building on Success and Learning from Experience: An Independent Review of the Research Excellence Framework.
  25. Brauer, R. (2018): What research impact? Tourism and the changing UK research ecosystem. Guildford: University of Surrey (PhD thesis). available at: http://epubs.surrey.ac.uk/id/eprint/846043
  26. Derrick, G. (2018). The evaluators’ eye: Impact assessment and academic peer review. Berlin: Springer.
  27. Sayer, D. (2014). Rank hypocrisies: The insult of the REF. Sage.
  28. "Evaluating the Submission Process for the Impact Element of REF". www.rand.org. Retrieved 2019-05-03.
  29. "Measuring the Societal Impact and Value of Research". www.rand.org. Retrieved 2019-05-03.
  30. Study International Staff (December 7, 2018). "Beware the 'Research Excellence Framework' ranking in the humanities". SI News. Retrieved September 19, 2019.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.