ProspectIP provide an overview of impact in the UK research sector – particularly in the context of the Research Excellence Framework – and outline how E-lucid, a new tool developed at UCL Business, the commercialisation company of University College London (UCL) can support the promotion, use and tracking of research for impact.
What is the Research Excellence Framework?
The ‘Research Excellence Framework (REF) is a periodic exercise for assessing the quality of research in UK higher education institutions (HEIs). REF is very important because it compares research quality across comparable subjects in HEIs and determines the level of Quality Related (QR) research funding available to universities from the government via the UK higher education funding bodies.
Up to and including the 2008 Research Assessment Exercise (RAE), government assessments of research focused only on outputs and the associated environment. However, in 2011, HEFCE (now Research England) introduced impact as a third strand of assessment, with the first submissions made for REF 2014. For that round, impact was worth 20% of the overall score; for REF2021, this will be increased to 25%.
What is ‘impact’?
For the purposes of the Research Excellence Framework (REF), impact is defined as an effect on, change or benefit to the economy, society, culture, public policy or services, health, the environment or quality of life, beyond academia (HEFCE et al 2012a). Put simply, it is the provable benefits of research in the real world (Bayley and Phipps, 2019).
Impact is also of increasing importance to funders, in particular UKRI and Research England. Since 2007, UKRI has required researchers to describe the potential impact from their research and what will be done to enhance the likelihood of their research being used to create wider benefits to society.
A report from the Policy Institute at King’s College London (2015) suggested it takes an average of 3-9 years for research to have an impact on society and the speed by which that impact happens varies by discipline.
The challenge for universities therefore lies in converting opportunities created from research activity into demonstrable economic and/ or broader societal impact. Identifying the potential societal benefits of your research output enables it to be shaped into an Impact Case Study (ICS) for inclusion in the REF assessment.
REF Impact Case Studies
A REF Impact Case Study is a document that describes how research, carried out over a defined period at a named institution, resulted in a change or benefit (impact) to culture, the economy, the environment, health, well-being, public policy, quality of life or society, and corroborated using qualitative and quantitative evidence. The impacts must have occurred during specified REF period (for 2021 this is August 2013 to July 2020).
An Impact Case Study contains:
- Summary of the impact
- Details of the underpinning research
- References to the research,
- Details of the impact,
- Evidence/sources to corroborate the impact
For REF 2021, the specific rules include:
- Underpinning research to have taken place between 1 January 2000 – 31 December 2020
- Impact to be demonstrated between 1 August 2013 – 31 July 2020
- Audit evidence of impact to be submitted by January 2021.
Understanding reach and significance
For REF, impact is assessed in terms of reach and significance. What do these terms mean?
- Reach is the extent and/or diversity of the beneficiaries of the impact, as relevant to the nature of the impact and is assessed in terms of the extent to which the potential number or groups of beneficiaries have been reached;
- Significance is the degree to which the impact has enabled, enriched, influenced, informed or changed the performance, policies, practices, products, services, understanding, awareness or wellbeing of the beneficiaries.
How much is an Impact Case Study worth to a university?
Case studies are reviewed by a panel of experts and rated on a scale of:
|Impact Case Study||Description|
|4*||Outstanding impacts in terms of their reach and significance|
|3*||Very considerable impacts in terms of their reach and significance|
|2*||Considerable impacts in terms of their reach and significance|
|1*||Recognised but modest impacts in terms of their reach and significance|
|Unclassified||The impact is of little or no reach and significance; or the impact was not eligible; or the impact was not underpinned by excellent research produced by the submitted unit.|
Money is awarded to those scoring 3* or 4*. Whilst actual figures vary by subject area (‘Units of Assessment’), work carried by Reed and Kerridge (2017) found that:
- A 4* impact case study was worth £46,311 to the university on average (range: £25,932-83,226) in 2016/17 and
- A 3* impact case study was worth £10,704 on average (range: £4,504-18,830) in 2016/17
Therefore, over the period 2015/16 – 2021/22 a 4* rating could be worth in the region of £324,000, to a university, and a 3* rating around £75,000.
Areas of impact and examples
There is no one-size-fits-all for impact, and the nature of impact varies by many factors including discipline and type of research. The table below referenced from the Panel criteria and working methods 2019/2 – REF2021 illustrates a range of areas of impact and example types:
|Areas of Impact||Examples of types of Impact|
|Impacts on understanding, learning and participation
|· Enhanced cultural understanding of issues and phenomena; shaping or informing public attitudes and values.
· Contributing to widening public access to and participation in the political process.
|Impacts on creativity, culture and society
|· Generating new ways of thinking that influence creative practice, its artistic quality or its audience reach
· Research-led engagement with marginalised, under- engaged and/or diverse audiences leads to increased cultural participation.
|Impact on social welfare
|· Improved social welfare, equality, social inclusion; improved access to justice and other opportunities (including employment and education).
· Engagement with research has enhanced policy and practice for securing poverty alleviation
|Impact on commerce & economy
|· Contributing to innovation and entrepreneurial activity through the design and delivery of new products or services.
· Gains in productivity have been realised as a result of research-led changes in practice.
|Impact on public policy, law and services
|· Policy decisions or changes to legislation, regulations or guidelines have been informed by research evidence.
· The quality, accessibility, acceptability or cost-effectiveness of a public service has been improved
|Impact on health, well-being and animal welfare
|· Outcomes for patients/users or related groups have improved
· A new diagnostic or clinical technology has been adopted
|Impact on production
|· Production yields or quality have been enhanced or level of waste has been reduced.
· Decisions by regulatory authorities have been influenced by research.
|Impact on the environment
|· New methods, models, monitoring or techniques have been developed that have led to changes or benefits
· Improved design or implementation of environmental policy or regulation
|Impact on practitioners and professional services
|· Professional standards, guidelines or training have been influenced by research
· Practitioners/professionals/lawyers have used research findings in conducting their work.
Indicators/evidence of impact
For REF, impact can only be claimed if there is corroborating evidence of the effects. Evidence can take many forms, see below (referenced from the Panel criteria and working methods 2019/2 – REF2021 ) as long as it proves or provides testimony to a change having occurred. This requires monitoring and tracking of any benefits achieved as a result of research being implemented and must collectively demonstrate the significance and reach being claimed. When collating evidence for impact, its significance and reach is key to a successful ICS. Significance must be demonstrated through the nature of any benefits and reach by the scale of the benefits.
|Examples of impact evidence (REF2021)|
|· Measures of improved clinical outcomes, public behaviour or health services
· Evaluative reviews in the media.
· Measures of improved social equality, welfare or inclusion.
· Sales of new products/services.
· Employment figures.
· Evidence of improved sustainability.
· Documented changes to working guidelines.
· Documented change to professional standards, performance or behaviour.
· New or modified technical standards or protocols.
· Verifiable influence on particular projects or processes which bring environmental benefits.
Cross disciplinary considerations
The nature of, and pathways towards impact vary between disciplines. Although there are no hard lines, often the outputs of Science, Technology, Engineering and Mathematics (STEM) based research or disciplines may be more naturally supported by university Technology Transfer teams and connected to commercial and economic benefits. Associated impacts may include sales of a new product/service, business performance measures, licences awarded and brought to market, commercial adoption of a new technology, process, knowledge, jobs created, evidence of clinical trials etc.
In contrast, for non-STEM disciplines such as Social Sciences, Humanities and the Arts, research may be co-produced with society and less amenable to more commercially focused models of support. This raises challenges not only for supporting staff, but also in identifying what impact to track and how.
E-lucid: a new platform to support the promotion, usage and tracking of research products
For STEM and non-STEM researchers alike, it is clear that increasing visibility and access to research ‘products’ (interventions, frameworks, workbooks etc) supports the pursuit of impact. However, it can be challenging for universities to achieve this across multiple and varied projects within existing staff capacity.
UCL Business has developed an automated transactioning platform E-lucid, to support Technology Transfer and Impact objectives. Research outputs such as software, healthcare tools, copyright materials, biological or chemical research material, questionnaires, assessment tools, images and videos, teaching materials, datasets, publications (e.g. manuals) etc can be uploaded by the Technology Transfer Office, Enterprise Office or equivalent onto their branded E-lucid storefront which is delivered as a managed service by the UCLB team. Reporting features provide impact evidence such as the number of licensees or users of a software, questionnaire or process, geographical distribution and organisational categories of licensees or users (e.g. SME, research institution, private individual etc), the intended use of the licensed material, etc.
The E-lucid platform is already used by a number of UK and US universities and supplements the traditional and existing resources academics use to promote their research outputs whilst offering the potential for collecting licence fees where appropriate.
In summary, from an impact agenda E-lucid manages the promotion of research outputs and collection of impact evidence to support the preparation of ICS which translates into QR funding for 3* and 4 * submissions.
ProspectIP is currently working with University College London Business to implement the system at other UK institutions to support the capture and monitoring of impact evidence to support ICS.
For further information contact ProspectIP (email@example.com)
HEFCE, SFC, HEFCW, DELNI (2012a) Assessment framework and guidance on submissions (02.2011 updated version) www.ref.ac.uk/media/ref/content/pub/assessmentframeworkandguidanceonsubmissions/GOS%20including%20addendum.pdf
Extending the concept of research impact literacy: levels of literacy, institutional role and ethical considerations Bayley & Phipps (2019)
How much was an impact case study worth in the UK Research Excellence Framework?
Reed & Kerridge (2017)
Panel criteria and working methods (2019/2) – REF2021
The nature, scale and beneficiaries of research impact:
An initial analysis of Research Excellence Framework (REF) 2014 impact case studies Research Report King’s College London and Digital Science (2015)