Outcomes from the REF consultation exercise have now been announced. Results show overwhelming support for expert review and the quality of research outputs to be the main element of the REF. There was support in principle for assessing the social and economic impact of research although reservations about how this could be done in practice. A pilot is currently looking at how to assess impact and will report in Autumn 2010.
A summary of the consultation responses and initial decisions can be found at http://www.hefce.ac.uk/ref
In a wide-ranging report,The Scientific Century: Securing Our Future Prosperity, the Royal Society makes its case for government investment in scientific research and improving education in the sciences.
Warning of the dangers of any crude assessment of research impact, the report nonetheless regards recent clarifications from HEFCE regarding this element of the Research Excellence Framework (REF) as being reassuring. It goes on to warn against the effects of any cuts in funding to academic scientific research, with a reminder how cuts in the mid-1980s led to researchers struggling “to remain at the cutting edge of their disciplines, using old equipment that they could not afford to replace”. Investing in scientific research now, they argue, is vital: “Science and innovation are investments that are essential to short-term recovery and, more importantly, to long-term prosperity and growth.”
Recommendations of the report align strategically with investment in: interdisciplinary research (including a call for a reform of research funding and assessment in relation to this); overseas collaboration; improved skills training (including ‘transferrable skills’) for PhD students in the sciences; and a call to create “strong global challenge research programmes, led by RCUK, to align scientific, commercial and public interests”.
The report can be read in full on the Royal Society website at:http://royalsociety.org/the-scientific-century/
In a recent Q&A on eGov monitor , the Shadow Minister For Science And Innovation, Adam Afriyie, said that he understood why researchers felt frustrated. He said that he wants to “work towards a clearer definition of the Haldane Principle” in order to preserve the independence of researchers and “blue skies research”.
As to the REF, he said that his party, if elected, would “would postpone the Research Excellence Framework to ensure we are defining, measuring and applying impact correctly”.
You can read the full Q&A at http://www.egovmonitor.com/node/33680
News that HEFCE have just published Capturing Research Impacts: A review of international practice is of particular interest for Swansea University since we will be participating in the HEFCE Pilot Exercise on Research Impact.
This is the report Hefce commissioned from RAND Europe to bring together knowledge gained from international experiences in assessing research impact. The review will help inform development of this new aspect of the REF. Notably, the review “suggests that the work of the Australian RQF Working Group on Impact Assessment might provide a basis for developing an approach to impact in the REF”. (See previous entries on this blog regarding the RQF.)
RAND Europe has already established experience in this area. See for example, Exploring the impact of arthritis research
Full report and executive summary are available at: http://www.hefce.ac.uk/pubs/rdreports/2009/rd23_09/
There has been much debate about this element of the REF proposals. Nobel laureates have raised objections. The Russell Group have stated that they are “broadly supportive of introducing a measure of the economic and social impact of research, provided that this is underpinned by a robust methodology which commands the confidence of the academic community”. A petition against the use of research impact in the allocation of funding has been created on the No. 10 website. An earlier petition, set up in June 2009, before the REF proposals were published, opposed the imposition of impact statements which funding councils now often require as part of a grant application; and knowledge transfer staff in universities have complained that they are being swamped with requests to write these “impact statements”.
Sir Alan Langlands, chief executive of the Higher Education Funding Council for England, and David Sweeney, Director of Research & Innovation at HEefce, have both indicated that research funding is limited and that there is a need to evaluate the impact of research. The policy context they cite is the Science & Innovation Investment Framework, 2004-14. There is a need, say Hefce, to “strengthen links between investment in research and the economic and social benefits it delivers”. The same message is to be found in the Research Councils Economic Impact Group’s 2006 paper, Increasing the Economic Impact of the Research Councils.
HEFCE has set out proposed methods for evaluating impact and is running a pilot exercise. The methodology to be employed in evaluating research impact, seems similar to that proposed for the now defunct Australian RQF exercise. The Australian top 8 research intensive universities opposed impact evaluation, which they felt would be overly bureaucratic and time-consuming. On the other hand, universities represented by the Australian Tehnology Network, were supportive of the proposals. The ATN produced an interesting brief report based upon their experience of the Australian research impact pilot exercise. It discusses methodology, and case study examples.
Here in the UK, UNICO, the Knowledge Transfer professionals’ organisation, have proposed the development of a metrics system to measure the impact of UK publicly funded research. These metrics, they suggest, could be used in attracting industry funding for research and to encourage international research collaborations with universities and with industry.
The Research Information Network (RIN) recently published a report on the effects of measures to evaluate the worth of research upon the publishing and citing behaviour of academic researchers. The report, “Communicating knowledge: how and why UK researchers publish and disseminate their findings“, finds that many are confused as to what the various funding bodies expect of them in terms of communicating their research. Most would appreciate better guidance as to how this will impact upon any assessment of their work.
- What factors influence decisions on the timing of publication and dissemination of research? Is it better to publish in the high status journal or to communicate more directly with the people most interested in the topic of the research?
- How do patterns vary across the disciplines?
- What place have the perceived requirements for research assessment occupied in the full range of factors that have influenced publication and citation behaviour?
A variety of methods, including focus groups and online questionnaires, were used to consider these and other questions. The report found that many reserachers are apparently considering citing their colleagues’ work more often because of the introduction of bibliometrics. Some researchers also said they felt they were being pressurised into publishing too much, too soon.
On the 15th October, HEPI (Higher Education Policy Institute) published their response to the HEFCE proposals for the Research Excellence Framework. On the whole, HEPI is supportive of the proposals but is critical of plans to assess the impact of research. The introduction of impact assessment – i.e. non-academic impact – is something new and experimental. HEFCE are proposing 25% of the assessment score should be given to “impact”. The HEPI response views this as rather high given the experimental nature of the “impact assessment” element.
The report is available to download.
The debate about assessing the impact of research in the Humanities continues to rumble on in the Guardian today.
David Sweeney, Director of Research & Innovation at HEFCE, was quoted in The Australian newspaper yesterday as saying that research funds in the UK are being stretched too much to cover the volume of research in academia. “Our strategy is to improve the quality of research, not the quantity,” he is reported to have said.
For those interested in the idea of evaluating the impact of research and the issues this raises, a recent article by Claire Donovan (ANU & Chair of the Technical Working Group on Research Impact for the RQF) published in 2008 in New Directions for Evaluation, is well worth a read. In this article, “The Australian Research Quality Framework: A live experiment in capturing the social, economic, environmental, and cultural returns of publicly funded research“, she discusses the development of the RQF’s impact rating scale. The Australians at that time planned to go beyond quantitative mechanisms measuring “investment from industry, commercialization, and technology transfer” to a broader definition of impact which included social, economic, environmental and cultural benefits. Donovan discusses the conflicting influences of government policy, keen to make academic research more business and industry focussed, with the concerns of academics to protect pure research. The period in which impact is measured was one issue of concern. The benefit of pure research is not always quickly realised.
In the end, the change of government in Australia led to the RQF & the research impact element being dropped. However, David Sweeney of HEFCE was quoted in yesterday’s AUSTRALIAN: “There are some bits we’ve pinched,” he says. “You were doing impact explicitly. You chose not to (continue). I understand why because you have other ways of doing that. But in our environment we thought it was worth trying.”