In a wide-ranging report,The Scientific Century: Securing Our Future Prosperity, the Royal Society makes its case for government investment in scientific research and improving education in the sciences.
Warning of the dangers of any crude assessment of research impact, the report nonetheless regards recent clarifications from HEFCE regarding this element of the Research Excellence Framework (REF) as being reassuring. It goes on to warn against the effects of any cuts in funding to academic scientific research, with a reminder how cuts in the mid-1980s led to researchers struggling “to remain at the cutting edge of their disciplines, using old equipment that they could not afford to replace”. Investing in scientific research now, they argue, is vital: “Science and innovation are investments that are essential to short-term recovery and, more importantly, to long-term prosperity and growth.”
Recommendations of the report align strategically with investment in: interdisciplinary research (including a call for a reform of research funding and assessment in relation to this); overseas collaboration; improved skills training (including ‘transferrable skills’) for PhD students in the sciences; and a call to create “strong global challenge research programmes, led by RCUK, to align scientific, commercial and public interests”.
The report can be read in full on the Royal Society website at:http://royalsociety.org/the-scientific-century/
In a recent Q&A on eGov monitor , the Shadow Minister For Science And Innovation, Adam Afriyie, said that he understood why researchers felt frustrated. He said that he wants to “work towards a clearer definition of the Haldane Principle” in order to preserve the independence of researchers and “blue skies research”.
As to the REF, he said that his party, if elected, would “would postpone the Research Excellence Framework to ensure we are defining, measuring and applying impact correctly”.
You can read the full Q&A at http://www.egovmonitor.com/node/33680
Research carried out at the Universities of Exeter, Loughborough and Leeds studied how academics rated management journals and found that they rated journals more highly if they had published in them, if they were on the board for the journal, if it reflected their subject interest strongly and if it was from their geographical area. It concludes that journal list rankings are subject to bias. Such lists have been ruled out for use in the Research Excellence Framework but a lot of universities still use them. The full story is in the Times Higher Education .
An interesting piece from Mark Henderson, on the Times Online website. Lord Drayson, Science Minister, was defending the case for research impact to young researchers opposed to its introduction in the REF. .” Lord Drayson argued that if science and academia are to continue to claim significant public funding, they have to do a better job of explaining its benefits — not least to make the case to an often sceptical Treasury on the look-out for easy cuts.” Meanwhile, at the Royal Society, Henderson tells us that Lord Rees was making an eloquent case against the impact agenda, saying that: “There’s a risk that current efforts to prioritise and ‘audit’ academic research will backfire, by eroding the strength of our universities and thereby weakening the UK’s competitiveness as a high-tech nation.” Read the article online and watch the debate with Lord Drayson online at the THES.
Since RAND Europe have now produced their report for HEFCE on international practices in assessing the impact of research, it may be a good time to also point out some of the work that RAND has been doing in this area and to look at examples of their work in assessing research impact for a range of various international clients. Those who are interested can follow this link to the RAND website.
On the 20th October, the RCUK launched its Framework document. It aims to show how publicly funded research enables the UK to have a “productive economy, healthy society and contribute to a sustainable world”. The basis of the framework rests upon these “three mutually supportive areas”. The document can be downloaded from the website and there are also links to case studies and an interview with Professor Alan J Thorpe, Chair of RCUK.
There has been much debate about this element of the REF proposals. Nobel laureates have raised objections. The Russell Group have stated that they are “broadly supportive of introducing a measure of the economic and social impact of research, provided that this is underpinned by a robust methodology which commands the confidence of the academic community”. A petition against the use of research impact in the allocation of funding has been created on the No. 10 website. An earlier petition, set up in June 2009, before the REF proposals were published, opposed the imposition of impact statements which funding councils now often require as part of a grant application; and knowledge transfer staff in universities have complained that they are being swamped with requests to write these “impact statements”.
Sir Alan Langlands, chief executive of the Higher Education Funding Council for England, and David Sweeney, Director of Research & Innovation at HEefce, have both indicated that research funding is limited and that there is a need to evaluate the impact of research. The policy context they cite is the Science & Innovation Investment Framework, 2004-14. There is a need, say Hefce, to “strengthen links between investment in research and the economic and social benefits it delivers”. The same message is to be found in the Research Councils Economic Impact Group’s 2006 paper, Increasing the Economic Impact of the Research Councils.
HEFCE has set out proposed methods for evaluating impact and is running a pilot exercise. The methodology to be employed in evaluating research impact, seems similar to that proposed for the now defunct Australian RQF exercise. The Australian top 8 research intensive universities opposed impact evaluation, which they felt would be overly bureaucratic and time-consuming. On the other hand, universities represented by the Australian Tehnology Network, were supportive of the proposals. The ATN produced an interesting brief report based upon their experience of the Australian research impact pilot exercise. It discusses methodology, and case study examples.
Here in the UK, UNICO, the Knowledge Transfer professionals’ organisation, have proposed the development of a metrics system to measure the impact of UK publicly funded research. These metrics, they suggest, could be used in attracting industry funding for research and to encourage international research collaborations with universities and with industry.
On the 15th October, HEPI (Higher Education Policy Institute) published their response to the HEFCE proposals for the Research Excellence Framework. On the whole, HEPI is supportive of the proposals but is critical of plans to assess the impact of research. The introduction of impact assessment – i.e. non-academic impact – is something new and experimental. HEFCE are proposing 25% of the assessment score should be given to “impact”. The HEPI response views this as rather high given the experimental nature of the “impact assessment” element.
The report is available to download.
The debate about assessing the impact of research in the Humanities continues to rumble on in the Guardian today.
David Sweeney, Director of Research & Innovation at HEFCE, was quoted in The Australian newspaper yesterday as saying that research funds in the UK are being stretched too much to cover the volume of research in academia. “Our strategy is to improve the quality of research, not the quantity,” he is reported to have said.