Making an impact: when science and politics collide

The 'impact agenda' is spreading from research councils to the higher education funding council, with increasing demands for strategic goals and public engagement
  • theguardian.com,
  • Jump to comments ()
A wrecking ball smashes into a building
Under new plans, university departments will have to submit impact statements for researchers. Photograph: John Lund/Getty

By the time Professor Alec Jeffreys worked out how to create a unique DNA fingerprint in 1984, he had been prising apart DNA for years to see how it varies from person to person.

Scientific breakthroughs like Jeffreys's exemplify what is known in modern science as "impact". This contentious concept tiptoes along the intersection between science and society because it implies that scientific endeavours ought to reap benefits for society.

This is where the politics comes in. How much scientific research should be driven by curiosity and how much should be planned with an end use, such as DNA fingerprinting, in mind?

In the UK, this question is answered at the research councils: government agencies that manage the allocation of public science funding. The job of the civil servants at the research councils is not to decide which research to support. That is left to scientists on funding committees. But the councils do set strategic goals that interpret ministers' desires to make research contribute to economic growth.

So research councils' decisions leave them open to criticism in this regard. Theoretical physicist Professor Michael Duff of Imperial College London complained in March that "non-scientists trying to pick winners and constraining researchers with the straitjacket of 'impact' and 'national interests' is neither good science nor good economics."

Duff's charge was amplified by the 100 or so scientists who delivered a coffin to 10 Downing Street on 15 May as a protest against several policies – not just regarding impact – of the Engineering and Physical Sciences Research Council. Chief among their concerns was "the introduction of non-scientific and subjective criteria such as 'importance' and 'impact' to determine funding". The EPSRC replied that "scientific excellence – funding the very best ideas – is at the heart of all that we do."

So controversy over impact has become a war of rhetoric, with this month's battle fought in the blogosphere and on Twitter (which is Storified here).

University of Cambridge physicist Professor Athene Donald has tried on her blog to focus the debate onto scientists' political tactics. Instead, her post inadvertently led to a bulging comment thread in which scientists quarrelled over the merits and linguistic nuances of impact.

University of Nottingham physicist Dr Philip Moriarty pulled apart the words of advice issued by research councils that grant applicants should draft an impact summary early "so that it informs the design of your research". Scientists like Moriarty and Duff are annoyed by the idea that their research should be directed by end goals. "Exploratory scientific research is no longer 'good enough'," Moriarty says.

Is research council policy too prescriptive? Donald thinks not, arguing that Moriarty's interpretation is "almost certainly a misreading". And the EPSRC has refuted Moriarty's and others' criticisms, even defending the policy more broadly. "This is not an issue that we actually mind fronting on behalf of the research councils," says Atti Emecz, the EPSRC's director of communications, information and strategy, adding that he remembers when other research councils were the focus of the impact debate. "Perhaps it's like tag-team wrestling," he says.

So the game will go on, as the impact agenda expands. The Biotechnology and Biological Sciences Research Council announced on Wednesday that it now expects its institutes to detail impact.

But is it good sport that the guidance for how researchers detail their "pathways to impact" appears to be unclear? Donald says that members of a funding committee she once chaired were uneasy when the policy came in, but found no problem in practice. "The person who writes a good grant proposal tends to be the person who writes a decent 'pathways to impact' statement," she says. Plenty of researchers do overcome this hurdle, and research councils detailed how in case studies published just a few days ago.

And in any case, impact is but one criterion used by the committees. Other factors include the applicant's track record and the feasibility of their proposed project. "Sometimes you reach a point where you have to choose between two grant applications," says Professor Doreen Cantrell, who chairs a Medical Research Council committee. "Then you will tend to use the impact [statement], but it's not the main driver." Her experience is shared by those on other committees.

Among all the other boxes to tick, then, where did the idea of impact come from? Most academics trace the birth of the impact agenda to a 1993 white paper by William Waldegrave, entitled Realising Our Potential. Up until this report, UK scientists operated under a loose set of expectations on their research. They were left to get on with their work on the assumption that basic research would eventually benefit health, national security and the economy.

Since Waldegrave's white paper, "the social contract has become rather more specific," says Professor Ben Martin, who researches science and technology policy at the University of Sussex.

"The state has had increasingly specific expectations that work should yield more direct benefits to the economy and society," Martin says. "In the early years after 1993, the term 'impact' wasn't much used. It was 'contributions or benefits to the economy'." Since 1993, the concept has expanded to include environmental factors and explaining science to the public.

Indeed, public engagement has become a significant form of impact recommended by research councils. But when scientists who are funded through an EPSRC fellowship met in London on Monday, only a minority admitted to being involved in public engagement (one source estimated it was a fifth of those attending the meeting).

So it may be recommended, but it's not easily taken up: after all, measuring the success of communicating science to the public is tricky. "How do you quantify Brian Cox's impact?" says Moriarty.

Nevertheless, bureaucrats are already creating new impact metrics, especially for next year when the policy is due to be used in the assignment of core funds to higher education institutions. Under new plans, university departments will have to submit an impact statement for one in every 10 researchers to the Higher Education Funding Council for England (HEFCE). The body will then weigh this criterion against others as it judges how much money to allocate to universities.

So although the research councils' impact agenda has come under attack from scientists, the critics may have to shift their sights onto the HEFCE. It could be that "impact" in research itself is not half as problematic as it is in higher education in general. "My worry is that applying impact expectations to that will be much more damaging," says Martin.

But this concern could be sidelined as the science community begins to prepare for another government spending review. This, along with the future of science in parliament, will be covered in next week's article, which will be the last in this series.

Today's best video

Today in pictures

Adam Smith investigates the role of science in UK politics. How do politicians receive information about scientific research? Should there be more scientists in parliament? Should scientists become more active in politics?

;