What the Heitor Report gets right (and wrong) about EU science policy
If science progresses one funeral at a time, EU science policy “progresses” one Framework Programme at a time.
Former Portuguese Science Minister, Manuel Heitor, has recently laid to rest the 9th Framework Programme, and offered the Commission recommendations on the 10th iteration.
Heitor has correctly diagnosed some of the Programme’s weaknesses:
Administrative burden: For grant applications, for grant selection, for project implementation and for administrative costs. (1)
Speed: The speed of technological change and adoption; the speed of change in the international order; the speed of decision-making processes; and the speed with which we need to act, faced with the “embedded inertia in science systems”.
Complexity of science: “Frontier research, breakthrough innovation, technology development … are intimately intertwined with many feedback loops and an ever-increasing speed of both discovery and translation … they behave as a complex adaptive system. Other countries have recognised this, and Europe’s global position is slipping due to their increased investment and performance. This is a wake up, turn around moment for Europe”
The importance (and elusiveness) of paradigm-shifting science: Most research focuses on normal science (‘incremental’ progress in established fields), rather than “harder, rarer, and potentially more disruptive, breakthrough and transformative scientific advances and innovation to catalyse paradigm shifts”. (2)
Experimental management: Europe lacks bodies experimenting with new models of science policy management. Drawing on the success elsewhere, Heitor concludes that such a body in Europe would need “risk tolerance… and an activist, change agent, high risk high reward culture where many failures can be justified by a single success.”
But, he wrongly prescribes more of the same - target-setting, road-mapping, and penny-pinching:
A focus on measurement and KPIs: Measurement leads to more bureaucracy. Bureaucracy slows down feedback loops and compounds speed- and complexity-management problems. Measurement also sends the wrong incentives to researchers. Grant applicants are encouraged to focus proposals on predictable projects whose outputs can be easily measured, not on the potentially riskier ventures, whose outcomes could be potentially transformative. Over-rating measurement drives researchers away from paradigm shifts and pushes them back towards more-of-the-same, normal science.
More Councils: To “ensure guidance and advice on strategy … exchanging and coordinating … commissioning evaluations and monitoring administrative efficiency and effectiveness”. You get the picture...
An overemphasis on ‘strategic’ research: Heitor acknowledges science is a complex adaptive system (so too is the EU policymaking process more generally). Change in complex adaptive systems is often non-linear and unpredictable. But Heitor seems to perceive progress as linear. He laments that R&D programmes are not aligned with broader European strategic policy priorities. He implies that more research in priority areas would lead to more progress against strategic objectives in those domains.
Investing more in science in ‘strategic’ areas might be beneficial in this regard, but there is no guarantee in complex system. Science is not some pre-defined tech tree (3) that one can ‘speed run’ by throwing more money and research hours at it. Progress has an element of chaos, randomness, and chance. The best way to manage this is to lean into it. (4)
Reinforcing negative feedback loops: There is also a risk that obsessing over a small number of ‘strategic’ research areas reinforces Europe’s emerging inferiority complex. We may end up constantly focusing on playing catch-up in a few domains, dictated by progress in the US and China. Our neighbours end up shaping our feedback loops and setting our agendas.
It might make sense generally for Europe to pay particular attention to those critical sectors where it is lagging. But if Europe is making paradigm-shifting breakthroughs, it is by very definition a leader in the field. Being first to pioneer a new frontier means setting the agenda and reaping the benefit of winner-takes-most. In science, there’s a strong case for over-investing in new, under-explored, not-yet-defined domains. We mustn’t neglect these and focus solely on other, more established areas where other regions seem to be doing better.
Instead, Europe needs experimental treatment:
Science is a complex adaptive system. Complex systems are managed (in so far as they can be) through leverage points. If policymakers want to manage complexity, they must identify the right leverage point to exploit, and how.
Heitor’s is right to highlight the importance of power laws (another element of complex adaptive systems) in dictating paradigm shifts. (5) Ultimately, I think Europe’s scientific place in the global science race will be decided by the outputs of a small number of its very best researchers. (6) Following the power law distribution model, most transformative scientific progress will come from this tiny number of incredibly gifted scientists.
Europe should develop a small, flexible, experimental programme in FP10 to identify, recruit, and exploit the potential of this cohort.
Scout and recruit these scientists to EU universities: They are likely to be creative, young, confident scientists - not afraid to ignore tradition and not as reputationally invested in contemporary paradigms. They are likely to be active in fields with gaps and theoretically unexplained ‘anomalies’. They will be nurturing theories which go against the established orthodoxy.
Scale down central management: “Managed science can at best produce only what its managers specify” is helpful here. ‘Top-down’, ‘mission-oriented’, ‘strategic’ are red herrings. Roadmaps, measurement and KPIs leads to short-termism. They favour researchers who can best game the grants system, not those peering into the scientific unknown. Europe needs to incentivize long-term thinking and curiosity for curiosity’s sake, not more periodic hoop-jumping. As in so many other fields, Europe must get better at accepting the uncertainty and higher risk-return profile that comes with this approach.
Get out of the prescriptive, linear mindset: Scientists, like startups, must be able to explore and pivot quickly when real-world feedback signals them to do so. Grant promising, young scientists unrestricted funds (still a drop in the ocean of the overall science policy budget) based on their potential, not their proposals.Require of them the absolute minimum reporting. This will allow the next generations of scientists to follow where the science – not their rubber-stamped roadmap – leads them. (7) Europe will begin pioneering new frontiers and become the global science agenda-setter.
Leverage Europe’s competitive advantages: Condition access to these unrestricted funds on conducting their research at EU universities. This will attract the best researchers from more bureaucratic or commanding ‘competitors’ and send a signal across the globe that Europe will back the iconoclasts. The EU could offer researchers funding, partnerships and secondments with other institutions in the single market as their research evolves and they need to seek expertise in areas outside their own. The EU could provide free fast-track patenting through its Unitary Patent System, in return for a tiny percentage of any revenue generated from their discoveries.
A smaller, private sector programme similar to the one I am describing turned $40m investment into $1.25b in 10 years. (8) Financially, the programme would more than pay for itself, before we even begin measuring geopolitical, scientific, and societal RoI.
***
(1) Heitor found that about 10% of those he surveyed faced administrative costs of more than 20% of the entire project budget. While the Commission has kept its admin costs flat at around 6%, proposal applicants spend more and more on consultants. This outsourcing of paperwork wastes time and that could be spent on research.
(2) Thomas Kuhn – The Structure of Scientific Revolutions.
(3) I borrowed the tech tree analogy from Ian Hogarth, who wrote an interesting article in the FT on Can Europe build its first trillion-dollar start-up? But don’t see progress quite as deterministically as he does.
(4) The list of breakthroughs made after scientists borrowed from other, seemingly-unrelated fields is long. It includes the discovery of the DNA double helix (built upon theoretical physicist Erwin Schrödinger’s work on information), and MRI (which was founded on physicist Isidor Isaac Rabi’s work on nuclear magnetic resonance, then initially applied to chemistry, before being refined for biological sciences by many, including Paul Lauterbur, who came up with the idea while brainstorming at the Pittsburgh Eat'n Park Big Boy Restaurant.)
(5) See Heitor’s claim that successful experimental units have a ‘high risk high reward culture where many failures can be justified by a single success’. This is the venture capital model of scientific research: dead ends in nine areas are worth it if the 10th avenue a scientist explores leads to paradigm shift.
(6) I say “I”, but my ideas are mostly influenced by – if not lifted from – this book, this book, this book and this book; as well as the burgeoning body of progress studies research focused on metascience.
(7) Braben points out that in 1874, Max Planck was advised by his supervisor against going into theoretical physics because it was a ‘highly developed, nearly mature science’. Good thing he ignored that, because in 1900, his research paved the way for quantum physics, which has fundamentally thrown into question how we understand the universe, and is still subject to a profound debate today.
(8) Donald W. Braben – Scientific Freedom: The Elixir of Freedom.