My article in Physics World (August 2015) with John Tesh, formerly of the UK Cabinet Office, giving 12 tips for scientists to have effective dialogue with politicians.
Here is the original draft:
How Do Politicians Think? Practical Tips for Communicating Science Advice.
Len Fisher (School of Physics, University of Bristol) and John Tesh CBE (Former Deputy Director, UK Cabinet Office Civil Contingencies Secretariat; Policy Fellow, Centre for Science and Policy, Cambridge)
According to former UK Government Chief Scientific Adviser Sir John Beddington “ensuring government is properly informed by science is something that all scientists should be involved in.”1 But how should we go about it?
Certainly there is a professed interest in science in many Government quarters. President Juncker of the European Commission, for example, has said that the EU needs to “make sure that Commission proposals and activities are based on sound scientific advice.”2 As a recent OECD study also pointed out3, “science is truly at the centre of many important policy issues and scientists are increasingly visible and, in many cases, increasingly vulnerable, in policy-making processes”. Some scientists have suggested4 that “the immediate priority is to improve policy-makers’ understanding of the imperfect nature of science.” Here we suggest that there is an even greater priority – to improve scientists’ understanding of the imperfect nature of politics.
Governments have to deal with three categories of decision-making: fulfilling a promised programme; solving problems and managing unplanned crises as they arise; and preparing for potential future problems, including securing their reputations for the long-term future. There is a role for scientists in all three, with the cause of those who wish to be involved being helped (in the U.K. at least) by the ongoing reduction in the size of the civil service. This means that there is an increasing necessity, and even a willingness, to diversify sources of advice, with the civil service itself being more being more occupied with the implementation of that advice.
We have been particularly concerned with facilitating interactions between scientists and policy-makers for the development of policies for handling slowly developing risks with potentially catastrophic economic, social or environmental consequences5. We believe from our experience that there is particular scope for engagement by scientists in the area of risk and uncertainty generally.
The management of risk (defined as “an uncertain consequence of an event or activity with regard to something humans value”6) has become a core political preoccupation. A U.K. Cabinet Office study in 20027 noted that the nature of risk had changed for two fundamental reasons. First, the increasingly rapid pace of development of new science and technology has led to concerns about “manufactured risks”. Many of these result from “the interaction of known risks (e.g. natural hazards) with previously unknown or unprepared-for vulnerabilities”8. These require governments and regulators to make judgments about the balance of benefit and risk across a huge range of technologies, many of which (from nanotechnology to large-scale developments of new and existing sources of energy) have a strong physics component.
The second new factor has been the greater interconnectedness of the world, through an integrated global economy, communications system and a shared environment, bringing huge opportunities but also exposing citizens to distant events. These systemic risks are now high on the policy agenda in many countries. Again, there are many areas for physics/mathematics involvement, especially in understanding and predicting the behaviour of networks in particular and complex systems in general.
The cause of scientists who wish to become involved is helped by the growing public intolerance of government inability to assess and manage risk, and also by the growing practice in the media to seek independent validation of governments’ professed commitment to open, evidence-based policy making. To quote Beddington again, though “What is more difficult is ensuring that science is brought to bear effectively on the questions which policymakers know matter but which don’t present a single decision moment, or where it is less obvious that science can help.”9 This means that individual scientists have a particular responsibility to communicate when their specialist knowledge and expertise means that they can perceive important scientific issues that those inside the normal processes of political decision-making may not be aware of.
There are pitfalls, however. The case of scientists and officials who were charged with manslaughter following the April 2009 earthquake in the Italian city of L’Aquila, which they had failed to predict, illustrates the legal perils if responsibilities are unclear between governments and their official or unofficial advisers.
To get their voices heard safely, scientists need a formal framework. Such frameworks exist, among countries like the UK which have recognized both the benefits and the challenges of integrating scientific advice into policy-making. More recently, the OECD’s report on ‘scientific advice for policy-making’ articulates the three essentials:
- a clear remit, with defined roles and responsibilities for its various actors distinguishing advisory from decision-making functions, and with the involvement of scientists at all stages including question-setting, team selection, research and advice, and above all communication of and use of the advice
- involvement a full range of scientists, policy-makers and other stakeholders in the team
- with a remit to produce advice that is sound, unbiased and legitimate.
An increasing number of governments are reviewing the terms of reference for scientific advisory bodies accordingly.
These conditions are the sine qua non for an effective and trustworthy science advisory process.
Scientists need to be aware of them when deciding whether or not to profer advice. To get their voices heard, however, scientists need more; they need to know how to communicate effectively with politicians and others involved in decision-making processes. Here we summarize some key hints, based on examples of success but also failure, across the field of science:
1) Be aware that scientists are often seen as just another lobby group10 This one simple (if unpalatable) fact means that science advice is more than a matter of speaking truth to power. It is also about persuading those who hold the power that the advice is reliable, that the advisor does not have a hidden agenda, and that the advice is worth both listening to and acting upon11.
2) Be aware of the structure of Government policy-making Government decision-making is a complex business, and there is a clear distinction to be drawn between political decision-making and policy-making. We do not believe that it is particularly helpful for scientists to attempt to involve themselves directly in the former (there are many unhappy examples to illustrate our point).
Nor is it easy to break through the barriers that government officials erect to protect themselves and their spheres of influence – see Anne Glover’s account12 of her time as Chief Scientific Adviser to the European Commission. But
it is quite realistic for scientists to hope to contribute to an evidence base for policy-making, especially in a climate (such as the present one in the U.K.) where a government has declared that it wants open and transparent policy making, with policies based on evidence, and where the prospects for economic growth depend largely on exploiting the innovations that science can offer. Governments are increasingly adapting their policy-making machines to accommodate a more systematic scientific voice. Glover’s success in collaborating with the Director General for Connect on digital communication issues shows what can be achieved when these new structures work.
3) Be aware that science is not the usually the only, or even the major, consideration In terms of political decision-making, science can be very low in the pecking order. Ministers with science degrees are few and far between, and tends to reach for experts in the economic, legal and social issues, as well as for their political advisers. It follows that science advice is most likely to be listened to if it can be integrated with information from these other fields. But don’t expect politicians to pick out the salient political or economic advantages from a complex mess of science. Do it yourself! If you need a few clues, the IOP Report The Importance of Physics to Economic Growth (http://www.iop.org/publications/iop/2013/page_60319.html) is an excellent starting point.
4) Point out the role of your specialty in contributing to the solution of cross-disciplinary problems The most intractable science-related problems that governments face tend to be cross-boundary, especially in the area of risk analysis, mediation, and prevention. This is where Governments need to bring teams of scientists from different areas together to develop a solution. It is not always obvious which areas may hold the key to a solution, so in cases of present or future risk, be prepared to consider whether you might have something to contribute, and don’t be shy about coming forward.
5) Be aware of the importance of personal contact The political process is based largely on the development of trust and understanding through personal contacts. Scientists who wish to be heard should aim to develop such contacts, rather than banging the drum from the outside. The point of contact will not necessarily be a politician – it may be a particular committee chairman, a departmental science adviser (in the UK, all major government departments have one), a civil servant or other member of a government department, or even a lobbyist. The important point is to find the right conduit for communication.
6) Be aware of political priorities and the necessity for engaging with them All too often, researchers with a passion for their subject seem to think that politicians just need to be ‘put straight’ on the science involved in a particular issue. Research11 on why certain types of advice are accepted, and others ignored, shows that this approach is ineffective unless it is framed in terms of the needs and preoccupations of the decision-maker – in this case, his or her political priorities. These may include the social context (e.g. how will voters in the decision-maker’s constituency be affected?), the economic cost or benefit, and even the practicality of taking and implementing a decision before the next election.
To be truly effective, scientists must thus make themselves aware of the political ramifications of their advice, and point these out in clear, unambiguous terms. Scientists should also realize that politicians and other policy-makers are constantly bombarded with information, and short, pithy statements are much more likely to be heeded – especially if the writer takes the time and effort to use effective words and phrases that can be “borrowed” and repeated.
7) Be aware of political timescales Politicians are primarily concerned with the short-term. Any policy whose benefits will only be felt in the distant future is likely to assume less importance than one whose benefits can proudly be displayed before the next election. It follows that scientific advice (which is often concerned with long-term issues) is most likely to be accepted and acted on if at least some short-term benefits can be identified and “sold” to politicians. This is not cynical – it is practical (after all, a politician cannot implement a policy if he or she is not in power).
8) Offer options, not policies There is evidence to suggest that science advice is most likely to be heeded if the scientist is perceived as an “honest broker,”11 integrating scientific knowledge and understanding with stakeholder concerns to provide even-handed advice and understanding within a policy context. By acting in such a way, scientists can help to break down the often-held political view that scientists are just another pressure group, or that they acting to promote the interests of particular pressure groups.
9) Don’t over-claim Hubris is as much a besetting sin among scientists as it is in other areas of specialism – perhaps more so, since in their efforts to persuade politicians and the public to sit up and take notice, scientists too often tend to overstate their case. In particular, scientists should avoid making predictions. Politicians don’t trust them (having seen so many fail), and are much more likely to be receptive to an understated, even-handed analysis of opportunity vs risk.
10) Make it as simple as possible, but not simpler Einstein’s famous dictum is especially appropriate when it comes to providing scientific direction for policy. Politicians and other policy makers are well aware that science is complex, but do not appreciate (or trust) oversimplification any more than they appreciate over-complexity. The important point in communicating science in a policy context is to focus on those aspects that are relevant to the problem in hand.
11) Be aware that “more research” is seldom an option The timescales of politics are such that politicians usually need fast answers to problems that are of immediate importance. It is counterproductive to use these occasions to push for more support for research, however much that support might be needed.
It cannot be said too strongly that requests (or demands) for support for further research simply reinforce most politicians’ view that scientists, like all other pressure groups, are promoting their views mainly to get a larger share of the financial cake. Such a larger slice is only likely to be forthcoming if the arguments for it are kept clearly separate from the offering of scientific advice on particular issues.
12) Look to establish long-term leverage Scientific advice is often concerned with long-term problems, while there is a rapid turnover (among politicians and civil servants especially) among those responsible for its implementation. One way to overcome this problem is for the scientist(s) concerned to keep a watching brief (perhaps through a scientific society or other network), and to urge that policies based on their advice should incorporate ongoing flexibility and responsiveness, so that actions can be modified as new information comes in and circumstances change.
An example that illustrates these points is the 2010 eruption of the Icelandic Eyjafjallajökull volcano. The risk of such an event, causing an ash cloud and wide-spread disruption to air travel, had been identified by the responsible government department as part of the national risk assessment process in 2005; but no-one could be found willing to or estimate the likelihood of such an event causing such a disruption, because the likelihood was a function of a number of factors (frequency/nature of an eruption; atmospheric and weather conditions) which were themselves unpredictable. The risk was held ‘in reserve’ for further study. After the 2010 eruption of Eyjafjallajökull, the Government Chief Scientific Adviser was invited to pull together a cross-disciplinary teams including physics-based professionals such as vulcanologists, meteorologists, and specialists in the behaviour of aerosols, to help policy-makers understand the risk, both that such an ‘ash cloud’ might recur, and what the ‘reasonable worst case scenario’. And a further question arose: “Is this the worst that can happen?” The answer to that was “no” because the ash cloud risk was substantially trumped by the risk of something like a recurrence of the eruption in 1783 of the Icelandic volcano Laki. This produced large quantities of gases, including carbon dioxide and sulphur dioxide, which caused famine conditions in Western Europe and would if repeated cause problems not just for transport but for health and agriculture. The lessons here for the use of science to support resilience policy at government levels were multiple: don’t try to predict, because wrong predictions (the most common kind) undermine trust in science which is reasonably high in UK; do try to give a best estimate, even if this is provisional upon further research, because ‘no opinion’ is ‘not a problem’; form a team, not just from the lead discipline but from all who have a contribution to make, including policy-makers whose job it is to understand the things that are of value to the government (lives and livelihoods, maintenance of essential services, public confidence); assess not just the phenomenon but the impact on these objects of value; use the team to build networks usable for problem-solving in other areas, as is the ‘natural hazards team’ formed by the Cabinet Office from scientific experts both within and without government.
In summary, the systematic use of science is now part of the policy-making landscape13 and, for those who have seen how it can work, is a ‘gift that keeps on giving’. But scientists who want their advice to be heeded need to put themselves in the shoes of their policy-making audience. They should make things easy for that audience by pointing out political benefits (if there are any), making connections with other politically relevant areas, and provide appropriate words and phrases that those whom they wish to influence can pick up and use. These well-established principles of communication may seem self-evident, but if they were that obvious many more scientists would already be using them. More of us need to catch on to them if science is to take its rightful, essential place in the hierarchy of political decision-making.
References
- Sir John Beddington “The Science and Art of Effective Advice” in Robert Doubleday & James Wilsdon (eds.) “Future Directions for Scientific Advice in Whitehall” (2013) 31. (http://www.csap.cam.ac.uk/events/futuredirections – scientific-advice-whitehall/).
- James Wilsdon & Rober Doubleday (eds) “Future Directions for Scientific Advice in Europe” (Cambridge: Centre for Science and Policy (2015)) 9.
- OECD, “Scientific Advice for Policy Making: The Role and Responsibility of Expert Bodies and Individual Scientists”, OECD Science, Technology and Industry Policy Papers,No. 21, OECD Publishing, Paris (2015). http://dx.doi.org/10.1787/5js33l1jcpwb-en
- J. Sutherland, D. Spiegelhalter & M.A. Burgman “Twenty tips for interpreting scientific claims” Nature 503 (2013) 335.
- R. Fisher “Preparing for Future Catastrophes: Governance principles for slow-developing risks that may have potentially catastrophic consequences” IRGC Concept Note (2013) (http://www.irgc.org/wp-content/uploads/2013/03/CN_Prep.-for-Future-Catastrophes_final_11March13.pdf ).
- Kates R.W, C. Hohenemser and J. Kasperson Perilous Progress: Managing the Hazards of Technology (Boulder: Westview Press (1985)).
- Cabinet Office Strategy Unit report (November 2002).
- OECD “Future Global Shocks – improving risk governance” (June 2011)
- Sir John Beddington ibid, p.22.
- Rees Kassen “If You Want to Win the Game, You Must Join In” Nature 453, (2011) 153.
- Roger A. Pielke, Jr The Honest Broker: Making Sense of Science in Policy and Politics (Cambridge: Cambridge University Press (2007)).
- http://www.theguardian.com/science/political-science/2015/feb/05/anne-glover-on-brussels-a-moment-of-magic-realism-in-the-european-commission
- James Wilsdon & Rober Doubleday (eds) “Future Directions for Scientific Advice in Europe” (Cambridge: Centre for Science and Policy (2015)) (http://www.csap.cam.ac.uk/events/futuredirections – scientific-advice-whitehall/).