Times Higher Education Supplement

For scientific pioneers, ridicule is an occupational hazard. The early aviation pioneers were ridiculed by the then President of the Royal Society, Lord Kelvin, who declared in 1895 that ‘Heavier-than-air flying machines are impossible.’ British proponents of space travel suffered similarly at the hands of the Astronomer Royal, Sir Harold Spencer Jones, who famously announced that ‘space travel is bunk’ – two weeks before the launch of Sputnik.

Distinguished experts have derided the invention of the light bulb, the aeroplane, the radio valve, the theory of relativity, the atomic bomb, and the proposed use of propellers to drive ships. Others have also joined in the hunt, including the U.S. District Attorney who prosecuted Lee de Forest, inventor of the audion tube that revolutionized wireless communication, on charges of fraudulently using the U.S. mails to sell stock in his Radio Telephone Company.

These days, the proponents of new ideas are as likely to be attacked from outside as from inside science. Proponents of nuclear power, GM crops, new methods of immunization and a host of other new technologies are attacked in the media and by pressure groups, sometimes even physically. How should they respond? What should they do?

‘You can recognize a pioneer by the arrows in his back’ says Beverly Rubik from Temple University’s Center for Frontier Sciences in Philadelphia. The response of many pioneers in the past has been to shrug off the arrows and to respond to ridicule with perseverance. When the attack comes from the outside, though, a different approach is called for. The courage of conviction is not enough; what is needed is the courage to share the process of science, to explain why we think the way that we do, and even (Heaven forbid) sometimes to accept ‘no’ for an answer.

The Center for Frontier Sciences is concerned with giving maverick ideas a fair hearing within the scientific community, where there is often a good reason for treating such ideas with caution. The principle was first stated by the Scottish philosopher David Hume, and re-stated by Carl Sagan in his book ‘Cosmos’: ‘Extraordinary claims require extraordinary proof’. It is a criterion that plays a similar role in science to that which a governor plays in an engine, preventing it from racing away and destroying itself in the pursuit of mad will-o’-the-wisp ideas. Sometimes such ideas have undermined the foundations of science and replaced them with new ones, but not often. ‘The further an idea is from established scientific wisdom, the closer it is to a Nobel Prize’ goes the popular saying – but most such ideas are closer still to an unmarked grave.

Hume’s and Sagan’s criterion has served science well. Take, for examples, Jacques Benveniste’s repeated claim that pure water can carry a ‘memory’ of homeopathic molecules that were previously dissolved in it (and that the information so memorized can be transmitted over telephone lines and the Internet!). This claim, if true, would totally change our thinking about the structure of water, which many experiments have shown to be evanescent and labile on a time scale of microseconds, and totally unable to retain any permanency that would allow ‘memory’. Benveniste’s is an extraordinary claim, but the extraordinary proof has never been forthcoming.

When Hume’s and Sagan’s criterion is applied to debates about whatever new technology that is currently making media headlines, the results can be disconcerting to both sides. It doesn’t matter whether the debate concerns nuclear power, animal experimentation, GM crops, immunization, mobile telephones, power lines, or whatever next month’s topic is going to be. The structure of the debate is always the same. A scientist will come forward to say that the benefits far outweigh the risks, or even that the risks are ‘negligible’. An opponent (sometimes another scientist) will then be brought forward to say that the risks are greater than have been admitted, and that the process or procedure should not be undertaken until there can be an assurance that no risk is involved.

Both sides are indulging in hubris. A scientist may well be authoritative in his or her assessment of benefits and risks, but to say that the benefits will be far-reaching, or that the risks are ‘negligible’, is to make an extraordinary claim – that the scientist is able to predict the future – that needs extraordinary proof to back it up. Scientists are no more prescient than their fellow mortals, as the examples at the beginning of this article show, and it sometimes behooves us to show a little modesty in this regard.

The opponents of new technologies, however, need to show the same modesty, and for the same reason. In demanding that no technology should be accepted if it carries an element of risk, they are exposing themselves to the same argument as they apply to its proponents. What gives them the right to demand that we take the risk of not introducing the new technology, which may be every bit as great (and difficult to quantify) as the risk of introducing it?

The plain truth is that mankind needs to take risks in order to progress, and perhaps even to survive. Some people are so frightened of change that they will resist, even to violence, any new technology that they perceive as carrying an element of risk. Scientists do themselves little service in these cases by presenting themselves as authority figures who are making a judgment on behalf of the community as to whether the risk is or is not worth taking. It does not necessarily matter that the risk is very small, especially if the community perceives that the consequences may be very large. An indeterminately small number (the perceived risk), multiplied by an indeterminately large number (the perceived consequence), gives an indeterminate answer, as all scientists know. It is up to the scientists to see that others know this as well, and for both sides to act on it.

It is a very difficult problem when the media and the public demand black-and-white answers, and where scientists are seen as shilly-shallying if they don’t deliver such answers. It is also a temptation for some scientists to be seen as omniscient, but the temptation should be resisted. The best thing, where possible, is to stick to cases and not to be drawn into generalities. Concrete examples that ask an opponent to offer a decision (e.g. if this vaccine is used, it is likely to save a certain number of lives; if that vaccine is used, it will affect badly a very small additional number, but it is likely to save many more additional lives; which would you choose?) are a scientist’s best defense. They are also our best chance of truly sharing our science with the wider community to whom science equally belongs.

Scientists need to recognize that the statistical assessment of risks and benefits is only one part of a complex decision process that involves community feelings and attitudes. The numbers may inform these feelings, but they are seldom the sole criterion for most people. Different people will feel differently, and come up with different answers. The role of scientists is to make sure that the feeling about both the risk and its consequences is as factually informed as it can be, and to keep his or her individual opinion separate. This is not to say that scientists should not hold opinions, or that those opinions should not be uttered in the hope of affecting community attitudes. The fact that the scientist is well-informed may help, but only if he or she is seen as impartial and if the opinion is not delivered in a spirit of absolute authority. If Lord Kelvin and the Astronomer Royal could be wrong in their predictions, so could we all.

© This article is copyright Len Fisher. Please email Len Fisher to seek permission to reproduce part or all of the above article.

Share This