My latest Science Show broadcast for Australian ABC Radio National, where I argue that lazy and habitual thought processes are causing untold damage. Here is the link:

http://www.abc.net.au/radionational/programs/scienceshow/challenging-our-thought-processes,-biases-and-assumptions/9162084

and here is the original unedited transcript:

I love charity shops and second-hand bookshops. You never know what you might find.

I was browsing in the book section of a charity shop recently when I came across a little book called “The Insanity of the Over-exertion of the Brain” by a Scottish doctor with the wonderful name of J. Batty Tuke. “Batty” just about sums it up, because Dr Tuke’s thesis was that, the more we use our brains, the more we are likely to go mad as a result.

Tuke published his book in 1895, and I must admit that the world seems to have gone more and more mad ever since, although not perhaps for the reasons that he advances. In fact, I would suggest that the opposite is the case; the less we use our brains, the madder we get.

One way to avoid using one’s brain is through what psychologists call “cognitive bias,” which basically means having a pre-conceived picture of how the world works, and fitting everything to that picture without thinking it through. We all do it; even scientists like me. In fact, I was looking just last week at a map of the 188 possible types of cognitive bias, and I was disturbed to find that I seemed to have most of them.

Even Occam’s Razor was listed, in the form of a tendency to believe that the simplest explanation is most likely to be true. Now that’s not what Occam’s Razor is really about. As I once said in an ABC Radio Ockham’s Razor programme, the principle is really about being critical of your own thought processes, and that’s what I want to talk about today. J. Batty Tuke was concerned about the insanity of over-exertion of the brain. What concerns me is the insanity of under-exertion of the brain.

Under-exertion of the brain means falling for your own cognitive biases, sometimes without realizing that they are there, but more often because it feels too uncomfortable to challenge them. One of the great benefits of being in science is that you have to challenge your own biases, and your beliefs in your own theories, because you know damn well that if you don’t, someone else will, and that if you are wrong the results are likely to be very painful.

I well remember an occasion when I had performed what seemed to me to be some beautiful experiments, and where I pointed out to an international audience that great care had been required, because even the slightest change in conditions might have produced a different outcome. At the end of my talk a Nobel Prize winner in the audience rose to ask a question: “Why did you do it then?”

Ouch.

But the biggest ouch of all, the one that is causing so much damage to the world today, is a philosophical howler of such gigantic proportions that we would have to turn our telescopes backwards to see it. It pervades our lives with its spurious logic. Many of us live by it, and all of us have surely fallen for its pernicious charm at some stage.

It is simply this: “It sounds right, so it must be true.”

Let me give you a simple example – the idea of a “balance of Nature” – the belief that natural systems have a built-in system of checks and balances that ensure long-term stability, so long as we don’t go interfering with them.

It sounds right, so it must be true. Right? Wrong.

The idea goes back to the Greek historian Herodotus, but its modern progenitor was an obscure self-taught Illinois biologist called Stephen Forbes. If you’ve never heard of him, you are not alone, but he was the one who came up with the notion of “the balance of nature” in the 1880s as a result of his studies of fish populations in the Illinois lakes. He observed that they formed a complex network together with their food sources – one that could “self-regulate”, so that if there was a disturbance in one part, other parts would change in response to bring the system back to equilibrium.

The idea of the balance of nature had such intuitive appeal that no one really thought to question it for the next hundred years. It became a foundational principle of ecology, and even found its way into economics in the guise of the “free market”. But recently, people have started to ask questions. Serious questions.

It took an Australian to rock the boat – my old friend and bridge partner Bob May, who later became President of the Royal Society and Chief Scientific Advisor to the British Government. Bob was a theoretical physicist at the time, but in the early 1970s he put a bomb under the world of biology by showing mathematically that, the more complex an ecological network became, the less stable it was. More recently, he showed that the same principle applied to banking networks, and just a few weeks ago I was able to surprise a group of insurance company CEOs with a similar observation about their industry.

The point is that complex networks have their own rules of behaviour, and until you go past the intuitive picture and start to delve more deeply, you won’t get to the reality. The reality for complex networks, whether they be ecological networks, economic networks or social networks, is that, yes, they do have periods of deceptively reassuring stability where there is something like a “balance of nature” prevails, but they are also subject to times of unexpected, and often unpredictable turbulence and change where the rules go out the window, the balancing mechanisms no longer work, and everything is up for grabs.

These times of change go under various names: “tipping points”, “thresholds”, or (if you are a scientist) “critical transitions”. Whatever you call them, they don’t fit with the traditional idea of a “balance of nature” – but they do fit with reality.

And that’s my point. Cognitive biases, and in particular our tendency to believe that if something sounds right then it must be true (especially if it fits with our preconceptions), are our mental short-cuts. Unfortunately, as is often the way with short-cuts, they can sometimes lead us to quite the wrong destinations.

To find our way to the right conclusions in this world of sound-bites and short-cuts, we need to avoid the insanity of under-exertion of the brain. J. Batty Tuke thought that brain over-exertion damaged the complex, interconnected neuronal pathways. I believe that under-exertion can lead to an even worse situation; a failure to appreciate the amazing, and sometimes frightening, ways in which our complex, interconnected world actually works, and where, contrary to cognitive bias, if it sounds right, then it more than likely to be wrong.

Share This