Another example was produced by economists Linda Babcock and George Loewenstein, who ran an experiment in which participants were given evidence from a real court case about a motorbike accident. They were then randomly assigned to play the role of plaintiff’s attorney (arguing that the injured motorcyclist should receive $100,000 in damages) or defence attorney (arguing that the case should be dismissed or the damages should be low). The experimental subjects were given a financial incentive to argue their side of the case persuasively and to reach an advantageous settlement with the other side. They were also given a separate financial incentive to accurately guess what damages the judge in the real case had actually awarded. Their predictions should have been unrelated to their role-playing, but again, their judgement was strongly influenced by what they hoped would be true. (Location 453)
Psychologists call this ‘motivated reasoning’. Motivated reasoning is thinking through a topic with the aim, conscious or unconscious, of reaching a particular kind of conclusion. (Location 461)
Motivated reasoning is thinking through a topic with the aim, conscious or unconscious, of reaching a particular kind of conclusion. (Location 462)
Modern social science agrees with Molière and Franklin: people with deeper expertise are better equipped to spot deception, but if they fall into the trap of motivated reasoning, they are able to muster more reasons to believe whatever they really wish to believe. (Location 529)
Consider this claim about climate change: ‘human activity is causing the Earth’s climate to warm up, posing serious risks to our way of life’. Many of us have an emotional reaction to a claim like that; it’s not like a claim about the distance to Mars. Believing it or denying it is part of our identity; (Location 551)
If you doubt this, ponder the findings of a Gallup poll conducted in 2015. It found a huge gap between how much Democrats and Republicans in the United States worried about climate change. What rational reason could there be for that? Scientific evidence is scientific evidence. Our beliefs around climate change shouldn’t skew left and right. But they do.16 This gap became wider the more education people had. (Location 556)
If emotion didn’t come into it, surely more education and more information would help people to come to an agreement about what the truth is – or at least, the current best theory? But giving people more information seems actively to polarise them on the question of climate change. This fact alone tells us how important our emotions are. (Location 564)
But Lord and his colleagues discovered something more surprising: the more detail people were presented with – graphs, research methods, commentary by other fictional academics – the easier they found it to disbelieve unwelcome evidence. If doubt is the weapon, detail is the ammunition. (Location 583)
When we encounter evidence that we dislike, we ask ourselves, ‘Must I believe this?’ More detail will often give us more opportunity to find holes in the argument. And when we encounter evidence that we approve of, we ask a different question: ‘Can I believe this?’ More detail means more toeholds on to which that belief can cling. (Location 586)
If we already have strong opinions, then we’ll seize upon welcome evidence, but we’ll find opposing data or arguments irritating. This ‘biased assimilation’ of new evidence means that the more we know, the more partisan we’re able to be on a fraught issue. (Location 591)
So why do many people remain sceptical? Part of the answer is a sad history of reckless publishing around the issue. But in part the doubts persist because many people have heard of children whose autism was diagnosed soon after an MMR vaccination, and whose parents think the MMR was to blame. Imagine taking your child for the vaccination, and soon afterwards receiving a diagnosis of autism. Would you connect the two? It would be hard not to wonder. In fact, the prevalence of such anecdotes is not surprising because autism tends to be diagnosed at one of two ages: early signs of the condition are observable by paediatric nurses at around the age of fifteen months; if not picked up then, diagnosis often follows a child starting school. (Location 858)
Psychologists have a name for our tendency to confuse our own perspective with something more universal: it’s called ‘naive realism’, the sense that we are seeing reality as it truly is, without filters or errors. (Location 871)
The Nobel laureate economist Friedrich Hayek had a phrase for the kind of awareness it’s hard to capture in metrics and maps: the ‘knowledge of the particular circumstances of time and place’. (Location 934)