Why no-one believes what scientists tell them

George Monbiot is a former environmental scientist turned journalist-activist. Many moons ago I studied physics, before joining the development and human rights dark/light side (depending on your point of view). So his recent meditation on the nature of science and ‘public reason’ as Amartya Sen would call it, struck a chord, (and not just with me, if the 1200 comments on the article are anything to go by). It also echoes the discussions I regularly have with NGO colleagues. climate_denier_cartoonThe prompt for his musings is the extraordinarily successful counterattack by climate sceptics on the scientific evidence for climate change. What Monbiot realizes is that, as with so many contentious issues of public policy, this discussion is only partly a rational argument, the outcome of which is determined by the evidence available. As I am finding with the debates on the Financial Transactions Tax, discussions may go through the motions of examining the evidence, but beliefs are fixed, emotional and largely impervious to new information or analysis. Why is that? Monbiot thinks that partly, it’s the employment structures behind public debate: ‘Views like this can be explained partly as the revenge of the humanities students. There is scarcely an editor or executive in any major media company – and precious few journalists – with a science degree, yet everyone knows that the anoraks are taking over the world.’ But he is too intelligent just to blame the media, and points to the way science itself has evolved. ‘The detail of modern science is incomprehensible to almost everyone, which means that we have to take what scientists say on trust. Yet science tells us to trust nothing, to believe only what can be demonstrated. This contradiction is fatal to public confidence.’ I would go further on this point – as Karl Popper argued, good scientists inhabit a world in which no law can ever be definitively proven to be true – it can only be proven to be false. Until that time, they can use the law as a working hypothesis. That is hardly guaranteed to reassure a general public desperate for certainty. Monbiot goes on to echo recent discussions on behavioural psychology from George Lakoff and others: ‘Those who see themselves as individualists and those who respect authority, “tend to dismiss evidence of environmental risks, because the widespread acceptance of such evidence would lead to restrictions on commerce and industry, activities they admire”. Those with more egalitarian values are “more inclined to believe that such activities pose unacceptable risks and should be restricted”. These divisions, researchers have found, are better at explaining different responses to [caption id="attachment_2169" align="alignright" width="116" caption="It's not him, it's us"]It's not him, it's us[/caption] information than any other factor. Our ideological filters encourage us to interpret new evidence in ways that reinforce our beliefs. “As a result, groups with opposing values often become more polarised, not less, when exposed to scientifically sound information.”’ All this leads him to a deeply pessimistic conclusion: ‘Perhaps we have to accept that there is no simple solution to public disbelief in science. The battle over climate change suggests that the more clearly you spell the problem out, the more you turn people away. If they don’t want to know, nothing and no one will reach them. There goes my life’s work.’ I think the people who work for NGOs come from both sides of the divide. There are plenty of enlightenment-rationalists, looking for evidence, believing in progress, both human and technological. But there is also a deep strain of those whose ‘ideological filters’ make it simply impossible for them to accept that technology X could be positive, whatever the evidence. [caption id="attachment_2170" align="alignleft" width="150" caption="Evidence-based policy making, anyone?"]Evidence-based policy making, anyone?[/caption] This hostility is often based (with some justification) on the ways control over new technologies exclude poor people and exacerbate inequality, but I also think it reflects deeper belief systems. On some technologies there is a broad consensus (IT and renewables good; weapons bad). But on many others, partly to maintain internal peace, the rationalists in NGOs have to settle for (at best) agnosticism and fence-sitting on issues such as GM, nanotech, nuclear power or geo-engineering. I am doubtful evidence will ever allow us to reach consensus on those – the Georges (Monbiot and Lakoff) are  right. That worries me, not least because of the missed opportunities to channel science for the benefits of poor people, and because we risk surrendering the issue to the bad guys. Got a feeling I might get a few comments on this one…..]]>

Subscribe to our Newsletter

You can unsubscribe at any time by clicking the link in the footer of our emails. For information about our privacy practices, please see our .

We use MailChimp as our marketing platform. By subscribing, you acknowledge that your information will be transferred to MailChimp for processing. Learn more about MailChimp's privacy practices here.


9 Responses to “Why no-one believes what scientists tell them”
  1. George Monbiot may be many things, and he may be right ot wrong, but he is certainly not a former environmental scientist, and I don’t think he would claim to be so.
    I believe he studied biology as an undergrad, but since then he has been a journalist, writer and activist.

  2. Tord Steiro

    Hmmm, if you don’t know for sure, but fear grave consequences, well, you buy insurance, no?
    Whether the IPCC is the right body to grant a forced insurance monopoly is perhaps questionable, but the need for insurance is not!
    Same can be said about the Robin Hood Tax and financial stability…

  3. Michael

    I think you’re very close to right, and we should be examining in ourselves how to persuade under these circumstances.
    Certainly, when talking to conservatives, the need to deny evidence in order to maintain the worldview has pushed conservatism away from Enlightenment empiricism. Faced with a choice between abandoning clearly false ideals, and abandoning reality, the conservative movement has chosen the latter. In that case, arguments based on reality, rather than the needs which are being fulfilled by the ideals will fail.

  4. Stephen Jones

    Ben Goldacre’s Bad Science blog, also at the Guardian covered this too:
    I’d be interested to know what the feedback is from Oxfam’s public campaigning on climate change (for example, the big posters on the Underground), which say that the effects are real and happening already on people in other parts of the world. Is this likewise a case of preaching to the converted?
    But in general, it’s a reminder of the strangeness of the phrase ‘evidence-based policy’ – surely policy is based on politics… which relies on appealing to people’s mix of ‘rational’ and ‘irrational’ beliefs.

  5. Eliot Whittington

    But then there is a genuine question in places about how much we WANT evidence based policy… An example that has been recently high up in the public domain is drugs-policy. It strikes me that the majority of the UK public want a drugs policy in tune with their values. They hope the evidence backs up their values but the values are the important thing. And maybe we shouldn’t fight that. If people believe that drugs are a moral bad and should be banned then it doesn’t really matter how much evidence you put in front of them showing that the social harm from drug use might be minimised by other approaches than criminalisation. If you do want to change drugs policy then it’d be more effective to go to the heart of what they believe and enter into the debate on whether or not drugs are bad. And if you don’t want to do that then maybe you need to accept the policy and work with it…

  6. Chris Jochnick

    Great blog – very relevant – but I think you miss an issue at end. What keeps us (activists) sitting on fence on things like nuclear power and GM is not just “belief system” filters but a combination of (i) very high stakes and (ii) our limited ability to predict where these things will take us, both as matter of science and politics. Fence-sitting is weak, frustrating and unsustainable, but it’s probably the most rational stand on certain issues (at certain times) under the circumstances.
    Duncan: Interesting Chris – the precautionary principle applied to policy and advocacy work. But if the high stakes are both for good and bad, at what point do we take the plunge, and on what basis? We may have to throw caution to the winds on climate change at some point. How do we go about making that decision?

  7. Thanks. Very informative and useful.
    The key lies in the human belief that we actually take action based on our prediction of what our future self will want or what we can predict about the future in our mind, which is a flawed model (at best) and a delusion (most often). In “Stumbling on Happiness”, Gilbert illustrates well our blindspots and our ability to rewrite our own logbook to say “I meant to do that”, when in reality, we simply respond to stimuli in a primitive manner and then make up a story for ourselves. The only true choice we make in real time is when we say “no” to something. Otherwise, wisdom comes from our ability to create (in response to catastrophe or desires) Systems of denial (laws and fences) to keep us from doing things automatically that we wouldn’t do if we actually were able to think and control our actions. As the saying goes, “Locks are for honest people.” The scientist is important over the long term to present and record information, so that when enough stimuli are accumulated or a catastrophe happens, we can use that data to act wisely, rather than reactively without guidance. Until NYC is under water, global warming is merely a data collecting opportunity.