Society’s Acceptable Misinformation

Author: Barty Wardell (Co-President)

After a year of hiatus, we are planning on using this blog more regularly so do stay posted!

I wanted to write about misinformation, and how misinformation is itself mischaracterised. Many people have heard about the corrosive effects of misinformation, mostly because the propagation  of anti-vax ideology has shown how quickly distrust can spread – and how demonstrably harmful mistrust can be. Many others have written much better than I am able to about this specific phenomenon and its social and psychological context, so if you are interested I would direct (for the purposes of this post) you elsewhere. However, there are some things I believe deserve some greater attention.

I had a significant mindset shift after speaking to a friend who I saw at New Year. They had been sharing a post about LFT swabbing from the i newspaper, long before there was any consistent data on whether or not it made LFTs more effective. The article itself consisted of two epidemiologists posting to their personal twitter accounts merely highlighting that it was a possibility LFT swabs should be changed, with both stating  their evidence as anecdotal. The i had then re-packaged this as worthy of scientific or public attention, leading my friend to share it. 

I was irked by this, because I thought both the newspaper, and my friend who has a science degree, should know better than to share what is clearly conjecture. Their rebuttal was to the effect that no social harm was done, which is of course superficially true before first exposure (anchoring) bias is accounted for, or the fact that it conditions people to accept similarly low standards of information in the future.  Furthermore, it elicits trust in the sharer of the information, who is likely to repeat the damage in the future. The fact was they were simply trying to be Covid secure, but it is not what surprised me – what surprised me was that the others there agreed it was not robust, but also saw no damage from sharing it. 

This sits in a wide array of what I call “Acceptable Misinformation”, which is socially  acceptable to support, discuss and share, but is known widely to be misinformation. This can be seen as the difference between anti-vax, which produces a strong rejection by most when mentioned in conversation, and ill-researched dietary advice or beauty tips, which might be causally shared by friends. An easy explanation to this is that one does great amounts of societal damage while the other one can certainly be described as harmful but mostly less so. While this is true, it ignores the fact that the psychological reality that as you start to question information less, the easier it becomes for you to believe increasingly invalid sources. There is also the fact that the same online spaces that share this low grade misinformation also contain links or articles to more damaging pseudoscience. 

We all have our own unique experiences, full of events which colour and warp our perception. What is key to realise is that we all spread this “Acceptable Misinformation”, from questionable historical facts to home cleaning solutions – we will repeat this information without worry most of our lives. The critical realisation of this, is that as a result we are already primed to accept worse conspiracies than the mild ones we already hold dear, we are all susceptible to misinformation. I feel this should shock many into re-estimating the damage of misinformation, and to hold some more sympathy for those who do fall further into conspiracy belief than perhaps the rest.

Using Critical Thinking to Build Resilience Against Misinformation Professor – Prof. John Cook

We welcomed Professor John Cook, Research Assistant Professor at the Center for Climate Change Communication at George Mason University, for our first presentation of Lent term. Prof. Cook has built on decades of research into inoculation theory in the field of behavioural psychology, and in his talk, discussed how this may be applied to climate change denial. 

His talk, titled “Using Critical Thinking to Build Resilience Against Misinformation”, delved into his research into inoculation theory, a suggested framework for stopping the spread of misinformation. 

The key feature of inoculation theory is that it exposes people to a “weakened form” of misinformation. A warning is displayed before the misinformation is shown and this is then followed by an explanation of the relevant counter-arguments. The aim of this technique is to train people to recognise the overarching characteristics of science denial when they encounter misinformation in day-to-day life. Professor Cook groups these practices into a set of five main characteristics:

  1. Fake experts: The practice of presenting an unqualified person or institution as a source of credible, expert information.
  2. Logical fallacies: When the assumptions of an argument do not lead to the conclusion.
  3. Impossible expectations: The act of demanding unrealistic standards of certainty before acting on science, when science can never give absolute certainty on any finding. 
  4. Cherry picking: When data are carefully selected to appear to confirm one position while ignoring other data that contradict that position.
  5. Conspiracy theories: When it is assumed that that there is a scheme planned by those with nefarious intent, without sufficient evidence to support this.

One of the highlights of this talk was the discussion of parallel argumentation – a method used effectively to demonstrate the ways in which misinformation is logically untrue. This process involves transplanting the flawed logic used by a piece of misinformation to another scenario, showing the absurdity of the logic. An example can be seen in Professor Cook’s comic strip below.

CrankyUncle comic depicting parallel augmentation.

Prof. Cook goes on to discuss the challenges faced combating misinformation and suggests psychological and behavioural reasons for the spread of misinformation in today’s society. 

The “Cranky Uncle” app, developed by Prof. Cook, presents a possible solution to these problems. It hosts a game that teaches players about misinformation and encourages competition, to be the best at spotting it. Prof. Cook believes this app has the potential to break into echo chambers and teach critical thinking skills to children, if it were implemented in a school setting. 

CUSAP is grateful to Professor Cook for joining us to share his fascinating research and innovative solutions to the global problem of misinformation. The practical application of inoculation theory demonstrates a way in which the spread of misinformation and pseudoscience can be countered and suggests we all have a role to play in inoculating ourselves and others.

Eilidh Hughes | Climate Change Awareness Officer, Students Against Pseudoscience