There are two fundamental reasons why misinformation is a bigger problem than ever before.
The first is tied to rise of the internet and social media. Where once only the richest and most powerful could spread their ideas widely, now the vast majority of people can do so at the click of a button.
This empowerment of those who were once voiceless can be a very good thing. But now malign actors can also influence enormous numbers of people in almost all parts of the globe.
Anybody from a foreign power to a politician to a company can inject huge quantities of bad information into the system – reaching millions of people in a fraction of the time and cost it once took.
Identifying bad information is hard
This itself wouldn’t be the problem it is if it weren’t for the other factor – which is that for a lot of reasons, people aren’t great at figuring out what information to believe.
And I mean all people – including you and me.
This is because the task of identifying bad information is enormously difficult and the proverbial dice are loaded heavily against us.
Lies are much easier to weed out if you can check them against your own direct evidence or experience. This is why nobody lies about commonly observable things like the colour of grass or the legal driving age.
Unfortunately, most things in the media can’t be verified by us firsthand. Did Trump engineer a coup? Are vaccines safe? Is climate change real? Even if we have access to real data at scale (which hardly ever happens) almost none of us have the expertise or time to evaluate it well.
This is a really hard situation to be in. How do we handle it?
It turns out that we use a lot of cognitive short cuts – what researchers call . One of those short cuts is .
Unfortunately, that means that simple falsehoods are tempting to believe over nuanced truths.
Another common heuristic is . This is sensible if you’re in an environment where falsehoods are weeded out by selection – if the people who claim there isn’t a tiger outside get eaten by the tiger, after a while, nobody will claim there isn’t a tiger.
But today’s media environment has disentangled frequency from truth.
Emotion short-circuits memory
Most algorithms that drive what we see, as well as our own , favour information that is . Also, people on social media often want to achieve social aims, like connecting with a group or advertising who we are.
Combine these tendencies with the fact that , and it is easy to see that false news that makes us angry or scared, or that appeals to our social group, will spread much faster and much further than a boring truth.
This makes us see it more often, and this increased frequency makes us believe it even more.
Where is the role of rationality in all of this?
We are certainly capable of evaluating the truth of a thing based on whether it makes sense. But without direct access to the truth, the best we can do is analyse and interpret information from people who claim to have that data (who might be lying).
We do that by seeing if their information is consistent with our other beliefs, as our beliefs aren’t isolated chunks of knowledge but comprise of a thick web of interconnected theories, ideas and premises.
When that web is broadly correct, then analysing new information by comparing it against what we already know is a very sensible thing to do.
This cognitive approach often comes undone, however, when people are exposed to information that mixes lies with truths, or when the falsehoods support an emotional need to believe some things over others (as we see with the persistent or ).
In an effort to buttress those needs, it’s possible to get sucked so far into a web of misinformation that the entire belief set is incorrect.
Be aware of your own biases
When your premises are false, only further falsehood makes sense. The end result is that people start believing in and seem to inhabit an entirely different reality.
So, what can be done?
Truly, this is a wicked problem and many of the solutions are systemic. Somehow, truthful and correct things need to get shared more widely and more quickly.
There are some things us as individuals can do – the main one is to be aware of these biases in ourselves. When you find yourself feeling strong emotions, that is often a sign that you’re being manipulated. Take a moment to put yourself in the shoes of the person who wrote or shared the information; ask yourself what their agenda is.
If something sounds unbelievable or oversimplified – that is another cue to try to confirm it independently.
Recognise your own fallibility and be humble – you will get things wrong, but if your worldview isn’t entirely entrenched in falsehoods, you’ll be much more likely to get the important things right.
How do you reach someone who is fully absorbed by conspiratorial beliefs?
That too is very hard, because they don’t trust anybody who believes differently from them. If you aren’t close to them already, you probably can’t do anything. If you are, and you can, and build trust slowly.
You won’t win somebody over with facts; instead, try to understand what emotional purpose their beliefs serve and see if you can give them other ways of meeting those needs, like offering support and connection.
It’s hard, long, aggravating work and is much more effective to identify people before they are pulled in too far.
More broadly, be kind.
It is far easier to be led by our emotions, particularly our negative ones like fear and anger, in times (like now) when there is so much uncertainty, pain, and difficulty.
The more we can build strong bonds and trust with each other, the more we can fight the pull of the falsehoods and exaggerations that try to tear us apart.
Discover more by subscribing to on apple, google, spotify or wherever you get your podcasts.
Hosted by award-winning journalist Lynne Malcolm, this exciting new podcast series unpacks today’s big issues from the perspective of psychology research. Supported by the .