Stefan Rollnick’s roundup of platform profits, vulnerabilities and the damage of disinformation.
Moving forward from 2022: The year that was, 2023 looks like the year of the battle strategy. Where misinformation might be spreading – and, indeed, landing – in certain vulnerable communities, our Head of The Misinformation Cell, Stefan Rollnick, assesses the purpose, process and repercussions of damaging disinformation narratives in our current climate.
Why have recent misinformation narratives been spreading in the first place?
Take some of the most recent fake news stories of 2023: the Oxford LTN protests, Strep-A, or even the conspiracy theories surrounding the World Economic Forum. What do they all have in common? They’re all having a ‘real world’ impact.
Recently, for instance, Lynn Planet and The Misinformation Cell have both observed a marked increase in misinformation narratives surrounding sustainability. Sustainable travel in particular. But why are people falling for these narratives? It comes back, of course, to freedom, fear and core beliefs.
First, look at denying people access to their cars. Humans inherently associate cars with freedom. Without cars, mobility is seen to be restricted. Opportunities are lost. Living life, even, becomes inconvenient. Impossible in some places. For example, Oxford is a very old city plagued with a very new problem: traffic. In a bid to divert traffic from its narrow streets, Oxfordshire County Council introduced the concept of Low Traffic Neighbourhoods (LTNs). The function of LTNs was to minimise motor traffic through certain roads, thus encouraging people to either take an alternative route (like the ring road) or choose an alternative method of transport (like walking, cycling or taking the bus).
The Twist
Somewhat inexplicably, however, the introduction of LTNs was mixed up with the introduction of 15 minute cities. Where LTNs restricted people cutting through neighbourhoods by – in some places – blocking roads with bollards and planters, 15 minute cities are designed to reduce car journeys. To be clear, this is not by blocking road access, but by ensuring that all amenities are within a 15 minute walk of your home. The hybrid interpretation by conspiracy theorists, it suggests, is a global plot to trap people within city zones.
Despite the obvious mix-up, the threat to life and freedom – exacerbated by conspiracy theorists and, interestingly, Conservative MP Nick Fletcher – has resulted in massive public pushback. In his LinkedIn post of how misinformation flips positive policies, Stefan outlines the damage disinformation can cause for the purpose of money and civil unrest, as well as warning local authorities about it: ‘The message for Local Authorities: think very carefully about how you frame these changes with your publics. It will determine how vulnerable they are to this kind of disinformation.’
With members of parliament spreading the very disinformation created to sow distrust in government, it’s little wonder why people are likely to spread the news.
The compulsion to spread misinformation.
It’s not only the support of rogue MPs which spreads misinformation. Likely, it spread to Nick Fletcher through the same social media channels it spreads to all of us. But how? The answer is, as ever, about money.
‘Here’s what you need to know,’ Stefan writes in his post regarding social media greenwashing, ‘The platforms are designed in a (profitable) way that basically nudges people towards sharing and spreading misinformation.’
Disregarding Richard Thaler’s advice of ‘always nudge for good’, social media nudges for neither good, nor bad. Rather, it nudges for money. Social media profits from ads and, ergo, makes more money the longer we spend on their platforms. And they do all they can to keep us there.
How they do this is old news. Biologically programmed this way, the human brain reacts positively to social recognition. The sharing of news, articles and, basically, anything consumer-worthy, is a quick and – thanks to social media – easy way to gain this recognition. As Gizem Ceylan explains in his recent research article, “Sharing of misinformation is habitual, not just lazy or biased”: ‘reward structure on social media that encourages users to form habits of sharing news that engages others and attracts social recognition.’ More often than not, it is fear-inducing, shocking or laughable misinformation narratives which attract such recognition. And so the habitual cycle continues.
The danger of the comeback.
It is due to these algorithms that even the simple act of fighting back – of correcting the inaccurate – becomes harmful. In fact, due to social media algorithms, the more interactions a post gets, the more popular it becomes. The truth of the matter is that by rebutting a misinformation piece, you could be making it reach more people.
In his article for Transport Xtra, Stefan explains how futile rebuttal can sometimes be: ‘While academic evidence shows that, in general, correcting misinformation with the facts is useful, we know that: a) it has a limited effect on behaviour change, and b) the academic research is taking place in a really simplified environment (i.e. a lab), and in reality the system is much more complex. Any time we correct a piece of misinformation publicly it can meanwhile have unforeseen knock on consequences, so we must respond carefully.’
Responding carefully, Stefan goes on to say, involves addressing ‘deeper truths’. In other words, meeting your audience with story: ‘The truth is that all of us mere mortals, regardless of how rational we try to be, make sense of the world through storytelling – and the disinformation spreaders understand this. Strategic communicators are beginning to understand this as well – Joe Biden’s presidential campaign team in 2020 deftly operationalised a storytelling approach to fighting disinformation.’
The aftermath.
So where does all of this leave organisations who chose to fight back? How do we arm ourselves against the increasing onslaught of false news?
An easy response would be to say we must increase people’s ability to identify fake news. As Ceylan summises, ‘older people and those with weaker or less critical reflection tendencies may fail to detect the veracity of information and thus be less discerning in their sharing’. However, as she later goes onto explain, misinformation sharing may be more related to social media habits, rather than education. And habits, as we know, are difficult to disrupt.
Even misinformation ‘counter measures’ social media platforms have devised have little to no effect. As Stefan sums up, ‘these are a sticking plaster over a problem that the platforms deliberately engineered because it makes them money.’
Accountability for the spreading of fake news has, and always will, lie with social media platforms. The issue: with their majority control over news, who can possibly make enough noise for them to finally listen?