With a snap election looking highly probable and the volume of online misinformation seen during the 2017 General Election set to continue, how can we ensure the electorate is making well-informed voting decisions?

Our co-founder, Shelley Metcalfe spoke recently at TEDx Macclesfield about the cost of careless clicks, how we are all part of the problem and what we can do about it.  Watch the talk or read it below:

“Last year, the number of people in Europe infected with measles was the highest this decade: 3 times the previous year. Of those infected well over half were hospitalised, and 72 died. We’ve had an effective vaccine for decades, so why are we losing people to this disease?  In the UK, immunisation rates are dropping year on year becauseonline misinformation about vaccines is fuelling safety fears. Accordingto the World Health Organization, “vaccine hesitancy” – the reluctance or refusal to vaccinate – is now a major threat to global health – just one example of falsehoods spread online having real-world consequences.

There are plenty more. In 2016, a man became so convinced that politicians were running a child trafficking ring from a Washington DC pizzeria, he walked in with an assault rifle and opened fire.

A recent YouGov poll in the US showed that 16% of adults have doubts about whether the earth is round. For 18–24 year olds, it rises to one in three…

I’m sure you’re aware of other examples.

I spent the first fifteen years of my career working in media and communications, including for some of the UK’s biggest youth media brands…. by the time my two boys started going online – to play and to learn – I had a growing, and unsettling, awareness of the enormous influence that digital media would have on how they see and engage with the world.

I wanted my kids to be prepared… but instead of wrapping them in cotton wool, I wanted to give them the tools they need to navigate the internet well. So, I co-founded the Digital Life Skills Company to help young people make sense of the information at their fingertips. Because we, as adults, are not currently equipping them with these skills:

  • Only 2% of 9-16s have the critical literacy skills to tell if a news story is real or fake.
  • 2 in 3 teachers believe that fake news is harming student’s well-being by raising anxiety levels and skewing their world view.

Why aren’t we doing more about this? And why is there so much misleading information online? Well, there are two main issues.

The first is a problem with the technology – let’s call it the Platform Problem. The technology behind the big platforms – like Google, Facebook, YouTube – is designed to engage us as much as possible so that we consume more ads, and they make more money.

So they promote what is popular and engaging over, say, what is accurate.

The interactions don’t have to be positive either. Love it or hate it, whatever gets the most views, clicks, comments and shares gets prioritised. Whether it’s make-up tutorials or holocaust denials, there’s no such thing as bad engagement.

Which is why bogus stories can get so much exposure. If we click on something out of curiosity, comment on something we disagree with or even share something to debunk it – we’re telling the technology that this is really engaging stuff! So it sends more of this kind of content our way. With every click we are shaping the media landscape.

So what does get the most likes, shares and views? Studies tell us that it’s material that is:

  • Fear-inducing
  • Divisive
  • Sensationalised
  • Provokes a strong emotional response

This isn’t new – the press has tapped into this for decades. Fear sells. Shock sells. Which leads us to the second, related problem – The People Problem.

A major study found that, on Twitter, falsehoods spread six times faster and reach significantly more people that truths. That’s staggering, and this virality is driven by us, by our human appetite for novelty. Falsehoods, they found, are 70% more likely to be shared. Which is not so surprising – in a world where we are overloaded with information, falsehoods get the upper hand – after all it’s easier to be interesting when you aren’t constrained by facts!

And more engagement means there’s money to be made from fabricated stories like a video that says you can charge your iPhone in the microwave. In Macedonia, where youth unemployment is a whopping 55%, the easy money made by manufacturing fake stories has become an industry. One teacher said she almost doubled her salary:

“I know it’s wrong to take a side job which consists of saying ‘Vaccines kill!’, ‘The Holocaust did not exist’ or promoting Trump, but when one is hungry, one doesn’t have the luxury to think about democratic progress.” [Source: Saska Cvetkovska, Investigative Journalist]

Others invent stories around issues known to push buttons – politics, gun control, immigration – sometimes for political reasons, sometimes for humour. The internet is awash with spoofs, parody, satire sometimes so convincing we miss the joke.

Even when satire features disclaimers like this, some people believe and share the hoaxes. Others deliberately remove such disclaimers or copy the stories to make money from the clicks.

Even refuting the lies can feed the system, as some extremist groups are learning. One white supremacist website revealed their strategy of claiming Taylor Swift was a secret Nazi. The media covered the sensational story which was shared by fans defending their idol and critics justifying their dislike. By joining in, both sides unknowingly helped normalise an extreme ideology.

So why do we fall for this? 

Well, we’re human! There are 3 human traits which fuel the spread of false information:

#1 Identity signalling

When we share, we are often saying something about ourselves. Take this story:

“Carly” was frustrated, because her mother, repeatedly shared “fake news” on Facebook. Each time, Carly would send her mother news stories and links to factchecking sites refuting the story, but her mother persisted. Eventually, fed up with her daughter’s efforts, her mother yelled, “I don’t care if it’s false, I care that I hate Hillary Clinton, and I want everyone to know that!” [Source: Georgetown Law Technology Review]

If we’re honest, we’re all a bit like Carly’s mum. When we share, often we’re showing who or what we believe in, or trying to coax others around to our way of thinking – even if the content is dubious.

#2 The reiteration effect

We are more likely to believe things that we see or hear repeated.

This is a real problem online, especially as users copy, edit and rehash digital content, it feels as if things are coming from multiple sources.

And there’s now readily available amplification tools such as liker bots and software that runs armies of fake social media accounts.

Donald Trump didn’t say this, by the way.

The more we see something, the more credible it seems.

This YouTube comment sums it up – “There are 2 million flat earth videos on YouTube, this can’t be B.S.”

The more time we spend on these platforms, the more exposure we get to the sensational stories they promote. Some of the staff employed to moderate social media posts sift through so much conspiracy material that they start to believe the content they are meant to be moderating…

Tendency #3 Herd mentality

We humans have been relying on each other’s co-operation and expertise ever since we formed tribes.  We trust information that comes from people we know and like. Why research which washing machine to buy – or who to vote for – when trusted friends have all the answers?

So if a friend shares something, we’re more likely to trust it. Say I trust Emma and she posts (incorrectly) about an actor being a tax dodger, I’m more likely to believe it. I then share that view with my close friend Jeff, so now he believes it. With three of us now sharing the same (groundless) belief, we’re all more confident we’re right. When dozens, hundreds or thousands of others share the same belief, we can’t possibly be wrong!

Our herd mentality also means that we’re more likely to trust someone who has many followers. Being part of the pack used to mean our survival, so we track our actions and beliefs against our peers and adjust to fit in. If we see a post that a friend has liked, we are 3-4 times more likely to like it ourselves.

So the platforms serve us the kind of content we’ve responded to before, repeatedly reinforcing our existing beliefs, plus we see similar content shared by people we know and trust, and that leave us feeling a bit like this:

Is anyone questioning their own habits yet? That’s good. Acceptance is the first step! With that in mind, put your hand up if you’ve ever shared an article or video.

Keep your hand up if you’ve ever shared something without clicking the link.

Keep or put your hand up if you’ve ever shared something without reading the article, or watching the video all the way to the very end…

Let me tell you, you’re not alone.

Sensationalist stories, fabricated news and propaganda are not new. But whatisnew is how pervasive these have become – because of the platforms’ hunger for our attention, and because of the opportunities – for anyone with a smartphone – to make money, and influence opinion. But we can change. So… what can we do better?

  1. Slow down.

Much of our faulty reasoning can be stopped in its tracks if we take a moment to stop and think. After all, what’s the rush? It’s not as if you’re ever going to finish reading the internet…

2. Read the story. To the end.

A Columbia University study showed 6 in ten Twitter links get retweeted without users even clicking the link. Check that the headline matches the content, that it hasn’t been exaggerated or suggests a false connection. If it’s worth sharing, it’s worth reading first.

3. Consider the source

Fight the tendency to trust something because it came via someone you like, or because lots of people are talking about it. Find out where it came from, and be sceptical if you can’t find a credible source!

4. Notice when our emotions are targeted.

Misleading content isn’t always factually incorrect, sometimes it’s about provoking an emotional response or painting a biased picture. Our critical defences are weak when we feel angry, afraid or upset, so ask yourself: am I being played?

5. Make active media choices.

We should seek out and support – with our attention and with our clicks, comments and shares – sources that are genuinely trying to get to the truth of the matter.

6. Report, don’t share unacceptable content.

The tech companies need to do much, much more to stem the flow of false, hateful and harmful content. We need to tell them about it, instead of spreading our outrage.

Earlier, I asked – Why aren’t we doing more to teach young people how to deal with online misinformation? Well, the truth is that we didn’t grow up with the internet and we’re just as susceptible to its tricks as our kids.

Every day, you, me, Carly’s mum – we all shape the digital landscape, our clicks and shares influencing what we, and others, will see tomorrow. Right now, we’re lazy. We rely on what shows up in our feeds, or at the top of search pages, we share content we haven’t read properly and trust others who make the same mistakes.

We all need to improve our digital literacy. To protect ourselves, and to prepare the next generation for what lies ahead.

We can’t separate the world into truth and lies, but we can show more awareness and ask more questions. Awareness of the technology that knows us better than we know ourselves, of the powerful systems manipulating our beliefs – awareness of our own fallibility. And question – both the content we see and our own behaviours. We are not powerless. We are now the media. We should embrace our power to change the digital landscape and choose to make it better.

On April Fool’s day, last month, one of my friends shared this:

Ah, April 1st. The only day of the year that people critically evaluate things they find on the internet before accepting them as true.

We could all do worse than treating everyday as if it were April Fool’s Day.”

Watch Misinformation: The Cost of Careless Clicks a TEDx Macclesfield 2019 talk.

Read next…

Digital Literacy and the School Curriculum.

‘Digital Thinking’ a hit with Students and Staff.