SAN FRANCISCO, California: Facebook elicited some laughs when it announced its grand strategy to discover which media publications were trustworthy in January.
Detailed in a post by Mark Zuckerberg, the plan would decide which stories would be prioritised in Facebook’s revamped newsfeed.
It was also simple: Users would be shown a list of news sources and asked two questions. Had they heard of them? And did they trust them?
The reach of fake news during the US presidential election showed beyond doubt that people are not good judges of what is trustworthy. And even after months of public debate about how to tackle the problem, fake news is still spreading fast.
Hoax videos topped US trending lists after the recent school shootings in Florida, disinformation has been rampant in the Brazilian and Italian election campaigns, and online propaganda is going viral in Myanmar.
To be fair to Zuckerberg, I do believe it is valuable to push people to think about where their news comes from.
As a reporter in San Francisco, I am often surprised by people’s lack of knowledge about how stories are discovered, reported and published.
I find myself having to explain how little is known in the first minutes and hours of a breaking news story, or how polling data can be interpreted in multiple ways and should therefore be treated with care.
LEARNING ABOUT DISINFORMATION TACTICS
A new online game called Bad News aims to help people understand the process of creating fake news.
By putting the player in the position of a “disinformation fake news tycoon”, it hopes to highlight the tactics being used to spread online propaganda, for political or financial ends.
Created by researchers at the University of Cambridge and the Dutch media collective DROG, it is based on psychological “inoculation”. The idea is that demystifying the process of its creation can help people challenge fake news.
It is about growing “mental antibodies” to disinformation, says Sander van der Linden, director of the university’s “social decision-making lab”.
I decided to try to improve my own mental antibodies. Within minutes of starting the game, I’ve created a fake Donald Trump account tweeting about war with North Korea and chosen the name of my own fake-news site: Honest Truth Online.
I earn a badge for impersonation - one of six tactics explored in the game - but a dial on my screen shows I still don’t have many followers and my credibility is rated as low.
As I go through the game, I’m given choices: Exploit fear or anger? Write a fake news article or create a meme? Troll a faceless international organisation or choose to target an individual?
I’m also shown the power of using bots to amplify my message. When I buy 4,000, my message spreads wider and the credibility gauge ticks up. At the end of the game, I’m told I have a #highscore.
ENCOURAGE KIDS TO THINK ABOUT RELIABILITY OF SOURCES
I met Van der Linden when he was working on the game late last year. He told me how a pilot study with Dutch high schoolers showed it could reduce how reliable they perceived fake news to be. He believes we should encourage much younger children to think about the reliability of news sources.
I agree. News literacy should be taught, to protect individuals and society as a whole.
But I fear the challenge is getting ever greater. New technologies make fakes much harder to spot.
You can recreate anyone’s voice using sample audio and build 3D models of faces from a handful of photos - so virtual impersonation could be used to trick large numbers of people.
Last year, University of Washington researchers used computer vision techniques and audio files to create an almost perfect lip-syncing version of President Obama. He had actually said the words they put in his mouth - but the technology could be developed to make it seem as if people had said things they hadn’t.
If we are not sure whether to trust a news site, we often search for a primary source, such as a video of a speech. These new technologies could shake our confidence in news even further - and be far harder to inoculate against.
© 2018 The Financial Times Ltd. All rights reserved. Please do not copy and paste FT articles and redistribute by email or post to the web.