Commentary: Facebook’s fake news crisis is a product of Silicon Valley culture
The Cambridge Analytica scandal is a predictable consequence of tech sector norms and American lawmakers’ “wait and see” attitude, says one observer from the NTU Wee Kim Wee School of Communication and Information.
SINGAPORE: All the elements were in place to create the Cambridge Analytica scandal.
First, there’s Facebook’s corporate culture. Facebook chose the name of the street in Silicon Valley where its headquarters are located: “Hacker Way”.
What’s the hacker way? Facebook’s internal motto illustrates it: “Move fast and break things”. Don’t be constrained by current norms, like privacy. Norms can change.
A recently leaked memo by top Facebook executive Andrew Bosworth urges colleagues to focus on growth and connecting more people regardless of occasional “ugly” side effects like bullying and terrorism.
Second, there’s American regulators’ approach to the information technology sector: Wait and see before passing regulations, so as not to stifle innovators.
Third, there’s American freedom of speech, the “marketplace of ideas” approach: Let everyone say what they will, even if it’s false. Truth will eventually prevail.
Combine the elements, and here’s the result emerging from the allegations.
While no one was watching, British data firm Cambridge Analytica illicitly obtained data harvested from Facebook to target millions of voters with political messages, including fake news, matched to their psychological profiles.
Its clients included the Trump campaign, the Brexit campaign, Kenyan politicians and others.
Now, the fallout. American legislators contemplate next steps. Singapore’s Parliament is acting swiftly, having already concluded hearings on deliberate online falsehoods, a possible prelude to legislation.
WHO'S LOOKING ANYWAY?
The story of Cambridge Analytica, and reactions from American and Singapore lawmakers, make sense when seen in context.
A close look at Facebook shows Mark Zuckerberg was never one to follow rules. When he founded Facebook’s predecessor in his Harvard dorm, he faced a university disciplinary board for breaches of security, privacy and copyright.
What does Facebook’s “move fast and break things” mean? Roll out the innovation quickly, even with missteps.
'Iterate', as innovators say. Get feedback. Do it again.
Emblematic of Facebook’s approach is its 2006 introduction of the news feed, the list of your friends’ updates. Before then, you logged in and saw your homepage. If you wanted to see friends’ updates, you sought them by visiting their pages.
Though it’s hard to imagine now, when the news feed first appeared, some protested that it was intrusive for Facebook to broadcast a user’s updates to their friends. Zuckerberg was unrelenting. Users adapted. The news feed quickly became an indispensable feature, adopted by other social networks.
Fast forward to 2014. For years, developers had created games and other apps that plug into Facebook.
Cambridge University’s Aleksandr Kogan developed a personality test app. Hundreds of thousands of Facebook users took the test. It also harvested their friends’ data, which Kogan had permission to collect for academic purposes.
Kogan allegedly passed all this data on millions of users to the company Cambridge Analytica, which eventually worked for the Trump campaign.
A Cambridge Analytica former employee, whistle-blower Christopher Wylie, told the BBC that the firm “absolutely” planted fake news stories.
Kogan’s alleged disclosure of data to Cambridge Analytica violated Facebook terms and conditions on transfer of the data to a third party. But who was looking?
In 2015, after the Guardian reported that Kogan transferred data to Cambridge Analytica, Facebook banned the app and asked all parties to delete data from it.
Wylie says Facebook did nothing to enforce the request except to ask Cambridge Analytica to check a box on a form saying they deleted it.
UNREGULATED, HANDS-OFF APPROACH
Facebook and other new media firms thrived in a largely unregulated American communication technology sector.
Lawmakers’ hands-off approach wasn’t new. They had taken steps to ensure platforms could grow with minimal threats of liability. In the 1990s, one of the first Internet regulations Congress passed was to insulate platforms from liability for defamation that users posted.
Another law insulated platforms from liability for users’ copyright infringement unless the platforms are notified of infringing material.
Nowhere is America’s wait-and-see approach to regulating tech more evident than in data privacy, where there is still no comprehensive law, despite years of debate.
Technology companies opposed to privacy regulations have a loud voice, backed by campaign contributions. Public pressure for such regulation is not high.
When answering surveys, Americans say data protection is important, but their conduct suggests otherwise. Like people everywhere, Americans are quick to click “agree” to terms and conditions without reading them, and to exchange privacy for convenience.
In Singapore, Parliament passed the Personal Data Protection Act (PDPA) back in 2012. It’s a comprehensive data protection regulation for the private sector, including social media companies.
Singapore’s PDPA is actually quite simple. Among its core provisions: An organisation like Facebook must notify users of how it uses their data, including whether data will be disclosed to third parties. Users must consent to such uses.
Without a PDPA counterpart, American regulators may instead be left to proceed under broad Federal Trade Commission (FTC) prohibitions of “unfair” or “deceptive” practices. Another possibility is law regulating political campaigns.
The FTC is investigating whether Facebook broke a 2011 promise, made to settle legal action for sharing more information than it notified users it shared. In this “consent decree”, Facebook promised that for 20 years, it would get users’ consent before sharing their data with third parties.
It’s notable, however, that American law lacks a standard way to proceed in such cases. Zuckerberg is set to testify before Congress. Lawmakers may craft more targeted measures to address a crisis generating global public outrage.
OPEN PUBLIC DEBATE?
The final piece of the fake news puzzle is America’s approach to freedom of expression.
In the hearings of Singapore’s Select Committee on Deliberate Online Falsehoods, Professor Thio Li-Ann disputed the marketplace of ideas model’s proposition that truth will emerge when all voices contend in an unregulated, open debate.
To many Americans, that approach to open public debate is, ironically, not open to debate. America set its standard with the adoption of the Constitution’s First Amendment in 1791, which says Congress “shall make no law … abridging the freedom of speech, or of the press”.
Few exceptions to the First Amendment are made. Determining truth, about fake news or anything else, is considered the public’s job.
Government is seen as having no role in determining what’s true in political news, since government is a central actor in that news.
America’s central principles have contributed to a crisis with global implications. Its freedom of commerce and speech opened the door for freewheeling Facebook to allow Cambridge Analytica to exploit user data and craft whatever messages it chose with its clients.
Singapore’s proactive, “better safe than sorry” stance was on display in recent weeks, as it moved quickly to prevent similar crises from erupting in Singapore.
Social media companies will continue to iterate using the “hacker way”. America will likely continue to abide by principles of openness in commerce and expression that have, in many ways, served it well.
It remains to be seen if American lawmakers, and others around the world, can agree on penalties for abuses of those freedoms, or ways to prevent abuses.
Mark Cenite teaches communication law at the Wee Kim Wee School of Communication and Information, Nanyang Technological University, where he is Associate Chair (Academic).