How Facebook algorithms can fight over your feed

A person looks at a smartphone with a Facebook App logo displayed on the background. (File photo: AFP)
CALIFORNIA: Over the past decade, most of us have gotten used to hearing that “the algorithm” is responsible for this or that on the Internet. Sometimes it’s something useful, like movie recommendations. Sometimes it’s not.
Facebook is under fire for algorithms that shape what nearly 3 billion users see on the world’s largest social media network, including troubling amounts of hate speech and misinformation. But internal company documents provided by a whistle-blower make clear that it’s wrong to think of “the algorithm” as a single bad actor.
Rather, they show how an algorithm designed for one purpose can struggle to integrate signals aimed at another, as when Facebook’s efforts to reduce harmful content in its News Feed ran into a system meant to keep users scrolling, sharing and posting.
1. What is an algorithm?
A set of instructions for making decisions or performing a task. Arranging names in alphabetical order is a kind of algorithm; so is a recipe for making chocolate chip cookies.
But those simple formulas bear only a distant relationship to the computerised code that companies like Facebook, Alphabet’s Google and Twitter spend billions of dollars on to keep their platforms running.
2. What are those like?
Typically, algorithms are written to perform discrete tasks, but many can work in concert to produce enormously complex systems.
At Facebook and Google, the algorithms powering the platforms are constantly updated to learn from new data and better optimise for company priorities. While algorithms operate strictly according to their program, the decisions about what signals to train the software on and what outcomes to aim for are entirely human.
As one internal Facebook memo recently revealed by the whistle-blower put it: “The mechanics of our platform are not neutral.”
3. How does Facebook use algorithms?
Facebook’s business model is centered on selling ads to be viewed by users. It had US$86 billion in revenue last year, a reflection of its huge reach and success at keeping users clicking.
One of Facebook’s most important algorithms is the one that produces the News Feed on a user’s page. Before 2009, the algorithm reflected the chronological order in which items were posted.
Related:
But starting that year, the Feed began to be ordered in a more complex way by an algorithm designed to show people the things they would find most interesting or engaging, not just the most recent.
Facebook also uses algorithms to target ads and recommend friends or groups to follow.
4. How do they work?
The billions of pieces of content posted on Facebook are all ranked in terms of an individual user’s likely reaction. That is, the algorithm makes a prediction based on each user’s characteristics and past behavior, as well as the nature of the content and how such posts have previously been received.
Like many companies, Facebook uses algorithms powered by artificial intelligence that are designed to improve with use. Feeding and improving such algorithms is a big reason why social media companies collect, store and curate vast amounts of data about users.
5. What other roles do algorithms play?
Facebook says it also designs its system to reduce content that might be offensive or distasteful. Facebook changes the settings of a variety of algorithmic elements from time to time.
But the internal documents provided by Frances Haugen, a former product manager for Facebook turned whistle-blower, show that the complexity of the algorithm meant that tweaking it is not a simple matter. For instance, a 2018 shift to focus on “meaningful social interactions” ended up increasing polarisation on the platform.
6. Are there any checks on the ranking system?
After posts get assigned an algorithmic rank, they go through what Facebook calls its “integrity processes”, designed by its Integrity team, to automatically detect and deal with problematic content.
Information that is illegal or violates Facebook’s community standards is deleted, and otherwise offensive or questionable content is demoted.
Related:
However, the internal documents show that people on the Integrity team realised that demoting content by 50 per cent - or even 90 per cent in some cases - wouldn’t dent very highly ranked posts that in some cases were amplified 100 times over by competing algorithms designed for the News Feed.
“I worry that Feed is becoming an arms race,” one Facebook employee wrote to colleagues in October 2019, describing a situation in which different aspects of the algorithm were in effect competing for influence.
7. What else did the documents show about algorithms?
A common complaint from Facebook employees was that there was no centralised vision for what kind of experience a user should have, as directed by the platform’s algorithms. A member of the Integrity team who was leaving the company wrote in a parting note that “harms fester in unwatched interactions” between different parts of the platform.
What Haugen called the company’s “flat” corporate structure made it hard to implement proposed interventions to address harmful content. For example, one internal study found that demoting “deep reshares” from people who weren’t a friend or a follower of the original poster could cut the number of times so-called civic misinformation was viewed by 25 per cent and civic photo misinformation views by 50 per cent.
Haugen said this intervention was discussed with senior management but never implemented, in part because Facebook didn’t want to lose the reader engagement driven by deep reshares. Joe Osborne, a Facebook spokesperson, said the company sometimes reduces deep reshares, but only rarely, because it is a blunt instrument that affects benign speech along with misinformation.
8. What else does Facebook say?
That it’s been trying to give people more control over the content in their News Feed.
The platform now has an easier way for users to say they want more recent posts, rather than machine-ranked content, and people can mark some friends or sources of information as favorites.
Facebook also suggests that users hide posts they don’t like, an action that would signal to the algorithm to show fewer posts like those.
9. Do other social media companies have similar issues?
Facebook is definitely not alone. Other social media platforms including Twitter, YouTube, TikTok, LinkedIn and Pinterest all give prominence to user content based on similar algorithms.
While the specific challenges can vary, many of the issues are universal, including their handling of misinformation, hate speech and abusive posts.
10. What ideas are being proposed for dealing with algorithms?
There are two main approaches:
One would be to chip away at liability protections for online platforms offered by a 1996 US law often referred to by its relevant provision, Section 230. Facebook founder Mark Zuckerberg told Congress this year that companies should only enjoy Section 230 protections if they implement best practices for removing illegal content.
The other would involve regulating algorithmic interventions. Hypothetically, a law could place a cap on how many times a piece of content could be reposted automatically or shared by users.
Some lawmakers have called for algorithms to be subject to audits to see if they are fostering discrimination or other social harms.
Haugen in her testimony before Congress said lawmakers should push social media platforms to return to a more chronological user experience in which information would move at a slower, more human-scale pace.