Commentary: Facebook found Instagram could hurt teens’ mental health, but is it doing enough to address it?
Social media platforms generate revenue by feeding our obsessions, even when they are harmful - but safeguards may be possible, says NTU’s Mark Cenite.
SINGAPORE: Facebook says its mission is to “give people the power to build community and bring the world closer together”.
By espousing such lofty aims, the company sets itself up to be judged harshly when it falls short.
And it has - many times. For years, critics have slammed Facebook for enabling the spread of fake news and hate speech, and for contributing to political polarisation from America to the United Kingdom and Myanmar.
As the global COVID-19 pandemic rages on, Facebook only recently started taking a tougher stance on vaccine misinformation, including long-debunked mistruths.
Now the Wall Street Journal (WSJ) has reported on another casualty of one of the company’s platforms: Teen mental health.
Facebook’s research on Instagram showed, to quote an internal presentation from 2019 that an employee leaked to the WSJ, “We make body image issues worse for one in three teen girls”.
One slide said: “Teens blame Instagram for increases in the rate of anxiety and depression. This reaction was unprompted and consistent”.
Another showed that some teen users attributed feelings of being “not good enough”, “unattractive” - and even suicidal thoughts - to the app.
The notion that teenagers’ mental health is impacted by social media use is not new. For example, a UK study of 5,000 young people found that heavy social media use was linked to low self-esteem, especially in girls.
But it is all the more concerning when the company’s own research says so.
LOST OPPORTUNITY
To be fair, Facebook deserves some credit for conducting research on potential harms.
And no single study can prove, authoritatively, that Instagram use causes teens to feel inferior to the carefully curated images posted by celebrities, influencers and friends. Social science requires replication.
Still, it’s never ideal for whistle-blowers to beat a company to exposing its bad news. Facebook could have proactively announced its research findings, even if preliminary, and called for further study of the problem and even pre-emptive safeguards.
Instead, Facebook appears to have gone the other direction. When asked in congressional hearings in March about the platforms’ effects on children’s mental health, CEO Mark Zuckerberg said that research showed that “using social apps to connect with other people can have positive mental health benefits”.
In August, US senators specifically requested for Facebook to reveal its internal research on children’s mental health. Facebook refused, calling it proprietary and confidential and adding that they were unaware of an expert consensus on how much screen time is “too much”.
Such evasion is not what we’d expect from a company that commits itself, in its own words from Instagram’s Terms of Use, to “fostering a positive, inclusive and safe environment”.
ROOTS OF THE PROBLEM
“The problem with Facebook is Facebook” goes the line coined by Professor Siva Vaidhyanathan, a media scholar at the University of Virginia.
The problems, Vaidhyanathan explains, result when the platforms perform exactly as designed.
Facebook earns almost all of its revenue from serving advertising tailored to each of us. The platforms gather data on which content makes us linger, click “like”, or leave a comment, then they serve us more of the same.
The longer we stay, the more ads the platforms serve. So Facebook has an incentive to send us deeper down our own personally-tailored rabbit holes.
That’s great if you’re digging into your new hobby of bird watching. You’ll see more birds.
It’s not so great if you get sucked into a whirlpool of baseless conspiracy theories.
And it may be hazardous when teens are obsessing over influencers and their lifestyles. They’ll see more posts of beautiful, glamorous people. The problem is that teens are particularly vulnerable to negative social comparison - including feeling miserable when they fall short of beauty ideals.
Another leaked Facebook document from 2020 says: “Social comparison is worse on Instagram” than on rival platforms. On TikTok, the focus is on performances, like dancing, rather than bodies and lifestyles, Facebook noted.
TikTok, whose mission is “to inspire creativity and bring joy”, is nonetheless vulnerable to similar criticisms. It works the same way as Instagram.
In another investigation, Wall Street Journal reporters created TikTok accounts for fictional 13- and 15-year-old users. They used the accounts to linger on videos about sexual fetishes or drugs. The app’s algorithm flooded them with recommendations of similar content.
SAFEGUARDS TO EXPLORE
No one is suggesting heavy-handed solutions like purging models’ photos so the rest of us don’t feel bad about ourselves. Some safeguards are worth examining, however.
First, the problems must not be allowed to grow. Facebook made plans for Instagram Youth, targeted at children ages 10 to 12, despite the objections of state attorneys general and child welfare groups.
Related:
In the wake of the leaks, Facebook put the plans on hold on Monday (Sep 27), after resisting such calls as recently as July. Facebook should not extend its services to even younger, potentially more vulnerable users until it can ensure that safeguards are in place.
Second, Facebook can only fulfil its commitments to safety if it is more transparent. To gain trust, Facebook must fully and openly acknowledge problems.
Facebook’s proprietary data on users puts it in a unique position to identify possible harms and solutions. It could promptly release internal research findings and fund further external researchers’ investigations, perhaps through an independent grant-giving foundation to reduce the appearance of conflicts of interest. It could share anonymised user data with researchers.
Third, to the extent the problem lies with recommendation engines, design tweaks need not be complicated. Platforms could adjust their algorithms to slow down the barrage of recommendations, especially to teens, of content associated with problems like body image issues and eating disorders. Instagram suggested as much following the WSJ report.
Facebook has taken the step of supporting a non-profit that produces videos encouraging teens to do daily affirmations like, “I am in control of my experience on Instagram”, according to the WSJ report.
How much such a campaign helps is another research question that Facebook can help answer. But such efforts may come across as half-hearted if not accompanied by firmer commitments to install safeguards.
Social media platforms like Instagram and TikTok may be toxic when they work as designed. So they can be made safer, by design - if their creators truly want to live up to their lofty mission statements.
Dr Mark Cenite is Associate Dean (Undergraduate Education) at Nanyang Technological University’s College of Humanities, Arts, & Social Sciences. He teaches communication law at the Wee Kim Wee School of Communication & Information.