Skip to main content
Best News Website or Mobile Service
WAN-IFRA Digital Media Awards Worldwide 2022
Best News Website or Mobile Service
Digital Media Awards Worldwide 2022
Hamburger Menu

Advertisement

Advertisement

Singapore

Meta removes child sexual exploitation content on Facebook after being notified by Singapore authority

The Online Safety Act, passed in November 2022, makes social media platforms liable if they fail to protect users from online harms.

Meta removes child sexual exploitation content on Facebook after being notified by Singapore authority

File photo of a person holding a mobile phone. (Photo: Jeremy Long)

SINGAPORE: The Infocomm Media Development Authority (IMDA) has notified Meta to review and remove content on Facebook containing child sexual exploitation material (CSEM), the agency said on Friday (Jun 9). 

Meta, the parent company of Facebook, Instagram and WhatsApp, took down the offending page and group within 24 hours, according to IMDA.

This is the first time IMDA has notified a social media service of such content, following the amendments to the Broadcasting Act last November to include enhanced online safety requirements.

The police had alerted the agency to a Facebook page that was part of an online network facilitating the sharing of CSEM, where it subsequently uncovered a Facebook group carrying similar posts. 

The Facebook page was uncovered during investigations against a 42-year-old man for possession of such materials, the police told CNA on Jun 14, adding that the man was arrested.

"The posts contained hyperlinks that led viewers to a website with CSEM," said IMDA, adding that it also directed internet service providers in Singapore to block a linked website that enabled access and distribution of similar content. 

Parliament passed the Online Safety Act last year, requiring social media sites to block access to harmful content within hours. The law empowers the IMDA to deal with harmful online content accessible to Singapore users, regardless of where the content is hosted or initiated.

Under the amended law, IMDA can direct social media companies to block or remove such content, and the agency said it would not hesitate to do so if swift action is not taken.

It can also direct an internet service provider to block access by users in Singapore, in the event an online communication service refuses to take down harmful online content.

The Act took effect on Feb 1 this year. 

The police and IMDA take a serious view on the solicitation and distribution of CSEM, which is a type of egregious content as defined under the amended Broadcasting Act, said the agency.

Other forms of content that fall under the ambit of this law include those advocating terrorism, suicide and self-harm, as well as content posing a public health risk or those likely to cause racial and religious disharmony in Singapore.

"Social media services, too, have a responsibility to ensure online safety for all users, particularly children," IMDA said.

"We recognise that the industry has taken active steps in recent years to combat harmful online content on social media, and urge social media services to remain vigilant in detecting and preventing the dissemination of harmful online content through their platforms and services."

IMDA added it would continue to work closely with social media services to enhance Singapore’s online safety.

Editor’s note: This article has been amended to make clear that no direction was issued by IMDA as Meta had removed the material upon notification.

Source: CNA/at(ac)

Advertisement

Also worth reading

Advertisement