Social media services to be directed to remove harmful online content under proposed codes of practice by Government
SINGAPORE: Social media services may soon be directed to “disable access” to harmful content or disallow accounts that have such content, the Ministry of Communications and Information (MCI) said on Monday (Jun 20).
The Government is consulting on two proposed codes of practice to enhance online safety. The first is for designated social media services with high reach or high risk to have system-wide processes to enhance online safety for all users, with additional safeguards for young users under 18.
The second proposal is that the Infocomm Media Development Authority (IMDA) may direct any social media service to remove specified types of “egregious content”.
Communications and Information Minister Josephine Teo announced in March this year that the Government will introduce the new codes to combat harmful online content on services that is accessible to users in Singapore.
Under the first code, social media services must have in place and effect community standards and content moderation mechanisms to “mitigate users’ exposure” to certain harmful content.
They should also provide tools for users to reduce and mitigate their exposure to unwanted content.
The social media platforms should proactively detect and remove child sexual exploitation and abuse material, as well as terrorism content.
Users should be allowed to report harmful content and unwanted interactions and this reporting mechanism must be easy to use and permanently available. Social media platforms must also assess and take appropriate action when the content is reported.
They must also produce an annual accountability report for publishing on IMDA's website.
The second proposed code of practice covers content areas assessed to be “egregious online harms” on social media platforms.
These include safety in relation to sexual harm, self-harm or public health. The areas of concern also include public security and racial or religious disharmony, or intolerance.
“For the proposed content code for social media services, IMDA will be granted powers to direct any social media service accessible from Singapore to disable access to specified types of egregious harmful content or disallow specified online accounts to communicate such content and/or interact with users in Singapore,” said MCI in a fact sheet.
Industry consultations started this month and will be followed by a public consultation exercise in July.
MCI said that globally, the prevalence of online harm is a “major concern”.
“While many online services have undertaken efforts to address this issue, the prevalence of online harms remain concerning,” it said, adding that these online harms are exacerbated when they are amplified on social media.
Such harms include content that endorses terrorism, extreme violence or hateful acts against certain communities, as well as encourages suicide and self-harm.
This kind of content threatens to "destabilise one's physical or mental well-being", through harassment, bullying or non-consensual sharing of sexual images.
Last year, a Singaporean man impersonated a Chinese woman and posted multiple racially offensive and insensitive public posts on social media, denigrating minority communities in Singapore.
According to a poll by Sunlight Alliance for Action in January, nearly half of more than 1,000 Singaporeans who responded said they experienced online harm. About 61 per cent who experienced gender-based online harm mainly experienced it on popular social media services.
In a Facebook post on Monday, Mrs Teo said: "Online safety is a growing concern and Singapore is not alone in seeking stronger safeguards for our people.
"Over the years, social media services have put in place measures to ensure user safety on their platforms. Still, more can be done given the evolving nature of harms on these platforms and the socio-cultural context of our society."
She added that there is a "growing global movement" pushing to enhance online safety.
"Singapore’s preferred approach in strengthening our online regulatory approach is to do so in a consultative and collaborative manner," said the minister.
"This means learning from other countries’ experiences, engaging tech companies on the latest tech developments and innovations, and understanding our people’s needs."
A spokesperson for Meta, which owns Facebook and Instagram, said that combating harmful online content is a "shared goal between governments and industry".
"At Meta, we have driven strong progress in our work on youth safety and well-being for many years now. We have also been consulting with regulators and experts around the world and we look forward to sharing our learnings and innovations in this space," the spokesperson said in response to CNA's queries.
"We welcome dialogue between youths, parents and caregivers, educators and experts to ensure teen safety and well-being, while respecting their expectations to privacy and promoting their autonomy."
TikTok's head of public policy for Southeast Asia and Singapore Teresa Tan said: "We share the same commitment as the Government to combat online harms.
"User safety is our top priority and over the years, we have made every effort to create a safe online space that prioritises age-appropriate experiences. We look forward to furthering our work to enhance online safety for our communities."