SINGAPORE: Laws that compel online content distributors to proactively stop fake news and false information from spreading on their platforms could prevent falsehoods from reaching users, said Assistant Professor Mohamed Elmie Nekmat from the National University of Singapore (NUS) on Friday (Mar 16).
These distributors should be held accountable for the failure to stop methods of information distribution through the likes of bots and spam, suggested Prof Elmie, who's from the Department of Communications and New Media.
In his written representation to the Select Committee on Deliberate Online Falsehoods, Prof Elmie said content distributors would include technology companies and organisations that facilitate the distribution of information using the Internet, including search engines and social media platforms like Facebook, Instagram and Twitter, he said.
“By making such content distributors accountable, a clear signal is sent that stopping online falsehoods and misinformation from reaching the people is also their responsibility, and that failure to carry out such responsibilities will have serious repercussions,” he added.
Such content distributors, especially social media companies, are “major conduits” to the development and spread of online falsehoods and misinformation, he said.
He pointed to findings from a nine-nation study done between 2015 and 2017 that involved countries like Brazil, China and Canada, that showed that the spread of manipulated information on social media aimed at influencing public opinion is largely “supported by Facebook and Twitter’s algorithms.”
When asked on his views about content producers by committee member Seah Kian Peng, given his big focus on distributors, Prof Elmie said that if present laws do not cover them, they should be ramped up.
LAWS WILL DEFINE SERIOUSNESS IN SAFEGUARDING AGAINST ONLINE FALSEHOODS
Such laws implemented on digital content distributors would set a precedent for guidelines and norms that define Singapore’s seriousness in engaging all stakeholders toward establishing the best safeguards possible against online falsehoods, he said.
He also suggested that content distributors be made to publicly furnish details and information on the actions and mechanisms done to safeguard users from misinformation coming from their platforms.
“Content distributors should be made to provide regular updates and information on the outcomes such as the number of bots and fake accounts removed, the list of sites and sources that are banned, the countries where such accounts and sites were registered in,” he added.
They should also be made to provide information on such content in the languages of the main ethnic groups in Singapore, he said.
MEDIA LITERACY HAS VALUE BUT IS LIMITED
Prof Elmie, also stated his “strong belief” in the value of media literacy, having done research on online information credibility evaluation and literacies in the contexts of Singapore and the United States.
“The importance of nurturing media literate citizens to protect against deliberate online falsehoods can never be undermined,” he said.
More often than not, the content of deliberate online falsehoods are not necessarily false, but manipulated and twisted out of context, he said.
Having media literacy skills to “critically evaluate” the contextualised nature of information would not only protect individual users from the direct influence of such information, but would also build long-term public resilience against such online falsehoods, he added.
“Public education campaigns would help prepare social media users on how best to react to such information,” he said.
But one limitation of public education and media literacy efforts is the rapid rate in which disinformation gathers critical mass within short time spans, he admitted.
“It is thus necessary to consider more relevant regulations that can provide short-term protection against the rapid effects from online disinformation campaigns in social media,” he said.
FALSEHOODS CAN STOKE ETHNIC AND RELIGIOUS TENSIONS
The fact that no racial conflicts have resulted from misinformation belies the potential of such falsehoods to stoke ethnic and religious tensions among Singaporeans, Prof Elmie said.
"This is pertinent when we consider that deliberate misinformation can disproportionately spark racial tensions and distrust when they spread in the aftermath of national crises," he said.
He gave the example of a hoax video that had circulated on Facebook showing “moderate Pakistani Muslims” allegedly celebrating the Paris terror attacks in November 2015 that had killed 130 civilians. The video actually showed Pakistanis celebrating their country’s cricket victory in 2009.
This led to heightened Islamophobia sentiments and concerns.
To tackle the spread of misinformation targeting the social fabric in Singapore, efforts should be taken to examine the impact of misinformation in different languages, especially when they are more likely to penetrate ethnic-based community networks via close group communication platforms such as WhatsApp, he said.
"Public debate to safeguard against online falsehoods has also tended to focus on English language examples of misinformation, blindsiding the fact that such information exist(s) in different languages."
"Language is almost always closely linked to religion, so people are more comfortable with it, and because of that, the role of language is also very influential in such falsehood dissemination," he said.
He also recommended publishing a list of websites and online sources that produce misinformation and falsehoods, especially those with racial slants.