SINGAPORE: The Select Committee looking at the issue of deliberate online falsehoods is still deliberating on the matter but any possible law in Singapore, and its accompanying compliance costs, will affect all social media players and not just Facebook, said Senior Minister of State for Communications and Information Janil Puthucheary on Monday (Jun 25).
Mr Puthucheary, speaking during a panel session on fake news, said from an engineering perspective, fake news is a “solvable problem” but acknowledged the difficulty in tackling it with legislation.
He cited an example of a Facebook post showing a young man wearing a Guy Fawkes mask and doing a Nazi salute at Hong Lim Park. It had upset a UK woman so much, due to its association with the Holocaust, she fought to have the picture removed from the social media platform.
It was later discovered the man made the salute to protest against what he perceived as the fascist People’s Action Party (PAP) Government, Mr Janil added.
That said, any law enacted that may have compliance costs associated is not targeting just Facebook. “It will affect anyone that plays in this space,” he explained, even newcomers trying to enter.
A Facebook representative on the panel, Mr Alvin Tan, had earlier pointed out that the issue of fake news and how to mitigate it is “multifaceted”, and legislation is an “imperfect tool” to deal with the issue.
Mr Tan, head of Public Policy for ASEAN, Malaysia and Singapore at Facebook, said during the discussion it was difficult to enforce the laws in each country Facebook has a presence in.
He cited an example raised by moderator Torben Stephan, director of Media Programme Asia at Konrad Adenauer Stiftung Singapore, which illustrates the difficulty.
Mr Stephan had talked about how the denial of the Holocaust is considered a crime in Germany, yet he still sees such assertions made on Facebook, and wondered how the law can be enforced and who will prosecute the offenders.
The social media giant had said during the public hearings on deliberate online falsehoods in March this year that it will remove content that is locally illegal if it receives a court order, but does not have a policy that requires all content posted to be “true, verified and accurate” so as not to be put in a position where it has to be the arbiter of the truth.
Another panellist, Dr Cherian George, contended that big Internet giants like Facebook should not be allowed a free pass to not deal with the issue by saying every market and its laws are different and enforcement is difficult.
Dr George, a professor at the Department of Journalism at Hong Kong Baptist University, pointed out the apparent unequal treatment of the issue by Facebook using Germany and Myanmar as examples.
As Germany is a big market for the company, it has poured in resources like hiring more reviewers to comply with the country’s recent law aimed at hate speech and fake news. The same cannot be said of markets like Myanmar, which the academic said may be more in need of such resources given the rise in Internet use and disinformation spreading there.
The social media platform’s role in spreading hate speech against Rohingya Muslims in the country was put under the spotlight earlier this year, and CEO Mark Zuckerberg said it was something the company is paying a lot of attention to.
He also said it was hiring “dozens” more Burmese-language content reviewers to look for hate speech, among other initiatives to address this in a Washington Post report.
Mr Tan said the company “needs to be smarter” about how it uses humans to complement machine learning algorithms to sift through things like hate speech on its platform.
This was an issue Facebook highlighted when it released its community standards enforcement efforts for the first time in May this year. Mr Guy Rosen, the company’s vice president of Product Management, told Channel NewsAsia then it is “very hard” to build artificial intelligence to address hate speech.
“We’re doing as much as we can ... (but) we can’t do everything alone,” Mr Tan said.