Skip to main content




'Co-regulation approach' should be taken when tackling hot-button issues: Facebook VP

'Co-regulation approach' should be taken when tackling hot-button issues: Facebook VP

FILE PHOTO: Monika Bickert, Facebook's head of global policy management, is interviewed by Reuters in Washington February 2, 2016. REUTERS/Gary Cameron/File Photo

SINGAPORE: The public and private sectors should work together to tackle hot-button issues like hate speech, terrorism or child exploitation in a “co-regulatory approach”, Facebook’s vice president of Global Policy Management Monika Bickert said.

Ms Bickert, who leads the social media giant’s global content policy team, was shedding light on the company's thinking about "meaningful regulation" during an interview with Channel NewsAsia on Wednesday (Feb 20).

"Meaningful regulation" was something Facebook said it was open to following the publication earlier this week of the United Kingdom’s parliamentary report looking into tech companies like Facebook and how disinformation is spread on these platforms.

She said the company already works in a “global framework of laws and regulations”, and has been talking to governments and regulators to understand their concerns and its obligations, as well as "inform policy development" by sharing the company's challenges.

Regulations, she added, are important when it comes to creating a place that fulfils Facebook's stated values of creating a safe online space that allows for freedom of expression.

Ms Bickert, who was a former assistant United States attorney and was the resident legal advisor at the US embassy in Bangkok, said her 11-year stint in the public sector helped shape her thinking on the policing of content on the popular social media platform.

“I’ve found that we tend to do our best, and by that I mean public and private sectors, tackling issues like hate, terrorism or child exploitation when we’re working together in a co-regulatory approach,” she said.

“Meaning we’re looking at the best ways of keeping the (online) community safe and also to protect (freedom of) expression."

She admitted that Facebook "isn't perfect" but pointed out that for most of its users, they are safe and have good experiences on the platform. It is actively addressing users who do not have these good experiences, she added.

The US tech giant has belatedly come to realise that such bad experiences have profound real-life ramifications. In Sri Lanka, for instance, the company failed to take down a post that incited racial hatred - something Singapore politician Edwin Tong called out last November and led to Facebook admitting to its "serious mistake". 

Last November, it released a third-party study into its impact within Myanmar that showed it had not done enough to prevent its platform from being used to incite violence. Such violence resulted in pushing more than 700,000 Muslims from Myanmar to neighbouring Bangladesh.


Her comments come as Facebook continues to face heavy regulatory scrutiny around the world. The UK committee’s report, for instance, branded the company “digital gangsters”. 

It called for a compulsory code of ethics for Facebook and other tech companies to comply with, and for it to be overseen by an independent UK regulator.

Germany is on the vanguard of countries increasingly policing the way the social media platform is run, and its anti-competition watchdog this month reined in its data collection practices.

"In future, Facebook will no longer be allowed to force its users to agree to the practically unrestricted collection and assigning of non-Facebook data to their Facebook accounts,” said chief of Federal Cartel Office Andreas Mundt then.

The company said it will appeal the landmark ruling.

Nearer home, Singapore’s Select Committee on Deliberate Online Falsehoods recommended 22 measures to tackle this modern problem including writing new laws to ensure tech companies contribute to a clean Internet ecosystem.

READ: Select Committee - Tech giants need to be more accountable; new laws possible

On this, Ms Bickert said that Facebook talks to the Government “regularly” on this and other issues.

“I think we find these conversations educational for us and productive, and look forward to continuing them,” she said, adding it will provide updates when there are any to be shared.


Ms Bickert, who is in town to attend the first of a series of workshops on its proposed external oversight board, also provided more details on the ambitious project.

She said that the oversight board will have a say on specific content flagged either by Facebook users or the company.

However, requests by regulators and governments for content takedowns will not be under this board’s purview. Such requests will be looked at by its legal team and assessed based on the country’s laws, and is a separate mechanism, the executive explained.

READ: ‘We do not have a policy that prohibits alleged falsehoods’: Facebook on why it did not take down States Times Review post

Founder Mark Zuckerberg had announced the oversight board last November, saying that Facebook should not make so many important decisions about free expression and safety on its own. 

He proposed to set up an independent oversight body so people can appeal content decisions made by the company.

“Over time, I believe this body will play an important role in our overall governance,” Mr Zuckerberg said. “Just as our board of directors is accountable to our shareholders, this body would be focused only on our community.”

Adding on to this, the company issued a draft charter this January on the body’s basic scope and structure, and it identified decisions that still need answering, such as who should be on it.

Ms Bickert said the workshops, which will run over the next six months, will help the company hammer down many of these details and there will be “some pilots” from these slated for launch in the first half of this year.

Source: CNA/kk(hm)


Also worth reading