SINGAPORE: Social media giant Facebook said on Thursday (Mar 22) that it “got it wrong” and it had a “moral obligation” to inform its users earlier about the breach in its policies involving political consultancy Cambridge Analytica.
This is according to its vice-president of public policy for Asia-Pacific, Simon Milner, who appeared before the Select Committee on Deliberate Online Falsehoods.
He was questioned by Home Affairs and Law Minister K Shanmugam for about three hours.
Mr Shanmugam had quizzed Mr Milner about allegations that Facebook data was misused by Cambridge Analytica and whether it was “odd” that users on the social media platform were not informed earlier.
To that, Mr Milner said “it definitely looks like a decision where we made a wrong call”.
“We should have let people know. Our CEO has owned that decision and said that we got that wrong,” he added, referring to founder Mark Zuckerberg’s post on Facebook overnight. In that post, Zuckerberg admitted that the company “made mistakes” and that “there’s more to do.”
Mr Milner also admitted that the social media giant had a “moral obligation” to inform users earlier about the breach that occurred between 2014 and 2015.
“We wouldn’t be admitting that we’ve breached people’s trust if that was not the case,” he said in response to Mr Shanmugam’s question.
However, when asked why Facebook did not do so earlier, Mr Milner said he would prefer not to speculate as he had not been involved in the decision making then.
Earlier during the exchange, which at times got heated, and which the Facebook representative acknowledged as being a “tough Q&A”, Mr Shanmugam asked Mr Milner if his answers to the United Kingdom parliament in February were “full and frank”.
Mr Milner had previously told British Members of Parliament on a select committee inquiry into fake news that Cambridge Analytica did not have Facebook data.
To that, Mr Milner responded repeatedly that he had been “frank and honest” with the UK committee and that his answers then were based on what he “knew to be true at the time”.
However, he later admitted that, in hindsight, he could have “provided a fuller answer”.
“With hindsight, I could have said more. But the answers I gave were based on what I knew to be true at the time,” he said.
Mr Shanmugam also described Facebook’s behaviour as being similar to an old approach that utility and transport operators used to take with consumers.
He referred to Facebook’s statement on Mar 17 which said claims describing this incident as a data breach is “completely false”, given that “everyone involved gave their consent” knowingly.
Using the example of how train tickets were purchased in the past, Mr Shanmugam elaborated: “There used to be standard forms on these tickets which would remove all liabilities on the part of the train operator. They can be negligent and I may be left without any recourse.”
That is why the law “intervened to equalise the position a bit more and refuse the company in a dominant position to deny all liability".
“It seems to me that the approach you take is very much similar to the approach that many of these utilities and other operators used to take with consumers - (one of an) individual versus a large corporation which seeks to deny all liability," he added.
When asked if Facebook has a duty to make sure its users are notified more clearly and are protected against third-party app developers, Mr Milner said that Facebook accepts that it has a “very significant responsibility.”
“We’ve been absolutely clear that this was a significant breach of trust with our users and we are determined to address it,” he said. “We are clearly accepting responsibility. This should not have happened on our watch.”
FACEBOOK IS A VALUABLE PARTNER AND 'WE WANT IT TO SUCCEED'
Bringing up various news articles, Mr Shanmugam had also asked Mr Milner repeatedly about Facebook’s statements in parliamentary proceedings in other countries, which prompted the latter to ask how his line of questioning was appropriate.
“This committee is looking into the issue of deliberate online falsehoods here in Singapore,” he said. “Myself and my colleague, and other people on this panel, have come here prepared to answer questions on that and help the committee understand it.
“I don't think it's fair to ask me detailed questions about evidence given by my colleague to a different parliament in a different country about activities associated with that country.”
In response, Mr Shanmugam stressed that the questions before the UK parliament were “very relevant” in “exploring the degree to which Facebook can be trusted to answer questions”.
“And one looks at the sequence of conduct from 2015 to 2018, and the very first time you accepted responsibility for Cambridge Analytica publicly - when did that happen, why did that not happen earlier and to what extent can we take seriously all these protestations that you can be completely trusted to apply your internal guidelines,” he added. “We're all sovereign parliaments, but we look at your conduct all around the world, and we have to understand.”
Mr Shanmugam later added that he sees Facebook – and the other companies – as valuable partners whom the Government hopes to work with to find a solution to the problem of deliberate online falsehoods.
“You want to make profits, which is not a bad thing, and in the process if you can do some good, you would want to do some good as well,” he said.
“We do not have the levers that the US Government might have, or a much larger government in Europe might have. But from our perspective we want all of you to succeed, to continue to make money and do well in Singapore.”
“But in the course of the hearing and of writing our report, we will have to identify what it is that you will be able to do, what it is that we think you will not be able to do, and it would not be proper for us if we hadn’t tried, at least, to give you an opportunity to deal with some of those things that we may have to deal with in the end. So take it in that context,” he added.