SYDNEY: In the aftermath of revelations about the alleged misuse of Facebook user data by Cambridge Analytica, many social media users are educating themselves about their own digital footprint. And some are shocked at the extent of it.
Last week, one user took advantage of a Facebook feature that enables you to download all the information the company stores about you. He found his call and SMS history in the data dump – something Facebook says is an opt-in feature for those using Messenger and Facebook Lite on Android.
This highlights an issue that we don’t talk about enough when it comes to data privacy - that the security of our data is dependent not only on our own vigilance, but also that of those we interact with.
EASY FOR FRIENDS TO SHARE OUR DATA
In the past, personal data was either captured in our memories or in physical objects, such as diaries or photo albums.
If a friend wanted data about us, they would have to either observe us or ask us for it. That requires effort, or our consent, and focuses on information that is both specific and meaningful.
Nowadays, data others hold about us is given away easily. That’s partly because the data apps ask for is largely intangible and invisible, as well as vague rather than specific.
What’s more, it doesn’t seem to take much to get us to give away other people’s data in return for very little, with one study finding 98 per cent of MIT students would give away their friends’ emails when promised free pizza.
Other studies have shown that collaborating in folders on cloud services, such as Google Drive, can result in privacy losses that are 39 per cent higher due to collaborators installing third-party apps you wouldn’t choose to install yourself.
Facebook’s data download tool poses another risk in that once the data is taken out of Facebook it becomes even easier to copy and distribute.
This shift from personal to interdependent online privacy reliant on our friends, family and colleagues is a seismic one for the privacy agenda.
HOW MUCH DATA ARE WE TALKING ABOUT?
With more than 3.5 million apps on Google Play alone, the collection of data from our friends via back-door methods is more common than we might think. The back-door opens when you press “accept” to permissions to give access to your contacts when installing an app.
Then the data harvesting machinery begins its work – often in perpetuity, and without us knowing or understanding what will be done with it.
More importantly, our friends never agreed to us giving away their data. And we have a lot of friends’ data to harvest.
The average Australian has 234 Facebook friends. Large-scale data collection is easy in an interconnected world when each person who signs up for an app has 234 friends, and each of them has 234 and, so on. That’s how Cambridge Analytica was apparently able to collect information on up to 50 million users, with permission from just 270,000.
Add to that the fact that the average person uses nine different apps on a daily basis. Once installed, some of these apps can harvest data on a daily basis without your friends knowing and 70 per cent of apps share it with third parties.
WE ARE MORE LIKELY TO REFUSE DATA REQUESTS THAT ARE SPECIFIC
And in our own research conducted with a sample of 287 London business students, 96 per cent of participants failed to realise the scope of all the information they were giving away.
However, this can be changed by making a data request more specific – for example, by separating out “contacts” from “photos”. When we asked participants if they had the right to give all the data on their phone, 95 per cent said yes. But when they focused on just contacts, this decreased to 80 per cent.
We can take this further with a thought experiment. Imagine if an app asked you for your “contacts, including your grandmother’s phone number and your daughter’s photos”. Would you be more likely to say no? The reality of what you are actually giving away in these consent agreements becomes more apparent with a specific request.
THE SILVER LINING IS MORE VIGILANCE
This new reality not only threatens moral codes and friendships, but can cause harm from hidden viruses, malware, spyware or adware. We may also be subject to prosecution as in a recent German case in which a judge ruled that giving away your friend’s data on Whatsapp without their permission was wrong.
Although company policies on privacy can help, these are difficult to police. Facebook’s “platform policy” at the time the Cambridge Analytica data was harvested only allowed the collection of friends’ data to improve the user experience of an app, while preventing it from being sold on or used for advertising.
But this puts a huge burden on companies to police, investigate and enforce these policies. It’s a task few can afford, and even a company the size of Facebook failed.
The silver lining to the Cambridge Analytica case is that more and more people are recognising that the idea of “free” digital services is an illusion. The price we pay is not only our own privacy, but the privacy of our friends, family and colleagues.
Vincent Mitchell is professor of marketing at the University of Sydney. Andrew Stephen is L’oreal professor of marketing and associate dean of research at the University of Oxford. Bernadette Kamleitner is professor of marketing at the Vienna University of Economics and Business. This commentary first appeared on The Conversation. Read the original piece here.