How China’s surveillance tech became my unlikely coronavirus ally
People wearing face masks in Beijing on March 19, 2020.
When coronavirus-tracking apps were rolled out by local authorities in Beijing a couple of months ago, I was cautious, wary of what an opaque algorithm might do with my data.
Now I count myself lucky to use them; in a country where anti-foreigner sentiment is rising, the app has helped me to deal with the bigger danger of human bias.
Like many parts of the country, Beijing has reported several periods of more than a week with no new local-origin infections.
It boasts that nearly all of its new cases are imports — the United Kingdom has led the list of source countries more than once. Government policy has shifted to stemming imported cases as well as preventing a local resurgence.
As a result, guards at building complexes and public parks — now open again — are cautious about foreigners, who have recently been banned from entering the country.
How do you tell if someone is a foreign passport holder? In my experience, people often resort to assumptions around race.
My white friends have been stopped but friends who look east Asian report no such problems.
The approach makes little sense, since it doesn’t distinguish between foreign nationals who have stayed here through the outbreak, or, say, Chinese students returning from New York.
The latter group is much more likely to be carrying the coronavirus; indeed, despite the focus on foreigners, imported cases are mostly from Chinese nationals.
In this climate of suspicion, surveillance technology has proved an unlikely ally.
Recently, I tried to visit the foothills of the Fragrant Mountains — one of Beijing’s most popular spots.
At the roadblock on the way in, I had my temperature checked, and showed the guards the green traffic light icon issued to me on a coronavirus-tracking app.
The icon is green because I have been in Beijing for the past 14 days, as tracked by my telecoms carrier’s location data.
The guards waved me through.
But a group of foreign and Chinese friends, who had been in Beijing the same length of time as me but did not know about the app, were stopped at the same gate a day later and asked for their passports.
They did not have these with them and they were turned away.
The telecoms carriers’ data is simple, listing cities it thinks you have been in over the preceding fortnight (though there is no clear recourse for mistakes).
But transparency is a major issue for some other apps. Take the various “health code” apps developed by local authorities — most famously, by the city government of Hangzhou, home of Alibaba.
These can generally only be used by people with a Chinese ID number and, while they ask for consent to share your location data, how that data is processed — and what other forms of data are combined with it to produce a colour code — isn’t explained.
Since your health code colour can bar you from your workplace, it’s no wonder some have complained that theirs has flipped from green to red with no obvious explanation.
Problems can often occur when technological processes affect the public but are shielded from public inspection — think of Twitter’s decision to censor certain posts but not others, or Apple’s decision to take down certain apps.
In China, government units also operate by passing on opaque algorithms from above.
“Everyone is extremely cautious about foreign passport-holders right now, whether they’ve been in Beijing or not,” the guard to one office complex told me.
By “everyone”, I knew he was referring to the general direction in which political winds are blowing — often more important here than what politicians explicitly say.
When unspoken directives are combined with human bias, the common result is confusion about what the authorities have in mind for us.
As I walked down the street in Beijing recently, a policeman yelled at me: “Are you a foreigner?”
I had been talking in English on the phone to a colleague.
“Which country are you from?” he asked, pulling out his phone and scanning my face with a police app.
I had met a similar app last year when reporting from the north-west region of Xinjiang, where surveillance is used to crack down on the Uighur Muslim population.
The officer showed me the results: There was my passport photo, my passport number and the residential address I’d registered with the police.
It was quite accurate and there was no algorithmic bias against my face, the kind on which Chinese facial-recognition algorithms have been trained.
Whether that accuracy contributes to a wider system of justice is another matter. FINANCIAL TIMES
ABOUT THE AUTHOR:
Yuan Yang is Financial Times’ China tech correspondent in Beijing.