Image credit: Daniel von Appen
Our computers and smart devices don’t leak our personal data. It’s sucked out of them by an infestation of parasite apps. They pump it to a vast landscape of info-factories. There, it’s blended, the anonymity squeezed out of it, and purified to feed a hungry swarm of advertisers, and other less obvious players.
This ecosystem is growing fast as everything becomes “smart”. A recent investigation by Which? revealed an upright vacuum cleaner that wanted to record audio on your mobile device, and an electric toothbrush’s app that wanted your exact location. “When we used a smart TV for just 15 minutes, it connected with a staggering 700 distinct addresses on the internet,” they wrote. There were plenty of security vulnerabilities, too.
In the first half of 2018 the mainstream media focused on privacy like never before. In part, that was down to Facebook’s role in the Cambridge Analytica scandal — a role that last week landed Facebook a maximum £500,000 fine from the UK’s regulator. And in part, it was Europe’s tough new General Data Protection Regulation (GDPR), which, as The Washington Post noted, made Europe the biggest regulator of Silicon Valley, not the US.
We already knew from 2014 research that of the top 100 URLs on the web, 85% were accessed in the background for tracking and ad-serving. When GDPR came into force on May 25, USA Today served a slimmed-down version to Europe, with all the tracking and advertising removed. The home page shrank from 5.2 megabytes to just 500 kilobytes — which is to say, more than 90% of what’s usually in the page is serving the surveillance economy.
In the weeks since we began working on the Prying Eyes series, we’ve also seen:
- In Europe, Facebook told users that its face-recognition technology was protecting them from strangers using their photo to impersonate them, a dark pattern that glossed over all the other potential uses;
- In the US, California passed sweeping new privacy laws in the GDPR vein, by far the toughest in the nation;
- Globally, mainstream media started reporting that Google lets app developers read people’s Gmail. That isn’t new, but as Slate reported, the media is now setting the privacy bar higher; and
- In Australia, medical appointment booking app HealthEngine was caught sharing personal information with lawyers. They eventually backed down after a doctor and patient backlash.
Could 2018 finally be the watershed year, when the narrative shifts against the surveillance economy?
Perhaps. As the Crikey-Roy Morgan polling showed, Australians are now deeply worried about how their personal data is used.
Perhaps not. It’s five years since your writer wondered, in the wake of the Snowden revelations, whether privacy fears would burst the dot-com bubble? It’s four years since I called for big data evangelists to be reprogrammed. “Big data is a dangerous, faith-based ideology. It’s fuelled by hubris, it’s ignorant of history, and it’s trashing decades of progress in social justice,” I wrote. All that is still true, but will the Facebook-Cambridge Analytica scandal and a bunch of new privacy laws really make that much difference?
Distinguished professor Genevieve Bell, an anthropologist, heads up the new 3A Institute at ANU. She says that big data systems — algorithms, machine learning, artificial intelligence (AI), whichever buzzwords you want to use — seem to have crossed a line. The apparent accuracy of their predictions now feels creepy, she said in the last of her Boyer Lectures for 2017, because it “suggests these technical systems know more about me than seems right or appropriate”.
“Clearly, Netflix, Wells Fargo, Telstra and Google all know a great deal about me. And on one level, all that data, I volunteered it to them in exchange for services, access, convenience, and connectivity. But along the line it stopped being about bits of information and became about what the information revealed when accumulated and aggregated. And more significantly still, what larger patterns and rules could be discerned from my information, and your information, and a whole lot of other people’s information too. We have been reduced, in this way, to our past data and the patterns it produced.”
Toby Walsh, scientia professor of artificial intelligence at UNSW and the CSIRO’s Data61, and a world leader in AI, agrees that “we just becom[ing] products being sold to people”. He’s particularly concerned about what he calls our “analogue data privacy”. This includes such signals as our location, already tracked in many ways, and health-related data tracked by fitness devices.
“You don’t want other people to know about things like your heartbeat. This is has terrible risks,” Walsh told Crikey.
“Can you imagine what an advertiser could do if they could actually see your response to an advert by measuring your heartbeat? Or imagine, even worse, what a political party could do if you could actually see a response to a political advert and see what issues will make your heart beat faster? The thing is, you can’t lie about your analog signals.”
Such medical-but-not-medical data is also collected outside traditional doctor-patient confidentiality. Aggregated, it can be used for research, the results of which can then be sold back to you.
“If you look at what’s happening in China you can see that is the future of surveillance,” Walsh said, referring not only to China’s ambitious plan to have 450 million video cameras installed by 2020, but its social rating system, something The Atlantic described as sinister.
“[China] is racing to become the first to implement a pervasive system of algorithmic surveillance. Harnessing advances in artificial intelligence and data mining and storage to construct detailed profiles on all citizens, China’s communist party-state is developing a ‘citizen score’ to incentivise ‘good’ behaviour’.”
A commercially focused system would be no less sinister. The infrastructure is already in place. In a world where people can already be sacked because of their Facebook posts, or denied credit because of factors unknown, it’s only a small step to secret algorithms deciding who gets hired, which medical treatments your insurance covers, and whether your kids get into your favoured school.
Personal information — or at least the services we trade it for — is now as much a part of our day-to-day activities as money or electricity. If we can have a royal commission into banking, and if energy supplies can be a hot-button political issue, then why can’t the impact of the surveillance economy get equal attention?
At the very least, we owe it to ourselves to understand what’s going on, and to demand that our government does the same, so we can all make informed decisions. So far, though, they’re asleep at the wheel as the world transforms around us. They’re cutting resources to agencies like the Office of the Australian Information Commissioner that could help us understand. And we’re letting them get away with it.
Crikey is committed to hosting lively discussions. Help us keep the conversation useful, interesting and welcoming. We aim to publish comments quickly in the interest of promoting robust conversation, but we’re a small team and we deploy filters to protect against legal risk. Occasionally your comment may be held up while we review, but we’re working as fast as we can to keep the conversation rolling.
The Crikey comment section is members-only content. Please subscribe to leave a comment.
The Crikey comment section is members-only content. Please login to leave a comment.