ANU data breach

Buried on the “Personal” section of the Telstra website is a page entitled “How we keep your personal information anonymous“: 

We ensure personal information is made anonymous before it’s used by removing data that can identify you, such as your name, address and telephone number. We also use a variety of statistical techniques and operational controls to anonymise data. Anonymising information is one of the tools we use to protect your privacy … We may share or sell anonymous information to our business partners for research, analysis, marketing, advertising and related uses. That process is carefully controlled so that the information is only used for the stated purpose, and so that individuals are not identified.

But of course, your personal information identifies you. You know, being, well, personal. So what does Telstra collect about you? Well …

This can include straightforward information like your name, date of birth, contact details (including address, email address, phone number or mobile telephone number), occupation, driver’s licence number, Telstra PIN, username or password and  financial information (such as credit card or bank account numbers).

We may also collect more in-depth information including:

  • Billing and Credit Information related to your financial relationship with us, such as your payment history, your credit history, and your service history. 
  • Information about how you use your products and services such as:
    − How you use our internet services, such as information about websites visited
    − Your location when you are using our products and services
    − Information that allows us to identify you for verification purposes including biometric information like your fingerprints and voice patterns

    Sensitive information includes information about a person’s race, ethnic origin, political opinions, health, religious or philosophical beliefs and criminal history

By the way, we don’t wish to single Telstra out, they’re just an illustrative example of what kind of information a big company might collect on you.

Dr Vanessa Teague, senior lecturer in the Department of Computing and Information Systems at at The University of Melbourne, said it was absurd for companies to claim that a wealth of personal information about someone even could be made anonymous.

“A detailed individual record about a particular person cannot be securely anonymised while also retaining most of the value of the data,” she said. “Of course at some level you can anonymise it by removing or changing most of the information, or you can release aggregated statistics. There are techniques such as differential privacy that give clear, mathematical guarantees about the protection of privacy in data sharing. This is an active and interesting field of research, but it doesn’t produce a way of sharing a detailed individual record without removing most of the information.” 

For its part, Telstra said through a spokesman that their anonymisation techniques were “aligned with industry best practice, injecting all outputs with ‘noise’ (differential privacy) and ensuring small values are removed from data sets (k-anonymisation). These processes offer strong mathematical guarantees and prevent outputs from being matched with other data sets to identify individuals.”

Meanwhile, Teague said the government has attempted to amend the Privacy Act to make legitimate research into privacy issues a crime. A sample of medical billing records for roughly 2 million people on data.gov.au was released by the government in August 2016. It was de-identified using encryption, but Teague and her colleagues Dr Chris Culnane and Ben Rubinstein were able to re-identify the data set

And it was easy, Teague said.

“Here’s how it works: find a few facts about an individual, such as the year they were born, the state they live in and the dates on which they had (two or three) children … Write a database query to find all matches for that information in the dataset. Generally for a few points of information, there will only be one match.  If the person is in the dataset, you have re-identified their record.”

“According to George Brandis, anyone who did identify someone would be committing a crime. A bill was written, but has never passed either house of our parliament,” she said. “I’m sure I don’t need to explain to Crikey that making it a crime to demonstrate that the government has made a mathematical error does not undo the error or improve the privacy of the affected individuals.”

Further, Culnane told Crikey, government use and sharing of data was small scale in comparison to what goes on in the private sector.

“Most companies are dependent on de-identification working for them to be able to analyse and sell the data they collect. If the data is identifiable it has far greater protection under the Privacy Act, once it is classified as de-identified there are few, if any, restrictions on how that data can be used,” he said. “There is very little oversight of the de-identification that is taking place in industry, and most data transfer agreements will include clauses the strictly prohibit re-identification, so no one is in a position to evaluate it.”

Software developer and privacy advocate Robin Doherty — who has previously helped develop a video game called Snitch Hunt to show how easily someone could be tracked, based on just a news story, someone’s metadata and a search engine — said the possibility of someone accessing and re-identifiying private information was much more serious that merely receiving unwanted advertising.

“Say you were a victim of abuse and you had a motivated attacker, they could track you down,”he said. “Or, a news source or whistle blower; they take a real risk sharing that information and the government or their employer could track them down.”

“But the value of privacy is not just for the individual — it’s a value to society, for free democratic discourse, and the exposure of information in the public interest.”