“Our pilot, and the experience of ISPs in many Western democracies, shows that ISP-level filtering of a defined list of URLs can be delivered with 100% accuracy,” Senator Stephen Conroy said yesterday when announcing that mandatory internet censorship — sorry, “filtering” — is going ahead.
“It also demonstrated that it can be done with negligible impact on internet speed.”
Conroy is right on both counts, as it happens — provided you gloss over that reference to “many” unnamed democracies. I wouldn’t call a dozen countries with ISP-level filtering “many”, and in some of them filtering isn’t mandatory. And provided you restrict your aims precisely to those carefully worded factoids cherry-picked from Enex TestLab’s trial report.
And provided you never make a mistake.
Blocking a defined list of URLs [specific web addresses] such as the ACMA blacklist of Refused Classification material, even 100% of it, falls far short of “protecting” children from “inappropriate” material, to use the wording of Labor’s original cyber-safety policy.
Google’s index passed a trillion web pages a year and a half ago. ACMA’s manually compiled blacklist of a thousand-odd URLs reported by concerned citizens is a token drop in that ocean, a mere 0.0000001%.
ACMA told Senate Estimates that of the 1175 URLs on their blacklist on September 30, 54% were Refused Classification material, and only 33% of those related to child sexual abuse. The rest of the blacklist? 41% was X18+ material, and 5% was R18+ material without a “restricted access system” to prevent access by minors.
The same key problems with a filter-based approach, which Crikey has reported many times before, are confirmed by the Enex report.
If you go beyond the pre-defined ACMA blacklist to catch a wider range of content, the false positive rate — material blocked when it shouldn’t be — is still up to 3.4%. Enex’s examples include the incorrect blocking of “sperm whales” and “robin red breast”. In the industry, this is known as the Scunthorpe Problem.
Australia’s biggest telco, Telstra, wasn’t part of the official trial, but it conducted its own tests and discussed the results with Enex.
“Telstra found its filtering solution was not effective in the case of non-web based protocols such as instant messaging, peer-to-peer [file sharing like BitTorrent] or chat rooms. Enex confirms that this is also the case for all filters presented in the pilot.”
For all filters.
Telstra also reported that its filtering system could be overloaded if pages on heavy traffic sites like YouTube ended up on the blacklist. Every request for anything on YouTube would have to be routed to the secret filter box to see whether it was listed.
“This is also the case for all filters presented in the pilot,” reports Enex.
For all filters.
In any event, as the Enex report reminds us, “A technically competent user could, if they wished, circumvent the filtering technology.” In its own tests, Telstra didn’t even bother testing circumvention because they take it as given.
“ISP filtering reduces the risk of Australians being inadvertently exposed to RC-rated material when they are online,” Senator Conroy said.
And again Conroy is dead right. Reduces the risk. Inadvertent exposure.
But when it comes to curious kids with technically-adept mates and plenty of time on their hands, or desperately secretive pedophiles trading their nasties, the filter will be nothing but a minor inconvenience.
Crikey is committed to hosting lively discussions. Help us keep the conversation useful, interesting and welcoming. We aim to publish comments quickly in the interest of promoting robust conversation, but we’re a small team and we deploy filters to protect against legal risk. Occasionally your comment may be held up while we review, but we’re working as fast as we can to keep the conversation rolling.
The Crikey comment section is members-only content. Please subscribe to leave a comment.
The Crikey comment section is members-only content. Please login to leave a comment.