The rest of the world has been smirking at Stephen Conroy’s ill-conceived plan to censor Australia’s Internet for a while now, but a new study published by Brooklyn Law School entitled “Filtering in Oz: Australia’s Foray Into Internet Censorship” is a serious embarrassment.
This report is important. Not only is it authored by a reputable and neutral foreign observer but it also focuses more on the legitimacy of the scheme than the technical concerns, and it finds some serious problems. Despite the sober language, phrases like “troubling”, “worrisome”, “politically motivated” and “unaccountable” are common.
Contrary to persistent claims by the Minister, the study finds that Australia “will likely become the first Western democracy to block access to on-line material through legislative mandate.”
But is it a legitimate experiment? The study’s author applies a process-based methodology to determining censorship’s legitimacy by asking four questions. Is the country open about its censorship plans and the reason behind them? Is it transparent about what is to be restricted? How narrow is the filtering? And finally, are the processes and decision makers behind the scheme accountable? While the Government earns praise for openness (Internet filtering was a central campaign promise), serious issues are highlighted in the other three areas.
Commentators, industry groups like Electronic Frontiers Australia and opposition political parties have consistently called for clarity on both the aims of the censorship scheme and the range of material to be targeted. Yet phrases like “other unwanted material” still represent the best information we have received from the Government. Whether or not this is a deliberate attempt to hobble debate we cannot say, but the situation was not lost on Bambauer:
To date, Australia’s transparency regarding its filtering has been poor. The country has vacillated on what material it will target for blocking. This uncertainty makes it difficult for citizens to assess whether the scope of material blocked is appropriate, and whether the set of targeted sites comports with the underlying rationales for censorship. The Labor government is opaque about the types of sites that will be blocked, how a site will be evaluated for filtering, and how those decisions map to larger social and political goals.
Indeed, in another part of the study the author examines the hypothetical 10,000-site blacklist floated by the Government, and wonders whether this proves they have an idea of the scope or are merely guessing. “The latter seems more likely,” he concludes.
This confusion has the net effect of robbing Australians of the ability to make decisions about the merits of the scheme, but also makes it hard to measure the scheme against its stated goal — protecting children. If the target of the filter is now primarily websites accessed by adults, this suggests that the rationale for Net censorship has changed since the election promises were made. Bambauer agrees. “In short, the Rudd government’s inability, or unwillingness, to elucidate a consistent set of content categories that will be off-limits, either to all Australians or to minors, undermines citizens’ ability to compare concrete plans for filtering to the reasons for implementing it initially.”
On the issue of narrowness, the author examines the state of dynamic filters as tested by the Government, and comes to the same conclusions as the rest of the world — that such filters come with inherent under — and over-blocking. Furthermore, since commercial software products are developed and administered by third-parties, discretion for what is blocked may be lost to the potentially inaccurate built-in lists provided by independent software vendors:
If the country’s filtering employs vendor-supplied block lists, or allows ISPs to choose which product to implement … then Australia’s controls will inevitably be both under — and overbroad, with implications for access to legitimate information, transparency, and accountability.
Finally, the study looked at the accountability issues surrounding expanded internet censorship powers, and — no surprise — accountability is clearly not a centrepiece of the cyber-safety platform.
For a start, lack of clarity on who controls the blacklist undermines the ability of the citizenry to ensure the scheme is fairly administered. The planned outsourcing of filtering decisions to unaccountable and overseas third parties such as the Internet Watch Foundation also raises severe issues and may in some instances contravene existing laws by bypassing the ACMA’s complaints-based mechanism. The implications of replacing human judgement with mandated software are plain:
If filtering is implemented based on software vendors’ decisions about whether content is sexually explicit, rather than on the Classification Board’s judgements, this will decrease the Australian citizens’ ability to have a voice in what they can access on-line.
Some of these concerns could be remedied, perhaps, if the Government could lay out their plans in sufficient detail. Instead, we are left waiting for clarity until after the “live” ISP trial. (What end is served by a trial conducted in such a policy vacuum I cannot say.)
The study’s author even picks up on Conroy’s odious habit of of tarring opponents as supporters of illegal material: “While hyperbolic rhetoric is common in democracies, attempts to silence dissenters or to conflate policy differences with support for unlawful behavior undermine accountability.”
Overall, the study concludes that:
Accountability problems are inherent in censorship achieved through computer technology. These challenges increase when some voices are magnified, and others silenced, in policy debates, and when content categorization is done by unaccountable (and perhaps foreign) entities.
The report is quite comprehensive and the Ministry would be well served to study it. The study does err, perhaps, in the amount of power it ascribes to Senator Steve Fielding of Family First in driving the policy. Nevertheless, it reinforces the position of the many stakeholders in Australia who have opposed the filter, not solely on technical grounds or from some misguided sense of cyber-anarchism, but from solid and fundamental policy/democratic principles. We are not the only ones who question the ability of our Government to anticipate, understand and manage the many complex issues surrounding such a radical internet policy.
In his conclusion, the study’s author makes the following observation:
Filtering looks easy and cheap, and calls to block access to material that is almost universally condemned – such as child pornography, extreme violence, or incitements to terrorism – are hard to resist. But this focus confuses means with ends.
It’s hard to disagree. The Government cannot claim a mandate for such a poorly-defined policy. If it is to have any legitimacy, the public and industry must be informed well in advance of the next stages.
The study can be downloaded here.
Crikey is committed to hosting lively discussions. Help us keep the conversation useful, interesting and welcoming. We aim to publish comments quickly in the interest of promoting robust conversation, but we’re a small team and we deploy filters to protect against legal risk. Occasionally your comment may be held up while we review, but we’re working as fast as we can to keep the conversation rolling.
The Crikey comment section is members-only content. Please subscribe to leave a comment.
The Crikey comment section is members-only content. Please login to leave a comment.