Google advertising
(Image: Adobe)

Last week 1% of Australian Google users temporarily had their access to Australian news content blocked. Australian news outlets including Crikey rightly framed this as a threat to their businesses ahead of negotiations over the federal government’s proposed news media bargaining code.

But few have highlighted those directly affected by the temporary erasure: ordinary internet users who were unwittingly made guinea pigs in Google’s “experiment”.

The Australian’s Adam Creighton, who was one of the locked-out users, noted it served as a “chilling” reminder of the tech giants’ “control of all digital information” and their willingness to manipulate our access to it.

While this instance may have been the most Orwellian, big tech companies frequently tweak their algorithms to change the content you see in your news feeds and test how you respond.

Rarely is it as brazen as disappearing entire catalogues, but frequent adjustments are made to the prioritisation of content to better map your psychological vulnerabilities for future ad targeting.

The most infamous “experiment” was Facebook’s 2014 emotions test, where the social platform manipulated the news feeds of nearly 700,000 users, unbeknownst to them, to prioritise posts expressing particular emotions. Unsurprisingly, this increased users’ expression of those emotions online. Facebook learned it could nudge thousands of people into happiness, sadness or rage.

These controversies have just been the most visible. As US-based psychology professor Katherine Sledge Moore told the BBC, “based on what Facebook does with their newsfeed all of the time and based on what we’ve agreed to by joining Facebook, this study really isn’t that out of the ordinary”.

Indeed, deliberately manufacturing rage over news stories has been the core business model of social media platforms for over a decade.

Since it began algorithmically sorting its news feed in 2009, testing quickly showed that feeding users content that confirms their biases and stokes partisan disagreements was the most effective way to keep eyeballs glued to the screen. This radicalised many users, particularly older conservatives, which helped facilitate the rise of Donald Trump in the US and popularised scare campaigns like “Labor’s retiree tax” here in Australia.

Forget temporary news blackouts for 1% of users — there are far bigger cohorts who rarely ever see credible, fact-based journalism in their so-called “news feeds”.

Big tech is only now being forced to curtail the most egregious Frankensteins their experiments created, banning Trump and many of his “QAnon”-inspired followers after the violent Capitol insurrection.

In Australia, pro-Trump Coalition backbenchers responded by pressuring the communications minister to ban Twitter’s censorship powers. The irony is that alt-right darlings like Trump only became so prominent because Twitter’s algorithm prioritised their hateful content — and de-prioritised thousands of kinder, smarter and more constructive voices that the company’s testing undoubtedly proved less profitable.

The question Australian internet users must ask themselves, whatever the outcome of the regulatory negotiations, is this: do we trust the same former frat bros who stoked Trump’s rise to write the rules that govern such prominent public spaces in secret?

The news blackout and emotions test were remarkable because they were reported and subsequently acknowledged by the companies. Other algorithm changes are made opaquely and without notice or explanation, with users left guessing at how the curation of their most visited pages have been warped in potentially insidious and damaging ways.

As Digital Rights Watch chair Lizzie O’Shea writes in her book Future Histories, “If we simply wait for these problems to present themselves, or address them piecemeal as they emerge, we will miss the iceberg for the tip”. She advocates breaking open “black box algorithms” for public scrutiny.

The proposed bargaining code will require the tech giants to give media companies warning of upcoming changes to their algorithms — why not users too? Surely we have a right to know when we’re being experimented on and how it could affect us. If academics conducted experiments without their subjects’ informed consent, they could be fired.

The current policy negotiations provide a welcome opportunity to begin reshaping the digital economy to increase fairness and civic responsibility. But regardless of the outcome, there will be more work to do.

Crikey‘s Guy Rundle is right to call for civic oversight boards to scrutinise social media platforms’ conduct.

But first, we need to know what it is we’re scrutinising. To do that, we need to unlock some of the most powerful forces of the digital era from permanent blackout.