(Image: Mitchell Squire/Private Media)
(Image: Mitchell Squire/Private Media)

In the mid-’90s, when my geeky partner was snaffling a domain using my surname and trying to convince me of all the ways the internet would liberate me from the most mundane aspects of my academic and journalistic endeavours, I was a sceptic.

Not because I couldn’t use the technology; I’m old enough to remember the basic commands required to drive DOS. My worry was about quality control: “If anyone can post anything on it, how do you know what to trust?”

His answer was confident and, at the dawn of a new era in information verification, unvexed by reality: “The algorithm. It’ll push what’s best to the top, and what’s rubbish further down.”

Now, a quarter of a century later, the problem in this answer is plain for all to see: just because something is popular, that doesn’t make it right or true.

Majorities used to think that the sun circled the earth, racial enslavement was a good idea, and women shouldn’t study at university because it would shrivel their uteruses. A majority supported the legal fiction of terra nullius and heiled Hitler. The very last thing majorities are known for are considered or informed judgments about anything, particularly once they coalesce — in person or online — into mobs.

Yet the information production system on which we all rely is based on mob rule. The algorithms used to search the web deliver results substantially based on what is often referred to as “the wisdom of crowds”. Namely, the degree to which other sites and users have engaged with the content.

There’s nothing wrong with filtering for popularity if you’re looking for a good movie or the best place to eat on Saturday night. But asking an information system optimised for popularity to reliably tell you whether Jews are plotting global domination, if vaginoplasty is required, or even what a famous person does if they’re currently mired in controversy is a category mistake. The results are nothing more than the racist, misogynistic and cancellation cries of the crowd.

It didn’t have to be this way. Before “monetising engagement” was all anyone cared about on the web, the concept of “search” could have been bifurcated. Rather than crowds being deemed the suitable arbiter of everything, their views could have been privileged and dismissed in precisely the way post-enlightenment cultures took centuries to perfect.

At the same time, the fascinating project of finding algorithmic ways of recognising truth and credibility could have been pursued, so that users searching the internet would have had a choice. Did they want search results that privilege the truth, or just what’s popular?

Instead we’ve been left with the web equivalent of the idiot box. A unitary search mechanism that produces results capable of satisfying only the most banal of human yearnings — not for the truth, or for challenging arguments based on facts, but to know what the pack is doing to stay in step and keep out of trouble.