This week, a New York Times article revealed that YouTube’s video recommendation algorithm gave paedophiles access to a tailor-made stream of content containing children.
The videos were usually innocent — family movies taken by parents of their kids playing in a swimming pool, dancing, or doing gymnastics. Nor did they violate the platforms terms. But they soon racked up thousands of views, as the algorithm identified people with an interest in this content, and kept them watching by directing them to similar videos across the platform.
The Times story further exposes just how deep the problems with YouTube and its algorithm run. Researchers and journalists have been drawing attention to the way the algorithm pushes people towards more extreme, radical content. And in the same week as the paedophile article, the platform came under attack yet again for its failure to police racist and homophobic harassment.
An online paedophile ring
YouTube has been aware of its paedophile problem for some time now. Earlier this year, it was revealed that the comments on YouTube videos were effectively providing space for an online paedophile ring.
Innocuous videos posted by children or their families amassed millions of views, while predators would post timestamps in the comments, pointing other users to the parts of the videos where children exposed genitals. They would also share their phone numbers and social media accounts, promising to distribute more hardcore, pornographic material. Often these videos and comments appeared alongside ads, leading companies like Nestle and Disney to remove their advertising from YouTube.
The radicalisation problem
But it isn’t just paedophiles congregating on YouTube. The platform’s algorithm is an incredibly powerful radicalising tool for the far right and other extremists.
It is well-documented that the algorithm recommends viewers more extreme content so as to keep them online for longer. Users watching videos on history are pushed toward reactionary conspiracy theories. Trying to seek information about Islam leads to videos posted by radical hate preachers.
Following unrest in the German city of Chemnitz earlier this year, people looking for information on YouTube were pointed toward videos peddling Islamophobic, anti-immigrant conspiracies, leading to a deluge of disinformation. Anti-vaxxer propaganda has also been allowed to flourish.
Content creators also struggle with the reactionary elements let loose on the platform. This week, Vox YouTube host Carlos Maza described severe homophobic and racist harassment he received from users, incited by a right-wing pundit, leading to him eventually getting doxxed.
What has YouTube done?
When confronted with criticism, YouTube frequently responds with obfuscation and vaguery. Facing questions last year about YouTube’s radicalising power, CEO Susan Wojcicki said the company was “working on it”. In a recent interview about online radicalisation, product chief Neal Mohan appeared to describe the “rabbit hole effect”, where users watch increasingly extreme videos, as a “myth”.
One important recent change, which was to prioritise “authoritative content” during breaking news situations, was found to be successful at reducing the flow of misinformation. But Mohan said the company was resisting attempts to broaden this across the platform, arguing that doing so would be too difficult.
YouTube’s responses to the paedophile issue show that more often than not the platform needs a real external push — usually from prominent media coverage — before it moves to clean up its act.
It was arguably the commercial blow of big advertisers pulling out that spurred the platform into action on the paedophile issue, with YouTube deciding to close down comments sections on videos containing minors in February. Similarly, Maza faced months of abuse but only got a response from the company when his tweets about it went viral — and even then YouTube refused to take down videos attacking him, saying that while “hurtful”, they did not violate the company’s policies.
Why it isn’t enough
Often these decisions highlight the uneasy tension YouTube feels it has to strike between generating more clicks — which bring cash for both the company and creators, some of whom have used it to build followings in the tens of millions — and maintaining protection for children and minorities. The decision to suspend comments was a case in point, causing backlash from the family vlogger community, people who make a living through videos documenting their family life and were worried they’d be deprived of advertising revenue.
That tension means there is plenty YouTube continues to not do. Despite the changes made in February, for example, it has not removed its recommendation system for videos involving children — a move that researchers quoted in the Times article cited as the best prevention measure.
It also means the changes YouTube does make can seem like ad-hoc, inconsistent tweaking. In response to the most recent reports, YouTube flagged that it was cracking down on material posted by people under the age of 13, even though many of the videos in question were published by parents and other older family members, meaning they would likely not be caught. And in September last year, the owner of a popular series of channels featuring and targeting tween girls was arrested for child sexual abuse. YouTube did not suspend his channels until nearly six months later, when he entered a guilty plea.
YouTube, like much of the internet, is a messy and sometimes quite ugly place. But until the platform responds to its problems proactively, rather than reactively, it will continue to be the fastest ticket to the internet’s most toxic sewers.
Crikey is committed to hosting lively discussions. Help us keep the conversation useful, interesting and welcoming. We aim to publish comments quickly in the interest of promoting robust conversation, but we’re a small team and we deploy filters to protect against legal risk. Occasionally your comment may be held up while we review, but we’re working as fast as we can to keep the conversation rolling.
The Crikey comment section is members-only content. Please subscribe to leave a comment.
The Crikey comment section is members-only content. Please login to leave a comment.