Does ‘Elsagate’ prove YouTube is too big to control?
Scandal of violent and sexual videos aimed at children exposes the difficulty of relying on algorithms
In February, YouTube announced it had hit a staggering milestone: visitors were now consuming the equivalent of a billion hours’ worth of video every day.
The sheer size of the platform’s userbase is equally astounding. “More than 1.5 billion people use YouTube,” says The Guardian’s Roger McNamee, giving it a global reach comparable to Islam.
But, as they saying goes, with great power comes great responsibility and the recent “ElsaGate”, a scandal involving disturbing videos aimed at children, has led some to wonder if YouTube has become a Frankenstein’s monster, beyond the control of its creators.
What is ElsaGate?
Some of YouTube’s most popular channels are aimed at children, with creators specialising in nursery rhymes, colourful cartoons and the all-powerful toy-unboxing videos gaining millions of subscribers and billions of views.
But there is a problem. YouTube is “absolutely flooded with extremely violent, inappropriate, sexually suggestive videos targeted at children,” says Inquistr, and these videos are finding their way into autoplay lists alongside age-appropriate clips.
Journalist James Bridle delved into this unsettling phenomenon, dubbed Elsagate after the popular Frozen character who appears in many of the videos, for an article titled Something is Wrong on the Internet on Medium this month.
He found that as content creators chase viewers, successful - and originally harmless - formulas for garnering views are “endlessly repeated across the network in increasingly outlandish and distorted recombinations”.
At their most extreme, these include a legion of unsettling videos which appear to be produced, or in some cases automatically generated, in response to popular keywords. They often feature disturbing themes and sexual or violent content.
For instance, a search for “Peppa Pig dentist” returns a homemade clip in which the popular children’s character is “tortured, before turning into a series of Iron Man robots and performing the Learn Colours dance”.
ElsaGate has generated a flood of response from concerned parents, the media and internet sleuths dedicated to finding out who or what is making these disturbing videos and why.
Has YouTube become too big to control?
At the heart of the ElsaGate and other controversies over YouTube’s content, such as videos promoting terrorism or violence, is the omnipotence of the platform’s algorithms.
Contrary to what some parents may believe, content on YouTube’s dedicated Kids app is not curated or even pre-screened by humans. Instead, suggested videos appear in its autoplay list automatically based on shared keywords or similar audiences.
The sheer size of YouTube’s catalogue goes beyond the capabilities of human oversight. Content is uploaded to the platform at the equivalent of 400 hours of video every minute, according to Statista.
The incredible proliferation of videos that have clearly been produced in response to common search terms show the extent to which YouTube is essentially run by machines, says Medium’s Bridle.
On a platform where content visibility – and thus potential for ad revenue – is controlled by algorithms, “even if you’re a human, you have to end up impersonating the machine”.
ElsaGate has exposed a long-standing truth about YouTube that can no longer be ignored, says Polygon: the filters designed to protect users of all ages from disturbing, violent or illegal content are not up to the job.
The algorithms are “ripped apart, analyzed and beaten up” by content creators, human or automated, who have become adept at gaming the system to make sure their output is seen.
In a particularly notorious example, pranksters on the 4chan message board “managed to splice pornography into children’s shows and game the algorithm to ensure the videos were monetized and seen”, says Polygon.
Can the problem be fixed?
YouTube has pledged to crack down on the plague of inappropriate children’s videos slipping past the filters on its Kids app.
Juniper Downs, director of policy, said a new system which predates the ElsaGate revelations will classify videos flagged by users as age restricted, which will automatically keep them out of the YouTube Kids app.
On a wider levels, there are signs YouTube is shifting its attitude towards content curation towards a more hands-on approach, prompted by pressure from governments concerned about online extremism.
Multiple terrorist attacks in Europe and the US have been perpetrated by individuals believed to have been self-radicalised in part by watching YouTube propaganda videos made by groups such as Islamic State and their sympathisers.
Politicians have threatened to take action against tech firms who do not strengthen measures to remove videos containing extremist content. Alphabet, YouTube's parent company, seems to be taking the threats seriously.
Previously, “inflammatory” religious or political videos were permitted on YouTube if they did not violate the site’s rules on graphic content or promoting violence, although they were not eligible for ad revenue.
A YouTube spokesperson confirmed to Reuters their policy has changed to prohibit any video that features people or groups classified as “terrorist”, including material such as lectures by al-Qaeda recruiter Anwar al-Awlaki.
However, even if YouTube does develop new strategies to fix the loopholes, the implications of ElsaGate remain disturbing on a more profound level, says TechCrunch’s Natasha Lomas.
“The YouTube medium incentivises content factories to produce click fodder,” Lomas writes, exposing a generation to a tidal wave of “pop culture slurry” based around keywords rather than coherency. “It’s hard to imagine anything positive coming from something so intentionally base and bottom-feeding being systematically thrust in front of kids’ eyeballs.”