Elsagate 2.0

Photos of the late 2010s Elsagate channel, (ITALICS)Webs and Tiaras, next to photos of a modern day Elsagate channel, (ITALICS)Glopy Toons

(Ruby Costa / The Puma Prensa)

By Ruby Costa, Co-Executive Editor

YouTube functioned as the internet’s camcorder for much of its early days. People would post home videos, video diaries, and other short, simple clips — most of the time, YouTube channels would only serve as archives for the poster, not the public.

Twenty years later, YouTube resembles the movie bin at the dollar store more than it does a home video camera. The oversaturation of this once beloved website has led many to look for new platforms or to lament on the simpler days when reaching one million subscribers was a rare, amazing achievement, and there weren’t ads every time you blinked.

2016 and 2017 are often cited as the end of YouTube’s “golden age.” Though there are many nuanced reasons for this, one notable culprit is what has come to be known as Elsagate.

Starting in about 2014, Elsagate brewed under the surface for years before being brought to the public eye in 2017. This scandal consisted of YouTube videos targeted towards children with violent, overly sexual, or frightening content. These channels used beloved kids’ characters—such as Elsa and Spiderman—and buzzwords in the titles and descriptions that would trick the algorithm, and the children scrolling through YouTube, into thinking they were harmless. These videos brought in a great deal of profit, with the biggest Elsagate channels getting billions of views each month.

In 2017, newspaper articles began drawing attention to the Elsagate problem, and YouTube began placing stricter guidelines on children's content. Many Elsagate channels were taken down, and some were even investigated for more serious allegations such as child abuse. Many Elsagate channels didn’t include any children at all, though, so the people responsible for them fell through the cracks and faced hardly any real-world repercussions.

With the channels taken down and harsher restrictions in place, it seemed like the storm was over. But, as with most things on the internet, deleting Elsagate did not remove its existence completely.

A second wave of Elsagate 

As sad as it is to admit, Elsa and Spiderman are not the epitome of children’s entertainment anymore. That may seem like an even more convincing piece of evidence for this controversy’s eradication, but, like an infectious virus, Elsagate was able to evolve.

Despite YouTube’s efforts, and general increased public awareness, the 2020s brought with them a second wave of Elsagate. This time, much of the content is based on popular video games like Minecraft, Among Us, or Poppy Playtime, and an overwhelming amount of it is AI-generated.

The sort of content is very similar—gore, oversexualized characters or situations, and gratuitous amounts of bodily fluids—but the strategy has changed for the worse. These videos no longer need to be carefully disguised or created just right to bypass the algorithm; they can be posted in their full perverse glory and YouTube’s moderation won’t bat an eye. With the sheer amount of content on YouTube, sorting through the good and the bad takes much more time. As well as this, YouTube now relies heavily on AI for its moderation. This has been a colossal failure, though it's not far-fetched to say the poor moderation may be on purpose; Elsagate videos make significant amounts of money, ergo, YouTube makes a huge profit.

The use of AI also makes this content significantly easier to produce— while traditional Elsagate videos were often live-action skits, with costumes and props, or animations, this new wave of Elsagate is much less involved. A few prompts shoved into AI for the video, a few more for the title and description, and a little bit of polishing is all it takes to make Elsagate content now.

YouTube Shorts are also key, as they seem to slip past moderation even easier than long-form content, and can be mass-produced even quicker as well. With children’s increased screen time, and decreased attention spans, it's frighteningly easy for them to pick up their iPads and start scrolling through this disgusting slew.

With how huge the Elsagate scandal was in the late 2010s, it's a terrifying thing to say that this new wave may be even worse. Current Elsagate videos are running rampant, far greater in number, and are even easier for children to access than ever.

An Elsagate case study

To see exactly how easy it is for kids to access Elsagate content, I created a new YouTube profile: a completely blank slate.

I decided to test the sort of content kids may come across by looking up four simple queries in the search bar: “Poppy Playtime,” “Among Us,” “cat,” and “kids.” This mix of specific interests, like games, as well as general words that may be appealing to kids gave me a good scope of how common this Elsagate content is.

When I searched up “Poppy Playtime," I only needed to scroll down ten videos before I found one with all the marks of modern Elsagate: AI, putting the characters in violent or overly sexualized situations, and an odd mix of bodily fluids and general uncleanliness. Even worse, the characters in this video were animated to look like babies or young children. The channel that uploaded this video, Glopy Toons, had numerous other videos using Poppy Playtime characters as well as generic characters like anthropomorphized shapes that all included the same odd, perverted content. As far as YouTube shorts, I only needed to scroll eight times before a video hypersexualizing one of the Poppy Playtime characters appeared, with absolutely no indication that it was not for kids or any sort of age restriction.

The next search was “Among Us,” and things certainly aren't looking up for the validity of YouTube’s moderation, as the third video and the very first short had Elsagate-esque content.

The query “cat” pulled up normal long-form videos, but the second short was AI-generated and, simply, just minutes of gore and animal cruelty using anthropomorphized cats. In fact, YouTube has made the odd choice of creating an entire highlight category, title, and all, for these AI cat videos, most of which were either gory, depicted pregnancy in an overly sexualized way, or were just outright pornographic. Again, none of these videos were age-restricted, and any child who simply wants to watch silly cat videos would be instantly shown this category of disgusting shorts. There is no clear distinction for kids between the weird AI videos and perfectly wholesome ones.

When searching up “kids,” a lot of the videos were, thankfully, educational songs for kids or channels teaching manners or language. However, a view of Elsagate videos still snuck in, like more from the channel Glopy Toons (whose videos all have between one hundred thousand and eight hundred thousand views).

When I returned to the YouTube home page, it was already filled with content just like all that was described above. Most of it is too gruesome or perverted to even describe. When attempting to take a screenshot of the homepage to use as an image for this article, I realized that the content was far too explicit to even be published on our website without heavy censoring. The thumbnails displaying this content are not allowed in a high-school student-run newspaper, but are somehow allowed to run rampant on a website that over eighty percent of parents report their children use.

For a child or their parent, the easiest way to find videos is to simply search up keywords, and as seen in this data, searching is also the easiest way to be exposed to Elsagate content.

The effects of Elsagate on children

Elsagate isn’t just wrong on principle — it can have a real impact on the developing minds of children. Pediatric and psychiatric research shows that children who are exposed to excessive violence or sexual content can have delayed cognitive development and behavioral problems.

Seeing gore can desensitize children to violence and also simply scare them, leading to nightmares or more serious and long-term conditions like anxiety or depression.

Sexual content can blur a child's understanding of public behaviors, appropriate relationships, or societal norms. A child who sees sexual content online may become a perpetrator of child-on-child sexual abuse (COCSA) without even understanding why what they’re doing is wrong. When kids are used to seeing Elsagate content, they could try to replicate it in the real world and fail to understand why that is bad. This is especially likely when it is characters that a child knows and loves committing these inappropriate acts — children often mimic those they look up to, and Elsagate videos set the very worst example.

Even if there were no psychological effects of Elsagate, these videos still offer no educational or developmental benefit for children. Most children's shows or YouTube channels are meant to instruct in some way, whether that be on school subjects like English or Math, or on more social subjects like manners or problem solving. Not absolutely everything a child watches has to be educational, but when allowing their children to roam freely on YouTube, parents are essentially setting them up for failure.

In some ways, the blame for Elsagate falls more on the shoulders of lazy parents than it does YouTube, or even the channels creating this content. Obviously, if this content never existed in the first place, it wouldn't be a problem, but parents should also take greater responsibility in monitoring what their kids watch.

In 2026, no parent should be able to feign innocence when it comes to the dangers of the internet.

When will it end?

As AI gets more and more prominent on the internet, Elsagate seems unbeatable from every angle — Elsagate content will only get easier and easier to make, and YouTube’s moderation will continue to rely on an AI system with a horrible track record.

When YouTube itself profits off of a problem like Elsagate, it's hard to imagine that they’ll be eager to fix it anytime soon.

Combatting the Elsagate problem comes down to individual choices. It may not be easy to hold YouTube accountable, or get Elsagate channels banned when YouTube is seemingly so fond of them, but individual choices can make a huge change.

For people with babies or younger children in their lives, monitor what they do online. Using streaming services or physical media can be so much safer than simply allowing access to YouTube before they know how to avoid bad content on their own.

And, for those who’ve seen how easy it is to make money off of AI-generated content online, take a second to think about people other than yourself. This harms the developing minds of children. It harms smaller creators who put so much effort into videos simply getting drowned out by AI. It harms our planet, which is already warming at a terrifying rate and going through its sixth mass extinction, entirely anthropogenic.

If Elsagate is allowed to continue, it won’t just be individual children facing harm. Soon, it will balloon into a large, irrevocable, ugly blemish on the face of YouTube, and sweeping the scandal under the rug won’t be able to save the platform this time. And, who knows— Elsagate, one of YouTube’s greatest gold mines, could even be the nail in the coffin that leads to its ultimate demise.

Previous
Previous

To block, or not to block

Next
Next

Eager for energy? Here’s the rundown