YouTube has become the world’s babysitter, an electronic pacifier during flights and in restaurants. But the ramifications of raising our children on a diet of video content are far deeper than we realise, writes Amanda Cassidy
Letting your child watch colourful characters sing and dance on screen is a daily fixture in many households. The children are entertained, the parents get time to put on the dinner or just get some down time too. Everyone is happy and what is the harm?
“These videos are abusive to children”.
“An open gate for paedophiles” is how The New York Times referred to the platform after an innocent story of a little girl in a bikini went viral. It attracted half a million views over a very short space of time via You Tube’s recommendation system. The reason it went viral so fast is because it was recommended to other users who had been searching for other videos of “prepubescent, partially clothed children.”
The comment’s section was also being used as a guide for paedophiles to share content with other paedophiles.
But let’s take a step back. To be fair to parents, myself included, we try to keep screentime minimised for children. We steer them away from YouTube proper because we understand, like above, that there is a plethora of wildly disturbing content festering there that includes disturbing pornography, videos of beheading, self-harm, abuse and general debauchery that no growing mind should come across.
But the Intenet is a parenting minefield. A friend’s son who loved trains was watching a series of videos about trains pulling into stations until she returned five minutes later to see that he was watching footage of a train accident that had automatically started playing.
So we silently cheered when YouTube Kids was born – all the video content to keep your offspring amused, without the nasties. Win-win. But that was our first mistake – trusting that the platform is, indeed, keeping the content appropriate. For the most part, the videos are collated and distributed by bots, based on algorithms that work off popular word searches and most popular clips.
And it turns out that our confidence that YouTube Kids is a safe space for our children is a misconception. Here’s why; Technology writer, James Bridle, has researched the video platform in great detail and his conclusion is that these videos are abusive to children. He also believes that Google-owned YouTube has a responsibility to keep them away from young eyes.
On the lighter side of the scale, there are people (trolls) creating thousands and thousands of disturbing videos where beloved characters turn violent or dark halfway through an innocent video, such as Peppa Pig drinking bleach or the dentist doctor pulling teeth. Videos being recommended by the algorithm included Mickey Mouse getting tortured and highly sexualised videos of Disney princesses were easy to find.
“YouTube was never supposed to be a platform for kids.”
Brink writes in his essay on Medium; “YouTube Kids, an official app which claims to be kid-safe but is quite obviously not, is the problem identified because it wrongly engenders trust in users. An article in the British tabloid The Sun, Kids left traumatised after sick YouTube clips show Peppa Pig characters with knives and guns appear on app for kids takes the same line, with an added dose of right-wing technophobia and self-righteousness.
But both stories take at face value YouTube’s assertions that these results are incredibly rare and quickly removed: assertions utterly refuted by the proliferation of the stories themselves, and the growing number of social media posts, largely by concerned parents, from which they arise.
But what is concerning to me about the Peppa videos is how the obvious parodies and even the shadier knock-offs interact with the legions of algorithmic content producers until it is completely impossible to know what is going on.” In other words, the bots are running away with all of this and it is impossible for us to keep up.
“This is the tip of the iceberg when it comes to inappropriate content on the video platform curated for kids”.
Then there are the so-called family-friendly channels showing pranks – these included children wetting themselves, being injured or terrified. One father who filmed pranks like this is believed to have lost custody of his children as a result. Then, despite the Momo challenge hysteria, there are videos on YouTube Kids with suicide advice spliced into cartoons as a ‘joke’. This is the tip of the iceberg when it comes to inappropriate content on the video platform curated for kids.
YouTube has removed many of the offending videos but they just keep on coming. The company relies on a flagging system where people have to actually see the video to question and report it before anything can be done. Pre-moderation is not something the platform seems to be keen on starting.
There are filters but let’s be honest, YouTube was never supposed to be a platform for kids. I have no faith in its ability to adapt itself.
“Of course, it isn’t all damaging content but it is frighteningly stupid.”
Raised by YouTube
Besides all of this, the idea of a child passively watching other children doing things – making slime, opening toy eggs, sliding down the stairs on a tray, playing Minecraft is all a bit weird. They’ll have enough technology in their lives as teens and beyond, these are the years we should be scooting them out of the house so they can come up with their own adventures and play their own games. Of course, it isn’t all damaging content but it is frighteningly stupid.
“Your child is just the product.”
When Coleen Rooney was criticised on social media for allowing her one-year-old to be glued to his personalised iPad, she pointed out that what her son was watching was educational. The most important element of learning from children under the age of two is rich interaction with humans and their actual environments.
The children-friendly videos are cheap, algorithm-driven songs or shows that exist purely to generate ad revenue. It is a largely unregulated, data-driven grab for toddlers’ attention. Your child is just the product in this case.
There is nothing wrong with having your child watch something you deem appropriate while you are in the room with them for a short space of time but allowing a series of algorithms choose the content your child’s brain is absorbing and then blindly trusting that it is safe is just bonkers.
We’ve removed this hex from all the devices in our house. There is plenty of good content to keep children entertained elsewhere with zero chance they are going to pick up suicide tips. I know it is not going away, I only have to look at my children pinch-zooming, their little fingers flying across screens to know that. But until there are better moderation, security and more educational choices for little minds, we’ll be outside if you need us.
Feature Image @Pexels.com
This article was originally published in May 2022.