Study finds YouTube algorithm can push violent, graphic videos to young children

Liam Ho
YouTube Logo 3d model

A recent study has found that the YouTube algorithm has the potential to automatically push violent or graphic videos to young children without them actively searching for them.

YouTube’s video algorithm has been a point of contention for quite some time. The mysterious inner workings of the platform’s recommended feed have dictated what content is pushed to viewers and in turn, what content gains popularity.

The algorithm, like most other social media platforms, takes into account what content the user is consuming, before offering them similar content in order to keep them engaged. At least, that’s the intention. Often, this means tuning a user’s suggested videos to cater to their particular hobbies and enjoyment.

However, a new study has found that there may be a potentially darker side to how YouTube’s algorithm functions. A study conducted by the Tech Transparency Project found that young children were having their YouTube suggestions flooded with graphic videos about school shootings, gun training videos, and more.

Photo of YouTube Home Screen and Trending
YouTube’s algorithm may be doing more harm than good, a new study finds.

Study finds YouTube algorithm may be pushing violent content to children

The new study took two YouTube accounts and simulated the behavior of two nine-year-olds who like video games. These accounts were both identical, except one would click on videos that were recommended by YouTube, whilst the other did not. The account that did click on the suggestions provided by the platform was soon flooded with graphic content, the report claims.

The account that did not interact with the suggestions would still receive some gun-related videos, at a total of 34. However, the accounts that did engage with the suggested videos would end up receiving 382 different firearm-related videos in a month, or around 12 per day, according to the research. Alongside this, the study also created accounts that mimicked 14-year-old boys and found similar results.

The findings of the study have brought about criticism of YouTube’s algorithm, which a spokeswoman at the platform defended in response to AP News. They noted it requires users under the age of 17 to gain their parent’s permission before using the site, and accounts for users under 13 are linked to a parental account.

“We offer a number of options for younger viewers… which are designed to create a safer experience for tweens and teens.”

The conversation about algorithms has also bled onto TikTok, which has similarly defended its site and policies by stating they prohibit users under the age of 13.

About The Author

Liam is a writer on the Australian Dexerto team covering all things gaming with an emphasis on MMOs like Destiny and FFXIV along with MOBAs like League. He started writing while at university for a Bachelor’s degree in Media and has experience writing for GGRecon and GameRant. You can contact Liam at liam.ho@dexerto.com or on Twitter at @MusicalityLH.