YouTube published a blog post on January 25 that revealed their new method to shutting down conspiracy content on the platform - by limiting recommendations.
In the blog post, YouTube took a hard stance against such videos that deny historical events, promote “miracle cures” for major illnesses, and make false claims about scientific evidence (a la flat-earthers).
YouTube admitted that, while such content doesn’t violate their community guidelines, it could spread harmful misinformation to users across the site, thus prompting direct action.
“...we’ll begin reducing recommendations of borderline content and content that could misinform users in harmful ways,” YouTube said of the issue. “...we believe that limiting the recommendation of these types of videos will mean a better experience for the YouTube community.”
YouTube went on to claim that their algorithm will affect less than one percent of content across the site, and will not remove the offending videos completely.
Additionally, YouTube revealed that their upcoming algorithm relies on both machine learning and actual employees, who are trained using the site’s guidelines and “critical input.”
YouTube is set to roll out this change “gradually,” with the algorithm being introduced to countries outside the US as the system becomes more accurate over time.
Thanks for reporting on this spam issue! We’re in the process of implementing additional measures to prevent impersonation like this. In the meantime, your subs can protect themselves by blocking any account that is spamming them: https://t.co/uieSAi4k7Z
— Team YouTube (@TeamYouTube) January 23, 2019
YouTube’s latest response to conspiracy videos follows a string of scams across the platform, where major content creators were being impersonated by scammers who sent messages to fans with the promise of exclusive giveaways.
YouTube has since replied to concern over the issue, claiming that they are amping up security measures to prevent such impersonations.