YouTube’s recommendation algorithm drives roughly 70 percent of what people watch on the video-hosting platform. That’s to be expected, of course, especially since parent company Alphabet also manages and maintains the world’s largest search engine. What you might not expect is that in spite of Google’s ubiquity in the world of search engine optimization, YouTube’s policies and algorithms leave a lot to be desired.
According to MIT Technology Review, a new study from Mozilla has found that even though YouTube provides users with controls to adjust what the algorithm shows them, those controls do very little. Researchers analyzed seven months of activity from more than 20,000 study participants to evaluate YouTube’s “Dislike,” “Not interested,” “Remove from history,” and “Don’t recommend this channel” functions.
Each rejected video still spawned an average of 115 unwanted recommendations.
This is only the tip of the iceberg where YouTube’s issues are concerned, however. MIT Technology Review goes on to note that the company has found itself repeatedly under fire for the site’s tendency to reward controversial content and promote videos and advertisements that violate its own content policies. Mozilla speculates that the core issue is that YouTube focuses more on watch time than user enjoyment or satisfaction.
It doesn’t help that YouTube’s content policies are vague at best, and draconian at worst—nor does it help that the company retroactively applies policy changes. The most recent blunder in that arena, TechDirt reports, was that “an additional change to violent content was made, with the restriction no longer applying only to IRL violence, but now depictions of violence in media content, such as video games.”
Considering that two of the top five YouTube channels worldwide publish gaming content, this move is baffling, to say the least.
But it’s also par for the course with YouTube, which has begun restricting foul language, limiting ads or outright demonetizing videos with profanity within the first 15 seconds. (Up until November 2021, YouTube allowed the moderate use of profanity within the first 30 seconds of a video.) Again, this change was applied retroactively, and the policy is enforced unevenly across channels, with multiple videos being repeatedly demonetized and then remonetized.
The point is: Something on YouTube is broken. Between the unevenly applied content policies, its failure to communicate with its most critical content creators, and its user-unfriendly algorithms, it’s baffling to think that its parent company is the same organization that has defined the world of SEO for decades.
You need only look at the swathes of prominent YouTubers forced to rely on alternative methods of monetization such as Patreon or brand sponsorships to see that YouTube needs to make a change.
In the meantime, the only thing you can do if you publish content on the site is to optimize your videos well—and hope that the company doesn’t make an arbitrary change that causes viewership to plummet.