YouTube has released fresh stats that show the impact of its controversial new “hate speech” rules which were rolled out in June. When these rules were introduced, history channels, independent journalists, and other channels producing innocuous types of content were either demonetized or terminated but the scale of the removals related to these rules wasn’t reported by YouTube.
With the release of these new stats, YouTube has shown that these new “hate speech” rules led to over 100,000 videos and 17,000 channels being removed in Q2 2019 – a 5x increase compared to the previous quarter. The stats also show that YouTube removed a staggering 500 million comments in Q2 2019 which represents a 2x increase. YouTube says this increase is partially due to a large increase in hate speech removals.
While these stats show a stark increase in content being removed for violating YouTube’s “hate speech” rules, these new rules were only in place for one of the three months during this quarter which suggests that even more content could be removed in future quarters where these new “hate speech” rules are in effect for the entire three months. YouTube seems to allude to this when discussing the impact of these rules and said that it can take “months for us to ramp up enforcement of a new policy”:
“Our hate speech update represented one such fundamental shift in our policies. We spent months carefully developing the policy and working with our teams to create the necessary trainings and tools required to enforce it. The policy was launched in early June, and as our teams review and remove more content in line with the new policy, our machine detection will improve in tandem. Though it can take months for us to ramp up enforcement of a new policy, the profound impact of our hate speech policy update is already evident in the data released in this quarter’s Community Guidelines Enforcement Report.”
These stats also show that YouTube drastically suppresses the reach of videos that are eventually removed for policy violations with views on these videos being reduced by 80% in the last 18 months.
Another key point that’s revealed in the stats is the dominance of machine learning when it comes to removing content on YouTube. YouTube says that 87% of videos that were removed in Q2 2019 were first flagged by YouTube’s automated systems and 80% of the auto-flagged videos were removed before they received a single view.
In addition to providing the content removal stats, YouTube reaffirmed its commitment to removing what it deems to be “inappropriate content” by highlighting 48 policy updates and product changes that have been made since 2016 to increase the removal of this type of content.
The release of these stats comes less than a week after YouTube CEO Susan Wojcicki reaffirmed the company’s commitment to promoting so-called “authoritative sources” while suppressing content that comes close to breaking the rules but doesn’t actually do anything wrong. Like with many of the changes YouTube has made over the last year, these new “hate speech” rules have disproportionately impacted smaller creators while other changes have increased the reach of supposedly “authoritative sources” on the site. The collective impact of these changes is that smaller professional YouTubers are now struggling to make a living on the platform.
One possible explanation as to why YouTube has made this shift towards pushing “authoritative sources” while neglecting independent creators is that the relative cost to YouTube of making these decisions is small. YouTube’s Community Guidelines Enforcement report shows that “hateful or abusive” content accounted for just 0.4% of all channel removals and just 1.2% of all video removals in Q2 2019. By touting that it has removed 5x more “hate speech,” YouTube can get a major PR win with legacy media outlets by focusing on an issue that affects around 1% of the entire platform. Wojcicki alluded to this last month when she said that news commentary is a “very small” part of YouTube. She has also previously suggested that trade offs which impact innocent creators are necessary.