YouTube has been increasing its dependence on algorithms to moderate content, Google has revealed. But technology is far from replacing human reviewers as the platform has been removing videos that do not violate any policies.
Google said that between April and June, more than 11.4 million videos were removed from the video platform. That was a 50 percent increase compared to the previous quarter.
The company says it had to rely more on technology as the pandemic forced it to send workers home. The technology they are using to moderate content, however, has been over-enforcing policies, Google now admits.
“When reckoning with greatly reduced human review capacity due to COVID-19, we were forced to make a choice between potential under-enforcement or potential over-enforcement,” Google wrote in a blog post alongside its Q2 transparency report.
Double your web browsing speed with today's sponsor. Get Brave.
“Because responsibility is our top priority, we chose the latter — using technology to help with some of the work normally done by reviewers,” with automated systems set to cast a “wider net.”
Of the removed videos during that period, about 325,000 were appealed. More than half of the appealed videos were reinstated after YouTube discovered they did not actually violate its policies.
But by the time YouTubers have their videos reinstated, the reach on the video has often already tanked and their ad revenue for the video is obviously non-existent during the time the video was down.
The company also attributed the increased number of removals to the pandemic. Since people are stuck at home, Google says there are more uploads and more people reporting content.