Clicky

YouTube CEO admits changing the recommendation algorithm in response to media’s radicalization theory

The theory has been contradicted by research data but YouTube still made changes in response to mainstream media articles pushing this narrative.

Tired of censorship and surveillance?

Defend free speech and individual liberty online. Push back against Big Tech and media gatekeepers. Subscribe to Reclaim The Net.

In July 2019, The New York Times published its infamous “The Making of a YouTube Radical” article from tech columnist Kevin Roose which used the anecdotal experience of a single YouTube viewer to push the notion that YouTube radicalizes its users by guiding them down an “alt-right rabbit hole” and into “a far-right universe” that’s filled with “conspiracy theories, misogyny and racism.”

The article looked at the YouTube history of Caleb Cain between 2015 and 2018 and described him being “seduced by a community of far-right creators” through recommended videos which appear on the YouTube homepage and in the “Up Next” section.

Since the article was published, two studies that collectively looked at millions of YouTube recommendations have concluded that YouTube’s recommended videos do not lead to extreme content.

A study from Mark Ledwich and Anna Zaitsev, which looked at over 23 million YouTube recommendations for 657,000 videos, concluded that YouTube’s recommendation engine is “not a radicalization pipeline” and that YouTube’s algorithm actively discourages users from visiting content that could be considered “radicalizing.”

Another study from Manoel Horta Ribeiro, Raphael Ottoni, Robert West, Virgílio A. F. Almeida, and Wagner Meira Jr., which looked at over 2 million YouTube video and channel recommendations, concluded that “it is possible to find Alt-right content from recommended channels, but not from recommended videos.”

Recommended channels are much less prominent than recommended videos and can only be accessed when a viewer intentionally navigates to a creator’s channel.

Recommended videos on the other hand are inserted into viewers’ home and Up Next feeds and will also be played to viewers automatically depending on their personal YouTube settings.

In addition to these studies, the conclusion of the article suggests that Cain’s failure to think critically rather than YouTube’s recommendation engine is to blame for him being radicalized.

In the final paragraphs, it describes how through watching YouTube’s recommended videos, Cain “successfully climbed out of a right-wing YouTube rabbit hole,” then jumped into “a left-wing YouTube rabbit hole,” and ultimately conceded that “he needed to think critically about the videos he watched.”

Yet despite this evidence that contradicts the anecdotal notion that YouTube radicalizes its users, YouTube CEO Susan Wojcicki admitted that the platform changed its algorithm in response to stories like Cain’s.

Wojcicki made the comments during a recent interview on The New York Times podcast “Rabbit Hole” which is hosted by Roose and repeats Cain’s story in audio form.

On the podcast, Roose said that stories like Cain’s had led to Wojcicki acknowledging that YouTube “needed to start making some changes.”

Susan Wojcicki said YouTube had made changes in response to the mainstream media’s stories about the YouTube radicalization theory (The New York Times, The Daily Beast, NBC News)
Susan Wojcicki said YouTube had made changes in response to the mainstream media’s stories about the YouTube radicalization theory (The New York Times, The Daily Beast, NBC News)

Wojcicki also commented on Cain’s story and said in addition to Roose, many other people had raised concerns about how YouTube’s recommendations systems work.

She then told Roose that these concerns had resulted in YouTube changing its recommendations system.

“And we have taken that seriously and I think you know we’ve made a lot of changes to how our systems work,” Wojcicki said.

While the legacy media often pressures big tech platforms to make changes to their algorithms and rules, CEOs of these companies rarely admit that they’re making changes in response to this pressure and in Wojcicki’s case, she usually frames such changes as meeting her “responsibility efforts.”

Not only is Wojcicki’s admission rare but it also highlights that YouTube’s willingness to respond to media pressure is strong enough that it will even change its algorithms in response to media anecdotes and editorials that are contradicted by research data.

And when YouTube does respond to such pressure, the legacy media are often the biggest benefactors and see an increase in recommendations to their content.

Not only is YouTube changing its algorithm in response to shaky media narratives but it’s actually rewarding those narratives by giving mainstream media outlets even more views and taking views away from independent creators.

If you're tired of censorship and dystopian threats against civil liberties, subscribe to Reclaim The Net.

Tired of censorship and surveillance?

Defend free speech and individual liberty online. Push back against Big Tech and media gatekeepers. Subscribe to Reclaim The Net.

Read more

Share