Clicky

New study further debunks “far-right” rabbit hole YouTube narrative

A new study helps debunk the myth of "radicalization" that's often used to call for censorship.

If you’re tired of censorship and surveillance, subscribe to Reclaim The Net.

YouTube uses algorithms to suggest videos depending on what you watch. There have been many stories about how YouTube’s recommendation algorithms have “radicalized” people by populating their viewings with a specific subject, particularly when the subjects are “conspiracy theories.”

Yet, these accusations have been accused of being a “conspiracy theory” themselves as several studies have debunked such claims.

A new study published Monday further suggests that video recommendations on YouTube do not radicalize people.

The study focused on whether the alleged radicalization is anecdotal or represented an undeniable trend. The results of the study do not rule out the existence of radicalization through social media. However, it does strongly suggest that this radicalization is not at all common.

One of the main challenges of such studies is getting people to reveal their video-watching habits knowingly and honestly. The researchers conquered this challenge by getting data from Nielsen, which tracks people’s online activities and anonymizes the results. The study involved 300,000 viewers who collectively viewed more than 21 million YouTube videos between 2016 and 2019.

The researchers classified channels according to political leanings, from far-left to far-right. But they added the “anti-woke” category, which they defined as those that focus on “opposition to progressive social justice movements.”

The number of far-right viewers did not increase but the total number of hours spent watching videos increased. But the number of mainstream-right viewers increased but the time spent watching videos was almost the same as the far-right.

The anti-woke group registered the highest growth compared to any group. They spent more time watching than centrists (who were the second largest group after the mainstream-left).

The far-left group was too small to analyze.

According to ArsTechnica, the data does not show radicalization, because of the “lack of significant growth at the two extremes.”

The outlet also noted if the recommendation algorithms were effective radicalization tools, the number of far-right or far-left videos would have significantly increased, which was not the case.

But, the researchers did conclude that far-right content was stickier, as viewers spent more time on it although the community did not grow significantly. Anti-woke content is also sticky; people who viewed a few anti-woke videos in a session were highly likely to keep watching them in future sessions.

If you’re tired of censorship and surveillance, subscribe to Reclaim The Net.

Read more

Share this post

Reclaim The Net Logo

Join the pushback against online censorship, cancel culture, and surveillance.

Already a member? Login.