Clicky

The idea that YouTube recommendations are radicalizing people to the “right wing” is a myth perpetrated as a tactic to call for censorship

Many of these people make livings off of their right to free speech, yet Kevin Roose still insists that we ban them, just because he doesn't like it?

If you’re tired of censorship and surveillance, subscribe to Reclaim The Net.

We have covered before that YouTube and other tech giants are slowly working to try and eradicate the counter culture, namely in this case and in the modern times, conservative voices. That’s right; the conservatives are now the counter culture. We know this by just a recent example when the CEO of YouTube in an interview suggested that Ben Shapiro should be banned. YouTube, however, isn’t always the main culprit here.

The New York Times published an article that could probably be summed up as a kid crying because they pooped their diapers. Namely, the fact that alternative media is vastly growing and is taking away their viewers.

New York Times journalist Kevin Roose interviewed Neal Mohan, YouTube’s product chief, solely on the topic of radicalization through YouTube videos.

What does he mean by radicalization through YouTube videos? Conspiracy theories, right extremists and white supremacists.

He questions the YouTube product chief on the rabbit hole of YouTube experience, where users get recommended videos based on what they watch. He questions though why Donald Trump-related videos are being recommended, as well as conspiracy videos. He even made the jump, stating, if we can ban ISIS related videos then why not right-wing extremists?

“Since the New Zealand shooting, we’ve heard this question about “Well, the platforms worked together to take down ISIS content. Why haven’t they done the same for white supremacy or violent right-wing extremism?” What’s the answer there?” Kevin Roose asks.

He pretty much exposes his root bias, where it all comes from, right here.

“So much of what YouTube has become over the years is this kind of alternative form of media. People don’t go to YouTube because they want the same stuff they would see on TV. They go because they’ve built relationships with creators that they trust, and when Logan Paul puts out a flat-earth documentary or Shane Dawson questions whether 9/11 happened, there’s a sense that YouTube is the place where these “real” explanations are being offered, and maybe that makes this all very hard to undo.”

It’s a slippery slope to ban just one thing, as one thing can just easily lead to another. That’s exactly what the New York Times want though, to ban viral information spiraling through YouTube because it is in direct competition with their platform. Information is what the New York Times provides, and YouTube is slowly monopolizing the medium of information because it’s as he said, YouTube is the place where real explanations are being offered.

But it’s not entirely the sheer bias that’s off-putting about this NYT article; it’s also because it breaks the fake news meter. It’s full of disinformation.

The main premise of the article is what keeps mentioning about the deep and dark rabbit hole that exists on YouTube. Roose opened the article in the first line quoting it as “one of the most powerful radicalizing instruments of the 21st century.” Kevin mentions more specifically, what he means by this, though saying, “I’m talking, of course, about YouTube — and, specifically, the recommendation algorithm that determines which videos the site plays after the one you’re watching.”

This is what he refers to as the rabbit hole of dark information and violence.

Obviously, YouTube recommends videos based on what you normally watch. That is in YouTube’s best interest, to generate more views and get people to spend more time on their platform. People can get radicalized in many ways, depending on what it is they watch. This includes ANTIFA radicalization as well. Left-wing extremists exist too, among other dangerous fringe types of content. But even with this type of content, exists elements of political viewpoints and speech. The line can get very blurry when you attempt to try and police the whole thing unless it’s outright violent and inappropriate.

What Kevin Roose recommends banning, however, is far from being outright violent. It’s mostly harmless information that he wants to be banned. Conspiracies and fringe so-called right-wing content producers do not deserve to be banned and their free speech essentially taken away. Many of these people make livings off of their right to free speech, yet Kevin Roose still insists that we ban them, just because he doesn’t like it?

The real danger to me here is, not in the fringe information, but in the free speech silencer activists like Kevin Roose. Impeding on a fundamental human right to me is considered highly dangerous. We are talking about restricting peoples voices, their mouths, their fundamental need as a human being to express themselves, and potentially ruin their source of income to live. This is the talk of extreme authoritarianism.

So how does the YouTube algorithm actually work when it comes to recommending videos? Well, there’s an open sourced software that actually tracks recommended videos based on what YouTube video you watch. You can see essentially the rabbit hole that you will be lead on based on what YouTube channels you watch.

Contrary to popular belief, it’s actually much more likely for YouTube to recommend a “Left” leaning YouTube video to people watching “Right” leaning videos. source: pyt.azureedge.net

Take a look at the chart. You will see that what you watch is what you get. Click on any of the dots which show YouTube channel’s names and you will see the rabbit hole of information YouTube recommends you.

You will find it’s mostly the same type of content for the most part. Steven Crowder videos lead to mostly other Steven Crowder videos, with also high chances of similar content related stuff like JRE, Philip DeFranco, Paul Joseph Watson, etc.

However, the lines across all political spectrums do converge. Meaning, you will still get left-leaning videos recommended even if you watch right-leaning videos. CNN can pop out on your feed if you watch JRE or Steven Crowder. If the issues are related, chances are these conversions will happen. This usually occurs with policy topics that both political sides talk about.

You do not magically get lead to a conspiracy-ridden rabbit hole that consumes your soul and then makes you into a mass shooter. YouTube’ isn’t all that bad. Most of us who use it do not have this weird problem that Kevin Roose was complaining about. It’s not really an issue worth bringing up, especially for a major news source like the NYT.

Neal Mohan surprisingly was pretty solid set on his answers. He pretty much refuted all of the journalist’s claims and just flat out made Roose seem ridiculous, to be perfectly blunt. With answers like these, it’s hard not see why.

“I think that even when you go to something that broad, it comes with real trade-offs. And I’m just raising the fact that there are considerations there, which is that you are then limiting political discourse to a set of preordained voices and outlets and publications. And I think that especially when it comes to something as charged and societally impactful as politics, there needs to be room for new voices to be heard.”

This completely goes against the mainstream media narrative. Neal Mohan is clearly more concerned about the user experience of his platform than some random political bias a news publication has.

The NYT trying to make this a legitimate issue by publishing articles and conducting interviews like these is a sad and pathetic attempt to selfishly garner more views for their own platform. Instead of winning the infowar by producing better quality content, they cry endlessly about it hoping to get the attention of some players in the industry to make it all right for them at a wave of a wand. Unfortunately for Kevin and the New York Times, this won’t be happening any time soon. Because for now at least, common sense still prevails largely amongst the general public.

Kevin Roose tweeted his thoughts on the interview after the article was published saying,

“This answer surprised me bc when I interview neo-Nazis and other extremists, I ask them how they got into the subject. Maybe 80% of the time, YouTube is involved. If there are counterexamples of people being de-radicalized by recommendations, YouTube should publicize them!”

He seems to have missed the point completely.

Number one, YouTube doesn’t feed you Nazi content unless you proactively go and search for it. No examples of this exist and nothing in their algorithm suggests that. The evidence was furthered provided in the research and chart.

Number two, Neal Mohan stated that it’s not in their interest to interfere with political speech on YouTube because that ruins the user experience. Banning outright violent videos is one thing, but banning so-called radical political views is another. One is violence and other is free speech. This quote from Neal Mohan during the interview explains it all:

“In the case of something like this, the challenges are harder because the line, as you can imagine, is sometimes blurry between what clearly might be hate speech versus what might be political speech that we might find distasteful and disagree with, but nonetheless is coming from, you know, candidates that are in elections and the like.”

The NYT article titled, “YouTube’s Product Chief on Online Radicalization and Algorithmic Rabbit Holes” has proven to be a fake news article. Online radicalization and algorithmic rabbit holes are loaded terms, don’t really exist, and have no bearing or significant effect in the real world. It is instead a disguised outcry for being losers in the content war. The New York Times’ last hope is, unfortunately, to try and get the plug pulled on their competitors from the backend rather than facing them from the front end.

If you’re tired of censorship and surveillance, subscribe to Reclaim The Net.

Read more

Share this post

Reclaim The Net Logo

Join the pushback against online censorship, cancel culture, and surveillance.

Already a member? Login.