YouTube has revealed some of its latest plans to restrict content under the guise of battling misinformation.
In a blog post, the Google company – notorious for some of the most shocking instances of online censorship related to the pandemic and more – outlined what they see as the “next” steps along the same path.
And one of those steps now under consideration includes disabling sharing of, and linking to videos that YouTube decides contain “misinformation,” with the clear goal of “killing” their reach and visibility – this time, specifically outside YouTube itself.
The post is meant to be the first in a series that will allegedly provide transparency into how all that works, but the answer to the above question is unlikely to pop up at any point, at least not in any “human understandable” language – past the self-serving, corporate platitudes. One of those is referring to content that is evidently not breaking platform rules as being “borderline” – and, “we won’t necessarily recommend it.”
It really doesn’t get more arbitrary and ultimately non-transparent than that.
One of the sections, “The cross-platform problem: addressing shares of misinformation” is dedicated to this, where YouTube pats itself on the back for managing to suppress videos that don’t break its rules (“but we just don’t like them”) to “significantly” below one percent, recommendations-wise.
But now an interesting “problem” occurs: how to stop those videos being shared and recommended elsewhere? It’s interesting not only because it betrays Google/YouTube striving to have its tentacles literally across all over the internet, but also as a roundabout acknowledgement that alternatives are gaining momentum, to the point where it becomes important to the behemoth whether or not a video it disapproves of gets an audience elsewhere.
And to stop that, YouTube would go as far as to disable the share button, “or break the link on videos that we’re already limiting in recommendations.”
The idea is so outrageous that even YouTube’s blog post pays lip service to common sense when its author notes, “but we grapple with whether preventing shares may go too far in restricting a viewer’s freedoms.”
However, with YouTube it’s best to err on the side of cynical caution, and consider posts like this as more the giant putting out feelers to see how the community reacts, than it actually “grappling” with any ethical issue.