Facebook continues to walk a tightrope between the desire to continue with its highly lucrative business model and practices, and the fear of government regulation.
The social media giant has just announced changes to its systems that run against its own “core engineering,” the Washington Post reports. The reason to take this unusual direction? Facebook would like to better police content on its platform for billions of users, and shield them from “misinformation and sensational news.”
Once again, this is not content that is actually prohibited by Facebook, and therefore Facebook won't remove it – but the company doesn't like it, or is worried about being perceived as liking it. The result of this latest tweaking of algorithms will make sure this type of content is pushed down the users' news feed and obscured from view.
This also means that Facebook is willing to lose a portion of user engagement to please the critics who are constantly pressuring the company to exercise more strict moderation on the platform.
The article refers to this as “borderline content” – and this has become a code of sorts, used a lot lately by both the media and the tech giants, for censorship of material that is not banned by any terms of service but is deemed to having the potential to cause harm.
Another point the media and politicians like to repeat is that Facebook is pretty much never “not doing enough” about “fake news.” This keeps the company on its toes and makes it seemingly ever more ready to big itself deeper into the censorship hole.
Indeed, the Washington Post article is unsure if Facebook is “doing enough”: is it now engaging in “tweaks on the margins or more fundamental fixes” that would put in on the path of recovering lost public trust.
But there are more ways that one to lose public trust. The newspaper mentions one: the spread of misinformation (whatever the yardstick maybe); but there are many more: censorship for one, and egregious violations of the privacy of users for another.
And the article focuses strongly on “fake news” and mentions that “outside researchers” have found, essentially, that the pressure on Facebook is paying back because fewer Americans visited “fake news websites” during the 2018 congressional campaign, compared to the 2016 presidential elections.