The Mozilla Foundation used to do one thing, and do it well: lead the development of the free and open source Firefox browser. Sadly, that browser, once with a huge chunk of the market and representing a revolutionary step up from Microsoft’s Internet Explorer, is falling by the wayside as Google’s Chrome has taken over.
Related: ? Â The fall of Mozilla
Chrome and the giant behind it are riddled with (un)answered questions and concerns about privacy and safety; while Mozilla has always touted itself as the opposite, an organization that is all about promoting those values.
Why then, when Mozilla these days feels the need to “take on” a Google property, is the story not about all the drawbacks of using Chrome and promoting the use of Firefox? Why is Mozilla instead virtue signaling by joining the “war on misinformation” and calling out Google’s YouTube?
And of all the things YouTube can be criticized for, Mozilla chooses the way videos that it feels fall into the conspiracy theory category are recommended on the platform.
A blog post published in a section of the website that could be dubbed, “mozillasplaining,” talks about the well-known fact that YouTube’s massive revenues come from its advertising business model that requires more clicks and engagement to grow. Not to mention that Mozilla’s own (in truth, non-existent) “business model” depends directly on the hundreds of millions of dollars of Google money it receives each year through a search deal.
Perhaps that is why the post doesn’t go into the nature of the advertising business itself, which is one of the murkiest parts of the web today, or criticize the recommendation algorithm per se – but instead wants it to be tweaked in a way that would prevent “hatred and conspiracies” from surfacing in people’s YouTube app.
“Users can quickly fall prey to a domino effect, where one conspiracy video leads to another,” says the post, making bold claims such as that watching videos it deems to contain “hateful” content can lead people to radicalization. Neither terms are defined at any point in the write-up.
As for solutions – Mozilla wants its audience to share stories of YouTube recommendation algorithms “leading them astray.”
And – it would also like “regulators (to) step in and issue laws that begin to curb this.”