Facebook has not had a shortage of whistleblowers over the past years, but most have been ignored and sometimes vilified by mainstream media and the authorities; however, they now have a “star” one, Frances Haugen, who seems to finally be telling them exactly what they want to hear.
And now European countries seem ready to use Haugen’s claims and her testimony this week before the US Congress as an excuse to promote more regulation that would force tech giants to come up with risk assessments every year regarding issues such as misinformation and hate speech.
The gist of Haugen’s testimony, and the reason why she revealed a number of internal Facebook documents prior to that, is the accusation that the social media giant has a negative and harmful effect on society.
So high is the profile now of this former product manager that straight after the congressional testimony, she was on the phone with European Commissioner Thierry Breton, and he was the one to inform the public about their conversation.
Breton, who is known for advocating very far reaching and strict new regulation of US tech giants, said Haugen “confirmed the importance and urgency of why we are pushing to rein in the big platforms.”
The leaked documents that were first reported in the Wall Street Journal – some of which had to do with the practice of white-listing celebrities and their content – now seem to be used as a catalyst in the EU to speed up the process of adopting new rules that aim to deal not only with the platform’s alleged anticomeptitive behavior stemming from their market dominance – but also make to go for more stringent ways of policing their networks – often a euphemism for unchecked moderation and even censorship.
Reports suggest that Haugen and EU officials drafting this legislation are having something of a meeting of minds, since a number of ideas she now has on how to contain Facebook are in agreement with what Brussels has been deliberating and debating for a year.
One of them, the Digital Services Act, would require transparency and disclosure both to regulators and researchers of services, algorithms and content moderation – but in the same breath, “force Facebook and other tech giants to conduct annual risk assessments in areas such as the spread of misinformation and hateful content,” writes the New York Times.