Following legal threats from Facebook, German-based AlgorithmWatch was forced to abandon its research project that was monitoring Instagram’s algorithm. This was not the first time the social media company has shut down similar projects.
AlgorithmWatch was launched in March 2020. The research project was done through a browser extension that users could install and collect data from their Instagram feeds. The data collected by the plugin allowed the researchers to see how Instagram prioritizes content and better learn what’s going on with the algorithm.
AlgorithmWatch publishes its findings regularly. The researcher had found out that Instagram ranked photos with faces higher than text screenshots and that the platform promoted content revealing bare skin more highly.
Although Facebook disputed the project’s methodology, it allowed the project to run for more than a year.
In a post published this week, the researchers said that Facebook requested a meeting with the leaders of the project in May. In the meeting, Facebook accused the researchers of violating Instagram’s terms of service. The company also claimed that the project violated GDPR because it collected users’ data without their consent.
“We only collected data related to content that Facebook displayed to the volunteers who installed the add-on,” the researchers argued. “In other words, users of the plug-in were only accessing their own feed, and sharing it with us for research purposes.”
Fearing legal action from the tech giant, the researchers chose to shut down the project.
A Facebook spokesperson confirmed the meeting but refused the claim that they threatened legal action.
The spokesperson said the company was ready to find ways for the research to continue without compromising users’ privacy.
Facebook has a troubling pattern of shutting down research into its algorithms. In their post, AlgorithmWatch researchers cited the NYU Ad Observatory, which was tracking political advertising on the platform before it was banned a few weeks ago.
“There are probably more cases of bullying that we do not know about,” the post reads. “We hope that by coming forward, more organizations will speak up about their experiences.”
Facebook does provide ways for researchers to collect data, like the Social Science One partnerships and the Ad Library. But, considering its pattern of shutting down independent research, AlgorithmWatch argues Facebook’s data cannot be trusted.
“Researchers cannot rely on data provided by Facebook because the company cannot be trusted,” the researchers said. “There is no reason to believe that Facebook would provide usable data, were researchers to replace their independently collected data with the company’s.”