Meta’s attempt to restart AI training using Europeans’ public social media activity has drawn renewed resistance, as the privacy rights organization noyb threatens fresh legal action. The group has formally challenged Meta’s latest move to mine user data, asserting the tech giant is sidestepping EU privacy obligations and advancing without regulatory clearance.
Following a halt in June 2024 prompted by regulatory concerns, Meta announced in April it would resume training its language models. This time, it intends to use public posts and user interactions, including with Meta AI, from adults across the European Union and European Economic Area.
The initial pause came after mounting pressure from the Irish Data Protection Commission and a wave of complaints submitted to authorities in various member states. According to Meta, a December opinion from the European Data Protection Board signaled that its approach satisfied legal standards.
“Last year, we delayed training our large language models using public content while regulators clarified legal requirements,” the company explained. “We welcome the opinion provided by the EDPB in December, which affirmed that our original approach met our legal obligations.”
Noyb sees things very differently. The Vienna-based privacy group has issued a cease-and-desist letter to Meta and is preparing a potential class action lawsuit in Europe.