Privacy organization Noyb is urging 11 European regulators to clamp down on Meta’s new privacy policy. Lacking an option to be forgotten within Meta systems, the policies on Facebook, Instagram and Threads would violate European regulations.
Meta recently announced it would start using personal data for AI training based on “legitimate interests”. Those who disagree can have their personal data removed from Meta’s systems. However, that does not prevent their own images and posts from already residing permanently in the training data from existing AI models.
Noyb
Privacy organization Noyb feels it is urgent to take action against this measure. The group explains that it has sent complaints to 11 European DPAs (Data Protection Agencies).
This will have to be urgently fixed, as the new Meta policy goes into force on June 26. Noyb fears that without resistance from regulators, a situation will arise where all the personal data Meta/Facebook has been collecting since 2007 is free game for AI development. Needless to say, until recently, users have been predominantly unaware of the implications of inclusion in AI training.
Schrems
Austrian privacy activist and lawyer Max Schrems is anything but lenient on the new Meta policy. “Meta is basically saying that it can use ‘any data from any source for any purpose and make it available to anyone in the world’, as long as it’s done via ‘AI technology’. This is clearly the opposite of GDPR compliance.
Essential here is that AI is an extremely broad playing field, Schrems said. “Much like ‘using your data in databases’, it has no real legal limit. Meta doesn’t say what it will use the data for, so it could either be a simple chatbot, extremely aggressive personalised advertising or even a killer drone. Meta also says that user data can be made available to any ‘third party’ – which means anyone in the world.”
Also read: End of carefree Windows use as Recall records everything you do on your PC