Poland’s data protection authority has alerted prosecutors after a teacher’s image was digitally altered using AI to appear nude and shared online.
AI-Generated Image Prompts Criminal Complaint
The President of the Office for Protection of Personal Data (UODO) has reported the incident to the Warsaw-Śródmieście District Prosecutor’s Office. The case involves a digitally manipulated image of a teacher circulated on social media, created using artificial intelligence to depict her as nude.
The teacher reported the incident, stating, “My photo was altered (UNDRESSED) using AI by some pervert and then posted online. I feel awful, I feel hurt and even telling myself that it wasn’t my body doesn’t help much.”
Legal Framework and Penalties
UODO believes the incident violates Article 107 of the Personal Data Protection Act. This article stipulates a fine, restriction of freedom, or imprisonment of up to two years for anyone processing personal data unlawfully or without authorization.
The President of UODO emphasized that this constitutes a public offense, prosecutable ex officio. The unlawful processing of data occurs when there is no legal basis for it, as defined by the GDPR (Articles 6 and 9), specifically lacking consent from the individual concerned.
Defining Deepfakes and the AI Act
Poland currently lacks a specific legal definition for this type of manipulation, necessitating reference to EU legislation. The definition of deepfakes is outlined in Article 3(60) of the Artificial Intelligence Act (AI Act): “images, audio or video content generated or manipulated by AI that resemble existing persons, objects, places, entities or events, which the recipient could wrongly believe to be authentic or true.”
Non-Consensual Synthetic Intimate Imagery (NSII)
The manipulation of the victim’s image occurred without her consent, utilizing publicly available photos and AI tools. This falls under the category of deepfakes, specifically identified as non-consensual synthetic intimate imagery (NSII).
Research indicates that NSII is the dominant form of deepfake abuse, comprising over 90% of cases, and overwhelmingly targets women, leading to widespread victimization and reinforcing gender stereotypes.
European Parliament Moves to Ban “Nudifier” Systems
The European Parliament recently adopted an amendment to the Digital Omnibus on AI, introducing a ban on “nudifier” systems – AI tools used to digitally undress people – into the AI Act.
The ban will apply to systems that utilize AI to create or modify sexually explicit or intimate images of identifiable individuals without their consent.
. **Sources:**



