New klicksafe topic areaWhat is image-based sexualized violence?
Taylor Swift's deepfakes also caused a media stir in Germany at the end of January 2024. In September 2023, a case in Spain showed how easily young people can create deepnudes. Fake nude images of schoolgirls were created and distributed there. klicksafe has also been contacted by teachers and parents whose pupils or children have been affected.
However, AI-generated nude images or porn are only part of the phenomenon of image-based sexualized violence. The non-consensual creation and dissemination of real images is also an act of sexual violence and punishable by law. Phenomena such as revenge porn, sextortion (blackmailing with nude images), secondary sexting (disseminating images that were originally exchanged consensually) and upskirting or downblousing with spy cams (secret recordings, e.g. in changing rooms or bathrooms) are also part of the area of image-based sexualized violence.
One thing is certain: Sexualized violence through images is a form of abuse and is highly distressing for those affected . For many of those affected, it does not matter whether the images are "real" or AI-generated fakes. They are at the mercy of a situation in which they are deliberately exposed and humiliated. Once such images or videos are in circulation, this can be extremely damaging to the reputation of the person concerned or even result in social exclusion.
In our new topic area, you can find out what image-based sexualized violence is, what role artificial intelligence plays in this context and what options there are for those affected and their supporters.