- cross-posted to:
- france@jlai.lu
- feminism
Police investigation remains open. The photo of one of the minors included a fly; that is the logo of Clothoff, the application that is presumably being used to create the images, which promotes its services with the slogan: “Undress anybody with our free service!”
The shock value of a nude picture will become increasingly humdrum as they become more widespread. Nudes will become so common that no one will batt an eye. In fact, some less endowed, less perfect ladies will no doubt do AI generated pictures or movies of themselves to sell on the internet. Think of it as photoshop X 10.
This isn’t about nude photos, it’s about consent.
I can already get a canvas and brush and draw what I think u/DessertStorms looks like naked and there is nothing you can do about it.
You’re not making the point you think you are, instead you’re just outing yourself as a creep. ¯_(ツ)_/¯
Hey, you dropped this \
¯\_(ツ)_/¯
The lack of empathy in your response is telling. People do not care for the effect this has on teenage girls. They don’t even try to be compassionate. I think this will just become the next thing girls and women will simply have to accept as part of their life and the sexism and objectification that is targeted at them. But “boys will be boys” right?
Photoshopped nude pictures of celebrities (and people the photoshopper knew personally) have been around for at least 30 years at this point. This is not a new issue as far as the legal situation is concerned, just the ease of doing it changed a bit.
The article is about children.
The age of the victims is not really relevant. The problem would remain if the article were about adults.
The problem is very different here because they are children.
Very different to what? AI identity theft is what creates the victims, independent of age (or clothing).