- cross-posted to:
- france@jlai.lu
- feminism
Police investigation remains open. The photo of one of the minors included a fly; that is the logo of Clothoff, the application that is presumably being used to create the images, which promotes its services with the slogan: “Undress anybody with our free service!”
aard ( @aard@kyu.de ) English162•10 months agoThis was just a matter of time - and there isn’t really that much the affected can do (and in some cases, should do). Shutting down that service is the correct thing - but that’ll only buy a short amount of time: Training custom models is trivial nowadays, and both the skill and hardware to do so is in reach of the age group in question.
So in the long term we’ll see that shift to images generated at home, by kids often too young to be prosecuted - and you won’t be able to stop that unless you start outlawing most of AI image generation tools.
At least in Germany the dealing with child/youth pornography got badly botched by incompetent populists in the government - which would send any of those parents to jail for at least a year, if they take possession of one of those generated pictures. Having it sent to their phone and going to police for a complaint would be sufficient to get prosecution against them started.
There’s one blessing coming out of that mess, though: For girls who did take pictures, and had them leaked, saying “they’re AI generated” is becoming a plausible way out.
Cethin ( @Cethin@lemmy.zip ) English10•9 months agoYeah, what I see happening is people end up not caring as much because there’s going to be so much plausible AI generated crap that any real stuff will be lost in the noise.
Turun ( @Turun@feddit.de ) English4•9 months agoQuelle für das angesprochene Gesetz bitte. Das will ich im Detail lesen.
aard ( @aard@kyu.de ) English7•9 months agoFang mit dem relativ neuen Fall hier an, und von da solltest du dann genug Info haben um selber zu suchen was die letzten Jahre passiert ist - das ist exakt das wovor damals gewarnt wurde, aber wer den hysterischen Irren die alles was irgendwie mit “Teenager entdecken Sexualitaet” mit dem Strafrecht erschlagen wollen mit durchdachten Argumenten kommt ist dann ja direkt auch ein Paedophiler.
ciko22i3 ( @ciko22i3@sopuli.xyz ) English79•10 months agoAt least now you can claim it’s AI if your real nudes leak
taladar ( @taladar@feddit.de ) English56•10 months agoIn the long term that might even lead to society stopping their freak-outs every time someone in some semi-sensitive position is discovered to have nude pictures online.
SkyeStarfall ( @SkyeStarfall@lemmy.blahaj.zone ) English19•9 months agoI hope so. We shouldn’t be ashamed of our bodies or sexuality.
rufus ( @rufus@discuss.tchncs.de ) English50•9 months agoInteresting. Replika AI, ChatGPT etc crack down on me for doing erotic stories and roleplay text dialogues. And this Clothoff App happily draws child pornography of 14 year olds? Shaking my head…
I wonder why they have no address etc on their website and the app isn’t available in any of the proper app-stores.
Obviously police should ask Instagram who blackmails all these girls… Teach them a proper lesson. And then stop this company. Have them fined a few millions for generating and spreading synthetic CP. At least write a letter to their hosting or payment providers.
crispy_kilt ( @crispy_kilt@feddit.de ) English3•9 months agoFined? Fuck that. CP must result in jail time.
rufus ( @rufus@discuss.tchncs.de ) English2•9 months agoI just hope they even try to catch these people. I’ve tried to look up who’s behind that and it’s a domain that’s with name.com and the server is behind Cloudflare. I’m not Anonymous, so that’s the point at which I’m at my wits’ end. Someone enraged could file a few reports at their abuse contacts… Just sayin…
There’s always the possibility they just catch the boy and just punish him. Letting the even more disgusting people in the background keep doing what they want. Because it would be difficult to get a hold of them. This would be the easiest route for the prosecuters and the least efficient way to deal with this issue as a whole.
/home/pineapplelover ( @pineapplelover@lemm.ee ) English2•9 months agoPrison at the very least and all the inmates need to know that you engaged in CP.
rayyyy ( @rayyyy@kbin.social ) 36•10 months agoThe shock value of a nude picture will become increasingly humdrum as they become more widespread. Nudes will become so common that no one will batt an eye. In fact, some less endowed, less perfect ladies will no doubt do AI generated pictures or movies of themselves to sell on the internet. Think of it as photoshop X 10.
DessertStorms ( @DessertStorms@kbin.social ) 41•10 months agoThis isn’t about nude photos, it’s about consent.
andrai ( @andrai@feddit.de ) English43•10 months agoI can already get a canvas and brush and draw what I think u/DessertStorms looks like naked and there is nothing you can do about it.
DessertStorms ( @DessertStorms@kbin.social ) 12•9 months agoYou’re not making the point you think you are, instead you’re just outing yourself as a creep. ¯_(ツ)_/¯
andrai ( @andrai@feddit.de ) English19•9 months agoHey, you dropped this \
¯\_(ツ)_/¯
ParsnipWitch ( @ParsnipWitch@feddit.de ) English4•9 months agoThe lack of empathy in your response is telling. People do not care for the effect this has on teenage girls. They don’t even try to be compassionate. I think this will just become the next thing girls and women will simply have to accept as part of their life and the sexism and objectification that is targeted at them. But “boys will be boys” right?
taladar ( @taladar@feddit.de ) English29•9 months agoPhotoshopped nude pictures of celebrities (and people the photoshopper knew personally) have been around for at least 30 years at this point. This is not a new issue as far as the legal situation is concerned, just the ease of doing it changed a bit.
SharkEatingBreakfast ( @SharkEatingBreakfast@sopuli.xyz ) English7•9 months agoThe article is about children.
devils_advocate ( @devils_advocate@lemmy.ml ) English4•9 months agoThe age of the victims is not really relevant. The problem would remain if the article were about adults.
SharkEatingBreakfast ( @SharkEatingBreakfast@sopuli.xyz ) English3•9 months agoThe problem is very different here because they are children.
devils_advocate ( @devils_advocate@lemmy.ml ) English2•9 months agoVery different to what? AI identity theft is what creates the victims, independent of age (or clothing).
Abaixo de Cão ( @AbaixoDeCao@lemm.ee ) English29•9 months agoThat’s really, really sad, EU, please try to regulate AI.
Sigmatics ( @Sigmatics@lemmy.ca ) English22•10 months agoThe only thing new about this is that the photos are probably more realistic, but still fake. Apps to do this existed before GenAI was a thing
Margot Robbie ( @MargotRobbie@lemm.ee ) English21•9 months agoBanning diffusion models doesn’t work, the tech is already out there and you can’t put it back in the box. Fake nudes used to be done with PhotoShop, the current generative AI models only makes them faster to make.
This can only be stopped on the distribution side, and any new laws should focus on that.
But the silver lining of this whole thing is that nude scandals for celebs aren’t really possible any more if you can just say it’s probably a deepfake.
GCostanzaStepOnMe ( @GCostanzaStepOnMe@feddit.de ) English5•9 months agoOther than banning those websites and apps that offer such services, I think we also need to seriously rethink our overall exposure to the internet, and especially rethink how and how much children access it.
PolarisFx ( @PolarisFx@lemmy.dbzer0.com ) English1•9 months agoYea, with 15 good headshots from different angles I can build a LoRA for anybody, hell Civit is full of celebrity LoRA’s.
Mage.space already had to switch to SFW because people were generating CP. The past couple weeks I’ve been playing with stable diffusion and some of the checkpoints easily generate content that I had to delete because they looked REALLY young and it creeped me out.
YurkshireLad ( @YurkshireLad@lemmy.ca ) English20•10 months agoMaybe something will change as soon as people start creating and distributing fake AI nudes of that country’s leaders.
Risk ( @Risk@feddit.uk ) English16•10 months agoHonestly surprised this didn’t happen first.
Be a great way to discredit politicians in homophobic states, by showing a politician taking it up the arse.
Sabata11792 ( @Sabata11792@kbin.social ) 11•10 months agoIts already happened, and there is not enough In the world bleach to unsee it.
tetraodon ( @tetraodon@feddit.it ) English20•9 months agoI feel somewhat bad saying this, but the wo/man (it will be a man) who can make an Apple Vision Pro work with AI nudifiers will become rich.
TheGreenGolem ( @TheGreenGolem@lemm.ee ) English11•9 months agoYou know the old joke: if we could do anything with just our eyes, the streets would be full of dead people and pregnant women.
helixdaunting ( @helixdaunting@lemm.ee ) English5•9 months agoI’ve never heard that joke before, but that’s brilliant.
uxia ( @uxia@midwest.social ) English8•9 months agoLol then people will probably start assuming anyone wearing that technology is a pedophile and/or disgusting creep.
GCostanzaStepOnMe ( @GCostanzaStepOnMe@feddit.de ) English2•9 months agoAs they should
Skates ( @Skates@feddit.nl ) English4•9 months ago(it will be a man)
I don’t even know whether to upvote or downvote your comment because I can’t figure out if you’re trying to say that only a man would do something like this, or that no woman is technically proficient enough to do this.
Have a downvote for the ambiguity.
tetraodon ( @tetraodon@feddit.it ) English14•9 months agoJesus Christ. I feel sorry for you.
duxbellorum ( @duxbellorum@lemm.ee ) English15•10 months agoThis seems like a pretty significant overreaction. Like yes, it’s gross and it feels personal, but it’s not like any of the subjects were willing participants…their reputation is not being damaged. Would they lose their shit about a kid gluing a cut out of their crush’s face over the face of a pornstar in a magazine? Is this really any different from that?
These are school girls in their teenage years.To them and their parents, this must be a nightmare.
duxbellorum ( @duxbellorum@lemm.ee ) English4•9 months agoWhy? They didn’t take or share any nudes, and nobody believes they did.
This is only a nightmare if an ignorant adult tells them that it is.
Why? They didn’t take or share any nudes, and nobody believes they did.
This is only a nightmare if an ignorant adult tells them that it is.
So you don’t have children, right?
ParsnipWitch ( @ParsnipWitch@feddit.de ) English4•9 months agoDid your picture got taken and shared as a teenager? Did you get heavily sexualised and harassed? Believe me, it feels like a nightmare even if no one is telling you that it should feel like a nightmare.
Take your “sexual harassment is only bad to teenage girls if you tell them” shit elsewhere.
RagnarokOnline ( @RagnarokOnline@reddthat.com ) English17•10 months agoI don’t want to band wagon against you, but I do think it’s important that people who agree with your viewpoint have a chance to understand that the situation is a violation of privacy.
The kids’ reputation is, likely, damaged. You have an underage girl who is already dealing with the confusion and hierarchy of high school. Then (A) someone generates semi-accurate photos of what their naked body looks like and (B) distributes it to others.
Issue (A) is bad because it’s essentially CSAM and also because it’s attempting to access a view of someone that the subject likely hasn’t permitted the generator to have access to. This is a privacy violation and the ethics around it are questionable at best.
Issue (B) is that the generator didn’t stop at the violations of issue (A), but has now shared that material with other people who know the subject without the subject’s consent, and likely without her knowledge of the recipients. This means that the subject now has to perpetually wonder if every person they interact with (friends, teachers, other parents, her own parents) have seen lewd pictures of her. Hopefully you can see how this could disturb a young woman.
Now apply a different situation to it. Suppose you took a test at school or at work that shows you as dumb (like, laughably dumb; enough to make you feel subconscious). Even if you don’t think it’s a fair test, this test exists. Now, assume that someone shared this test with your friends, co-workers, and even your parents without you knowing exactly who received it. And instead of everyone saying “it’s just a dumb test — it doesn’t mean anything”, they decide it means something about you. Every hour or so, you walk by someone or interact with someone who chuckles or cracks a joke at your expense. You’re not allowed by your community to move on from this test.
Before your test was released, you could blend in. Now, you’re the person everyone is looking at and judging. Think of that added anxiety on top of everything else you have to deal with.
duxbellorum ( @duxbellorum@lemm.ee ) English4•9 months agoI appreciate your intentions, but your examples are just not up to the standard needed to treat AI generated nudes any differently than a nude magazine collage with kids’ crushes faces in it.
As uncanny as the nudes might be, they are NOT accurate. People know this and they are going to learn one way or another to adjust their definition of “real”. No character details like moles or their actual skin tone, or anything like this will be accurately portrayed. They have no reason to think “someone has seen their naked body”. Yeah, if someone tells them to worry about it, they will, as any young person will, but why? The bigger the deal we make of it, the worse it is, and the litmus test is, is it bad if we decide to ignore it and teach kids that ai generated nudes have nothing to do with them and that they can safely ignore them, then they do basically zero harm.
How is your test example related to this at all? In the one case, my face and clothed picture is acquired likely with my implied permission from social media and modifications that i did not authorize are added to it and it is then distributed, making me look naked and having no bearing on my person or character (since the ai doesn’t actually know what i look like naked) so no conclusion anyone would draw from it constitutes a disclosure of information about me. The test example constitutes a clear disclosure with provenance to establish the validity of the information, quire a different scenario. It is true that AI chat bots can be jail-broken to release my previous questions which might reveal things about my character that i do not wish to disclose, but that is a different issue and unrelated to these nude generators.
I’m not saying handing these nudes to a kid or blackmailing them is not criminal or harassment, just that the technology and medium should have almost no bearing on how we treat this.
RagnarokOnline ( @RagnarokOnline@reddthat.com ) English1•9 months agoBuddy, I want to let you know that I wrote a big rebuttal and then accidentally canceled my comment and it got erased. In my response I disagreed with your original argument and your rebuttal as well, but that I respected the time it took to share your thoughts. I’m so sad my dumb comment got deleted, lol
Know that I appreciate your lengthy response back to me.
Be well.
lambalicious ( @lambalicious@lemmy.sdf.org ) English3•9 months agoIssue (A) is bad because it’s essentially CSAM and also because it’s attempting to access a view of someone that the subject likely hasn’t permitted the generator to have access to. This is a privacy violation and the ethics around it are questionable at best.
That part is not a privacy violation, the same way someone drawing in a canvas their own impression of what a bank vault looks like on the inside does not constitute a trespassing / violation of privacy of the bank. Unless the AI in question used actual nudes of them as a basis, but then we wouldn’t need the extra AI step for this to be a problem, right? Otherwise, I’m rather sure that the actual privacy violation starts at (B).
Ofc, none of that makes it less of a problem, but it does feel to me like it subverts a potential angle for fighting against this.
RagnarokOnline ( @RagnarokOnline@reddthat.com ) English2•9 months agoI appreciate your input and am thankful for your perspective, mate.
LordXenu ( @LordXenu@lemm.ee ) English11•10 months agoBruh, all of this sounds creepy as shit.
iByteABit [he/him] ( @iByteABit@lemm.ee ) English12•10 months agoGovernments need to strike hard against all kinds of platforms like this, even if they can be used for legitimate reasons.
AI is way too dangerous a tool to allow free innovation and market on, it’s the number one technology right now that must be heavily regulated.
Blapoo ( @Blapoo@lemmy.ml ) English20•10 months agoWhat, exactly would they regulate? The training data? The output? What kinds of user inputs are accepted?
All of this is hackable.
pseudorandom ( @pseudorandom@kbin.social ) 19•10 months agoIt’s child porn in this case. Regulate it as such. Putting a real child’s head onto an AI generated body is sexualizing a child.
FUCKRedditMods ( @FUCKRedditMods@lemm.ee ) English6•10 months agoThat’s not what he’s saying, he’s asking what grounds and mechanism they have for regulating the platform itself.
RaivoKulli ( @RaivoKulli@sopuli.xyz ) English12•10 months agoMaking unauthorized nude images of other people, probably. The service did advertise, “undress anyone”.
jet ( @jet@hackertalks.com ) English9•10 months agoThe Philosophical question becomes, if it’s AI generated is it really a photo of them?
Let’s take it to an extreme. If you cut the face off somebody’s polaroid and then paste it into a nudie magazine over the face of an actress. Is that amalgam a nude photo of the Polaroid picture person?
It’s a debate that could go either way, and I’m sure we will have an exciting legal land scape with countries with different rules.
ReversalHatchery ( @ReversalHatchery@beehaw.org ) English8•10 months agoThe Philosophical question becomes, if it’s AI generated is it really a photo of them?
That does not matter, as people can’t make a difference, even if they wanted.
It is a photo about them if you can recognize them, especially their face, on it. jet ( @jet@hackertalks.com ) English7•10 months agoWhat if there’s somebody who looks very similar to somebody else? Are they prevented from using their likeness in film and media?
Could an identical twin sister be forbidden from going into porn, to prevent her from besmirching the good image of her other twin sister who’s a teacher?
ReversalHatchery ( @ReversalHatchery@beehaw.org ) English4•9 months agoThey are not looking very similar intentionally. But editing images is done pretty much intentionally.
taladar ( @taladar@feddit.de ) English8•9 months agoI suppose you could make a Ship of Theseus like argument there too. At what point does it matter where the parts of the picture came from. Most would probably be okay with their hairstyle being added to someone else’s picture, what about their eyes, their mouth,… Where exactly is the line?
jet ( @jet@hackertalks.com ) English6•9 months agoExactly. A bunch of litigators are going to get very rich debating this.
RagnarokOnline ( @RagnarokOnline@reddthat.com ) English7•10 months agoI think it comes down to the identity of the person whose head is on the body. For instance, if the eyes had a black bar covering them or if the face was blurred out, would it be as much an invasion of privacy?
However, if the face was censored, the photo wouldn’t have the same appeal to the person who generated it. That’s the issue here.
A cutout of a person’s head on a porn star’s picture still has a sense of falsehood to it. An AI generated image that’s likely similar to the subject’s body type removes a lot of the falsehood, and thus makes the image have more power. Without the subject’s consent, this power is harmful.
You’re right about the legal battles, though. I just feel bad for the people who will have their dignity compromised in the mean time. Everyone should be entitled to dignity.
RaivoKulli ( @RaivoKulli@sopuli.xyz ) English5•10 months agoIn this sort of situations the conclusion would be easy or in cases where we have the input photo. But absolutely it could get iffy
barsoap ( @barsoap@lemm.ee ) English4•10 months agoObjectively it’s absolutely not AIs don’t have X-ray eyes. Best they could do is infer rough body shape from a clothed example but anything beyond that is pure guesswork. The average 14yold is bound to be much better at undressing people with their eyes than an AI could ever be.
Subjectively, though, of course yes it is. You’re not imagining the cutie two desks over nude because it isn’t them.
ParsnipWitch ( @ParsnipWitch@feddit.de ) English2•9 months agoHow about we teach people some baseline of respect towards other people? Punishing behaviour like that can help showing that it’s not okay to treat other people like pieces of meat.
WarmSoda ( @WarmSoda@lemm.ee ) English4•10 months agoI’m pretty sure nude pictures of minors is already illegal.
RaivoKulli ( @RaivoKulli@sopuli.xyz ) English2•10 months agoI’m not sure if AI made ones count yet
WarmSoda ( @WarmSoda@lemm.ee ) English3•10 months agoYou go ahead and make AI generated kiddie porn and we’ll find out.
RaivoKulli ( @RaivoKulli@sopuli.xyz ) English3•9 months agoI’m fairly sure there are legal cases about it, so no need to encourage anyone to make kiddie porn…
WarmSoda ( @WarmSoda@lemm.ee ) English3•9 months agoThen wtf are you confused about? Lol
iByteABit [he/him] ( @iByteABit@lemm.ee ) English3•10 months agoSurely there will be loop holes, but there must be laws there in the first place. Better something than nothing
Risk ( @Risk@feddit.uk ) English17•10 months agoGood luck regulating cross borders.
I’d also prioritise regulating fossil fuel technology as the number one priority.
iByteABit [he/him] ( @iByteABit@lemm.ee ) English5•10 months agoFossil fuels is absolutely number one, I was talking about digital technology specifically
danhab99 ( @danhab99@programming.dev ) English9•9 months agoI tried the AI with a pic of me. It was incredibly inaccurate and gave me something between a dick and a vagina. Nothing truly damaging.
Rin ( @Rin@lemm.ee ) English5•9 months agoI’m morbidly curious
danhab99 ( @danhab99@programming.dev ) English1•9 months agoThen upload a picture of yourself. I think every account is allowed one free try
LoafyLemon ( @LoafyLemon@kbin.social ) 3•9 months agoSooo, intersex?
Aetherion ( @Aetherion@feddit.de ) English7•9 months agoBetter don’t stop posting your life into the internet, this would push people to create more child porn! /s
uxia ( @uxia@midwest.social ) English6•9 months agoWhy are men?
pinkdrunkenelephants ( @pinkdrunkenelephants@sopuli.xyz ) English5•9 months agoReason #48373884 why generative AI should be banned
iegod ( @iegod@lemm.ee ) English19•9 months agoDefinitely not down with banning. You can imagine nudity in your mind and redraw it. Do we ban thoughts and artists too? The AI isn’t the problem.
pinkdrunkenelephants ( @pinkdrunkenelephants@sopuli.xyz ) English11•9 months agoNo amount of false equivalencies will make me or anyone else accept something as stupid, dangerous and terrible as generative AI.
It’s on you to accept you don’t have the right to have a robot think and be creative for you, and that poor girl is one of many reasons why.
iegod ( @iegod@lemm.ee ) English17•9 months agoYou’re riled up, I get it, but your statements are simply not factual, as much as you want them to be.
1984 ( @1984@lemmy.today ) English10•9 months agoI don’t know… I think in this age, you can always say any nude is AI generated, so nobody can be sure it’s a real nude.
There will come a time soon when people won’t trust what they see online because AI.
pinkdrunkenelephants ( @pinkdrunkenelephants@sopuli.xyz ) English1•9 months agoAI generated content is usually pretty obvious.
1984 ( @1984@lemmy.today ) English8•9 months agoToday sure. Tomorrow, not so sure.
pinkdrunkenelephants ( @pinkdrunkenelephants@sopuli.xyz ) English2•9 months agoFair. All the more reason to ban it.
1984 ( @1984@lemmy.today ) English2•9 months agoIn theory yeah, but we live in a world where companies will do whatever they want, and any punishment is just “cost of doing business”.
pinkdrunkenelephants ( @pinkdrunkenelephants@sopuli.xyz ) English3•9 months agoUnless we work to fix it, it will always be that way.
LoafyLemon ( @LoafyLemon@kbin.social ) 1•9 months agoYou cannot put the genie back in the lamp.
pinkdrunkenelephants ( @pinkdrunkenelephants@sopuli.xyz ) English4•9 months agoYou’re not a genie, you’re a lazy reprobate.
LoafyLemon ( @LoafyLemon@kbin.social ) 2•9 months agoIt’s impossible to ban AI once it’s allowed for public use because technology spreads rapidly, and enforcing a ban becomes impractical due to its widespread adoption and the difficulty of regulating it effectively. But hey, if you want to make an ineffective ban that will only affect one small part of the world, irrelevant to the masses, be my guest.
pinkdrunkenelephants ( @pinkdrunkenelephants@sopuli.xyz ) English4•9 months agoNo it isn’t. We can and have banned awful, terrible shit that became widespread before, and we’ll do it again. To your precious AI you’re dumb enough to allow to do your thinking for you. We’ll even jail you for using the things.
That’s what laws are for and if we believe what you’re saying, then no law can exist.
LoafyLemon ( @LoafyLemon@kbin.social ) 1•9 months agoConsidering I’m not even a US resident, your government and laws cannot touch me, that’s how irrelevant your knee-jerk reactions are.
Do you think China, India, or even members of the EU will stop developing AI because one country said so? Your expectations are highly unrealistic.
pinkdrunkenelephants ( @pinkdrunkenelephants@sopuli.xyz ) English6•9 months agoOther countries can ban you, too.
And I do think the EU is more likely even than us to ban you, or at least heavily regulate you.
You’re living in a dream world if you think you can steal everyone else’s artwork en masse, use it to generate art for you and think you can get away with it. It’s going to happen. You’re going to get banned.
LoafyLemon ( @LoafyLemon@kbin.social ) 1•9 months agoI’m fine making art on my own, without AI, but thanks for your concern.