- cross-posted to:
- france@jlai.lu
- feminism
Police investigation remains open. The photo of one of the minors included a fly; that is the logo of Clothoff, the application that is presumably being used to create the images, which promotes its services with the slogan: “Undress anybody with our free service!”
- aard ( @aard@kyu.de ) English162•1 year ago
This was just a matter of time - and there isn’t really that much the affected can do (and in some cases, should do). Shutting down that service is the correct thing - but that’ll only buy a short amount of time: Training custom models is trivial nowadays, and both the skill and hardware to do so is in reach of the age group in question.
So in the long term we’ll see that shift to images generated at home, by kids often too young to be prosecuted - and you won’t be able to stop that unless you start outlawing most of AI image generation tools.
At least in Germany the dealing with child/youth pornography got badly botched by incompetent populists in the government - which would send any of those parents to jail for at least a year, if they take possession of one of those generated pictures. Having it sent to their phone and going to police for a complaint would be sufficient to get prosecution against them started.
There’s one blessing coming out of that mess, though: For girls who did take pictures, and had them leaked, saying “they’re AI generated” is becoming a plausible way out.
- Cethin ( @Cethin@lemmy.zip ) English10•1 year ago
Yeah, what I see happening is people end up not caring as much because there’s going to be so much plausible AI generated crap that any real stuff will be lost in the noise.
- Turun ( @Turun@feddit.de ) English4•1 year ago
Quelle für das angesprochene Gesetz bitte. Das will ich im Detail lesen.
- aard ( @aard@kyu.de ) English7•1 year ago
Fang mit dem relativ neuen Fall hier an, und von da solltest du dann genug Info haben um selber zu suchen was die letzten Jahre passiert ist - das ist exakt das wovor damals gewarnt wurde, aber wer den hysterischen Irren die alles was irgendwie mit “Teenager entdecken Sexualitaet” mit dem Strafrecht erschlagen wollen mit durchdachten Argumenten kommt ist dann ja direkt auch ein Paedophiler.
- ciko22i3 ( @ciko22i3@sopuli.xyz ) English79•1 year ago
At least now you can claim it’s AI if your real nudes leak
- taladar ( @taladar@feddit.de ) English56•1 year ago
In the long term that might even lead to society stopping their freak-outs every time someone in some semi-sensitive position is discovered to have nude pictures online.
- SkyeStarfall ( @SkyeStarfall@lemmy.blahaj.zone ) English19•1 year ago
I hope so. We shouldn’t be ashamed of our bodies or sexuality.
- rufus ( @rufus@discuss.tchncs.de ) English50•1 year ago
Interesting. Replika AI, ChatGPT etc crack down on me for doing erotic stories and roleplay text dialogues. And this Clothoff App happily draws child pornography of 14 year olds? Shaking my head…
I wonder why they have no address etc on their website and the app isn’t available in any of the proper app-stores.
Obviously police should ask Instagram who blackmails all these girls… Teach them a proper lesson. And then stop this company. Have them fined a few millions for generating and spreading synthetic CP. At least write a letter to their hosting or payment providers.
- crispy_kilt ( @crispy_kilt@feddit.de ) English3•1 year ago
Fined? Fuck that. CP must result in jail time.
- rufus ( @rufus@discuss.tchncs.de ) English2•1 year ago
I just hope they even try to catch these people. I’ve tried to look up who’s behind that and it’s a domain that’s with name.com and the server is behind Cloudflare. I’m not Anonymous, so that’s the point at which I’m at my wits’ end. Someone enraged could file a few reports at their abuse contacts… Just sayin…
There’s always the possibility they just catch the boy and just punish him. Letting the even more disgusting people in the background keep doing what they want. Because it would be difficult to get a hold of them. This would be the easiest route for the prosecuters and the least efficient way to deal with this issue as a whole.
- /home/pineapplelover ( @pineapplelover@lemm.ee ) English2•1 year ago
Prison at the very least and all the inmates need to know that you engaged in CP.
- rayyyy ( @rayyyy@kbin.social ) 36•1 year ago
The shock value of a nude picture will become increasingly humdrum as they become more widespread. Nudes will become so common that no one will batt an eye. In fact, some less endowed, less perfect ladies will no doubt do AI generated pictures or movies of themselves to sell on the internet. Think of it as photoshop X 10.
- DessertStorms ( @DessertStorms@kbin.social ) 41•1 year ago
This isn’t about nude photos, it’s about consent.
- andrai ( @andrai@feddit.de ) English43•1 year ago
I can already get a canvas and brush and draw what I think u/DessertStorms looks like naked and there is nothing you can do about it.
- DessertStorms ( @DessertStorms@kbin.social ) 12•1 year ago
You’re not making the point you think you are, instead you’re just outing yourself as a creep. ¯_(ツ)_/¯
- andrai ( @andrai@feddit.de ) English19•1 year ago
Hey, you dropped this \
¯\_(ツ)_/¯
- ParsnipWitch ( @ParsnipWitch@feddit.de ) English4•1 year ago
The lack of empathy in your response is telling. People do not care for the effect this has on teenage girls. They don’t even try to be compassionate. I think this will just become the next thing girls and women will simply have to accept as part of their life and the sexism and objectification that is targeted at them. But “boys will be boys” right?
- taladar ( @taladar@feddit.de ) English29•1 year ago
Photoshopped nude pictures of celebrities (and people the photoshopper knew personally) have been around for at least 30 years at this point. This is not a new issue as far as the legal situation is concerned, just the ease of doing it changed a bit.
- SharkEatingBreakfast ( @SharkEatingBreakfast@sopuli.xyz ) English7•1 year ago
The article is about children.
- devils_advocate ( @devils_advocate@lemmy.ml ) English4•1 year ago
The age of the victims is not really relevant. The problem would remain if the article were about adults.
- SharkEatingBreakfast ( @SharkEatingBreakfast@sopuli.xyz ) English3•1 year ago
The problem is very different here because they are children.
- devils_advocate ( @devils_advocate@lemmy.ml ) English2•1 year ago
Very different to what? AI identity theft is what creates the victims, independent of age (or clothing).
- Abaixo de Cão ( @AbaixoDeCao@lemm.ee ) English29•1 year ago
That’s really, really sad, EU, please try to regulate AI.
- Sigmatics ( @Sigmatics@lemmy.ca ) English22•1 year ago
The only thing new about this is that the photos are probably more realistic, but still fake. Apps to do this existed before GenAI was a thing
- Margot Robbie ( @MargotRobbie@lemm.ee ) English21•1 year ago
Banning diffusion models doesn’t work, the tech is already out there and you can’t put it back in the box. Fake nudes used to be done with PhotoShop, the current generative AI models only makes them faster to make.
This can only be stopped on the distribution side, and any new laws should focus on that.
But the silver lining of this whole thing is that nude scandals for celebs aren’t really possible any more if you can just say it’s probably a deepfake.
- GCostanzaStepOnMe ( @GCostanzaStepOnMe@feddit.de ) English5•1 year ago
Other than banning those websites and apps that offer such services, I think we also need to seriously rethink our overall exposure to the internet, and especially rethink how and how much children access it.
- PolarisFx ( @PolarisFx@lemmy.dbzer0.com ) English1•1 year ago
Yea, with 15 good headshots from different angles I can build a LoRA for anybody, hell Civit is full of celebrity LoRA’s.
Mage.space already had to switch to SFW because people were generating CP. The past couple weeks I’ve been playing with stable diffusion and some of the checkpoints easily generate content that I had to delete because they looked REALLY young and it creeped me out.
- tetraodon ( @tetraodon@feddit.it ) English20•1 year ago
I feel somewhat bad saying this, but the wo/man (it will be a man) who can make an Apple Vision Pro work with AI nudifiers will become rich.
- TheGreenGolem ( @TheGreenGolem@lemm.ee ) English11•1 year ago
You know the old joke: if we could do anything with just our eyes, the streets would be full of dead people and pregnant women.
- helixdaunting ( @helixdaunting@lemm.ee ) English5•1 year ago
I’ve never heard that joke before, but that’s brilliant.
- uxia ( @uxia@midwest.social ) English8•1 year ago
Lol then people will probably start assuming anyone wearing that technology is a pedophile and/or disgusting creep.
- GCostanzaStepOnMe ( @GCostanzaStepOnMe@feddit.de ) English2•1 year ago
As they should
- Skates ( @Skates@feddit.nl ) English4•1 year ago
(it will be a man)
I don’t even know whether to upvote or downvote your comment because I can’t figure out if you’re trying to say that only a man would do something like this, or that no woman is technically proficient enough to do this.
Have a downvote for the ambiguity.
- tetraodon ( @tetraodon@feddit.it ) English14•1 year ago
Jesus Christ. I feel sorry for you.
- YurkshireLad ( @YurkshireLad@lemmy.ca ) English20•1 year ago
Maybe something will change as soon as people start creating and distributing fake AI nudes of that country’s leaders.
- Risk ( @Risk@feddit.uk ) English16•1 year ago
Honestly surprised this didn’t happen first.
Be a great way to discredit politicians in homophobic states, by showing a politician taking it up the arse.
- Sabata11792 ( @Sabata11792@kbin.social ) 11•1 year ago
Its already happened, and there is not enough In the world bleach to unsee it.
- duxbellorum ( @duxbellorum@lemm.ee ) English15•1 year ago
This seems like a pretty significant overreaction. Like yes, it’s gross and it feels personal, but it’s not like any of the subjects were willing participants…their reputation is not being damaged. Would they lose their shit about a kid gluing a cut out of their crush’s face over the face of a pornstar in a magazine? Is this really any different from that?
These are school girls in their teenage years.To them and their parents, this must be a nightmare.
- duxbellorum ( @duxbellorum@lemm.ee ) English4•1 year ago
Why? They didn’t take or share any nudes, and nobody believes they did.
This is only a nightmare if an ignorant adult tells them that it is.
Why? They didn’t take or share any nudes, and nobody believes they did.
This is only a nightmare if an ignorant adult tells them that it is.
So you don’t have children, right?
- ParsnipWitch ( @ParsnipWitch@feddit.de ) English4•1 year ago
Did your picture got taken and shared as a teenager? Did you get heavily sexualised and harassed? Believe me, it feels like a nightmare even if no one is telling you that it should feel like a nightmare.
Take your “sexual harassment is only bad to teenage girls if you tell them” shit elsewhere.
- RagnarokOnline ( @RagnarokOnline@reddthat.com ) English17•1 year ago
I don’t want to band wagon against you, but I do think it’s important that people who agree with your viewpoint have a chance to understand that the situation is a violation of privacy.
The kids’ reputation is, likely, damaged. You have an underage girl who is already dealing with the confusion and hierarchy of high school. Then (A) someone generates semi-accurate photos of what their naked body looks like and (B) distributes it to others.
Issue (A) is bad because it’s essentially CSAM and also because it’s attempting to access a view of someone that the subject likely hasn’t permitted the generator to have access to. This is a privacy violation and the ethics around it are questionable at best.
Issue (B) is that the generator didn’t stop at the violations of issue (A), but has now shared that material with other people who know the subject without the subject’s consent, and likely without her knowledge of the recipients. This means that the subject now has to perpetually wonder if every person they interact with (friends, teachers, other parents, her own parents) have seen lewd pictures of her. Hopefully you can see how this could disturb a young woman.
Now apply a different situation to it. Suppose you took a test at school or at work that shows you as dumb (like, laughably dumb; enough to make you feel subconscious). Even if you don’t think it’s a fair test, this test exists. Now, assume that someone shared this test with your friends, co-workers, and even your parents without you knowing exactly who received it. And instead of everyone saying “it’s just a dumb test — it doesn’t mean anything”, they decide it means something about you. Every hour or so, you walk by someone or interact with someone who chuckles or cracks a joke at your expense. You’re not allowed by your community to move on from this test.
Before your test was released, you could blend in. Now, you’re the person everyone is looking at and judging. Think of that added anxiety on top of everything else you have to deal with.
- duxbellorum ( @duxbellorum@lemm.ee ) English4•1 year ago
I appreciate your intentions, but your examples are just not up to the standard needed to treat AI generated nudes any differently than a nude magazine collage with kids’ crushes faces in it.
As uncanny as the nudes might be, they are NOT accurate. People know this and they are going to learn one way or another to adjust their definition of “real”. No character details like moles or their actual skin tone, or anything like this will be accurately portrayed. They have no reason to think “someone has seen their naked body”. Yeah, if someone tells them to worry about it, they will, as any young person will, but why? The bigger the deal we make of it, the worse it is, and the litmus test is, is it bad if we decide to ignore it and teach kids that ai generated nudes have nothing to do with them and that they can safely ignore them, then they do basically zero harm.
How is your test example related to this at all? In the one case, my face and clothed picture is acquired likely with my implied permission from social media and modifications that i did not authorize are added to it and it is then distributed, making me look naked and having no bearing on my person or character (since the ai doesn’t actually know what i look like naked) so no conclusion anyone would draw from it constitutes a disclosure of information about me. The test example constitutes a clear disclosure with provenance to establish the validity of the information, quire a different scenario. It is true that AI chat bots can be jail-broken to release my previous questions which might reveal things about my character that i do not wish to disclose, but that is a different issue and unrelated to these nude generators.
I’m not saying handing these nudes to a kid or blackmailing them is not criminal or harassment, just that the technology and medium should have almost no bearing on how we treat this.
- RagnarokOnline ( @RagnarokOnline@reddthat.com ) English1•1 year ago
Buddy, I want to let you know that I wrote a big rebuttal and then accidentally canceled my comment and it got erased. In my response I disagreed with your original argument and your rebuttal as well, but that I respected the time it took to share your thoughts. I’m so sad my dumb comment got deleted, lol
Know that I appreciate your lengthy response back to me.
Be well.
- lambalicious ( @lambalicious@lemmy.sdf.org ) English3•1 year ago
Issue (A) is bad because it’s essentially CSAM and also because it’s attempting to access a view of someone that the subject likely hasn’t permitted the generator to have access to. This is a privacy violation and the ethics around it are questionable at best.
That part is not a privacy violation, the same way someone drawing in a canvas their own impression of what a bank vault looks like on the inside does not constitute a trespassing / violation of privacy of the bank. Unless the AI in question used actual nudes of them as a basis, but then we wouldn’t need the extra AI step for this to be a problem, right? Otherwise, I’m rather sure that the actual privacy violation starts at (B).
Ofc, none of that makes it less of a problem, but it does feel to me like it subverts a potential angle for fighting against this.
- RagnarokOnline ( @RagnarokOnline@reddthat.com ) English2•1 year ago
I appreciate your input and am thankful for your perspective, mate.
- LordXenu ( @LordXenu@lemm.ee ) English11•1 year ago
Bruh, all of this sounds creepy as shit.
- iByteABit [he/him] ( @iByteABit@lemm.ee ) English12•1 year ago
Governments need to strike hard against all kinds of platforms like this, even if they can be used for legitimate reasons.
AI is way too dangerous a tool to allow free innovation and market on, it’s the number one technology right now that must be heavily regulated.
- Blapoo ( @Blapoo@lemmy.ml ) English20•1 year ago
What, exactly would they regulate? The training data? The output? What kinds of user inputs are accepted?
All of this is hackable.
- pseudorandom ( @pseudorandom@kbin.social ) 19•1 year ago
It’s child porn in this case. Regulate it as such. Putting a real child’s head onto an AI generated body is sexualizing a child.
- FUCKRedditMods ( @FUCKRedditMods@lemm.ee ) English6•1 year ago
That’s not what he’s saying, he’s asking what grounds and mechanism they have for regulating the platform itself.
- RaivoKulli ( @RaivoKulli@sopuli.xyz ) English12•1 year ago
Making unauthorized nude images of other people, probably. The service did advertise, “undress anyone”.
- WarmSoda ( @WarmSoda@lemm.ee ) English4•1 year ago
I’m pretty sure nude pictures of minors is already illegal.
- RaivoKulli ( @RaivoKulli@sopuli.xyz ) English2•1 year ago
I’m not sure if AI made ones count yet
- WarmSoda ( @WarmSoda@lemm.ee ) English3•1 year ago
You go ahead and make AI generated kiddie porn and we’ll find out.
- RaivoKulli ( @RaivoKulli@sopuli.xyz ) English3•1 year ago
I’m fairly sure there are legal cases about it, so no need to encourage anyone to make kiddie porn…
- WarmSoda ( @WarmSoda@lemm.ee ) English3•1 year ago
Then wtf are you confused about? Lol
- iByteABit [he/him] ( @iByteABit@lemm.ee ) English3•1 year ago
Surely there will be loop holes, but there must be laws there in the first place. Better something than nothing
- Risk ( @Risk@feddit.uk ) English17•1 year ago
Good luck regulating cross borders.
I’d also prioritise regulating fossil fuel technology as the number one priority.
- iByteABit [he/him] ( @iByteABit@lemm.ee ) English5•1 year ago
Fossil fuels is absolutely number one, I was talking about digital technology specifically
- danhab99 ( @danhab99@programming.dev ) English9•1 year ago
I tried the AI with a pic of me. It was incredibly inaccurate and gave me something between a dick and a vagina. Nothing truly damaging.
- Rin ( @Rin@lemm.ee ) English5•1 year ago
I’m morbidly curious
- danhab99 ( @danhab99@programming.dev ) English1•1 year ago
Then upload a picture of yourself. I think every account is allowed one free try
- LoafyLemon ( @LoafyLemon@kbin.social ) 3•1 year ago
Sooo, intersex?
- Aetherion ( @Aetherion@feddit.de ) English7•1 year ago
Better don’t stop posting your life into the internet, this would push people to create more child porn! /s
- uxia ( @uxia@midwest.social ) English6•1 year ago
Why are men?
- pinkdrunkenelephants ( @pinkdrunkenelephants@sopuli.xyz ) English5•1 year ago
Reason #48373884 why generative AI should be banned
- iegod ( @iegod@lemm.ee ) English19•1 year ago
Definitely not down with banning. You can imagine nudity in your mind and redraw it. Do we ban thoughts and artists too? The AI isn’t the problem.
- pinkdrunkenelephants ( @pinkdrunkenelephants@sopuli.xyz ) English11•1 year ago
No amount of false equivalencies will make me or anyone else accept something as stupid, dangerous and terrible as generative AI.
It’s on you to accept you don’t have the right to have a robot think and be creative for you, and that poor girl is one of many reasons why.
- iegod ( @iegod@lemm.ee ) English17•1 year ago
You’re riled up, I get it, but your statements are simply not factual, as much as you want them to be.
- 1984 ( @1984@lemmy.today ) English10•1 year ago
I don’t know… I think in this age, you can always say any nude is AI generated, so nobody can be sure it’s a real nude.
There will come a time soon when people won’t trust what they see online because AI.
- pinkdrunkenelephants ( @pinkdrunkenelephants@sopuli.xyz ) English1•1 year ago
AI generated content is usually pretty obvious.
- 1984 ( @1984@lemmy.today ) English8•1 year ago
Today sure. Tomorrow, not so sure.
- pinkdrunkenelephants ( @pinkdrunkenelephants@sopuli.xyz ) English2•1 year ago
Fair. All the more reason to ban it.
- 1984 ( @1984@lemmy.today ) English2•1 year ago
In theory yeah, but we live in a world where companies will do whatever they want, and any punishment is just “cost of doing business”.
- pinkdrunkenelephants ( @pinkdrunkenelephants@sopuli.xyz ) English3•1 year ago
Unless we work to fix it, it will always be that way.
- LoafyLemon ( @LoafyLemon@kbin.social ) 1•1 year ago
You cannot put the genie back in the lamp.
- pinkdrunkenelephants ( @pinkdrunkenelephants@sopuli.xyz ) English4•1 year ago
You’re not a genie, you’re a lazy reprobate.
- LoafyLemon ( @LoafyLemon@kbin.social ) 2•1 year ago
It’s impossible to ban AI once it’s allowed for public use because technology spreads rapidly, and enforcing a ban becomes impractical due to its widespread adoption and the difficulty of regulating it effectively. But hey, if you want to make an ineffective ban that will only affect one small part of the world, irrelevant to the masses, be my guest.
- pinkdrunkenelephants ( @pinkdrunkenelephants@sopuli.xyz ) English4•1 year ago
No it isn’t. We can and have banned awful, terrible shit that became widespread before, and we’ll do it again. To your precious AI you’re dumb enough to allow to do your thinking for you. We’ll even jail you for using the things.
That’s what laws are for and if we believe what you’re saying, then no law can exist.
- LoafyLemon ( @LoafyLemon@kbin.social ) 1•1 year ago
Considering I’m not even a US resident, your government and laws cannot touch me, that’s how irrelevant your knee-jerk reactions are.
Do you think China, India, or even members of the EU will stop developing AI because one country said so? Your expectations are highly unrealistic.
- pinkdrunkenelephants ( @pinkdrunkenelephants@sopuli.xyz ) English6•1 year ago
Other countries can ban you, too.
And I do think the EU is more likely even than us to ban you, or at least heavily regulate you.
You’re living in a dream world if you think you can steal everyone else’s artwork en masse, use it to generate art for you and think you can get away with it. It’s going to happen. You’re going to get banned.
- LoafyLemon ( @LoafyLemon@kbin.social ) 1•1 year ago
I’m fine making art on my own, without AI, but thanks for your concern.