Police investigation remains open. The photo of one of the minors included a fly; that is the logo of Clothoff, the application that is presumably being used to create the images, which promotes its services with the slogan: “Undress anybody with our free service!”

  •  aard   ( @aard@kyu.de ) 
    link
    fedilink
    English
    16210 months ago

    This was just a matter of time - and there isn’t really that much the affected can do (and in some cases, should do). Shutting down that service is the correct thing - but that’ll only buy a short amount of time: Training custom models is trivial nowadays, and both the skill and hardware to do so is in reach of the age group in question.

    So in the long term we’ll see that shift to images generated at home, by kids often too young to be prosecuted - and you won’t be able to stop that unless you start outlawing most of AI image generation tools.

    At least in Germany the dealing with child/youth pornography got badly botched by incompetent populists in the government - which would send any of those parents to jail for at least a year, if they take possession of one of those generated pictures. Having it sent to their phone and going to police for a complaint would be sufficient to get prosecution against them started.

    There’s one blessing coming out of that mess, though: For girls who did take pictures, and had them leaked, saying “they’re AI generated” is becoming a plausible way out.

  •  rufus   ( @rufus@discuss.tchncs.de ) 
    link
    fedilink
    English
    50
    edit-2
    9 months ago

    Interesting. Replika AI, ChatGPT etc crack down on me for doing erotic stories and roleplay text dialogues. And this Clothoff App happily draws child pornography of 14 year olds? Shaking my head…

    I wonder why they have no address etc on their website and the app isn’t available in any of the proper app-stores.

    Obviously police should ask Instagram who blackmails all these girls… Teach them a proper lesson. And then stop this company. Have them fined a few millions for generating and spreading synthetic CP. At least write a letter to their hosting or payment providers.

      •  rufus   ( @rufus@discuss.tchncs.de ) 
        link
        fedilink
        English
        2
        edit-2
        9 months ago

        I just hope they even try to catch these people. I’ve tried to look up who’s behind that and it’s a domain that’s with name.com and the server is behind Cloudflare. I’m not Anonymous, so that’s the point at which I’m at my wits’ end. Someone enraged could file a few reports at their abuse contacts… Just sayin…

        There’s always the possibility they just catch the boy and just punish him. Letting the even more disgusting people in the background keep doing what they want. Because it would be difficult to get a hold of them. This would be the easiest route for the prosecuters and the least efficient way to deal with this issue as a whole.

  • The shock value of a nude picture will become increasingly humdrum as they become more widespread. Nudes will become so common that no one will batt an eye. In fact, some less endowed, less perfect ladies will no doubt do AI generated pictures or movies of themselves to sell on the internet. Think of it as photoshop X 10.

  • Banning diffusion models doesn’t work, the tech is already out there and you can’t put it back in the box. Fake nudes used to be done with PhotoShop, the current generative AI models only makes them faster to make.

    This can only be stopped on the distribution side, and any new laws should focus on that.

    But the silver lining of this whole thing is that nude scandals for celebs aren’t really possible any more if you can just say it’s probably a deepfake.

    • Yea, with 15 good headshots from different angles I can build a LoRA for anybody, hell Civit is full of celebrity LoRA’s.

      Mage.space already had to switch to SFW because people were generating CP. The past couple weeks I’ve been playing with stable diffusion and some of the checkpoints easily generate content that I had to delete because they looked REALLY young and it creeped me out.

  • This seems like a pretty significant overreaction. Like yes, it’s gross and it feels personal, but it’s not like any of the subjects were willing participants…their reputation is not being damaged. Would they lose their shit about a kid gluing a cut out of their crush’s face over the face of a pornstar in a magazine? Is this really any different from that?

        •  ParsnipWitch   ( @ParsnipWitch@feddit.de ) 
          link
          fedilink
          English
          4
          edit-2
          9 months ago

          Did your picture got taken and shared as a teenager? Did you get heavily sexualised and harassed? Believe me, it feels like a nightmare even if no one is telling you that it should feel like a nightmare.

          Take your “sexual harassment is only bad to teenage girls if you tell them” shit elsewhere.

    • I don’t want to band wagon against you, but I do think it’s important that people who agree with your viewpoint have a chance to understand that the situation is a violation of privacy.

      The kids’ reputation is, likely, damaged. You have an underage girl who is already dealing with the confusion and hierarchy of high school. Then (A) someone generates semi-accurate photos of what their naked body looks like and (B) distributes it to others.

      Issue (A) is bad because it’s essentially CSAM and also because it’s attempting to access a view of someone that the subject likely hasn’t permitted the generator to have access to. This is a privacy violation and the ethics around it are questionable at best.

      Issue (B) is that the generator didn’t stop at the violations of issue (A), but has now shared that material with other people who know the subject without the subject’s consent, and likely without her knowledge of the recipients. This means that the subject now has to perpetually wonder if every person they interact with (friends, teachers, other parents, her own parents) have seen lewd pictures of her. Hopefully you can see how this could disturb a young woman.

      Now apply a different situation to it. Suppose you took a test at school or at work that shows you as dumb (like, laughably dumb; enough to make you feel subconscious). Even if you don’t think it’s a fair test, this test exists. Now, assume that someone shared this test with your friends, co-workers, and even your parents without you knowing exactly who received it. And instead of everyone saying “it’s just a dumb test — it doesn’t mean anything”, they decide it means something about you. Every hour or so, you walk by someone or interact with someone who chuckles or cracks a joke at your expense. You’re not allowed by your community to move on from this test.

      Before your test was released, you could blend in. Now, you’re the person everyone is looking at and judging. Think of that added anxiety on top of everything else you have to deal with.

      • I appreciate your intentions, but your examples are just not up to the standard needed to treat AI generated nudes any differently than a nude magazine collage with kids’ crushes faces in it.

        As uncanny as the nudes might be, they are NOT accurate. People know this and they are going to learn one way or another to adjust their definition of “real”. No character details like moles or their actual skin tone, or anything like this will be accurately portrayed. They have no reason to think “someone has seen their naked body”. Yeah, if someone tells them to worry about it, they will, as any young person will, but why? The bigger the deal we make of it, the worse it is, and the litmus test is, is it bad if we decide to ignore it and teach kids that ai generated nudes have nothing to do with them and that they can safely ignore them, then they do basically zero harm.

        How is your test example related to this at all? In the one case, my face and clothed picture is acquired likely with my implied permission from social media and modifications that i did not authorize are added to it and it is then distributed, making me look naked and having no bearing on my person or character (since the ai doesn’t actually know what i look like naked) so no conclusion anyone would draw from it constitutes a disclosure of information about me. The test example constitutes a clear disclosure with provenance to establish the validity of the information, quire a different scenario. It is true that AI chat bots can be jail-broken to release my previous questions which might reveal things about my character that i do not wish to disclose, but that is a different issue and unrelated to these nude generators.

        I’m not saying handing these nudes to a kid or blackmailing them is not criminal or harassment, just that the technology and medium should have almost no bearing on how we treat this.

        • Buddy, I want to let you know that I wrote a big rebuttal and then accidentally canceled my comment and it got erased. In my response I disagreed with your original argument and your rebuttal as well, but that I respected the time it took to share your thoughts. I’m so sad my dumb comment got deleted, lol

          Know that I appreciate your lengthy response back to me.

          Be well.

      • Issue (A) is bad because it’s essentially CSAM and also because it’s attempting to access a view of someone that the subject likely hasn’t permitted the generator to have access to. This is a privacy violation and the ethics around it are questionable at best.

        That part is not a privacy violation, the same way someone drawing in a canvas their own impression of what a bank vault looks like on the inside does not constitute a trespassing / violation of privacy of the bank. Unless the AI in question used actual nudes of them as a basis, but then we wouldn’t need the extra AI step for this to be a problem, right? Otherwise, I’m rather sure that the actual privacy violation starts at (B).

        Ofc, none of that makes it less of a problem, but it does feel to me like it subverts a potential angle for fighting against this.

  • Governments need to strike hard against all kinds of platforms like this, even if they can be used for legitimate reasons.

    AI is way too dangerous a tool to allow free innovation and market on, it’s the number one technology right now that must be heavily regulated.

        •  jet   ( @jet@hackertalks.com ) 
          link
          fedilink
          English
          9
          edit-2
          10 months ago

          The Philosophical question becomes, if it’s AI generated is it really a photo of them?

          Let’s take it to an extreme. If you cut the face off somebody’s polaroid and then paste it into a nudie magazine over the face of an actress. Is that amalgam a nude photo of the Polaroid picture person?

          It’s a debate that could go either way, and I’m sure we will have an exciting legal land scape with countries with different rules.

          • The Philosophical question becomes, if it’s AI generated is it really a photo of them?

            That does not matter, as people can’t make a difference, even if they wanted.
            It is a photo about them if you can recognize them, especially their face, on it.

            •  jet   ( @jet@hackertalks.com ) 
              link
              fedilink
              English
              7
              edit-2
              10 months ago

              What if there’s somebody who looks very similar to somebody else? Are they prevented from using their likeness in film and media?

              Could an identical twin sister be forbidden from going into porn, to prevent her from besmirching the good image of her other twin sister who’s a teacher?

          •  taladar   ( @taladar@feddit.de ) 
            link
            fedilink
            English
            89 months ago

            I suppose you could make a Ship of Theseus like argument there too. At what point does it matter where the parts of the picture came from. Most would probably be okay with their hairstyle being added to someone else’s picture, what about their eyes, their mouth,… Where exactly is the line?

          • I think it comes down to the identity of the person whose head is on the body. For instance, if the eyes had a black bar covering them or if the face was blurred out, would it be as much an invasion of privacy?

            However, if the face was censored, the photo wouldn’t have the same appeal to the person who generated it. That’s the issue here.

            A cutout of a person’s head on a porn star’s picture still has a sense of falsehood to it. An AI generated image that’s likely similar to the subject’s body type removes a lot of the falsehood, and thus makes the image have more power. Without the subject’s consent, this power is harmful.

            You’re right about the legal battles, though. I just feel bad for the people who will have their dignity compromised in the mean time. Everyone should be entitled to dignity.

          •  barsoap   ( @barsoap@lemm.ee ) 
            link
            fedilink
            English
            4
            edit-2
            10 months ago

            Objectively it’s absolutely not AIs don’t have X-ray eyes. Best they could do is infer rough body shape from a clothed example but anything beyond that is pure guesswork. The average 14yold is bound to be much better at undressing people with their eyes than an AI could ever be.

            Subjectively, though, of course yes it is. You’re not imagining the cutie two desks over nude because it isn’t them.