- cross-posted to:
- legalnews@lemmy.zip
- cross-posted to:
- legalnews@lemmy.zip
cross-posted from: https://lemmy.zip/post/15863526
Steven Anderegg allegedly used the Stable Diffusion AI model to generate photos; if convicted, he could face up to 70 years in prison
Daxtron2 ( @Daxtron2@startrek.website ) English19•11 months agoHow are they abuse images if no abuse took place to create them?
Possibly linux ( @possiblylinux127@lemmy.zip ) English2•11 months agoCSAM is illegal all around
Darkrai ( @Darkrai@kbin.social ) 15•11 months agoIt seems weird that the AI companies aren’t being held responsible too.
bjorney ( @bjorney@lemmy.ca ) English14•11 months agoIt’s open source code that someone ran on their own computer, it’s not like he used paid OpenAI credits to generate the image.
It also would set a bad precedent - it would be like charging Solomons & Fryhle because someone used their (absolutely ubiquitous) organic chemistry textbook to create methamphetamine
Demigodrick ( @Demigodrick@lemmy.zip ) English9•11 months agoWell the American way is not to hold the company accountable, I.e. school shootings, so yeah.
Possibly linux ( @possiblylinux127@lemmy.zip ) English5•11 months agoI’m pretty sure you can’t hold a school liable for a school shooting
JokeDeity ( @JokeDeity@lemm.ee ) English5•11 months agoJust to be clear, you guys think that any company that produces anything that ends up used in a crime should have criminal charges for making the product? Yeah, makes about as much sense as anything these days.
jonne ( @jonne@infosec.pub ) English1•11 months agoI think stable diffusion is an open source AI you can run on your own computer, so I don’t see how the developers should be held responsible for that.
Lowlee Kun ( @Obonga@feddit.de ) English13•11 months ago13.000 images are generated relatively fast. My PC needs like 5 seconds for a picture with SD(depending on settings of course). So not even a day.
Also, if pedos would only create their own shit to fap to i would consider this a win.
stevedidwhat_infosec ( @stevedidwhat_infosec@infosec.pub ) English13•11 months agoSensitive topic - obviously.
However these guard rail laws, and “won’t someone think about the children” cases are a reeeeally easy way for the government to remove more power from the people.
However, I believe if handled correctly, banning this sort of thing is absolutely necessary to combat the mental illness that is pedophilia.
SuperSpruce ( @SuperSpruce@lemmy.zip ) English8•11 months ago70 years for… Generating AI CSAM? So that’s apparently worse than actually raping multiple children?
Onihikage ( @Onihikage@beehaw.org ) English5•11 months agoHe did more than generate it, he also sent some of it to a minor on Instagram, probably intending to get some real CSAM, or worse. For that, spending the next 70 years away from both children and computers seems appropriate to me.
onlinepersona ( @onlinepersona@programming.dev ) English1•11 months agoPunishment over rehabilitation has always been a great solution 👍
Onihikage ( @Onihikage@beehaw.org ) English1•11 months agoIt’s not about punishing him, it’s about keeping a clear threat to children away from them for as long as is necessary. Maybe he can be rehabilitated, but I’d rather start with lifelong separation from their means and targets and go from there.
HelixDab2 ( @HelixDab2@lemm.ee ) English7•11 months agoThe basis of making CSAM illegal was that minors are harmed during the production of the material. Prior to CG, the only way to produce pornographic images involving minors was to use real, flesh-and-blood minors. But if no minors are harmed to create CSAM, then what is the basis for making that CSAM illegal?
Think of it this way: if I make a pencil drawing of a minor being sexually abused, should that be treated as though it is a criminal act? What if it’s just stick figures, and I’ve labeled one as being a minor, and the others as being adults? What if I produce real pornography using real adults, but used actors that appear to be underage, and I tell everyone that the actors were all underage so that people believe it’s CSAM?
It seems to me that, rationally, things like this should only be illegal when real people are being harmed, and that when there is no harm, it should not be illegal. You can make an entirely reasonable argument that pornographic images created using a real person as the basis does cause harm to the person being so depicted. But if it’s not any real person?
This seems like a very bad path to head down.
Possibly linux ( @possiblylinux127@lemmy.zip ) English2•11 months agoOf course he did. That’s the world we live in.