Interesting decision

  •  millie   ( @millie@beehaw.org ) 
    cake
    link
    fedilink
    English
    461 year ago

    I feel like this is less of a big decision and more of a ‘duh’ sort of situation. To my understanding this isn’t saying that all AI art violates copyright, but that AI art which does violate copyright can’t be used.

    Like if i took a picture of Darth Vader and handed it to NightCafe to fool around with, that still belongs to Disney. Steam is legally required to act if a valid DMCA is sent, and to adhere to the court’s ruling in the case of a dispute.

    I feel like this is a reassurance that they intend to obey copyright law rather than a restriction of all AI art. Basically they’re saying that if you DMCA someone in good faith on the basis of derivative works, they’ll play ball.

    •  Dominic   ( @Dominic@beehaw.org ) 
      link
      fedilink
      English
      20
      edit-2
      1 year ago

      Right, the phrasing is “copyright-infringing AI assets” rather than a much more controversial “all AI assets, due to copyright-infringement concerns.”

      I do think there’s a bigger discussion that we need to have about the ethics and legality of AI training and generation. These models can reproduce exact copies of existing works (see: Speak, Memory: An Archaeology of Books Known to ChatGPT/GPT-4).

      •  millie   ( @millie@beehaw.org ) 
        cake
        link
        fedilink
        English
        101 year ago

        Sure, but plagiarism isn’t unique to LLMs. I could get an AI to produce something preexisting word for word, but that’s on my use of the model, not on the LLM.

        I get the concerns about extrapolating how to create works similar to those made by humans from actual human works, but that’s how people learn to make stuff too. We experience art and learn from it in order to enrich our lives, and to progress as artists ourselves.

        To me, the power put into the hands of creators to work without the need for corporate interference is well worth the consideration of LLMs learning from the things we’re all putting out there in public.

        • That’s a fair point.

          In my eyes, the difference is the sheer volume of content that these models rip through in training. It would take many, many lifetimes for a person to read as much as an LLM “reads,” and it’s difficult to tell what an LLM is actually synthesizing versus copying.

          Now, does it really matter?

          I think the answer comes down to how much power is actually put into the hands of artists rather than the mega-corps. As it stands, the leaders of the AI race are OpenAI/Microsoft, Google, and Meta. If an open LLM comes out (a la Stable Diffusion), then artists do stand to benefit here.

          •  millie   ( @millie@beehaw.org ) 
            cake
            link
            fedilink
            English
            21 year ago

            I mean, they are and they aren’t. OpenAI, Google, and Meta may be in control of the most popular iterations of LLMs at the moment, but the cat’s also kind of out of the bag. If we all lost access to ChatGPT and other AI stuff that’s dependent on it over-night, there’s a pretty huge incentive to fill that gap.

            They control it now because they’ve filled people’s emerging need for LLMs to assist in their workflow. If they try to choke that off as though they own it in a wider sense, they’re going to find their power over it turning to ash in their mouths and someone else will take their place.

            I’m optimistic that the trend of cracks developing in the authoritarian power structures we’re seeing on the internet won’t stay limited to there. LLMs could be a big part of that. Even just Stable Diffusion being Open Source is massive. I’m sure others will follow, and those big companies, if they’re smarter than Elon and Spez, will want to hang onto their relevance as long as possible by not shunting users over to FOSS.

            • Absolutely true. StableLM is a thing, and although it’s not as good as ChatGPT or even Bard, it’s a huge step. I’m sure there will be better free models in the months and years to come.

      • I just don’t see how this is different from “Valve won’t publish games that feature copyright-infringing assets” which is probably already true. Does it matter whether a human or an “AI” produced it?

        •  Pseu   ( @Pseu@beehaw.org ) 
          link
          fedilink
          English
          31 year ago

          Probably not. But there is a pretty widespread belief that images generated by AI cannot possibly be infringing, because the model is somehow inherently transformative.

          This is not the case, and Valve reiterating that it is not the case might keep developers who are under the impression above from trying.

          • I have mostly ever seen the exact opposite position: that AI cannot possibly produce anything not copyright infringing. It’s hard to remember a time someone was claiming that a given artwork produced by AI could never be copyright infringing except among like, cryptobros.