Interesting decision

  • That’s a fair point.

    In my eyes, the difference is the sheer volume of content that these models rip through in training. It would take many, many lifetimes for a person to read as much as an LLM “reads,” and it’s difficult to tell what an LLM is actually synthesizing versus copying.

    Now, does it really matter?

    I think the answer comes down to how much power is actually put into the hands of artists rather than the mega-corps. As it stands, the leaders of the AI race are OpenAI/Microsoft, Google, and Meta. If an open LLM comes out (a la Stable Diffusion), then artists do stand to benefit here.

    •  millie   ( @millie@beehaw.org ) 
      link
      fedilink
      English
      21 year ago

      I mean, they are and they aren’t. OpenAI, Google, and Meta may be in control of the most popular iterations of LLMs at the moment, but the cat’s also kind of out of the bag. If we all lost access to ChatGPT and other AI stuff that’s dependent on it over-night, there’s a pretty huge incentive to fill that gap.

      They control it now because they’ve filled people’s emerging need for LLMs to assist in their workflow. If they try to choke that off as though they own it in a wider sense, they’re going to find their power over it turning to ash in their mouths and someone else will take their place.

      I’m optimistic that the trend of cracks developing in the authoritarian power structures we’re seeing on the internet won’t stay limited to there. LLMs could be a big part of that. Even just Stable Diffusion being Open Source is massive. I’m sure others will follow, and those big companies, if they’re smarter than Elon and Spez, will want to hang onto their relevance as long as possible by not shunting users over to FOSS.

      • Absolutely true. StableLM is a thing, and although it’s not as good as ChatGPT or even Bard, it’s a huge step. I’m sure there will be better free models in the months and years to come.