Speaking as a creative who also has gotten paid for creative work, I’m a bit flustered at how brazenly people just wax poetic about the need for copyright law, especially when the creator or artist them selves are never really considered in the first place.

It’s not like yee olde piracy, which can even be ethical (like videogames being unpublished and almost erased from history), but a new form whereby small companies get to join large publishers in screwing over the standalone creator - except this time it isn’t by way of predatory contracts, but by sidestepping the creator and farming data from the creator to recreate the same style and form, which could’ve taken years - even decades to develop.

There’s also this idea that “all work is derivative anyways, nothing is original”, but that sidesteps the points of having worked to form a style over nigh decades and making a living off it when someone can just come along and undo all that with a press of a button.

If you’re libertarian and anarchist, be honest about that. Seems like there are a ton of tech bros who are libertarian and subversive about it to feel smort (the GPL is important btw). But at the end of the day the hidden agenda is clear: someone wants to benifit from somebody else’s work without paying them and find the mental and emotional justification to do so. This is bad, because they then justify taking food out of somebody’s mouth, which is par for the course in the current economic system.

It’s just more proof in the pudding that the capitalist system doesn’t work and will always screw the labourer in some way. It’s quite possible that only the most famous of artists will be making money directly off their work in the future, similarly to musicians.

As an aside, Jay-Z and Taylor Swift complaining about not getting enough money from Spotify is tone-deaf, because they know they get the bulk of that money anyways, even the money of some account that only plays the same small bands all the time, because of the payout model of Spotify. So the big ones will always, always be more “legitimate” than small artists and in that case they’ve probably already paid writers and such, but maybe not… looking at you, Jay-Z.

If the copyright cases get overwritten by the letigous lot known as corporate lawyers and they manage to finger holes into legislation that benifits both IP farmers and corporate interests, by way of models that train AI to be “far enough” away from the source material, we might see a lot of people loose their livelihoods.

Make it make sense, Beehaw =(

  •  frog 🐸   ( @frog@beehaw.org ) 
    link
    fedilink
    English
    29 months ago

    Destroy all existing AI datasets, as they’re irreparably tainted. Require all AIs, regardless of whether they’re owned by a company or are open source, to build new datasets exclusively from work that is in the public domain or for which the copyright owner has been consulted and compensated. If the megacorporations want to keep the models they already have, they must compensate the creator of every single piece in the training data at market rates - if they can’t afford to do it, then they either go bankrupt or destroy the tainted dataset. If anyone, company or individual, is caught training an AI with content for which they don’t have a valid licence, issue fines starting with 10% of global revenue, to be distributed to the people whose copyright they violated. Higher fines for repeat offenders.

      •  frog 🐸   ( @frog@beehaw.org ) 
        link
        fedilink
        English
        19 months ago

        Yes, but the solution isn’t to allow everyone to rip off artists. Because that results in the small guy creators being even less powerful - instead of only having to be cautious in their dealings with large corporations, they now have to contend with every single person on the planet using their stuff without consent or compensation.

        Even the large corporations that own a lot of content do not own enough to make a viable AI. These things take billions of images in the dataset to result in a model that’s halfway usable. No company owns that many, and they’d bankrupt themselves trying to buy that many. That’s why forcing them to pay is actually a viable solution. No existing company has copyright over billions of images.

        Oh, and obviously the legislation would have to be written to explicitly not give the likes of Google the ability to claim that by using their services, you consent to them harvesting your content to train an AI. “Can’t pay, can’t use” would have to apply to all content, globally, in a way that can’t be signed away through a sneaky ToS.

    • That sounds feasable.

      To be more specific I would require that models have to be copyleft and probably GNU GPLv3 so that big tech companies don’t get a monopoly on good models.

      Basically you can do what you want except change the license.

      •  frog 🐸   ( @frog@beehaw.org ) 
        link
        fedilink
        English
        29 months ago

        That sounds reasonable. It also makes room for artists who feel so inclined to offer their works into a training dataset, which is fine when it’s something they’ve specifically opted into.