• I’ll admit, it was kind of long and difficult to read for some reason, so I kind of started and then didn’t read everything in it, maybe I’ll try again later.

    Okay, that’s fair, I don’t want the ‘creative industrial complex’ like disney etc to gain more power, sorry if I came off incorrectly. I can see the flaws in my argument now, but machine learning/LLMs do make me angry and upset because sure, if a person is analysing my work, that’s fine, I just don’t particularly want that work to be used to make new work without the skills necessary to do so well, LLMs/Machine Learning cannot gain those skills because it is not alive and thus it cannot create. I actually release most of my work very permissively, but I still don’t want it to train some model, I’m happy if people are inspired by it or do it better than I can though.

    The article mentions though that using things ‘without permission’ is how a lot of people became and remain(ed) poor, especially people from marginalised communities, likely from those in power so again, I think we’re on the same page there?

    • I just don’t particularly want that work to be used to make new work without the skills necessary to do so well, LLMs/Machine Learning cannot gain those skills because it is not alive and thus it cannot create.

      This kind of sentiment saddens me. People can’t look past the model and see the person who put in their time, passion, and knoledge to make it. You’re begrudging someone who took a different path in life, spent their irreplaceable time acquiring different skills and applied them to achieve something they wanted. Because of that, they don’t deserve it, as they didn’t do it the same way you did, with the same opportunities and materials.

      The article mentions though that using things ‘without permission’ is how a lot of people became and remain(ed) poor, especially people from marginalised communities, likely from those in power so again, I think we’re on the same page there?

      We are, but that’s just one symptom of a larger exploitative system where the powerful can extract while denying opportunities to the oppressed. AI training isn’t only for mega-corporations. We shouldn’t put up barriers that only benefit the ultra-wealthy and hand corporations a monopoly of a public technology by making it prohibitively expensive to for regular people. Mega-corporations already own datasets, and have the money to buy more. And that’s before their predatory ToS allowing them exclusive access to user data, effectively selling our own data back to us.

      Regular people, who could have had access to a competitive, corporate-independent tool for creativity, education, entertainment, and social mobility, would instead be left worse off and with less than where we started. We need to make sure this remains a two-way street, corporations have so much to lose, and we, everything to gain. Just look at the floundering cinema industry, weak cable and TV numbers and print media.