I've recently noticed this opinion seems unpopular, at least on Lemmy.
There is nothing wrong with downloading public data and doing statistical analysis on it, which is pretty much what these ML models do. They are not redistributing other peoples' works (well, sometimes they do, unintentionally, and safeguards to prevent this are usually built-in). The training data is generally much, much larger than the model sizes, so it is generally not possible for the models to reconstruct random specific works. They are not creating derivative works, in the legal sense, because they do not copy and modify the original works; they generate "new" content based on probabilities.
My opinion on the subject is pretty much in agreement with this document from the EFF: https://www.eff.org/document/eff-two-pager-ai
I understand the hate for companies using data you would reasonably expect would be private. I understand hate for purposely over-fitting the model on data to reproduce people's "likeness." I understand the hate for AI generated shit (because it is shit). I really don't understand where all this hate for using public data for building a "statistical" model to "learn" general patterns is coming from.
I can also understand the anxiety people may feel, if they believe all the AI hype, that it will eliminate jobs. I don't think AI is going to be able to directly replace people any time soon. It will probably improve productivity (with stuff like background-removers, better autocomplete, etc), which might eliminate some jobs, but that's really just a problem with capitalism, and productivity increases are generally considered good.
Good question.
Ok, so let's say the artist does exactly what the AI does, in that they don't try to do anything unique, just looking around at existing content and trying to mix and mash existing ideas. No developing of their own style, no curiosity of art history, no humanity, nothing. In this case I would say that they are mechanically doing the exact same thing as an AI is doing. Do I think I they should get payed. Yes! They spent a good chunk of their life developing this skill, they are a human, they deserve to get their basic needs met and not die of hunger or exposure. Now, this is a strange case because 99.99% of artists don't do this. Most develop a unique style and add life experience in their art to generate something new.
A Software Engineer can profit off their AI model by selling it. If they are make money by generating images, then they are making money off of hard working artists that should be payed for their work. That isn't great. The outcome of allowing this is that art will no longer be something you can do to make a living. This is bad for society.
It also should be noted that a Software Engineer making an AI model from scratch is 0.01% of the AIs being used. Most people, lay people, who have spent very little time developing art or Software Engineering skills can easily use an existing model to create "art". The result of this is that many talented artists that could bring new and interesting ideas to world are being out competed by one guy with a web browser producing sub-par sloppy work.