This issue is not black and white.

It is accepted, within limits, for humans do transformative work, but it's not been yet established which the limits for AIs are, primarily (IMO): 1. whether the work is transformative or not 2. whether the scale of transformation/distribution changes the nature of the work.

Embedding other people's work in a vector space, then sampling from the distribution at a different point in the vector space, is not a central member of the "transformative" category. The justifications for allowing transformative uses do not apply to it.

That does seem to be the plurality opinion yes. But you are responding to someone saying that what counts as transformative hasn't been decided by saying that you have decided. We don't know how human brains do it. What if we found that humans actually do it in the same way? Would that alter the dialog, or should we still give preference to humans? If we should, why should we?

> or should we still give preference to humans? If we should, why should we?

Because of the scaling abilities of a human brain, you cannot plug more brains into a building to pump out massive amounts of transformative work, it requires a lot for humans to be able to do it which creates a natural limit to the scale it's possible.

Scale and degree matter even if the process is 100% analogous to how humans do it, the natural limitation for computers to do it is only compute, which requires some physical server space, and electricity, both of which can be minimised with further technological advances. This completely changes the foundation of the concept for "transformative work" which before required a human being.

This is a good observation, and motivates adjusting the legal definition of "transformative" even if the previous definition did include what generative AI systems can now do.

> What if we found that humans actually do it in the same way?

We know that humans don't – or, at least, aren't limited to this approach. A quick perusal of an Organization for Transformative Works project (e.g. AO3, Fanlore) will reveal a lot of ideas which are novel. See the current featured article on Fanlore (https://fanlore.org/wiki/Stormtrooper_Rebellion), or the (after heavy filtering) the Crack Treated Seriously tag on AO3 (https://archiveofourown.org/works?work_search[sort_column]=k...). You can't get stuff like this from a large language model.

People could be doing their own transformative works, and then posting them to tumblr or whatever with a “Ghibli style” tag or something.

Critiques like this dismissis AI as a bunch of multiplications, while in reality it is backed by extensive research, implementation, and data preparation. There's an enormous complexity behind, making it difficult to categorize as simply transformative or not.

The Pirate Bay is also backed by extensive research, implementation, and data preparation. I'm not dismissising [sic] anything as "a bunch of multiplications" – you'll note I talked about embedding in vector spaces, not matrix multiplication. (I do, in fact, know what I'm talking about: if you want to dismiss [sic] my criticism, please actually engage with it like the other commenters have.)

Any type of art is inspired by the art of others. Its the simplicity in which you now can generate "art" which is the issue. Stealing artists work while also making it harder than ever for them to make a living is a deeply ethical issue. AI "artists" and "art" disgust me. Its a skill you build over your whole life, taking the shortcut because you're unwilling to learn the craft is deeply insulting to real artists. Good thing traditional art is still somewhat safe from this. Thankfully, this is making it easier to leave highly addicting online platforms as I boycott AI content of any form.

> Its a skill you build over your whole life, taking the shortcut because

Doesn't this apply to the printing press?

For me the core issue is not that OpenAI can generate some copies if the art, the issue is that some artists can not earn an honest living and that people do not care about artists generally. I wonder how many of the people commenting here have bought themselves art from an artist.

I personally doubt that AI can do a movie similar to studio Ghibli (of which I seen a lot and I love and paid for) and I also wonder how much of the issue here is some corporate profit rather than some poor artists (do you know who owns studio Ghibli without looking?)

It's fine to boycott AI content, but you could also decide to boycott content produced by large corporations for profit.