I respect the desire for reciprocity, but strong copyleft isn't the only, or even the best, way to protect user freedom or public knowledge. My opinion is that permissive licensing and open access to learn from public materials have created enormous value precisely because they don't pre-empt future uses. Requiring permission for every new kind of reuse (including ML training) shrinks the commons, entrenches incumbents who already have data deals, and reduces the impact of your work. The answer to exploitation is transparency, attribution, and guardrails against republication, not copyright enforced restrictions.
I used to be much more into the GPL than I am now. Perhaps it was much more necessary decades ago or perhaps our fears were misguided. I license all my own stuff as Apache. If companies want to use it, great. It doesn't diminish what I've done. But those who prefer GPL, I completely understand.
> as well as an academic's LGPL high performance matrix library which is developed via grants over the years.
The academic got paid with grants. So now this high performance library exists in the world, paid for by taxes, but it can't be used everywhere. Why is it bad to share this with everyone for any purpose?
> What I put out is for humans' direct consumption. Middlemen are not welcome.
Why? Why must it be direct consumption? I've use AI tools to accomplish things that I wouldn't be able to do on my own in my free time -- work that is now open source. Tons of developers this week are benefiting from what I was able to accomplish using a middle man. Not all middlemen, by definition, are bad. Middlemen can provide value. Why is that value not welcome?
> I'm not against AI/LLM/Generative technology/etc. I'm against exploitation of people, artists, musicians, software developers, other companies.
If you define AI/LLM/Generative technology/etc as the exploitation of exploitation of people, artists, musicians, software developers, other companies then you are against it. As software developers our work directly affects the livelihoods of people. Everything we create is meant to automate some human task. To be a software developer and then complain that AI is going to take away jobs is to be a hypocrite.
Your whole argument is easily addressed by requiring the AI models to be open source. That way, they obviously respect the AGPL and any other open license, and contribute to the information being kept free. Letting these companies knowingly and obviously infringe licenses and all copyright as they do today is obviously immoral, and illegal.
AGPL doesn't pre-empt future uses or require permission for any kind of re-use. You just have to share alike. It's pretty simple.
AGPL lets you take a bunch of data and AI-train on it. You just have to release the data and source code to anyone who uses the model. Pretty simple. You don't have to rent them a bunch of GPUs.
Actually it can be annoying because of the specific mechanism by which you have to share alike - the program has to have a link to its own source code - you can't just offer the source alongside the binary. But it's doable.