You can't really argue that things are in a certain way when that contradicts the way the law works, that's a recipe for disaster. The rules have been set, you can disagree with them and then you will be forced to litigate, which is both expensive and time consuming. Purposefully going against the grain is only for those with extremely deep pockets (and for lawyers...).
> Besides, if we go by "the law" then we already have a court case where training an AI model is protected by fair use.
Yes, but training an AI is a completely different thing than distributing the work product generated by that AI.
Note that I don't agree with all aspects of copyright law either, but I'll be happy to play by the rules as set today simply because I can't afford to be wrong and held liable for infringement. For instance I strongly believe that the length of copyright is a problem (and don't get me started on patents, especially on software). I also believe that only the original author should have copyright, not the company they worked for, their heirs (see Ravel for a really nasty case) or anybody else. I believe they should not be transferable at all.
But because I'm a nobody and not wealthy enough to challenge the likes of Disney in court I play by the rules.
As for 'this situation is going to get funny when some country decides that AI generated content does get copyright protection':
Copyright is one of the most harmonized legislative constructs in the world. Almost every country has adopted it, often without meaningful change. In practice US courts are obviously a very important driver behind changes in copyright law. But in general these changes tend to lean towards more protection for copyright owners, not less. So far the Trump admin has not touched copyright law in their usual heavy handed manner. I'm not sure if this is by design or by accident but maybe there are lines that even they can not easily cross without massive consequences.
Some parties in the AI/Copyright debate are talking about two sides of their mouth, for instance, Microsoft is heavily relying on being able to infringe on copyright at will but at the same time they are jealously guarding their own code. Such hypocrisy is going to be the main wedge that those in favor of strong copyright are going to use to reduce the chances that AI work product deserves copyright, after all, if it is original and not transformative then Microsoft could (and should!) train their AI on their own confidential code. But they're not doing that, maybe they know something you and I do not...