The "learning" isn't learning really. I mean it might be, but if you define learning to be a human endeavor than AI can't learn.

It's perfectly reasonable to say it's okay for humans to do something but not okay for a computer program to do the same thing. We don't have to equate AI to humans, that's a choice and usually a bad one.

It's also perfectly reasonable to say it's ok for a program or machine to do the same thing as a human. This has been the basis for the technological revolution since the dawn of technology.

It's legal and perfectly reasonable for a human being to combine organic fuels with oxygen from the air to create energy and CO2. Any law restricting that would be the worst form of tyranny.

It would not be reasonable to allow machines to do that at unlimited scale without restrictions.

(Hopefully the fossil fuels industry won't draw inspiration from the legal arguments made by AI companies...)

> It's legal and perfectly reasonable for a human being to combine organic fuels with oxygen from the air to create energy and CO2.

Is there any line past which it becomes unreasonable?

> It would not be reasonable to allow machines to do that at unlimited scale without restrictions.

If the machines were a replacement for a damaged respiratory system in a human would it reasonable?

What about if the machine were being used by a human to do something else that was important?

Where is the line where it becomes reasonable?

> Is there any line past which it becomes unreasonable?

That's exactly the question we should be asking about AI and fair use.

If one defines 'flying' to be a bird's endeavor, then humans can't fly.

Now, if you'll excuse me, I need to catch a metal shuttle that chucks itself through the air on wings.

Sure as a word it can be broad, as a concept in our legal system that should be much more nuanced.

The relevant extension of your analogy is should birds be required to obey FAA rules? Or should plane factories be protected as nesting sites?