It's a sane policy - human is responsible for what they contribute, regardless of what tools they use in the development process.

However, the gotcha here seems to be that the developer has to say that the code is compatible with the GPL, which seems an impossible ask, since the AI models have presumably been trained on all the code they can find on the internet regardless of licensing, and we know they are capable of "regenerating" (regurgitating) stuff they were trained on with high fidelity.

Then we get to the Code of Theseus argument, if you take a piece of code and replace every piece of with code that looks the same, is it still the original code?

Is an AI reimplementation a "clean room" implementation? What if the AI only generates pseudocode and a human implements the final code based on that? Etc etc ad infinitum.

Lawyers will be having fun with this philosophical question for a good decade.