This is fine for a moderately sized query. When your queries start taking in 8 joins and 20 fields per table because you're running queries on Presto with 5 TB of data, not only is it drastically better at writing (because it doesn't mess up the fields), you can ask it to try the query 5 different ways to help you land on the most optimal.
This is a great example of AI tech-debt and fragility.
An eight-join query is going to be nigh on unmaintainable should the requirements change, leading to a change-break-change-break spiral as your preferred coding agent tries to fix its previous fixes.
Maybe the wise way to use AI would be to sort out the schema.
This feels wrong. 8 joins is almost certainly reporting stuff, not transactional. Contrary to what some SQL-averse devs think, 300 lines of SQL is actually more maintainable than the equivalent ~1000 lines of application code. It's also much faster. And I do think that's the real conversion, because SQL is a much higher level language than currently available application languages. It's also declarative in nature, which helps maintainance.
A highly normalized DB can easily end up with 8 joins required for some function. That's really not out of the question. "Sorting out" the schema then would be... denormalization, which is a thing, but you need to know why you're doing it. And I think 8 joins isn't enough of a reason.