It's not an engineering culture problem lol, I promise. I have over a decade in this career and I've worked at places with fantastic and rigorous processes and at places with awful ones. The better places slacked each other a lot.

I don't understand what's so hard to understand about "I need to understand the actual ramifications of my changes before I make them and no generated robotext is gonna tell me that"

I'm probably bad at explaining.

StackOverflow is a tool. You could use it to look for a solution to a bug you're investigating. You could use it to learn new techniques. You could use it to guide you through tradeoffs in different options. You can also use it to copy/paste code you don't understand and break your production service. That's not a problem with StackOverflow.

> "I need to understand the actual ramifications of my changes before I make them and no generated robotext is gonna tell me that"

Who's checking in this robotext?

* Is it some rogue AI agent? Who gave it unfettered access to your codebase, and why?

* Is it you, using an LLM to try to fix a bug? Yeah, don't check it in if you don't understand what you got back or why.

* Is it your peers, checking in code they don't understand? Then you do have a culture problem.

An LLM gives you code. It doesn't free you of the responsibility to understand the code you check in. If the only way you can use an LLM is to blindly accept what it gives you, then yeah, I guess don't use an LLM. But then you also probably shouldn't use StackOverflow. Or anything else that might give you code you'd be tempted to check in blindly.