LLM technology does not have a connection with reality nor venues providing actual understanding.

Correction of conceptual errors require understanding.

Vomiting large amounts of inscrutable unmaintainable code for every change is not exactly an ideal replacement for a human.

We have not started to scratch the surface of the technical debt created by these systems at lightning speed.

> We have not started to scratch the surface of the technical debt created by these systems at lightning speed.

Bold of you to assume anyone cares about it. Or that it’ll somehow guarantee your job security. They’ll just throw more LLMs on it.