They fail because the “scaffolding” is building the complicated expert system that AI promised that one would not have to do.

If I implement myself a strict parser and an output post-processor to guard against hallucinations, I have done 100% of the business related logic. I can skip the LLM in the middle altogether.

> If I implement myself a strict parser and an output post-processor to guard against hallucinations, I have done 100% of the business related logic. I can skip the LLM in the middles altogether.

Well said and I could not agree more.

> If I implement myself a strict parser and an output post-processor to guard against hallucinations, I have done 100% of the business related logic. I can skip the LLM in the middle altogether.

You might even be able to put a UI on it that is a lot more effective than asking the user to type text into a box.

Very interesting that you've found ways to mitigate the hallucination issue. Are you able to share more about what worked for you with the post processor and parser?