Are you seriously asking why someone might want to do something guaranteed to behave exactly as they defined it, when they could have an LLM hallucinate code that touches the core of their system, instead?

Why would anyone go with the inaccurate option?