I agree with sibling replies but more tangentially maybe: why is it that sometimes the point of these things is that I do not have to modify my behavior at all, while the restaurant can pay one less person, but other times the point is all about modifying it so the company can pay one less person?

Like here: if the restaurant really wants to get rid of their intercom person, why not make it self checkout, no AI required? What is actually saved or gained either way? There is nothing intrinsic about this situation that requires me to use natural language to order something. People order tons of food online these days anyway!

Like I just dont think it makes sense and I also probably don't think the economics of this would work out with fast food restaurant scale.

Again, just step back and think about it for a moment: lots of this really doesn't make sense. The world is not really full of tasks a good prompt can solve. There a million things that aren't "produce this python script" or "summarize this article probably correctly."

Why can't it just be what it is? Why does it absolutely have to be everything or nothing? So much of the thought around this feels so clearly wrong headed, its just starting to feel truly absurd.