People have to stop talking like LLMs solved programming.

If you're someone with a background in Computer Science, you should know that we have formal languages for a reason, and that natural language is not as precise as a programming language.

But anyway we're peek AI hype, hitting the top on HN is worth more than a reasonable take, reasonableness doesn't sell after all.

So here we're seeing yet another text about how the world of software was solved by AI and being a developer is an artifact of the past.

> we have formal languages for a reason

Right? At least on HN, there's a critical mass of people loudly ignoring this these days, but no one has explained to me how replacing formal language with an english-language-specialized chatbot - or even multiple independent chatbots (aka "an agent") - is good tradeoff to make.

It's "good" from the standpoint of business achieving their objectives more quickly. That may not be what we think of as objectively good in some higher sense, but it's what matters most in terms of what actually happens in the world.

Should it be what matters most? Idiots leading idiots in a circle.

Yes, but the people who talk to me as Software Engineer about what to build also talk to me only in natural language, not a formal language.

Does it really matter that English is not as precise if the agent can make a consistent and plausible guess what my intention is? And when it occasionally guesses incorrectly, I can always clarify.

You're right, of course, but you should consider that all formal language starts as an informal language idea in the mind of someone. Why shouldn't that "mind" be an LLM vs. a human?

I think mostly because an LLM is not a "mind". I'm sure there'll be an algorithm that could be considered a "mind" in the future, but present day an LLM is not it. Not yet.

This is in my opinion the greatest weakness of everything LLM related. If I care about the application I'm writing, and I believe I should if I bother doing it at all, it seems to me that I should want to be precise and concise at describing it. In a way, the code itself serves as a verification mechanism for my thoughts and whether I understand the domain sufficiently.

English or any other natural language can of course be concise enough, but when being brief they leave much to imagination. Adding verbosity allows for greater precision, but I think as well that that is what formal languages are for, just as you said.

Although, I think it's worth contemplating whether the modern programming languages/environments have been insufficient in other ways. Whether by being too verbose at times, whether the IDEs should be more like databases first and language parsers second, whether we could add recommendations using far simpler, but more strict patterns given a strongly typed language.

My current gripes are having auto imports STILL not working properly in most popular IDEs or an IDE not finding referenced entity from a file, if it's not currently open... LLMs sometimes help with that, but they are extremely slow in comparison to local cache resolution.

Long term I think more value will be in directly improving the above, but we shall see. AI will stay around too of course, but how much relevance it'll have in 10 years time is anybody's guess. I think it'll become a commodity, the bubble will burst and we'll only use it when sensible after a while. At least until the next generation of AI architecture will arrive.