And we re-invent the wheel basically. You have to use very specific prompts to make the computer do what you want, so why not just, you know... program it? It's not that hard.
Natural language is trying to be a new programming language, one of many, but it's the least precise one imho.
> Natural language is trying to be a new programming language, one of many, but it's the least precise one imho.
I disagree that natural language is trying to be a programming language. I disagree that being less precise is a flaw.
Consider:
- https://www.ietf.org/rfc/rfc793.txt
- https://datatracker.ietf.org/doc/html/rfc2616
I think we can agree these are both documents written in natural language. They underpin the very technology we are using to have this discussion. It doesn't matter to either of us what platform we are on, or what programming language was used to implement them. That is not a flaw.
Biological evolution shows us how far you can get with "good enough". Perfection and precision are highly overrated.
Let's imagine a wild future, one where you copy-and-paste the HTML spec (a natural language doc) into a coding agent and it writes a complete implementation of an HTML agent. Can you say with 100% certainty that this will not happen within your own lifetime?
In such a world, I would prefer to be an expert in writing specs rather than to be an expert in implementing them in a particular programming language.
In this world where the LLM implementation has a bug in it that impacts a human negatively (the app could calculate a person's credit score for example)
Who is accountable?
I couldn't even tell you who is liable right now for bugs that impact human's negatively. Can you? If I was an IC at an airplane manufacturer and a bug I wrote caused an airplane crash - who is legally responsible? Is it me? The QA team? The management team? Some 3rd party auditor? Some insurance underwriter? I have a strong suspicion it is very complicated as it is without considering LLMs.
What I can tell you is that the last time I checked: laws are written in natural language, they are argued for/against and interpreted in natural language. I'm pretty confident that there is applicable precedent and the court system is well equipped to deal with autonomous systems already.
I agree with this. There's so much snake oil at the moment. Coding isn't the hard part of software development and we already have unambiguous language for describing computation. Human language is a bad choice for it, and we already find that when writing specs for other humans. Adding more humaness to the loop isn't a good thing IMHO.
At best an LLM is a new UI model for data. The push to get them writing code is bizarre.