I wonder: how relevant is this portion of the software industry? Because I’m guessing there is also no way they can apply LLms at scale, which is never discussed in the larger AI at work narrative
I wonder: how relevant is this portion of the software industry? Because I’m guessing there is also no way they can apply LLms at scale, which is never discussed in the larger AI at work narrative
I work in an industry that requires reproducible binaries from source, and cryptographic hashes filed with a regulator.
It's also not aviation or medical. So perhaps it's more common than you imagine.
I think my comment conveyed the wrong sentiment, my bad. I’m suggesting exactly this: there are extremely common cases in which deterministic software outcomes are needed/mandatory/regulated. Way more often than we think, often in boring and solved but critical environments. Yet the entire AI industry acts as if that is an afterthought or an unimportant edge case.
LLMs aren't relevant to aviation and medical devices
Exactly! And yet they’re touted as a catch all business case!!!
It is completely relevant, if you want reliable software that you use daily to continue running without a massive rewrite.
Before suggesting to use LLMs to completely rewrite this sort of software, there is a reason why compilers need to be certified to operate in safety critical environments. Not everything needs to use LLMs as the solution to a problem.
I would go as far to say that using an LLM in this context is the wrong solution and is irrelevant to critical systems. Maybe some here see everything as tokens and must solve everything in the form of using LLMs.
Rewriting a toy web app using LLMs from Javascript to Typescript is great, but isn't good for safety critical systems.
Safety critical software is mostly a compliance dance that incidentally produces artifacts with lower defect rates than usual. LLMs can help with safety critical code as long as a human signs their name that they are responsible for its behavior.
When I'm sitting in the plane that has CAS firmware, I'd like to think it wasn't written by an LLM and that my death in the case of a CAS failure isn't chalked up to "some engineer somewhere gets in trouble".
There probably already is generated code in there, only it was generated from UML. I don’t think that LLM generated code will be treated differently from the point of view of the relevant regulations.
UML conversion is deterministic.
I agree with you. The question is: how the hell is this never discussed when assessing the economic potential of AI-driven disruption. I ask because I have the impression that all the really relevant industries are resistent to the current narrative. That said we had Claud helping bomb a school full of kids, you would guess the military would know better but no :/