LLMs are a tool to help match human thought to what computers can do. People would like them to have exact reproducible results, but they're on the other end of the spectrum, more like people than tools. George correctly points out there is a vast space to explore closer to the compute hardware that might profitably be explored. Thanks to the same LLMs, that's about to get a whole lot easier. If you marveled at the instant response of Turbo Pascal and IDEs, you're in for a whole lot more.
--- (that was the tl;dr, here's how I got there) ---
As a mapper[3], I tend to bounce all the things I know against each new bit of knowledge I acquire. Here's what happens when that coincides with GeoHot's observation about LLMs vs Compilers. I'm sorry this is so long, right now it's mostly just stream of thought with some editing. It's an exploration of bouncing the idea of impedance matching against the things that have helped advance programming and computer science.
--
I've got a cognitive hammer that I tend to over-use, and that is seeing the world through the lens of a Ham Radio operator, and impedance matching[2]. In a normal circuit, maximum power flows when the source of power and the load being driven have the same effective resistance. In radio frequency circuits (and actually any AC circuit) there's another aspect, reactance. It's a time shifted form of current. This is trickier there are now 2 dimensions to consider instead of one, but most of the time, a single value, VSWR is sufficient to tell how well things are matched.
VSWR is adequate to know if a transmitter is going to work, or power bouncing back from the antenna might destroy equipment, but making sure it will work across a wide range of frequencies, yields at least a 3rd dimension. As time progresses, if you actually work with those additional dimensions, it slowly sinks in what works, and how, and what had previously seemed like magic, becomes engineering.
For example, vacuum tube based transmitters have higher resistances that almost any antenna, transformers and coupling through elements that shift power back and forth between the two dimensions allow optimum transfer without losses at the cost of complexity.
On the other hand, semiconductor based transmitters tend to have the opposite problem, their impedances are lower, so different patters work for them, but most people still just see it as "antenna matching", and focus on the single number, ignoring the complexity.
{{Wow... this is a book, not an answer on HN... it'll get shorter after a few edits, I hope, NOPE... it's getting longer...}}
Recently, a tool that used to cost thousands of dollars, the Vector Network Analyzer, has become available at less than $100. It allows for measuring resistance, reactance, and gain simultaneously across frequency. It's like compilers, making things manageable in scope that otherwise seemed too complex. It only took a few times playing with a NanoVNA to understand things that previously would have been some intense EE classwork with Smith Charts.
Similarly, tools like Software Defined Radios for $30, and GNU Radio (for $0.00) allowed understanding digital signal processing in ways that would have been equally difficult without professional instruction. With these tools, you can build a signal flow graph in an interactive window, and in moments have a working radio for FM, AM, Sideband, or any other thing you can imagine. It's magic!
-- back to computing and HN --
In the Beginning was ENIAC, a computer that took days to set up and get working on a given problem by a team with some experience. Then John Von Neumann came along, and added the idea of stored programs, which involved sacrificing the inherently parallel nature of the machine, losing 70% of its performance, but making it possible to set it up for a task simply by loading a "program" onto it's back of instruction switches.
Then came cards and paper tape storage, further increasing the speed at which data and programs could be handled.
It seems to me that compilers were like one of the above tools, they made it possible for humans to do things that only Alan Turing or others similarly skilled could do in the beginning of programming.
Interactive programming increased the availability of compute, and make machines that were much faster that programmers, more easily distributed among teams of programmers.
IDEs were another. Turbo Pascal allowed compile, linking, and execution to happen almost instantly. It widely opened the space for experimentation by reducing the time required to get feedback from minutes to almost zero.
I've done programming on and off through 4 decades of work. Most of my contemplation is as an enthusiast, instead of professional. As far as compilers and the broader areas of Computer Science I haven't formally studied, it seems to me that LLMS, especially the latest "agentic" versions, will allow me to explore things far easier than I might have otherwise done. LLMs have helped me to match my own thoughts across a much wider cognitive impedance landscape. (There's that analogy/hammer in use...)
Compilers are an impedance matching mechanism. Allowing a higher level of abstraction gives flexibility. One of the ideas I've had in the past for helping with better interaction between people and compilers is to allow compilers that also work backwards.[1] I'm beginning to suspect that with LLMs, I might actually be able to attempt to build this system, it's always seemed out of reach because of the levels of complexity involved.
I have several other ideas that might warrant a new attempt, now that I'm out of the job market, and have the required free time and attention.
{{Sorry this turned out to be an essay... I'm not sure how to condense it back down right now}}
[1] https://wiki.c2.com/?BidirectionalCompiler
[2] https://en.wikipedia.org/wiki/Impedance_matching
[3] https://garden.joehallenbeck.com/container/mappers-and-packe...