Even though I don’t buy that LLMs are going to replace developers and quite agree with what is said, this is more of a critique of LLMs as English-to-code translators. LLMs are very useful for many other things.

Researching concepts, for one, has become so much easier, especially for things where you don’t know anything yet and would have a hard time to even formulate a search engine query.

I’ve found that ChatGPT and Perplexity are great tools for finding “that article I skimmed a year ago that talked about…”.

I agree. I think a better analogy than a compiler is a search engine that has an excellent grasp of semantics but is also drunk and schizophrenic.

LLMs are really valuable for finding information that you aren't able to formulate a proper search query for.

To get the most out of them, ask them to point you to reliable sources instead of explaining directly. Even then, it pays to be very critical of where they're leading you to. To make an LLM the primary window through which you seek new information is extremely precarious epistemologically. Personally, I'd use it as a last resort.