> Sure, we may see many lacking fundamentals, but I suppose that isn't so different from the criticism I heard when I wrote most of my first web software in PHP.

It's not just the fundamentals, though you're right that is an easy casualty. I also agree that LLMs can greatly help with some forms of learning -- previously, you kind of had to follow the incremental path, where you couldn't really do anything complex without have the skills that it built on, because 90% of your time and brain would be spent on getting the syntax right or whatever and so you'd lose track of the higher-level thing you were exploring. With an LLM, it's nice to be able to (temporarily) skip that learning and be able to explore different areas at will. Especially when that motivates the desire to now go back and learn the basics.

But my real fear is about the skill acquisition, or simply the thinking. We are human, we don't want to have to go through the learning stage before we start doing, and we won't if we don't have to. It's difficult, it takes effort, it requires making mistakes and being unhappy about them, unhappy enough to be motivated to learn how to not make them in the future. If we don't have to do it, we won't, even if we logically know that we'd be better off.

Especially if the expectations are raised to the point where the pressure to be "productive" makes it feel like you're wasting other people's time and your paycheck to learn anything that the LLM can do for you. We're reaching the point where it feels irresponsible to learn.

(Sometimes this is ok. I'm fairly bad at long division now, but I don't think it's holding me back. But juniors can't know what they need to know before they know it!)

> But my real fear is about the skill acquisition, or simply the thinking. We are human, we don't want to have to go through the learning stage before we start doing, and we won't if we don't have to. It's difficult, it takes effort, it requires making mistakes and being unhappy about them, unhappy enough to be motivated to learn how to not make them in the future. If we don't have to do it, we won't, even if we logically know that we'd be better off.

I've noticed the effects of this first hand from intense LLM engagement.

I relate it more to the effects of portable calculators, navigation systems, and tools like Wikipedia. I'm under the impression this is valid criticism, but we may be overly concerned because it's new and a powerful tool. There's even surveys/studies showing differences in how LLM are perceived WRT productivity between different generations.

I'm more concerned with potential loss of critical thinking skills, more than anything else. And on a related note, there have been concerns of critical thinking skills before this mass adoption of LLMs. I'm also concerned with the impact of LLMs on the quality of information. We're seeing a huge jump in quantity while some quality lacks. It bothers me when I see an LLM confidently presenting incorrect information that's seemingly trivial to validate. I've had web searches give me more incorrect information from LLM tooling at a much greater frequency than I've ever experienced before. It's even more unsettling when the LLM gives the wrong answer and the correct answer is in the description of the top result.

"I'm also concerned with the impact of LLMs on the quality of information."

You have finally made an astute observation...

I have already made the assumption that use of LLMs is going to add new mounds of BS atop the mass of crap that already exists on the internet, as part of my startup thesis.

These things are not obvious in the here and now, but I try to take the view of - how would the present day look, 50 years out in the future looking backwards?