I feel it’s a function of abstraction.

You learned when analogue circuitry was the norm. I learned when digital circuitry was simple enough that you could readily take something apart and understand it.

Now, EE courses often start with cad, simulations, digital electronics, and you end up with people building ziggurats atop an ocean of incomprehension.

It’s exactly the same thing with software.

I don’t scorn people for this, rather I see myself as fortunate for having learned in a time when the more fundamental knowledge was still worth learning - and that’s the rub - for a vast majority, it simply isn’t worth the time or energy to explore the full stack, when there’s so much to learn atop it.

"You learned when analogue circuitry was the norm. I learned when digital circuitry..."

What's not taught properly these days is that ALL electronics is analog at the physical/circuit level.

For you digital types that's OSI Model Layer 1 — Physical layer (look it up on Wiki). Nothing in electronics works unless that's working properly—ICs, tunnel diodes, transistors, inductors, resistors, capacitors, cables and antennas are all analog devices at that level. That includes the heart of the most advanced digital ICs. For example, the upper clock speeds in processors are limited by transit times/electron mobility, inter-electrode and stray capacitances, unwanted inductance, etc.—all of which are analog effects and they must be accounted for.

Like it or not, the physical analog world is alive and well! The Noughts & Ones Brigade unfortunately seems to have forgotten that fact.

> you end up with people building ziggurats atop an ocean of incomprehension.

Everyone does. There's probably a layer below for everyone but the most theoretical physicists. I don't know where the leaks in electronics engineering's abstractions are, but I'm pretty sure they exist.