I briefly taught a beginner CS course over a decade ago, and at the time it was already surprising and disappointing how many of my students would reach for a calculator to do single-digit arithmetic; something that was a requirement to be committed to memory when I was still in school. Not surprisingly, teaching them binary and hex was extremely frustrating.

I tell people when I tip I "round off to the nearest dollar, move the decimal place (10%), and multiply by 2" (generating a tip that will be in the ballpark of 18%), and am always told "that's too complicated".

I would tell others to "shift right once, then divide by 2 and add" for 15%, and get the same response.

However, I'm not so sure what you mean by a problem with thinking that abstraction is bad. Yes, abstraction is bad --- because it is a way to hide and obscure the actual details, and one could argue that such dependence on opaque things, just like a calculator or AI, is the actual problem.

> shift right once, then divide by 2

So, shift right twice? ;)

I think asking people to convert to binary might be a bit too much lol

  > Yes, abstraction is bad
Code (and math) is abstraction

No ifs, ands, or buts about it.

I'm sorry, I think you are teaching people the wrong thing if you are blanket statement saying "abstraction is bad". You are throwing the baby out with the bath water. You can "over abstract" and that certainly is not good but that's not easy to define as it is extremely problem dependent. But with these absurd blanket statements you just push code quality and performance down.

Over abstraction is bad because it can be too difficult to read or it can be bad because it de-optimizes programs. "Too difficult to read or maintain" is ultimately a skill issue. We don't let the juniors decide that but neither should we have abstraction where only wizards can maintain things. Both are errors.

But abstraction can also greatly increase readability and help maintain code. It's the reason we use functions. It's the reason we use OOP. It helps optimize code, it can help reduce writing, it can and does do many beneficial things.

Lumping everything together is just harmful.

Saying abstraction is bad is no different than saying "python is bad", or any duck typing language (including C++'s auto), because you're using an abstract data type. The "higher level" the language, the more abstract it is.

Saying abstraction is bad is no different than saying templates are bad.

Saying abstraction is bad is no different than saying object oriented programming is bad.

Saying abstraction is bad is saying coding is bad.

I'm sorry, literally everything we do is abstraction. Conflating "over abstraction" with "abstraction" is just as grave an error as the misrepresentation of Knuth's "premature optimization is the root of all evil." Dude said "grab a fucking profiler" and everyone heard "don't waste time making things work better".

If you want to minimize abstraction then you can go write machine code. Anything short of that has abstracted away many actions and operations. I'll admire your skill but this is a path I will never follow nor recommend. Abstraction is necessary and our ability to abstract is foundational into making code even work.

*I will die on this hill*

  > because it is a way to hide and obscure the actual details
That's not abstraction, that obfuscation. Do not conflate these things.

  > one could argue that such dependence on opaque things, just like a calculator or AI, is the actual problem.
I'll let Dijkstra answer this: https://www.cs.utexas.edu/~EWD/transcriptions/EWD06xx/EWD667...