A related idea is sub-linear cost growth where the unit cost of operating software gets cheaper the more it’s used. This should be common, right? But it’s oddly rare in practice.
I suspect the reality around programming will be the same - a chasm between perception and reality around the cost.