> one thing that I think was revolutionary about Jobs...

No. Absolutely, unequivocally, no. You're talking about the difference between then and now in the way software was/is built, not the difference between Jobs and everyone else! The deification of Jobs is bad enough without the constant historical revisionism.

Back then, generally tuning/maximising performance and quality was a top priority for the majority of people in the industry, software engineers and senior staff alike. "Faster hardware" just didn't affordably exist for them back then. Many who were there in those days now bemoan the way their modern equivalents no longer prioritise efficiency, which leads to the awful slow UX you're referring to that really shouldn't be seen as acceptable, but somehow is.

Even if we see Jobs as being at the extreme - more focused on these things than most top-level execs of his day - then to treat the entire rest of the industry together as though they were at the opposite extreme (i.e. at today's level of not caring) and call him "revolutionary" in his day as a response to this, would very much be fallacious.

> "Faster hardware" just didn't affordably exist for them back then.

Not what I said. And I think you are the one that's doing historical revisionism now.

Even in this email from 1983, it starts off with

> since its 68000 microprocessor was effectively 10 times faster than an Apple II

From the 80s through the 00s (which I was alive through and very aware of), computer hardware was frequently doubling in performance. The common wisdom then was to make things fast enough. Anything more was a waste of time because in a year or two hardware would be twice as fast.

The wastefulness of today came directly from that past wisdom. I can guarantee you that ever since I've been conscious around discussions about software there's been people that have bemoaned how sloppy and wasteful software has become. People complained about how bloated Windows XP was vs 98.

Ruby, python, perl, java. All these bloated and slow programming languages got their starts in the 80s and 90s. Exactly because of the wisdom that "it's slow today but hardware tomorrow will make it fast". Heck, even C and lisp are manifestations of this. Consider that people weren't writing all software in assembly during the time period in question. There were clear performance benefits of doing so as compiler at the time were particularly bad.

I've worked with a lot of older devs and they all hold the attitude that performance optimization is a complete waste of time. They've been the hardest ones to break of that notion. Younger devs tend to more intuitively know that performance optimization are important. That's because over the last decade, hardware performance improvements have stagnated.

So yes, absolutely yes. In the past if you could make writing software more ergonomic by sacrificing some memory or performance, that's a tradeoff most of the industry would gladly take. They wrote for today's hardware and sometimes tomorrow's.

> The common wisdom then was to make things fast enough. Anything more was a waste of time because in a year or two hardware would be twice as fast.

...and people would be using hardware spanning several generations that actually considerably differed in performance, as nobody but the nerds was buying a new PC every year.