Try running software from 1995 on a brand new system and you'll find all sorts of fun reasons why it's more complicated than that.
Try running software from 1995 on a brand new system and you'll find all sorts of fun reasons why it's more complicated than that.
I don’t think I can take that claim by itself as necessarily implying the cause is hardware. Consumer OSes were on the verge of getting protected memory at that time, as an example of where things were, so if I imagine “take an old application and try to run it” then I am immediately imagining software problems, and software bit rot is a well-known thing. If the claim is “try to run Windows 95 on bare metal”, then…well actually I installed win98 on a new PC about 10 years ago and it worked. When I try to imagine hardware changes since then that a kernel would have to worry about, I’m mostly coming up with PCI Express and some brave OEMs finally removing BIOS compatibility and leaving only UEFI. I’m not counting lack of drivers for modern hardware as “hardware still changes” because that feels like a natural consequence of having multiple vendors in the market, but maybe I could be convinced that is a fundamental change in and of itself…however even then, that state of things was extremely normalized by the 2000s.
Drivers make up a tiny portion of the software on our computer by any measure (memory or compute time) and they're far longer lasting than your average GUI app.
On the other hand, the main reason why Y2K happened was because a lot of major orgs would rather emulate software from the 60s forever than rewrite it. I'm talking like ancient IBM mainframe stuff, running on potentially multiple layers of emulation and virtualization.
We rewrite stuff for lots of reasons, but virtualization makes it easy enough to take our platforms with us even as hardware changes.
Pretty sure if I downloaded and compiled Tcl/Tk 7.6.x source code on a modern Linux box, it would run my Tcl/Tk 7.6.x "system monitor" code from 1995 or 1996 just fine.
Do you have any examples that aren't because of the OS (as in, not trying to run a 90's game on Windows 11) or specialized hardware (like an old Voodoo GPU or something)?
The whole point is that everything changes around software. Drivers, CPUs, GPUs, web browsers, OSs, common libraries, etc. Everything changes.
It doesn't matter if x86 is backwards compatible if everything else has changed.
No code can last 100 years in any environment with change. That's the point.
If you restrict yourself to programs that don't need an OS or hardware, you're going to be looking at a pretty small set of programs.
I don't, but I do restrict that you run it on the same OS as it was designed for.
Backwards-compatibility in OSes is the exception, not the rule. IBM does pretty well here. Microsoft does okay. Linux is either fine or a disaster depending on who you ask. MacOS, iOS, and Android laugh at the idea. And even the OSes most dedicated to compatibility devote a ton of effort to ensuring it on new hardware.