Just false nostalgia memory.
20 years ago things werent any better. Software didn't consume gigabytes of ram because there was no gigabytes of ram to consume.
Just false nostalgia memory.
20 years ago things werent any better. Software didn't consume gigabytes of ram because there was no gigabytes of ram to consume.
There's almost no product or piece of software that I use today that doesn't have at least 2 bugs that I run into on a daily basis. Every website, web app, mobile app, console app, etc, they all have clearly user-affecting bugs. And nearly every one of those makes it hard for me to diagnose or report the bugs. I spent at least 15 to 30 minutes every day working around bugs so I can live my life.
We have a vastly different software culture today. Constant churning change is superior to all else. I can't go two weeks without a mobile app forcing me to upgrade it so that it will keep operating. My Kubuntu 24.04 LTS box somehow has a constant stream of updates even though I've double-checked I'm on the LTS apt repos. Rolling-release distros are an actual thing people use intentionally (we used to call that the unstable branch).
I could speculate on specifics but I'm not a software developer so I don't see exactly what's going on with these teams. But software didn't used to be made or used this way. It felt like there were more adults in the room who would avoid making decisions that would clearly lead to problems. I think the values have changed to accept or ignore those problems. (I don't want to jump to the conclusion that "they're too ignorant to even know what potential problems exist", but it's a real possibility)
I was a software developer, and have some idea where this comes from. Keeping track of multiple versions of software to separate bug fixes from new features is hard, but calling whatever is in version control on the first Friday of every month "version N+1" is easy. Back when users had to pay for a stack of floppies or CDs to get a new version, you had to give them a compelling reason to do so. When it's nearly impossible for them to prevent the new version from being auto-installed, you don't.
Nah, I don't think so -- it really was a big deal to have a bug back then, and software quality used to be a lot higher. We could go back and run some VMs to try to objectively quantify this (and it would be interesting to do so), but I'm personally confident my memory isn't being tinted by nostalgia.
The main reason is the ability to do constant updates now -- it changes the competitive calculus. Ship fast and fix bugs constantly wins out vs. going slower and having fewer bugs (both in the market & w/in a company "who ships faster?").
When you were shipping software on physical media having a critical bug was a very big deal. Not so anymore.
The problem with constant updates is that usually developers will make it so that the app stops working unless you update.
Anybody that was around for the Windows 95-ME era remembers. Things would just randomly crash. BSODs, "This program has performed an illegal operation", "A device attached to the system is not functioning", "Windows will restart and repair your registry", "Explorer has caused an error"... Ctrl+S was the first keyboard shortcut every schoolchild learned so that Word wouldn't munge their homework.
Let's not even think about the absolute mess that the web was with competing browser box models and DHTML and weird shared hosting CGI setups. We have it easy.
20 years ago, you could consistently pick up a phone, get a dial tone, and then call a human to solve a problem.
Sure, plenty of stuff didn't work. The issue is we're not bothering to make anything that does. It's a clear cultural shift and all of this "nothing ever worked so why try" talk here is not what I remember.
We're in a stochastic era of scale where individual experiences do not matter. AI turning computers from predictable to not is in the same direction but with yet more velocity.
I think you've got your time ranges wrong. Almost exactly 20 years ago I worked for a company that did dashboards for analyzing and managing call centre traffic. The most important metric wasn't "how happy customers are" or "how many calls served" it was: "how many calls did we avoid having a human agent get involved for?" The metric of success was the degree to which they could avoid using expensive human labour and get you through the automated call tree.
Companies offered such (expensive) services because they had no choice. They made every effort to divert and divest from such activities. Google and companies like them made filthy profits because they figured out the secret sauce to scaling a business without the involvement of humans, but people were trying it for literally decades with mixed results (usually enraged customers).
Stupid red tape, paperwork, and call centre frustrations were the order of the day 20-30 years ago.
What part of that is inconsistent with what I said? Over a long period of time, including 20 years ago, the trajectory of cultural value has been going from consistency to usually acceptable but at least it's cheap. It wasn't started by AI but slop is the next stop.
I'm saying the experience sucked back then, too. I don't see it as worse, frankly.
There have always been things that sucked. The difference, in my mind, is that we now dismiss the idea quality ever existed, an impossible hypothetical.
See "A plea for lean software" by Wirth.
It's from 1995 and laments that computers need megabytes of memory for what used to work in kilobytes.
The transition from kilobytes to megabytes is not comparable to the transition from megabytes to gigabytes at all. Back in the kilobytes days, when the engineers (still) had to manage bits and resort to all kind of tricks to somehow make it to something working, a lot of software (and software engineering) aspects left to be desired. Way too many efforts were poured not so much into putting the things together for the business logic as were poured into overcoming the shortcomings of limited memory (and other computing) resource availability. Legitimate requirements for software had to be butchered like Procrustes' victims, so that the software could have a chance to be. The megabytes era accommodated all but high end media software, without having to compromise on their internal build-up. It was the time when things could be properly done, no excuses.
Nowadays' disregard for computing resource consumption is simply the result of said resources getting too cheap to be properly valued and a trend of taking their continued increase for granted. There's simply little to no addition in today's software functionality that couldn't do without the gigabytes levels of memory consumption.
Wirth's law is eating Moore's law for lunch.
> 20 years ago things werent any better.
Yes they were. I was there. Most software was of a much higher quality than what we saw today.
Back in the Mac OS < 10 or MS-DOS days, all programs shared the same address space. Bugs that would crash a single program nowadays would bring down the entire system. But you could somehow run a bunch of sketchy shareware games, and your computer usually wouldn't crash or run out of memory, because programmers cared about bugs.
One could easily imagine the analogy, e. g. the amount of commercial software (in critical infrastructure) that "leaked kBytes and MBytes of memory", as well as embarassing "Computer Science 101 error handling" fuck-ups of the day with billion dollar consequences, but I believe you got it surrounded...
Ariane 5?
Therac-25?
Anyone?
Bueller? Bueller?
If you Ferris-wheel some good hard info into an amusing article, I'll read it. Promise.
Yeah i wonder if the people saying this are just too young to have experienced it. Computers and software have always been janky as fuck. I used to have games crash my entire system instead of just the program. That still happens today but not as frequently
But now there are gigabytes, and I would love to fill those up with countless virtual machines and applications, but sadly that is not possible since bloat has grown at least as fast as RAM size.
How is that not better? If you could do it without consuming gigabytes back in the day, and now you can't, something must have gotten worse. The cost of consuming gigabytes has gone down, but being able to tolerate a worse situation doesn't mean the situation isn't worse.
You couldn't consume gigabytes because that amount of ram didn't exist. You still had apps with the same issues that would eat all your ram.
Computers crashed all the fucking time for dumb bugs. I remember being shocked when I upgraded to XP and could go a full day without a BSOD. Then I upgraded to intel OSX and was shocked that a system could run without ever crashing.
Edit: this isn't to say that these issues today are acceptable, just that broken software is nothing new.
> You couldn't consume gigabytes because that amount of ram didn't exist.
No, they didn't consume gigabytes because they were written is such a way that they didn't need to. Run one of those programs on a modern computer with gigabytes of ram and it still won't. It was as easy then as ever to write software that demanded more resources than available; the scarcity at the time was just the reason programmers cared enough to fix their bugs.
> You still had apps with the same issues that would eat all your ram.
The worst offenders back then had objectively smaller issues than what would be considered good now.
> Computers crashed all the fucking time for dumb bugs. I remember being shocked when I upgraded to XP and could go a full day without a BSOD.
Because XP could handle more faults, not because the programs running on XP were better written.
Exactly, and if you used machines 20-30 years ago you got used to all sorts of periodic terrible faults that could require rebooting the machine ("blue screens" or sad Mac, guru meditation, etc) or at least restarting the program.
On top of that many things were simply hard to use for non-specialists, even after the introduction of the GUI.
They were also riddled with security holes that mostly went unnoticed because there was simply a smaller and less aggressive audience.
Anyways most people's interaction with "software" these days is through their phones, and the experience is a highly focused and reduced set of interactions, and most "productive" things take a SaaS form.
I do think as a software developer things are in some ways worse. But I actually don't think it's on a technical basis but organizational. There are so many own goals against productivity in this industry now, frankly a result of management and team practices ... I haven't worked on a truly productive fully engaged team in years. 20-25 years ago I saw teams writing a lot more code and getting a lot more done, but I won't use this as my soapbox to get into why. But it's not technology (it's never been better to write code!) it's humans.
Anyone remember the good old 90s and win95 or win98? Yeah those were the days, the quality was absolutely perfect nothing ever crashed /s
This is exactly the right way to think about it.