Operating systems of that era were designed based on UX research to help people use the unfamiliar operating system.
Subsequent ones were designed by UI designers, and opinionated senior managers, who already knew how to use them, and took out usability features to make them "look nicer". This sort of worked when the opinionated manager was Steve Jobs. Most managers are not Steve Jobs.
> in some applications they seem to have taken extra steps to make it difficult to find the line to grab
Pet peeve of mine in Windows where the line is at most one pixel now. They also took away the coloured distinction between title bars for the active window, so you don't know where keystrokes are going to go.
> Operating systems of that era were designed based on UX research
Too many developers nowadays don't know this. On any HN discussion of UIs, I've been noticing a growing number of younger devs insisting that usability is entirely subjective (their words, not mine). It's not just that they don't know about cleverly thought-out things such as safe triangles in nested menus or all the affordances/signifiers espoused by Don Norman et al. The bigger problem is that they don't know what they don't know, and they come across as being unwilling to learn.
It does make UX discussions frustrating and meaningless when they could, and should, be interesting and a learning experience for us all.
> safe triangles in nested menus
I did not know about this, but I did notice my own menu-rage every time a submenu disappears!
> Operating systems of that era were designed based on UX research to help people use the unfamiliar operating system.
I have a lot of thoughts on things like PC usability today. You're right that UX research would have heavily contributed to the design on these older systems. As computers moved from the warehouse to the living room they had to be easier to use and understand for people without CS degrees. I think it is fair to assume *some* things about what people these days are familiar with when it comes to the desktop GUI, but usability should receive more focus now even if it slightly hinders aesthetic. A friend of mine has been teaching a college program for video editing and she has students who needed her to explain what files and folders are. This is not the first time I've heard of things like this.
Smartphones and tablets have obfuscated so many basic functions and features that it is actively harming people's understanding of how to use a computer. Things like window sizing, executables, how apps know where things are, and how programs are installed. Android does allow users to peek behind the curtain more than iOS but Google has been going down the path of locking down Android. I haven't been in an elementary school classroom for like 17 years but I remember having computer lab time where we would learn how to use Windows 95/98. I think what has benefited my friends and others my age (~30) is that we grew up when computers were in the home and were usable enough for us to log in and intuit our way around but there was enough friction that made it so we would have to figure things out on our own.
Chesterton's fence! Don't delete something unless you know why it's there in the first place.
‘Took out usability features to make them "look nicer"’ is exactly how Steve Jobs gave us the double-click, undiscoverable and timing-sensitive.
Double-click came out of Xerox's research park. Apply might have been the first to put that into a popular desktop PC solution, but it wasn't their design any more than the rest of the system they copied. There are arguments that a second button was a much better idea, but that would still not be immediately discoverable and even with many buttons in modern solutions we _still_ have double-clicking.
For the brief time I used Windows 11 the amount of times I placed a window over another and then clicked on the wrong window because I couldn't tell at first glance where one started and the other ended was absolutely ridiculous.
I'm afraid that the core of the problem is something far more simple and fundamental.
The people designing desktop apps today simply never learned the conventions that make desktop applications good. They grew up with smartphone apps, web apps, electron apps, games, etc.
In fact, you can observe from things like JavaFX, Flutter, WPF, etc., that the trend has long been about the ability of easily creating custom widgets like you could with Javascript (or Flash), rather than the convenience of having a library of widgets that look and feel exactly the same as every other widget in the system.
> couldn't tell at first glance where one started and the other ended
This was even worse in an RDP session. No drop shadows. I'm not sure who thought "everything should be flat and white" was a good idea.
> I'm not sure who thought "everything should be flat and white" was a good idea.
It's just the old Windows 2.0 look.
Windows 2 had plainly visible borders, with decent contrast depending on your colour settings, so you could see what ended where.
My pet peeve is spacing. My usual resolution is 1920x1080 (scaled or not) and it feels I could cram more information in an old 1024x768 desktop. You have to maximize most windows to get it to show enough information.
This drives me crazy. Even looking at these old screenshots you just know that these systems we outputting a display resolution lower than 1024x768.
When I was checking out the MacBook Neo a while back I was disappointed that the resolution is not natively x2 scaled. It uses fractional scaling when macOS handles fractional scaling quite poorly. I've set the resolution on my M1 MBP to 1280x800 so it was x2 scaled and clarity improved significantly. But I also sacrificed usable space because apps don't adjust, everything is just made larger.