In 2020, I started working on a (C++) game engine. Since the only decent open-source UI option was Dear ImGui (which was obviously a bad choice for consumer-facing UIs), I ended up rolling my own retained-mode UI library on top of SDL. Now, it's fully-featured enough that I rarely have to touch it. There's even a major company using it for embedded products.
I don't get why every language's community doesn't just do the same thing: roll an idiomatic UI lib on top of SDL. It was tough, but I was able to do it as a single person (who was also building an entire game engine at the same time) over the course of a couple years.
SDL is not perfect, e.g. you can't get pinch/zoom events on MacOS. IMO, using the OS APIs yourself is better.
How's the accessibility?
I'm not trying to troll here, asking seriously: are there any immediate-mode GUI APIs with good a11y?
I haven't worked on screen reader support, yet. Support for alternative text input is built into SDL. UI size scaling is a feature I plan on adding eventually.
For most serious applications, accessibility isn't a second thought, it's a requirement and it's very hard to implement correctly.
So the solution is to build applications around less of a common base? I don't follow the logic, with respect to Zed. I get what you mean if there's a first-party UI solution in your language (e.g. Swift), but in that case you don't need an open-source UI library.
The solution, if you want a production ready GUI, is to use a GUI toolkit which already has decent accessibility support.
There aren't that many of those: .NET, AppKit/UIKit, SwiftUI, Qt, GTK, the web, wxWidgets (which is really just GTK/AppKit/.NET), probably a couple others. So you either use the native language of one of those toolkits, or you use bindings from your language to those toolkits.