The simplest examples have over a thousand (literally) dependencies. Amongst them, are GTK, GDK, pango, etc. It literally depends on another toolkit, which is the weirdest thing IMHO.

Because of GNOME's insistence on not implementing Server Side Decorations, you can't not depend on libadwaita. This is what I imagine pull in all of the GTK dependencies.

I think this is pretty common on Linux. You would want to GTK (or Qt) I would think to draw the top level window and perhaps system menus, etc. even though the UI itself is drawn using a GPU canvas.

> You would want to GTK (or Qt) I would think to draw the top level window and perhaps system menus, etc. even though the UI itself is drawn using a GPU canvas.

No, you would want to draw for Wayland or X. GTK and Qt themselves don't burden with importing each-other to work, for example.

My guess is that they import GTK only to get a title bar on GNOME, as GNOME forces applications to render their own. They could go custom and cut the dependency but it never looks quite right when apps do that.

No. On Wayland all of that should be in the compositor. Window sizing and positioning can not be done by the apps, so it makes sense that the controls for that are drawn and handled by the WM. But Gnomes gotta gnome...

Are GTK/Qt memory safe now?

No. What is the likelihood of an attack on a desktop program via memory unsafety?

Really high?

What do you run your browser as?

Browser runs complex untrusted code from the internet. Most desktop programs don't do anything like that. The servo programmers were riding a motorbike. Using Rust for a desktop program would be more like wearing a crash helmet in a car.

>What is the likelihood of an attack on a desktop program via memory unsafety?

Low.

What's the likelihood of someone entering your house if it's unlocked?

Also low, and yet you lock it.

A desktop program is already in a locked house - your desktop - which I can't login to.

[flagged]

Do all desktop programs only ever run in "your house"?

Such is sadly increasingly the way with Rust projects.

Would you rather have 1000 small, composable, auditable dependencies or the same amount of code in a monolithic dump of .hpp files?

How about a few large dependencies and no little ones?

No advantage to it. Worse quality code to gain what? A smaller number hiding ultimately the same amount of code? Also, since the unit of compilation is a crate, fewer opportunities for concurrent compiling.

A multitude of tiny dependencies has a multitude of solo maintainers, who eventually walk away, or sometimes get compromised.

A few big dependencies each have a team and a reputation that has earned trust and established release process and standards. If there's a serious problem in a small part of a big dependency, there are a few trusted maintainers of the big dependency who can be reached and can resolve it.

The theory of small dependencies was a good theory, 10 years ago when js devs using NPM started the trend of making them "as small as possible". But it really seems like the emergent pattern is the opposite of what is claimed. These JS and Rust projects end up taking longer to build and resulting in bigger outputs. Instead of a couple of "huge" 200KB dependencies, you end up with _thousands_ of 1KB dependencies including different versions and alternative implementations, you end up with megabytes of "accidental" code you don't really need.

And we can reason about why. In an ecosystem where something has 1 to 3 large deps, well sometimes a dependency pulls in another sub-dependency with code you don't need. But in an ecosystem where something has 10 to 100 deps, this still happens, but 50x more overall. It's a exponential trend: you have 3 big deps that each have 2 big deps that each have 1 big dep, vs you have 20 small deps that each have 15 small deps that each have 10 small deps.