D gets no respect. It's a solid language with a lot of great features and conveniences compared to C++ but it barely gets a passing mention (if that) when language discussions pop up. I'd argue a lot of the problems people have with C++ are addressed with D but they have no idea.

Ecosystem isn't that great, and much of it relies on the GC. If you're going to move out of C++, you might as well go all in on a GC language (Java, C#, Go) or use Rust. D's value proposition isn't enough to compete with those languages.

D has a GC and it’s optional. Which should be the best of both worlds in theory.

Also D is older than Go and Rust and only a few months younger than C#. So the question then becomes “why weren’t people using D when your recommended alternatives weren’t an option?” Or “why use the alternatives (when they were new) when D already exists?”

> D has a GC and it’s optional.

This is only true in the most technical sense: you can easily opt-out of the GC, but you will struggle with the standard library, and probably most third-party libraries too. It's the baseline assumption after all, hence why it's opt-out, not opt-in. There was a DConf talk about the future of Phobos which indicated increased support for @nogc, but this is a ways away, and even then. If you're opting-out of the GC, you are giving up a lot. And honestly, if you really don't want the GC, you may be better off with Zig.

Garbage collection has never been a major issue for most use cases. However, the Phobos vs. Tango and D1 vs. D2 splits severely slowed D’s adoption, causing it to miss the golden window before C++11, Go, and Rust emerged.

Could say the same for Nim.

But popularity/awareness/ecosystem matter.

That's the great thing about LLMs.

Especially with Nim it's so easy to make quality libraries with a Codex/ClaudeCode and a couple hours as a hobby.

Especially when they run fast. I just made Metal bindings and got 120 FPS demos with SDF bitmaps running yesterday while eating Saturday brunch.

I don't really get the idea that LLMs lower the level of familiarity one needs to have with a language.

A standup comedian from Australia should not assume that the audience in the Himalayas is laughing because the LLM the comedian used 20 minutes before was really good at translating the comedian's routine.

But I suppose it is normal for developers to assume that a compiler translated their Haskell into x86_64 instructions perfectly, then turned around and did the same for three different flavors of Arm instructions. So why shouldn't an LLM turn piles of oral descriptions into perfectly architected Nim?

For some reason I don't feel the same urgency to double-check the details of the Arm instructions as I feel about inspecting the Nim or Haskell or whatever the LLM generated.

I don’t trust them. I run tests and I review the code generated by the LLMs. About 1/5 times I’ll just git reset the changes and try again.

You have to push for them to add tests. It also helps if you can have the LLM just translate from C++ to Nim.

We’re certainly not at the age of LLMs generating code on the fly each time.

If the difference in performance between the target language and C++ is huge, it's probably not the language that's great, but some quirk of implementation.

Tiny community, even more tinier than when Andrei Alexandrescu published the D book (he is now back to C++ at NVidia), lack of direction (it is always trying the next big thing that might atract users, leaving others behind not fully done), since 2010 other alternatives with big corp sponsoring came up, others like Java and C# gained the AOT and improved their low level programing capabilities.

Thus, it makes very little sense to adopt D versus other managed compiled languages.

The language and community are cool, sadly that is not enough.

[deleted]