There are so many reasons why C/C++ build systems struggle, but imo power is the last of them. "Powerful" and "scriptable" build systems are what has gotten us into the swamp!

* Standards committee is allergic to standardizing anything outside of the language itself: build tools, dependency management, even the concept of a "file" is controversial!

* Existing poor state of build systems is viral - any new build system is 10x as complex as a clean room design because you have to deal with all the legacy "power" of previous build tooling. Build system flaws propagate - the moment you need hacks in your build, you start imposing those hacks on downstream users of your library also.

Even CMake should be a much better experience than it is - but in the real world major projects don't maintain their CMake builds to the point you can cleanly depend on them. Things like using raw MY_LIB_DIR variables instead of targets, hacky/broken feature detection flags etc. Microsoft tried to solve this problem via vcpkg, ended up having to patch builds of 90% of the packages to get it to work, and it's still a poor experience where half the builds are broken.

My opinion is that a new C/C++ build/package system is actually a solvable problem now with AI. Because you can point Opus 4.6 or whoever at the massive pile of open source dependencies, and tell it for each one "write a build config for this package using my new build system" which solves the gordian knot of the ecosystem problem.