Yes this is a great point. The great irony of the tech sector is that although tech creates efficiencies, the process by which tech is created is itself comically inefficient.

Almost nobody, especially those working for government actually looks at a complex, expensive solution and says "We should simplify this and make it cheaper." The government is paying for a LOT of unnecessary complexity. I would say that's most of the cost of essentially every tech project the government funds.

Reminds me of that 3-section meme about Starlink boosters showing how they simplified the design over time. This is the exception which proves the rule.

A lot of what you see was removed was just test sensors. The same happens in every engineering program, but no one else pretends that it's somehow innovation.

It's like removing test code when you ship a binary.

I don't agree that it's not innovation. It always looks stupidly simple with hindsight to just remove unnecessary complexity, and yet it's extremely rare to see a team which actually does it right on the first go.

I'm an experienced software engineer who worked on a range of big complex projects over almost 2 decades and my experience with every single project (for which I wasn't the team lead) was way, way, way over-engineered. At least 95% of the time was spent on fixing unnecessary intermediate technical issues which the team itself created for itself.

Even the sensor argument... Do you need so many sensors and fallback mechanisms if every part of the system was designed to work within the simplest necessary constraints to begin with? My experience is that the answer is almost always; no. Once you accept that your design is flawed, any patch you add on top to correct the flaws provides tiny diminishing returns if any. Often, the additional complexity actually makes it more likely that your core mechanisms will fail.