And I strongly recommend that you stop recommending the reading of something that has its practical usefulness limited by what the treatise leaves unsaid:

  – It identifies problems (complexity, latent failures, hindsight bias, etc.) more than it offers solutions. Readers must seek outside methods to act on these insights.

  – It feels abstract, describing general truths applicable to many domains, but requiring translation into domain-specific practices (be it software, aviation, medicine, etc.).

  – It leaves out discussion on managing complexity – e.g. principles of simplification, modular design, or quantitative risk assessment – which would help prevent some of the failures it warns about.

  – It assumes well-intentioned actors and does not grapple with scenarios where business or political pressures undermine safety – an increasingly pertinent issue in modern industries.

  – It does not explicitly warn against misusing its principles (e.g. becoming fatalistic or overconfident in defenses). The nuance that «failures are inevitable but we still must diligently work to minimize them» must come from the reader’s interpretation.
«How Complex Systems Fail» is highly valuable for its conceptual clarity and timeless truths about complex system behavior. Its direction is one of realism – accepting that no complex system is ever 100% safe – and of placing trust in human skill and systemic defenses over simplistic fixes. The rational critique is that this direction, whilst insightful, needs to be paired with concrete strategies and a proactive mindset to be practically useful.

The treatise by itself won’t tell you how to design the next aircraft or run a data center more safely, but it will shape your thinking so you avoid common pitfalls (such as chasing singular root causes or blaming operators). To truly «preclude» failures or mitigate them, one must extend Cook’s ideas with detailed engineering and organizational practices. In other words, Cook teaches us why things fail in complex ways; it is up to us – engineers, managers, regulators, and front-line practitioners – to apply those lessons in how we build and operate the systems under our care.

To be fair, at the time of writing (late 1990's), Cook’s treatise was breaking ground by succinctly articulating these concepts for a broad audience. Its objective was likely to provoke thought and shift paradigms, rather than serve as a handbook.

Today, we have the benefit of two more decades of research and practice in resilience engineering, which builds on Cook’s points. Practitioners now emphasise building resilient systems, not just trying to prevent failure outright. They use Cook’s insights as rationale for things such as chaos engineering, better incident response, and continuous learning cultures.