This stood out to me:

> Indeed, the computer, by virtue of its brittle nature, seems to require that it come first. Brittleness is the inability of a system to cope with surprises, and, as we apply computers to situations that are ever more interconnected and layered, our systems are confounded by ever more surprises. By contrast, the systems theorist David Woods notes, human beings are designed to handle surprises. We’re resilient; we evolved to handle the shifting variety of a world where events routinely fall outside the boundaries of expectation. As a result, it’s the people inside organizations, not the machines, who must improvise in the face of unanticipated events.

In this new age of AI, maybe we can start to reverse this trend? Make systems that can adapt and handle surprises, instead of pushing all this brittleness onto the humans using the system