A manager has responsibility for the team, how can an AI be responsible?

A developer has responsibility for their code. How can an AI be responsible?

A writer has responsibility for their writing. How can an AI be responsible?

AI doesn't need to be responsible. It just needs to provide value, just like writers, developers, managers, etc.

That’s why no one replaced developers with AI that can push its own code to prod.

What does responsibility mean? How does it translate to actual work and skills?

One of the most useful functions of a good manager is to act as a shield for their team from upper management firestorms. That's a role that I think AI is particularly unsuited for, given their tendency to be obsequious.

As a more general role, the idea of responsibility is that the manager has the job of making sure that individual employees' tasks are suited both to their individual competence and abilities and to the corporation's deliverables and ultimate bottom line. This requires making arguments in both directions: in pulling employees to working on things more useful to the company, and in changing the deliverables to capitalize on employees' abilities.

As I get it (and I’m not a smart or well-educated person, so please correct me if I’m wrong or misinformed or talking nonsense), responsibility is status that arises from the recognition of one’s moral obligations. Companies are at least partially formed around shared moral obligations, or put simply - goals. At least without those they become meaningless economic machines, and our culture tends to favor things “having” a meaning.

With multiple agents (of any nature) feedback is essential for the work to be done well - that’s my understanding of how it translates to the actual work getting done.