How does an agent help my website not get crushed by traffic load, and how is this proposal any different from the gatekeeping problem to the open web, except even less transparent and accountable because now access is gated by logic inside an impenetrable web of NN weights?
This seems like slogan-based planning with no actual thought put into it.
Whatever is working against the AI doesn’t have to be an AI agent.
So proof of work checks everywhere?
Sure, as long as it doesn't discriminate against user agents.