I recently ran a test on the page load reliability of Browserbase and I was shocked to see how unreliable it was for a standard set of websites - the top 100 websites in the US by traffic according to SimilarWeb. 29% of page load requests failed. Without an open standard for agent identification, it will always be a cat and mouse game to trap agents, and many agents will predictably fail simple tasks.

https://anchorbrowser.io/blog/page-load-reliability-on-the-t...

Here's to working together to develop a new protocol that works for agents and website owners alike.