Because that's the definition of collaboration? Prior to the invention of LLMs, one could generally assume requests for collaboration were somewhat sincere due to the time investment involved. Now we need a new paradigm for collaboration.

> Because that's the definition of collaboration?

I don't think the definition of collaboration includes making close to zero effort and expecting someone else to expend considerable effort in return.

The problem is that the sheer volume of low-quality AI PRs is overwhelming. Just the time it takes determining whether you should pay attention to a PR or not can add up when there are a lot of plausible-looking, but actually low-quality and untested, pull requests to your project.

But if you stop looking at PRs entirely, you eliminate the ability for new contributors to join a project or make changes that improve the project. This is where the conflict comes from.

Since the bar to opening a PR has gotten lower, there's an argument that the bar for closing it might need to be lowered as well. I think right now, we consider the review effort to be asymmetric in part because it's natural to want to give the benefit of the doubt to PR authors rather than making a snap judgement from only a looking briefly at something; the current system seems to place a higher value on not accidentally closing a potentially useful but poorly presented PR than not accidentally wasting time on one that superficially appears like it might be good but isn't. I have to wonder if the best we can do is to just be more willing to close PRs when reviewers aren't sufficiently convinced of the quality after a shorter inspection regardless of whether we're 100% certain about whether that judgment is perfect. If "false positive" PRs that seem reasonable but turn out not to be are better at appearing superficially good, the best option seems like it might just be to be willing to throw out more "false negatives" that would be useful but aren't sufficiently able to distinguish themselves from the ones that aren't.

After a minute (or whatever length of time makes sense for the project), decide whether you're not fully confident that the PR is worth your time to continue reviewing, with the default answer being "no" if you're on the fence. Unless it's a yes, you got a bad vibe; close it and move on. Getting a PR merged will require more effort in making the case that there's value in keeping it open, which restores some of the balance that's been lost in the effort having been pushed to the review side.

PR authors blow now need to spend energy and effort to make their PR appear worthwhile for consideration. AI PRs will have the effect of shifting the burden of effort to the PR authors (the real ones).

No more drive-by PRs.