A lot of ReadMe's are generated with AI. Doesn't really mean anything.

You're right. A lot of words that don't really mean anything; and that's exactly why you should not do it if you want actual humans to read it.

Whenever I see a README or worse, PR description that was obviously generated by an LLM, my immediate response is "if you couldn't be bothered to write this, why should I bother reading this?"

Because it provides useful information, and is easier to read compared to reading the code directly.

Except, no it doesn’t.

In the case of a pull request, I am not about to trust some LLM that has no business context and can only pretend to guess at the “why” of a change.

To understand the “what” of a change, you have to actually read the code. This doesn’t belong in the pull request description most of the time.

You’re implying that if someone uses AI to write something, the person doesn’t then read it/iterate on it to ensure correctness. Serious “get off my lawn” vibes here.

the person doesn’t then read it/iterate on it to ensure correctness.

As someone who has had to deal with drive-by PRs on open-source projects, which were a problem before but have now gotten much worse in volume as they are mostly AI-generated, yes.

[deleted]