It already is and has been for a long time. The kinds of content you have to moderate are not things anyone wants to look at. The worst ones are far worse than you can imagine and the average ones are nudes of people you don't want to look at all day.

This is actually today's controversy on Bluesky because the #1 attribute of its power users is they're terrified of "AI" and the idea that "companies will steal their posts to generate AI slop", which means they think the ML moderation is stealing their posts.

Oh, but the ML can't be decentralized because the training datasets are illegal.