One could easily test the author's conviction on "rejecting content as they please" by spamming them with horrible stuff for a few months and the author would learn why 100% of content moderation should not be pushed on the individual user.

I think that moderation should be pushed to the individual user to avoid censorship, but not in the form it's currently implemented by all these platforms.

To give an example on how I think moderation should work. If I follow you and you follow me on some nonexistent platform Y. You see the content I upvote, and I can see the content you upvote. So we'd start with block all by default, with transparency of why something is in one's list.

I pitched a P2P platform like this years ago to NLNet (taking heavy inspiration from I2P's Syndie app, minus the funky UX), though I didn't manage to get any funding due to missing clout as a public developer; to lead such an effort.

Yes, the trust and content moderation needs to be a central part for the social network to be successful. The P2P aspect is a technical detail but in order to invest into yet another thing, it needs to have certain attractiveness.

Why are HN people moving to lobste.rs? Because it is an exclusive community.

It's not easy to spam someone on Nostr. The apps that users use have multiple options that make it into a non problem. There is a lot of spam and offensive content, but you just don't get to see it unless you look at "global" feed and that's now quite hidden in most apps. Essentially your app restricts to seeing content from people you follow and then you limit visibility on random replies to posts.

Now nostr is actually much bigger than "twitter-like" app, including powering app stores, chat apps, collaboration, podcasts, music player, etc.

i think you cannot spam someone's screen in nostr. they just unsubscribe from your key, if they ever were subscribed.

DoS on the infra is a different question, though.

I mean from multiple accounts. The idea is that they will get tired of constantly have to block content they don't want to see and they will understand why other people try to enforce stronger moderation defaults.