> I know this because a part of my work as a moderator of a Mastodon instance is to respond to user reports, and occasionally those reports are for CSAM, and I am legally obligated to review and submit that content to the NCMEC.

Oh ** that.

I have moderated all sorts of crap, and I am grateful that my worst has only been murders, hate speech, NCII, assaults, gore, and other forms of violence.

> I sometimes wish that the engineers working at OpenAI etc. had to see these images too. Perhaps it would make them reflect on the technology they are ushering into the world, and how “alignment” is working out in practice

This is a great idea. I’ve heard of new leaders being dropped in, and being sure they have a better handle on safety than the T&S teams.

Only after they engage with the issues, and have their assumptions challenged by uncaring reality, did they listen to the T&S teams.

There are a lot of assumptions on speech online that do not translate into operational reality.

On HN and Reddit, everyone complains about moderation and janitors, but I highly recommend coders take it as civic service and volunteer.

How can you meaningfully fix a mess, if you do not actually know what the mess is about?