The why is because we can, but damn am I finding the tools being built with, or having tacked on, AI depressing. Is this a small glimpse of the future we're building for ourselves? Communication is valuable because thought and effort went into it, lowering the bar on producing content doesn't mean more choice, it means lower quality. Already I see a reaction against this amongst some peers when they find out something they were asked to review was AI generated, why should they put effort in if the other person didn't.

If we find the "thought and effort" part of communication valuable, we'll keep it. If not, we won't.

That would be a fine posture to take, very naturally-selective, but I find it discomforting because I've seen so many different ways that humans act that don't benefit them (individually or on the whole). It isn't always out of self-destruction or lack of self-preservation. More often than not, the choice was based on what's easiest -- a tendency towards the path of least resistance. This technology looks a lot like trading off intention, and attention, for quick and good-enough(?) results. Enough so that I can understand GP's concern for our communication skills as a society.

I think we could find ourselves losing the "thought and effort" in spite of it being more valuable, because many people find it easier. Or that even those who continue on with it, despite it not being easier, are broadly labeled as a bot because their writings failed the vibe check.

I have confidence that there will always be small communities that continue holding this as valuable, but that maintaining it in a community setting will become more strained as the general zeitgeist drifts in the direction of regarding output higher than effort.