Having an issue with users uploading CSAM (a problem for every platform) is very different from giving them a tool to quickly and easily generate CSAM, with apparently little-to-no effort to prevent this from happening.

If the tool generates it automatically or spuriously, then yes. But if it is the users asking it to, then I'm not sure there is a big difference.

Well, its worth noting that with the nonconsensual porn, child and otherwise, it was generating X would often rapidly punish the user that posted the prompt, but leave the grok-generated content up. It wasn't an issue of not having control, it was an issue of how the control was used.