> But I am having trouble justifying in an consistent manner why Grok / X should be liable here instead of the user.

This seems to rest on false assumptions that: (1) legal liability is exclusive, and (2) investigation of X is not important both to X’s liability and to pursuing the users, to the extent that they would also be subject to liability.

X/xAI may be liable for any or all of the following reasons:

* xAI generated virtual child pornography with the likenesses of actual children, which is generally illegal, even if that service was procured by a third party.

* X and xAI distributed virtual child pornography with the likenesses of actual children, which is generally illegal, irrespective of who generated and supplied them.

* To the extent that liability for either of the first two bullet points would be eliminated or mitigated by absence of knowledge at the time of the prohibited content and prompt action when the actor became aware, X often punished users for the prompts proucing the virtual child pornography without taking prompt action to remove the xAI-generated virtual child pornography resulting from the prompt, demonstrating knowledge and intent.

* When the epidemic of grok-generated nonconsensual, including child, pornography drew attention, X and xAI responded by attempting to monetize the capacity by limiting the tool to only paid X subscribers, showing an attempt to commercially profit from it, which is, again, generally illegal.