This would probably still be way better than the status quo, because it would introduce a higher barrier for entry and more friction into the slop delivery system and give the moderation and spam detection a chance to catch up. Much easier to detect and delete 5000 laundered slop images than 500.000.
Seems like something that would be miss-used against people in horrible ways.
I think it might be enough to be able to verify that a photo was taken by a particular manufacturer's camera, not necessarily a specific camera/owner.
TIL this exists... C2PA can digitally sign the complete lifecycle of an image, starting from capture on the camera:
https://petapixel.com/2026/02/21/a-look-at-an-image-verifica...
(I submitted this article to HN too as I thought it was pretty interesting)
Then I can take a photo of an AI generated image.
I think it would be hard to make that look like an image you'd taken naturally, rather than one of a screen
That might impose enough effort to stop most slop from being spammed everywhere.
Could offer a service that automates it with a DSLR pointing at a screen laundering images one after another.
This would probably still be way better than the status quo, because it would introduce a higher barrier for entry and more friction into the slop delivery system and give the moderation and spam detection a chance to catch up. Much easier to detect and delete 5000 laundered slop images than 500.000.