It's obvious there is no way OpenAI can keep videos generated by this within their ecosystem. Everything will be fake, nothing real. We are going to have to change the way we interact with video. While it's obviously possible to fake videos today, it takes work by the creator and takes skill. Now it will take no skill so the obvious consequence of this is we can't believe anything we see.

The worst part is we are already seeing bad actors saying 'I didn't say that' or 'I didn't do that, it was a deep fake'. Now you will be able to say anything in real life and use AI for plausible deniability.

I think that's the point... Then world coin comes to the rescue

World coin is so delightfully dystopian. You could drop it wholesale into a superhero movie and it would be believable as the supervillain’s plot.

It's not that obvious. iOS is pretty secure, if they keep the social network and cameo feature limited to that there might not be good ways to export videos off the platform onto others beyond pointing a camera at the tablet screen. And beyond there being lots of ways to watermark stuff to be detectable, nothing stops the device using its own camera to try and spot if it's being recorded. The bar can be raised quite high as long as you're willing to exclude any device that isn't an iPhone/iPad.

Record things with 2 cameras.

Today's Sora can produce something that resembles reality from a distance, but if you look closely, especially if there's another perspective or the scene is atypical, the flaws are obvious.

Perhaps tomorrow's Sora will overcome the the "final 10%" and maintain undetectable consistency of objects in 2 perspectives. But that would require a spatial awareness and consistency that models still have a lot of trouble with.

It's also possible we remain stuck in the uncanny valley forever, or at least for the rest of our lives.

It's possible to produce some video or image that looks real, cherry-picked for a demo, but not possible to produce any arbitrary one you want that will end up passable.

>Everything will be fake, nothing real. We are going to have to change the way we interact with video.

I'm optimistic here.

Look at 1900s tech like social security number/card, and paper birth certificates. Our world is changing and new systems of verification will be needed.

I see this as either terribly dystopian - or - a possibility for the mass expansion of cryptography and encrypted/signed communication. Ideally in privacy preserving ways because nothing else will make as much sense when it comes to the verification that countries will need to give each other even if they want backdoor registry BS for the common man.

Breaking changes get fixes.

We are going to have to change the way we interact with video.

I doubt it will be for the better. The ubiquity of AI deepfakes just reenforces entrenchment around "If the message reinforces my preconceived notion, I believe it and think anyone who calls it fake is stupid/my enemy/pushing an agenda. If the message contradicts my preconceived notion, it's obviously fake and anyone who believes it is stupid/my enemy/pushing an agenda.". People don't even take the time to think "is this even plausible", much less do the intellectual work to verify.