Video streams are not known for their low bandwidth needs, let alone adding in RTT latency for inputs.

That's true, I'm not saying it comes without trade-offs. But in return you get a perfectly consistent and physically accurate simulation. It would mostly be expensive, I think, but it's technically feasible (services like Shadow or GeForce Now already demonstrate that).

Which one of your friends can host an mp physics heavy game with a number of low-latency high-resolution video streams? I would estimate the average answer to be zero.

Perhaps the solution could be to have all players stream the game from a centralized instance, rather than all clients streaming from the host’s instance.

That would have a number of advantages, come to think of it. For starters, install size could be much lower, piracy would be a non-issue, and there would be no need to worry about cross-platform development concerns.

We don't have to theorize about this. We've had cloud gaming for years, and the companies have immense motivations to turn us all into renters in the cloud so they've poured a lot of effort into it and we can see half-a-dozen highly-resourced results now. We can just look at it and we can see that it... almost... works. If you don't care much about latency it definitely works.

However, Teardown is in the set of games where it just barely works and only if all the stars and the moon align. I'd characterize it as something like, cloud gaming spends 100% of the margin, so if anything, anything goes wrong, it doesn't work very well.

(Plus, as excited as the companies are about locking us into subscriptions rather than purchases that we own, when it comes time to actually pay for the service they are delivering they sure do like to skimp, because it turns out it's kind of expensive to dedicate the equivalent of a high-end gaming console per person. Most stuff that lives in the cloud, a single user averages using a vanishing fraction of a machine over time, not consuming entire servers at a time. Which doesn't pair well with "you spent 100% of the margin just getting cloud gaming to work at all".)

For the record, my comment was a joke. I was quoting from Stadia’s marketing. :-)

Running several raytracers on a single videocard isn’t free either. Syncing the world changes as they do is the least intensive for the server, and the last bandwidth. It’s probably optimal in all ways.

[deleted]

Most consumer GPUs have a limit on the number of video streams their hardware encoder can handle at once, and in some cases the limit is as low as 2.

Okay, I didn't know that