That sounds like something I don’t actually want to be possible.
Why? Think about a shooter in a frame-by-frame 16 millisecond sort of way. This frame you’re running forward-- you’re very likely to continue running forward the next frame too. This sequence of frames you’re aiming at the alien’s head. You’re likely to shoot at it.
The service isn’t playing the game for you, if you don’t click the mouse button to shoot you won’t see that head explode in a delightful pinata of green alien blood. But if you do, it’ll feel super responsive. That’s a good thing.
Also, John Carmack invented “rollback networking” with clientside prediction in Quakeworld in like 1997.
It shouldn’t be too hard for people like me who gets by on old hardware. I still think it’s a waste of money, but I also thought overpaying for a phone on a subscription was dumb and Apple won, so…
I can see it working with a single binary input. But with half a dozen inputs, some of which are analog, the likelihood of a correct prediction has to be really low. To have any chance of getting a useful hit rate, they’ll have to start treating incorrect but close enough inputs as having been correct.
So maybe they predicted a click at frame X and it arrived at X+4; close enough, just pretend they clicked at X. Or even worse, they predicted the mouse would move 1.5cm in the next 50ms but it moved 1.7cm instead.
What about the way good platformers alrady detect if you’re a bit too late pressing jump, and pretend you actually timed it perfectly? Somehow it feels different, but I don’t know why.
Exactly. I play games on a Shadow instance hosted in a datacenter 500 miles away. I can’t easily tell that the PC isn’t local (except that the graphics rendering is far better than anything my ancient PC could possibly muster.)
I was also a Google Stream beta tester. It worked similarly well. I streamed AC:Ody to my Chromebook and to my mobile phone, and it worked.
Right, I played a couple hours of AC:Odyssey on the Google stream test and Tomb Raider 2017 on Nvidia GeForce Now and both worked perfectly well. I do think third-person action games like that are particularly well-suited to streaming, though.
It’s worth noting that Google probably picked Assassin’s Creed specifically BECAUSE it’s a game that is extremely forgiving of input lag due to it’s inherently mushy, third person controls. Makes a good impression, without seriously taxing latency.
When they can do something twitchy, like Doom Eternal, at scale, across all of the pipes they’ve said the service will work with (meaning not just people on fiber), without people telling a difference, I will believe streaming has truly arrived.
And then you’re still left with the ultimate question I never got answered, of why any of this is better value than xCloud, aside from letting people who hate Microsoft avoid them (for the lovable little indie scamps at Google).
LOL they’ve been using DOOM Eternal at all their demos since they announced at GDC.
“… at scale, across all of the pipes they’ve said the service will work with (meaning not just people on fiber), without people telling a difference…”
Hey, totally fair to say that Google needs to prove that their product actually works. But kinda obvious, too.
With rollback networking, there’s an assumption that someone is “local”. Who or what is local when you’re playing on the cloud? As far as I know, there’s no gaming logic going on when I play street fighter 87 on my toaster oven - I don’t have the CPU for local computations, and don’t need to.
But rollback networking looks like you’re doing the computations locally then correcting them…so I can’t see how you correct the lag from your machine (which is streaming video and sending inputs) to the cloud…
In rollback networking, game logic is allowed to proceed with just the inputs from the local player. If the remote inputs have not yet arrived when it’s time to execute a frame, the networking code will predict what it expects the remote players to do based on previously seen inputs. Since there’s no waiting, the game feels just as responsive as it does offline. When those inputs finally arrive over the network, they can be compared to the ones that were predicted earlier. If they differ, the game can be re-simulated from the point of divergence to the current visible frame.
Maybe it’s not? This is a thread about Stadia? Seems like asking why Hulu is better than Netflix.
As far as I can tell it is literally the Quakeworld clientside prediction stuff which is not exactly new tech.
I know it’s a thread about Stadia. My question was with regards to the perceived value of Stadia, next to it’s only real competitor.
Shadow? PS Now? GeForce Now?
Hard to compare Stadia and xCloud since neither of them are available to the public as of yet, unlike those three above.
Heck, pretty soon you won’t have to press any buttons at all, it’ll just hit all the ones you meant to hit!
This is true, but the trend in tech over the last couple of decades is to make it “good enough” such that even though it’s worse than what they have now, people will flock to it in the name of convenience. Convenience is the only thing that seems to matter to the bulk of consumers these days, and once they latch on to something “convenient” it’s not long before the old, better thing is out of business and no longer available to those who can tell the difference.
PS Now, as currently constituted (specifically in it’s streaming form), isn’t competing with anything, haha. Latency is terrible (especially during peak hours), and the game selection, while large, is not new games at or around release, like Stadia and xCloud are aiming for.
Shadow and GeForce Now I’m not familiar with, as one is in beta, and the other isn’t available in my country.
You have a client in quakeworld that you run locally. Stadia doesn’t run locally, it runs on the cloud. So you can’t reduce lag from stadia to your display/input device with this. You can only reduce your lag to another user in multiplayer games - at least as described. Maybe they are doing something more interesting…
No one knows what business model xCloud is aiming for at this point.
I was talking about rollback networking, which does seem very similar to clientside prediction.