7 hours ago, DeathByVisualStudio wrote
Modern controllers have more than 4 buttons.
You don't have to predict all of the buttons - but you can if you want. It scales pretty linearly.
If your bandwidth (and Datacenter CPU) can cope with it, you can predict tons of buttons and button combinations. In real life, though, you'll only predict the lag-sensitive buttons.
Opening fire and jumping are probably worth predicting. But if the forward button is down, predicting backward is probably a wasted guess. As is pressing START or SELECT where users care much less about a 100ms response time.
Depending on how smart you want to be, you can even change which buttons you're predicting as you go. Maybe one part of the game it's worth predicting what happens if you jump, but when you're crouched with a sniper rifle out, it's probably a better guess to predict zoom than jump.
Note that you will probably predict button deltas - not button pushes. That means if the user is running forwards, you'll predict forwards+left, forwards+right, and stop, rather than left, right, forwards, backwards.
So whilst you can predict more than 4 buttons, you don't need to.
But the results speak for themselves: people playing Doom3 and Fable were unable to tell the difference between Delorean+250ms artificial lag and playing the same game locally.
Increases bandwidth? Really? From the article:
4.5x bandwidth for 720p is trivial and hell, even my phone over cellular can manage that already (and has for a while).
Going by Netflix' guide, a 1.5x-4.5x slowdown looks like:
Low quality: 1Mbps-3Mbps
Med quality (SD): 2.4Mbps-7Mbps
High quality (HD): 10Mbps-30Mbps
High quality (HD+3D): 16Mbps-48Mbps
Super high quality (4K Max): 23Mbps-71Mbps.
Now maybe you're hiding out in Africa dialing into C9 over a 56kbps modem and cursing at the big pictures on the page, but those numbers don't look like they're from a crazy impossible world of giganto-broadbands. This is well within what normal people have in their homes - especially when we consider most games on Xbox360 were SD in the first place.
Comcast's $80/mo package of 50Mbps is more than the worst-case 4.5x on HD+3D. Alternatively, you, your wife and each of your three children can all separately play Delorean's worst-case 4.5x slowdown at SD on different devices around your house, on Comcast's 50Mbps package.
Better still: Verizon's 4G package will put 12Mbps through to your phone over celluar. That means you could play a game in 720p with Delorean's worst case 4.5x slowdown on the bus on your phone. And when it's only doing 1.5x slowdown rather than 4.5x, it can upgrade your resolution from a mere 720p to a full 1080p. So when Apple eventually release a phone capable of that resolution, you'll be able to stream non-laggy games to it whilst on the bus with cellular bandwidth that's already available today.
But to me, whilst noting that these are already available is a kind-of "well that's nice", it misses the bigger point about why Delorean is so ground-breaking:
Until I read this paper, I was under the impression lag was an intractable problem, that was always going to be painful for cloud-gaming. The speed of light quintessentially prevents my packets from going between London and Seattle faster than the speed of light - and the round-trip makes my games laggy. That's a big deal, and no amount of dollars, new infrastructure or waiting for new protocols and versions of things is going to make my packets break the light-speed barrier and suddenly drop the lag to be make playing games in a Datacenter playable.
But Delorean shatters that illusion - because by trading lag for bandwidth (at 1000x cost, never mind merely 1.5x), Delorean moves an intractable problem that physics is fundamentally preventing us from defeating into a tractable one we know is getting better all the time.
Bandwidth can keep going up. The speed of packets really can't.
So to have demonstrated that you can turn a lag problem into a bandwidth and CPU problem - Microsoft have just made cloud-gaming not only feasible - but feasible now. For most people the 1.5x-4.5x bandwidth degrade on even HD isn't really a big problem, but even if they were: If your home network can't handle the strain, you'll just trade quality for bandwidth, and you'll be shooting slightly fuzzy zombies - but you'll be doing it with no lag.
One other things to consider: the user's input can also be disrupted in getting to the backend. That's two too many points of pain for any immersive gaming.
That's in the paper. Yes: the server needs to reconcile gamer movements, but again - this is a solved problem. For multiplayer games, well, they've been doing this for well over a decade. Not a big deal.
For single-player games, the story is even better: the server predicts 4 states, and sends you streams. You pick one and send your choice back to the server. The server then takes the state corresponding to your selection and predicts the next four states to render and send back to you. That way, if your client gets out of sync with the server - the server just reverts to your state - since Delorean is predicated on quickly shifting states. That way, the server recomputes what it would have done, as if your input reached it at the time you pressed the button.
Again - this isn't just something you can do, this is something Delorean has demonstrated - this is fundamental to what it's setting out to achieve, and they achieved it.