Loading User Information from Channel 9

Something went wrong getting user information from Channel 9

Latest Achievement:

Loading User Information from MSDN

Something went wrong getting user information from MSDN

Visual Studio Achievements

Latest Achievement:

Loading Visual Studio Achievements

Something went wrong getting the Visual Studio Achievements


evildictaitor evildictait​or Devil's advocate
  • DeLorean

    , magicalclick wrote

    You keep missing the point. I am talking about YOU lagging in front of OTHER PLAYERS. I am not talking about you playing the game smoothly. And that case #1.

    That's only a problem in multiplayer games.

    Delorean isn't really about multiplayer games, so much as about defeating the video lag of streaming. It's much more about eliminating YouTube-style "buffering" and the problem of pressing a button and the visual-feedback of that button being pressed not appearing on your screen until the button press has been relayed all the way up to the server, been rendered, sent back and then displayed on your screen.

    The really cool thing about Delorean is how big the lag can get before people even notice. Beyond 100ms of lag between button and screen, the user notices the disconnect. But with Delorean, they raised the lag all the way out to 250ms, and players couldn't even tell they were playing over the Internet.

    DBVS said: 

    I was referring to your similar "No Compromise", "Don't like it, don't use it" attitude that both ex-Microsofties exhibited similar to your statements. Extra points for the hyperbole.

    Math doesn't care if you like it or not. Proof that the intractable problem of lag can be solved using the scalable resource of bandwidth is cool research, whether or not you want to derail the discussion with imaginary implementation problems that clearly didn't affect their prototype in the lab.

    Either way, this is cool research. You can ignore it or you can read it. But saying Microsoft is stupid for even trying to make the world a better place by coming up with cool research that might one day solve problems we currently think of as intractable makes you sound like a boring troll.

    It's cool research any which way you look at it, and I, for one, look forward to it becoming a commercial product.

  • boom baby

    Windows isn't just the shell.

    If you want a custom shell - write your own, or pay for or join a community that is doing just that. It's perfectly expandable if that's what you're into, and there are entire communities and commercial products dedicated to doing this if that's what you want.

  • DeLorean

    Good thing you have never been to Gamespot System War forum. You wont survive very long. Because you will be so depresses with all the 720p jokes. Hell, even 928p is a joke for them. Anything hitting less than what PS4 can do, is automatically a fail.

    My point was even low-speed broadband is already fast enough to cope with maximum resolution output of many premium XboxOne titles  - and bandwidth is climbing faster than screen resolutions. Middle-speed broadband (10Mbps) is already fast enough for streaming 1080p content with Delorean - and that's top-end graphics at maximum resolution across the industry right now.

    By the time Microsoft is ready to ship a console with 4K output - let's be honest, two major versions from now, (so a decade probably) - your bandwidth isn't going to have a problem at just 47Mbps - hell even DBVS' claims his internet connection could cope with that already, given how he claims a 50Mbps download speed from Comcast.

    , DeathBy​VisualStudio wrote

    You are assuming a perfect world where the connectivity between endpoints is never disrupted (or disrupted for <= 250ms). I'm not talking about bandwidth. I'm taking about zero bytes going down the line in one direction or the other.  It happens all the time. I have Comcast's 50Mbs service as well as plenty of my friends and we all * about the flaky internet service. Zero bytes means zero streams and zero button pushes.

    Lol. So you're saying Delorean is crap because you have high (50Mbps) bandwidth, but lots of lag.

    If only someone could invent some way of, I don't know, trading some of the excessive bandwidth you have into, I don't know, maybe reducing some of that lag?

    Thanks for resurrecting the Sinofsky and Mattrick attitudes

    Lol, right. Boo Sinofsky for mathematically demonstrating that an otherwise intractable problem can be made tractable. Boo Mattrick and his stupid research papers for hypothetical future products that haven't even begun moving to a feature team yet.

    Quick! Let's devise a strawman argument as to why despite the fact you have a 50Mbps bandwidth, the 4.6Mbps download for maximum-resolution XboxOne titles in order to get rid of the lag is a terrible and stupid idea...

    I'm a little disappointed, but in retrospect not entirely surprised that you manage to took a discussion about cool research and de-railed it with your tedius "Boo Microsoft! Everything they do is suxxors" attitude, but you know what? I don't care. If you're too cheap to afford a 4Mbps internet connection with <250ms lag, well that's your problem, not mine.

    In the meantime, I still think proving that lag can be traded for bandwidth is probably going to be the most important research paper to come out of MSR this year, and I look forward to it being on the shelves ASAP.

  • DeLorean

    , magicalclick wrote
    I for one will game on a new console of 8K. Or tablet at 1080p on the go with multiplayer using LTE.

    It doesn't matter what resolution your screen is, it matters what resolution the content is.

    Whilst some games on XboxOne are 1080p, most are still 720p (including Call of Duty: Advanced Warfare, Call of Duty Ghosts, Dead Rising3, Killer Instinct, Metal Gear Solid V, The Witcher 3, Titanfall (actually 792p, but w/e), Tomb Raider).

    Or to put it another way, Delorean means you can stream Xbox One's Call of Duty: Advanced Warfare at maximum resolution (i.e. the same resolution as a local Xbox One would stream to your TV) in less bandwidth than watching a HD-movie from Netflix.

  • DeLorean

    , DeathBy​VisualStudio wrote


    And I for one am interested in a reliable Internet that doesn't disrupt my streaming regardless of how much bandwidth I have. Of course that's not Microsoft's concern since they only consider the "optimum" environment when designing their products. :S


    Let's put it this way. Right now, in 2014, streaming a Netflix in HD is about 6.6Mbps.

    Assuming an even distribution between 1.5 and 4.5x bandwidth slowdown, Delorean streaming full-resolution Xbox360 SD games to your TV will be 4.7Mbps.

    Assuming, because you're DBVS and always like worst-cases, we take the worst-possible case of a constant 4.5x slowdown, we're still only talking 7Mbps for streaming full-resolution Xbox360 content - a tiny amount more bandwidth than streaming Netflix or YouTube in HD, which is so common already, that most people do it every day without thinking about it.

    If you don't like the idea of streaming, from your mud-hut in Africa over 56Kbps, fine. Be my guest. Don't use cloud gaming and don't use Netflix and go out and buy $40 movies on a disk and $60 games on a BluRay that require you to wait a day and go to the store. Nobody's going to force you to use this service. Don't like it? Don't use it.

    But don't try and pretend that an average of 4.7Mbps to stream content to your TV is some magical fantasy-land of bandwidth that only billionaires in their mansions can even dream of - since most people stream more than that already when they stream Netflix-HD over the same Internet connection, to the same TV.

    But at the end of the day, nobody really cares what the nay-sayers are gonna say. If customers want to point-and-click to start playing an entirely new game with no download-time, and rent game titles like they stream movies from Netflix, it'll happen.

    And thanks to Delorean, the games everyone else will be streaming will be games with no lag.

  • DeLorean

    , bondsbw wrote

    The same goes for DeLorean.  DeLorean is research; it is preparing for the future (hence the name), a future where connections under 50 Mbps may be difficult to fathom and 1 Gbps may be the average.

    I, for one, look forward wistfully to the futuristic utopian world of which you describe when ordinary folk may be able to purchase speeds as high as 2.4-7Mbps on the open-market so they can stream games with 1.5-4.5 bandwidth overhead at native Xbox-360 resolution (720p)

  • DeLorean

    , DeathBy​VisualStudio wrote

    Modern controllers have more than 4 buttons.

    You don't have to predict all of the buttons - but you can if you want. It scales pretty linearly.

    If your bandwidth (and Datacenter CPU) can cope with it, you can predict tons of buttons and button combinations. In real life, though, you'll only predict the lag-sensitive buttons.

    Opening fire and jumping are probably worth predicting. But if the forward button is down, predicting backward is probably a wasted guess. As is pressing START or SELECT where users care much less about a 100ms response time.

    Depending on how smart you want to be, you can even change which buttons you're predicting as you go. Maybe one part of the game it's worth predicting what happens if you jump, but when you're crouched with a sniper rifle out, it's probably a better guess to predict zoom than jump.

    Note that you will probably predict button deltas - not button pushes. That means if the user is running forwards, you'll predict forwards+left, forwards+right, and stop, rather than left, right, forwards, backwards.

    So whilst you can predict more than 4 buttons, you don't need to.

    But the results speak for themselves: people playing Doom3 and Fable were unable to tell the difference between Delorean+250ms artificial lag and playing the same game locally.

    Increases bandwidth? Really? From the article:

    4.5x bandwidth for 720p is trivial and hell, even my phone over cellular can manage that already (and has for a while).

    Going by Netflix' guide, a 1.5x-4.5x slowdown looks like:

    Low quality: 1Mbps-3Mbps
    Med quality (SD): 2.4Mbps-7Mbps
    High quality (HD): 10Mbps-30Mbps
    High quality (HD+3D): 16Mbps-48Mbps
    Super high quality (4K Max): 23Mbps-71Mbps.

    Now maybe you're hiding out in Africa dialing into C9 over a 56kbps modem and cursing at the big pictures on the page, but those numbers don't look like they're from a crazy impossible world of giganto-broadbands. This is well within what normal people have in their homes - especially when we consider most games on Xbox360 were SD in the first place.

    Comcast's $80/mo package of 50Mbps is more than the worst-case 4.5x on HD+3D. Alternatively, you, your wife and each of your three children can all separately play Delorean's worst-case 4.5x slowdown at SD on different devices around your house, on Comcast's 50Mbps package.

    Better still: Verizon's 4G package will put 12Mbps through to your phone over celluar. That means you could play a game in 720p with Delorean's worst case 4.5x slowdown on the bus on your phone. And when it's only doing 1.5x slowdown rather than 4.5x, it can upgrade your resolution from a mere 720p to a full 1080p. So when Apple eventually release a phone capable of that resolution, you'll be able to stream non-laggy games to it whilst on the bus with cellular bandwidth that's already available today.

    But to me, whilst noting that these are already available is a kind-of "well that's nice", it misses the bigger point about why Delorean is so ground-breaking:

    Until I read this paper, I was under the impression lag was an intractable problem, that was always going to be painful for cloud-gaming. The speed of light quintessentially prevents my packets from going between London and Seattle faster than the speed of light - and the round-trip makes my games laggy. That's a big deal, and no amount of dollars, new infrastructure or waiting for new protocols and versions of things is going to make my packets break the light-speed barrier and suddenly drop the lag to be make playing games in a Datacenter playable.

    But Delorean shatters that illusion - because by trading lag for bandwidth (at 1000x cost, never mind merely 1.5x), Delorean moves an intractable problem that physics is fundamentally preventing us from defeating into a tractable one we know is getting better all the time.

    Bandwidth can keep going up. The speed of packets really can't.

    So to have demonstrated that you can turn a lag problem into a bandwidth and CPU problem - Microsoft have just made cloud-gaming not only feasible - but feasible now. For most people the 1.5x-4.5x bandwidth degrade on even HD isn't really a big problem, but even if they were: If your home network can't handle the strain, you'll just trade quality for bandwidth, and you'll be shooting slightly fuzzy zombies - but you'll be doing it with no lag.

    One other things to consider: the user's input can also be disrupted in getting to the backend. That's two too many points of pain for any immersive gaming.

    That's in the paper. Yes: the server needs to reconcile gamer movements, but again - this is a solved problem. For multiplayer games, well, they've been doing this for well over a decade. Not a big deal.

    For single-player games, the story is even better: the server predicts 4 states, and sends you streams. You pick one and send your choice back to the server. The server then takes the state corresponding to your selection and predicts the next four states to render and send back to you. That way, if your client gets out of sync with the server - the server just reverts to your state - since Delorean is predicated on quickly shifting states. That way, the server recomputes what it would have done, as if your input reached it at the time you pressed the button

    Again - this isn't just something you can do, this is something Delorean has demonstrated - this is fundamental to what it's setting out to achieve, and they achieved it.

  • DeLorean

    , spivonious wrote

    I don't even think any ISP in the US offers 1Gbps. Comcast (the only ISP in my area) maxes out at 150Mbps, and it's a ridiculous $115/month.

    Mine home internet is 1Gbps - and it's only $80/mo http://www.condointernet.net/

    Whatever you guys might think normal internet speeds are - Netflix is already selling 25Mbps services to customers, and 25Mbps is vastly more bandwidth than streaming 720p games using Delorean.

  • DeLorean

    , magicalclick wrote

    Nah... After hearing the drawbacks. I will keep it as minor research. Because while bandwidth grows overtime, so is resolution. Yes, people keep saying current resolution is enough, but all the fanboys will focus on resolution for at least few decades.

    It's only 25MBps to get 4K TV show from Netflix to your living room - and my bandwidth is 1Gbps - admittedly towards the top end of residential bandwidth, but within a few years it'll be standard.

    And part of the point of Delorean is that because of compression, predicting 4 button inputs doesn't mean sending 4x the data.

    Or to put it another way: Delorean increases bandwidth, but for 720p streaming (SD resolution used by Netflix), this kind of bandwidth is trivially already available pretty much everywhere.

  • DeLorean

    , DeathBy​VisualStudio wrote

    Sounds like a way to get out of the console manufacturing business. At least when this whole things caves in on itself Microsoft will have some good compression technology they can use elsewhere. Demand for higher resolution, more immersive gaming will continue to go up and both this technology and the internet won't be able to keep up with the bandwidth requirements. It also assumes that the internet connection between provider and home will be pretty stable. Sounds like they'll be trying to sell the "less is more" idea once again.

    Just to be clear I'm not suggesting they stop trying to innovate. I am saying that the time and money would be better spent elsewhere.  

    No, it's the opposite.

    Not long ago, Netflix was a specialist product. Nobody had the bandwidth for it. Not long before that, YouTube sounded insane. And not long before that, people were "optimising" their websites to not have big images on their home pages.

    Bandwidth is going up all the time - and we already live in a land where I can reliably stream hours of 720p content from Netflix over standard middle-of-the-road Internet connections - and Netflix is already moving not just to 1080p, but to streaming 4K data to customers. And my home connection already has enough bandwidth to simultaneously stream 400 Netflix 4K movies to my laptop at once.

    The problem for cloud gaming has never really been a bandwidth problem. It's been a lag problem. On a choppy line, Netflix can buffer several seconds ahead of where you are, so that it can smooth out any bumps in the bandwidth. Games can't do that in general. How can you buffer a boss-fight when it's not a passive video - it's a active battle. You can't buffer video of a battle when you don't know if the person is going to dodge left or shoot or throw a grenade.

    Or can you?

    And that's what Delorean is all about. By predicting your inputs, the server constructs multiple plausible streams each of which are buffered and sent to you. The server says "I don't know if you're going to go left, or right, or shoot, so here's video of what happens if you were to go left, or to go right or to shoot". When you pull the trigger, your local Xbox says "Aha. I'll tell the server I shot, and select the 'I decided to shoot' stream to send to your TV".

    That way your on-screen character doesn't just sit there blankly staring at the zombie trying to eat your face whilst you wait for your button push to get from London to New York, rendered on some GPU farm and sent back to your TV in London over a laggy connection.

    No: Before the server even knows you started shooting, your TV is showing what happens when you do.

    This is amazing research, because lag was widely thought to be an intractable problem for cloud gaming, and this proves you can turn lag into CPU/GPU cost on the server, and bandwidth cost for the client - both of which we already know are basically solved problems.

    And what this means is that Microsoft can stop selling you $600 consoles with a state-of-the-art CPU and GPU, a huge hard-drive, optical disks and so on, it can sell you a $100 box with just a AVI decoder and a HDMI port - and for a $20/mo or so subscription you can stream games directly with no lag.

    Want to play GTAV? Click GTA5. Bam. Done. You're shooting now. No stupid 4hr download to wait for.

    Wonder if Fable4 is right for you? Click. Bam. Done. Don't bother with the trailer - just dive straight in and see if it's right for you in a 2hr demo of the actual game. No downloads - click "Go" and it's there.

    Choppy-line? No problem. Delorian means when you pulled the trigger, the zombie died right away on your screen, even though the network hit a bump and didn't receive any packets for ~200ms. No lag, and throwing the controller against the wall because your character just stood there and got his face eaten. The zombie is dead and you didn't even notice the lag - just like how you don't notice the lag on Netflix.

    And why is it so cheap? Because Microsoft has massive economies of scale. Rendering 100,000 customers' data in a Datacenter is much cheaper than making 100,000 Xboxes capable of rendering the same game - and it amortizes over the day. When you put the controller down and go to bed, that hardware is now being used by other customers in China, or Europe.

    And it's not blue-sky "I wonder if..." research either.

    They built it, they tried it out, and showed it worked.

    Customers playing Doom3 and Fable3 in their lab, with the game being rendered in the cloud didn't notice the artificial 250ms delay added to the network.

    Delorian isn't about getting out of the console market. It's about knocking it out of the park.