Coffeehouse Thread

43 posts

Microsoft admits Direct3D and GPU's are not designed for gaming

Back to Forum: Coffeehouse
  • User profile image
    androidi

    http://xboxforums.create.msdn.com/forums/p/89293/538811.aspx

    Shawn Hargreaves > XNA (in fact all of D3D and GPU hardware) is focused on 60hz animation, and not designed for either inputs or outputs with millisecond precision.

    (note the italics)

    Unfortunately game and tech journalism is in on this scam/conspiracy to ruin the arcade and twitch PC gaming experience for good.

    I don't recall a single tech journalist using a benchmark for a true keyboard to display latency. (Input to output, that's pretty critical right?)

    One could probably develop such benchmark tool on a breadboard with $5 of components from ebay, yet no one in the "tech" journalism has bothered. Maybe they aren't that much into tech or are paid to shut up about this by Nvidia, Intel and Microsoft.

    I think it would be most interesting to see a line up of all sorts of PC hardware from early 80s to right now tested for end-to-end latency and jitter in end-to-end input-output.

    I would not be entirely surprised if the best performing hardware, as usual, was made in, as usual, in the early 90s, as usual. Smiley

     

     

  • User profile image
    androidi

    I have to say though, I'm not entirely sure how I would go around measuring the input from a keyboard (such that the keyboard controller is included in the end to end measurement), would key travel time be included etc and how would such device be implemented with cheap availability (so wide number of configurations would be submitted into a results database) and with no need to disassemble existing keyboards or mouses etc.

    I think the solution could be a latency calibrated touch sensor pad and audio sensor, so you calibrate the audio input latency using the sensor pad, then subtract the travel time by analyzing the audio but this seems overly complicated.

  • User profile image
    figuerres

    SO you get an inch and you take a mile?

    how many humans can react in the millisecond time range ?

    how many games *NEED* that level of high detail timing ?

    now this is just a wild guess but I bet that most humans playing most games are just fine with n times a second level of timing.

    as far as I have seen DX does a good job for most all games I have seen.

    Heck to get a solid millisecond timing on I/O you would need to design a whole system to make sure that for example a Disk IO did not lock a thread that in turn created a lag on delivering a key click to some app.  that might not be a great way of stating that but just look at the real-time systems and know that windows is not *real-time* never has been.  and with XNA you add to that .net and GC and all that....

  • User profile image
    Sven Groot

    The only game that I can ever recall having played that had a problem with control lag was Need for Speed 3: Hot Pursuit. That game's slow reaction to the controls and impossibly dark night tracks meant that you could only feasibly drive at night if you'd memorized the entire track. That's the reason I preferred to play those same tracks in Need for Speed 4: High Stakes instead (it had all the NFS3 tracks in addition to some new ones).

    Other than that, I have never, ever experienced the kind of latency you are describing. I've never had a problem with a noticeable delay between my actions and the result on screen unless that delay was caused by network lag in a multiplayer game.

  • User profile image
    Proton2

    The average human reaction time appears to be around 200 to 250 milliseconds:

     

    http://hypertextbook.com/facts/2006/reactiontime.shtml

  • User profile image
    warren

    How in the world did you derive a headline of "Microsoft admits Direct3D and GPU's are not designed for gaming" from the actual comment (on a forum) from a software developer on the Windows Phone team named Shawn Hargreaves of, "XNA (in fact all of D3D and GPU hardware) is focused on 60hz animation, and not designed for either inputs or outputs with millisecond precision."...

    Are you intentionally trying to be controversial in order to draw attention to yourself, or do you simply not understand the subject?

    Let's assume the latter, because the former suggests deep psychological and self-confidence problems which is beyond the scope of Channel9.

     

    Game loops generally run at either 16.7 or 33.3ms precision -- this is true of pretty much platforms, well beyond Microsoft's.  You'll find it on PlayStation, you'll find it on Wii, you'll find it on Android, you'll find it everywhere, going back for many years.  Why?  It's because many monitors made before the last couple years can't draw faster than at 60hz.  There are some 120hz and 240hz displays out there but they are often considered at a disadvantage relative to 60hz when displaying fast-moving video 60hz (or 30hz) content for a variety of complicated content production reasons.  It's only become possible in the two years or so to even get 120hz-capable desktop monitors and graphics cards, and of course, your CPU etc. need to be capable of rendering a frame of your game every 8.7ms if you expect it to be smooth.

    What the person from Microsoft is talking about here, is millisecond precision, which means something closer to 1ms from the time you hit a key on a keyboard, to the time the display is updated.  This isn't really possible using any current PC technology, considering everything from USB latency, to memory protections afforded by modern operating systems, to the fact that it's very, very hard to write an input loop that is even capable of consistently taking 1,000 input polls a second and issuing commands that have visual results in 1ms.  This kind of perf is still multiples faster than what mainstream hardware is capable of today.

    One millisecond is a hell of a lot shorter period of time than most programmers realise.  Sure, anyone can knock together a for loop in C# that multiplies 100,000 integers together in 1ms, but that's toddler-class compared to the computations needed to render even a single frame of a game that uses dozens of effects with thousands of textures, never mind handling collisions and other physics concerns, which is pretty much Ph.D. level math.

    It all comes down to multiples.  If a game can't run its logic + render loop faster than every 33ms, then 60hz doesn't matter.  If a game can achieve everything consistently in 16hz, then yes, 60hz is possible.  But if it takes 17ms, then you won't be able to consistently draw at the display's refresh rate, and the game will not feel smooth.

    None of this is related to the design of Direct3D or XNA.  They'll go as fast as the hardware allows.  Once we have USB 3.0 keyboards, monitors that run at 960hz, and graphics cards several times faster than are on the market today, then you can have your millisecond precision in games.  Until then, don't get worked up over the fact that it doesn't exist.

  • User profile image
    Charles

    ???
    C

  • User profile image
    magicalclick

    I suppose if you use some kind of ultra 1million DPI fiber optic mouse, you may consider that millisecond an advantage on an extremely competitive Counter Strike match. Assume other parts of computer raise zero bottleneck ofc.

    Leaving WM on 5/2018 if no apps, no dedicated billboards where I drive, no Store name.
    Last modified
  • User profile image
    Proton2

    There is also the human eye limitations:

    "The human eye and its brain interface, the human visual system, can process 10 to 12 separate images per second, perceiving them individually,[1] but the threshold of perception is more complex, with different stimuli having different thresholds: the average shortest noticeable dark period, such as the flicker of a cathode ray tube monitor or fluorescent lamp, is 16 milliseconds,[2] while single-millisecond visual stimulus may have a perceived duration between 100ms and 400ms due to persistence of vision in the visual cortex."

    http://en.wikipedia.org/wiki/Frame_rate#Background

  • User profile image
    JoshRoss

    In other news, area man admits that Windows wasn't designed to run programs.

  • User profile image
    Bas

    So how have we been playing all these games then?

  • User profile image
    ZippyV

    https://twitter.com/id_aa_carmack/status/193480622533120001

    I can send an IP packet to Europe faster than I can send a pixel to the screen. How f'd up is that?

  • User profile image
    Dr Herbie

    , Bas wrote

    So how have we been playing all these games then?

    I have decide to stop playing any games, because it's obviously not possible to play them. It's a shame really, I've nearly finished Skyrim.

    Herbie

  • User profile image
    androidi

    Without a way to measure it and some hard data it's pointless to try argue this.

    > average human reaction time appears to be around 200 to 250 milliseconds

    Reaction time tests are really contrived and it's always a laughing point when they're quoted in irrelevant context: We usually get cues that allow to anticipate events from distance. Especially in gaming and musical context. I've seen how the 200-500 ms reaction time figures often quoted have been "deviced" and what it measures is the scenario that you are blind and deaf and suddenly your blindless is gone and you react to that singular and sudden event.

     

    I already use a CRT and I can put it to 200 hz should I wish to, but that doesn't really tell anything about latencies. Reacting to game events is conceptually similar to a "feedback loop" in electronics, except that added latency increases the loop lenght while ability to anticipate (look-ahead) decreases it. In a game like TrackMania or many arcade style games, the intensity and satisfaction derived from the game is strongly correlated to the lenght of this feedback loop.

    http://techreport.com/review/21516/inside-the-second-a-new-look-at-game-benchmarking/11

    "To cushion jitter, Nvidia is increasing the amount of lag in the graphics subsystem".

    Lets say the loop length is increased to reduce jitter (which may have something to do with an architecture that was designed for office/multitasking first in mind) that work better with a shorter feedback loop.

    In this case, a theoretical fix is to keep the loop length the same (eg. target 120 hz vsync for 8,3 ms interval) to reduce jitter & tearing, but when user input arrives, the OS or game don't have to use a "polling" method to react to this, the keyboard/game controller and bus can be designed for low latency in a way that does not require any polling or electricity (schematics for soft switches that don't consume electricity are out there, eevblog detailed one). This key event can be made to use similar technology as 10 Gigabit ethernet or firewire, so the keyboard directly avails the buffer to the game without any OS intervention. The game can then look at how long the typical game loop took on the current system, and decide if there's time to re-run the loop based on the new input before that 8,3 ms timing window closes. This will shorten the feedback loop, as otherwise you'd possibly get that 8,3 ms additional delay.

    Then there's the matter of 2D rendering. I have seen stats which suggest that for 2D rendering, XP with the appropriate card and drivers, is 10-20 times faster than Windows 7 with latest generation as of the making of those stats. 20 times faster and no jitter cushioning could potentially mean that XP+ old drivers and old graphics card has lower end to end latency. I saw 20 ms quoted for modern 3D stack driver latency. But to measure this, some measurement hardware is needed.

    Based on my observations and feel of things, I believe that 50 ms latency end to end is likely typical for a gaming PC today. That is WAY TOO MUCH! Just by switching from LCD to CRT you can cut maybe 20-30 ms, so that's better, but from experience I know that you really want <20 ms to "feel in sync" with the game or musical instrument for a consumer level experience. At professional/competitive level, <10 ms is necessary.

    No matter how many hz your display is updating, you don't really know from that figure how long it takes from a key press for something to happen on the screen.

    It's much much easier to play when you don't need to input ahead of time (lag compensation) and rely on memorization (the increased end to end lag reduces the "anticipation horizon"). When I try playing things from memory, it's similar to playing piano from memory, instead of playing to the music, you're playing to some memorized thing and timing inprecision accumulates quickly and the outcome isn't as musical as when feeling the music already sounding and playing to the patterns and timings established live, not from memory.  And in case on TrackMania I've noticed consistently from years of playing now that my first race performance is often best because I don't have the memory messing things up and it's much more about anticipation based on the visual input, very similar to live jamming in music, you're anticipating what other players do. Great groove involves millisecond precision (swing/shuffle) and anticipation, musical context and "feeling it" makes that possible.

     

  • User profile image
    BitFlipper

    , Proton2 wrote

    The average human reaction time appears to be around 200 to 250 milliseconds: 

    That response time is a different thing. What you are talking about is the response time from seeing something and then responding physically. What this is talking about is the response time from performing an action and the display/sound reacting to it. Those two things are essentially serialized.

    When playing online FPS games, anything more than about 50 ms round-trip network lag is starting to get annoying. 200 ms will be unplayable. This was true in the old days at least, but I think many games these days can mask the latency relatively well. They do this by performing the actions locally rather than waiting for the round-trip from the server before performing the action.

  • User profile image
    cbae

    By definition, any game that DEPENDS on millisecond precision suffers from really shitty gameplay.

  • User profile image
    BitFlipper

    BTW, there are relatively easy ways to test input-to-output latency. I did this once when I wanted to test the response time on a Windows Phone from touching the display to being able to produce a sound. It isn't just enough to know how large the input buffer is (100 ms is the smallest input buffer on Windows Phone - quite annoying). What you also need to test is how long does the OS take to send the touch event to the app, and how long does it take for the buffer of audio to actually be played out via the DAC?

    What I did was create a simple app with a button. When pressing the button, it would trigger a click sound. Then using an external microphone placed very close to the phone and a desktop audio application, I recorded me tapping on the screen and the resulting click from the phone. I then zoom in on the waveform and measure the time difference between the first audio event (hitting the screen with my finger) and the resulting audio click from the phone.

    You should be able to rig something similar for keyboard-to-screen response time. Place a video camera such that it can capture both the keyboard and the screen at the same time. Then load a game or something that visually responds to key presses. The resulting video can be examined frame by frame in a video editor to see how many frames it takes for the key press to register on the screen. Recording at 60 fps gives you 17 ms resolution.

  • User profile image
    figuerres

    , BitFlipper wrote

    BTW, there are relatively easy ways to test input-to-output latency. I did this once when I wanted to test the response time on a Windows Phone from touching the display to being able to produce a sound. It isn't just enough to know how large the input buffer is (100 ms is the smallest input buffer on Windows Phone - quite annoying). What you also need to test is how long does the OS take to send the touch event to the app, and how long does it take for the buffer of audio to actually be played out via the DAC?

    What I did was create a simple app with a button. When pressing the button, it would trigger a click sound. Then using an external microphone placed very close to the phone and a desktop audio application, I recorded me tapping on the screen and the resulting click from the phone. I then zoom in on the waveform and measure the time difference between the first audio event (hitting the screen with my finger) and the resulting audio click from the phone.

    You should be able to rig something similar for keyboard-to-screen response time. Place a video camera such that it can capture both the keyboard and the screen at the same time. Then load a game or something that visually responds to key presses. The resulting video can be examined frame by frame in a video editor to see how many frames it takes for the key press to register on the screen. Recording at 60 fps gives you 17 ms resolution.

     Recording at 60 fps gives you 17 ms resolution.

    or if you need to get 1ms then you need to record at 1020 fps ?  LOL

     

Comments closed

Comments have been closed since this content was published more than 30 days ago, but if you'd like to continue the conversation, please create a new thread in our Forums, or Contact Us and let us know.