How in the world did you derive a headline of "Microsoft admits Direct3D and GPU's are not designed for gaming" from the actual comment (on a forum) from a software developer on the Windows Phone team named Shawn Hargreaves of, "XNA (in fact all of D3D and GPU hardware) is focused on 60hz animation, and not designed for either inputs or outputs with millisecond precision."...

Are you intentionally trying to be controversial in order to draw attention to yourself, or do you simply not understand the subject?

Let's assume the latter, because the former suggests deep psychological and self-confidence problems which is beyond the scope of Channel9.

 

Game loops generally run at either 16.7 or 33.3ms precision -- this is true of pretty much platforms, well beyond Microsoft's.  You'll find it on PlayStation, you'll find it on Wii, you'll find it on Android, you'll find it everywhere, going back for many years.  Why?  It's because many monitors made before the last couple years can't draw faster than at 60hz.  There are some 120hz and 240hz displays out there but they are often considered at a disadvantage relative to 60hz when displaying fast-moving video 60hz (or 30hz) content for a variety of complicated content production reasons.  It's only become possible in the two years or so to even get 120hz-capable desktop monitors and graphics cards, and of course, your CPU etc. need to be capable of rendering a frame of your game every 8.7ms if you expect it to be smooth.

What the person from Microsoft is talking about here, is millisecond precision, which means something closer to 1ms from the time you hit a key on a keyboard, to the time the display is updated.  This isn't really possible using any current PC technology, considering everything from USB latency, to memory protections afforded by modern operating systems, to the fact that it's very, very hard to write an input loop that is even capable of consistently taking 1,000 input polls a second and issuing commands that have visual results in 1ms.  This kind of perf is still multiples faster than what mainstream hardware is capable of today.

One millisecond is a hell of a lot shorter period of time than most programmers realise.  Sure, anyone can knock together a for loop in C# that multiplies 100,000 integers together in 1ms, but that's toddler-class compared to the computations needed to render even a single frame of a game that uses dozens of effects with thousands of textures, never mind handling collisions and other physics concerns, which is pretty much Ph.D. level math.

It all comes down to multiples.  If a game can't run its logic + render loop faster than every 33ms, then 60hz doesn't matter.  If a game can achieve everything consistently in 16hz, then yes, 60hz is possible.  But if it takes 17ms, then you won't be able to consistently draw at the display's refresh rate, and the game will not feel smooth.

None of this is related to the design of Direct3D or XNA.  They'll go as fast as the hardware allows.  Once we have USB 3.0 keyboards, monitors that run at 960hz, and graphics cards several times faster than are on the market today, then you can have your millisecond precision in games.  Until then, don't get worked up over the fact that it doesn't exist.