Craig Mundie shows off the future of NUI

Play Craig Mundie shows off the future of NUI

The Discussion

  • User profile image

    cool stuff

    some stuff felt a little like csi UIs though, looks cool but not necessarily practical Smiley

    the underlying tech is very cool though. the stereoscopic camera software for example.. i know there is a msr group that recently released a stereoscopic api, havent had time to play with it though..


    one thing about nui that kinda bothers me is that its somehow assumed that ink is "more" natrual than keyboards.. imo it is not. typing with a pen is far far slower for me than using a keyboard.. buxton said it best, what is "natrual" is very subjective.


    love to see more nui stuff though, perhaps more in depth Smiley

  • User profile image

    Actually Bill Buxton said in his last C9 interview that keyboard and mice where not going away, it's about the right tool for the job, rather than forcing one tool (input method) in all contexts. It's incredibly exciting to have this extended palette of input methods. UX is often all about pixels, but of course output is only half the story.

  • User profile image

    Very cool, but to be honest I don't completely buy how close to practical use this is Wink


    How much of this demo is scripted? To detect hand movements and such is very cool but the problem seems much easier if the program already knows what's going to happen. For example in a real world program there could be 30 different possible gestures at a time and the software would have to decide which one you are doing. If there are only two gestures (tilt up/tilt down) that seems easier. Same with voice recognition. If there are two possible phrases (e.g. zoom in/zoom out) and you've trained the program to recognize this it's not very hard. If there are thousands of possible phrases and no training on your voice the problem becomes harder.


    Also would a person with less experience be able to drive the demo, or do you need very precise gestures or have the program be trained on following your eyes and voice?


    Still very cool stuff.

  • User profile image

    No, Laura this does not prove global warming. It has yet to be proven that our emmissions cause global warming. CORRELATION DOES NOT PROVE CAUSATION! THIS IS THE FUNDAMENTAL PRINCIPLE OF SCIENCE! THIS IS AN ISSUE OF SCIENCE, NOT POLITICS!

  • User profile image

    Thanks for  the passionate reply need to yell though Wink

  • User profile image

    exactly! and i can tell you, my heart warmed when he said that Smiley he's pretty much the only one ive heard from microsoft that put it that way, hopefully he's not the only one that realize that though..


    casing point is natal.. to me the most exciting natal stuff is when you combine the regular xbox controller with natal and add layers of control instead of replacing one or the other. head tracking, dodgeing, sneaking, pointing teammates around, all possible with natal while still using a controller for the super hifi controlls..


    it gets even more interesting on the pc, natal is never going to outperform the high end mice at available at the same time, but it doesnt have to, you could still use a mouse where that makes sense and use natal for actions where that makes sense.. in the demo, the propeller adjustment would probably need the precision  and stability of a mouse to be practical, but for camera controls, hand tracking makes a lot more sense.


    but in almost all the material published about natal, the aim is essencially casual games where you jump around. very narrow minded imo.. there will ofcourse be apps like that and they will be awsome. but those are not the only apps possible. the mouse didnt replace the keyboard, natal will not replace anything either, it will add another dimension of control..


    thats what i hope for anyway Smiley

  • User profile image

    I was thinking the same thing.  Interesting timing on that first app as global warming data and ~scientists have proved themselfs wrong and run away.  That said, it is some cool stuff.  I like grabing the wing thing.  Allowing the engineer to manipulate stuff in real time like this will very much reduce random compute cycles and turn around time.  You can get it close and bound the compute domain, then the computer and run all variarations in the domain to fine tune the best variation.  You can also do quick "what-if" testing.  Like what happens if a put a Winglet here in the middle or on the end, etc..  You know right away if it is worth exploring. 

  • User profile image

    @zowens: Yes. And the fact that the results change pretty drastically by using a more sophisticated model is a very bad sign. How do we know that the second model *is* accurate? Maybe we need an even more sophisticated model and the predictions will change drastically again...

  • User profile image

    Very cool stuff.. Smiley

Add Your 2 Cents