Kinect to Fingers, hands, pen and a mouse...


Today Mike Taulty shares some magic of how he's using the Windows 8 pointer api's to turn the Kinect for Windows v2 into a touch/mouse like pointer...

Kinect for Windows V2 SDK: Mouse, Pen, Fingers, Hands


One of the things that Windows 8 did when it came along with its new app development model was to do some unification around the way in which “a device that can point” is represented.

In the early days, I found myself often looking for a “Mouse Down” style event only to have to keep reminding myself that it might not be a mouse that the user is using and so I needed to think in terms of “Pointers” rather than mice.

If I extend this out to code then I can quickly knock up a blank app in .NET which has a Canvas to draw on;


Bringing in the Kinect

One possible role for a Kinect is to use it as a fancy kind of “pointer” device.

You could use the low level skeletal tracking bits to treat some part( s ) of a body as a pointer and somehow turn the Kinect into a form of pointer device. You’d have quite a lot of work to do though in deciding which bits of the body to track and you’d have to choose something against which to track them to (e.g. track the right hand relative to the hip or the head or somesuch) and so on.

While that’s do-able, it might not be the best plan unless you’ve got specific reasons for doing it because;

  1. your new mechanism might not be overly discoverable for the user unless you convince other app developers to follow the same scheme
  2. you’d have to write all that code yourself

Both of those can be avoided by using the bits that are talked about in the Channel 9 video below;


and that’s pretty much it – I’m not sure about using the IManipulatableModel.CapturedPointerId in the controller code – that might be dubious. More generally, none of this code does the right thing around capturing/releasing pointers but this seemed to give me what I was basically looking for in that I can now draw with the hand gesture as in the video below;

from Vimeo.


[Click through for all the source, snaps, details and tips]

Project Information URL:

Project Source URL:

Contact Information:

The Discussion

  • User profile image

    I am probably one of the biggest Xbox One fans on the planet, and I love Kinect on Xbox One.

    What really puzzles me, is how gestures on Xbox One at this point, are completely useless. I know this post is about Kinect Windows, but still, it's not that you can't do gestures on Xbox One, it's that the implementation involves pushing in and out to do things, even though that level of engagement is not even close to being a finished product.

    What they need to do on XOne is redesign the "gestures" to react to the large sweeping gestures that the device can currently react to accurately. How about reach forward with an open hand, close your hand, and then move your arm up and down to control volume? Please do something with Gestures on XOne, you guys are messing this one up, big time.

    Yes I know this comment does not belong on this forum, but sometimes the right people are listening here, when a meaningful comment like mine usually gets lost in the sea of "feedback" land.

Conversation locked

This conversation has been locked by the site admins. No new comments can be made.