Today's post is a rollup of a number of recent Kinect for Windows v2 posts done by El Bruno (aka Bruno Capuano). It looks like he's really enjoying it...
First off, remember this is BETA hardware and software. It WILL change in the future...
“This is preliminary software and/or hardware and APIs are preliminary and subject to change“
[Note: All the text below is machine translated...]
Great day #KinectOne has arrived to my house … for this post, only photos of the great device !
-Open the box, always an excitement moment l!!
Today we are going with the step by step configuration of the new KinectOne sensor. The first thing to keep in mind is that you need a good laptop to be able to use the SDK, in my case a Surface Pro.
The first step is the classic: install the SDK. Important do not connect the #KinectOne until you have not finished the installation of the SDK.
A good thing to keep in mind: the Kinect sensor comes “empty”, so you have to update the firmware to get it working. In example: when you launch the app to see the status of the sensor you’ll see something similar to the following picture: the firmware version is 0.0.0.0
Today’s post is simple: let’s see how to make a Hello World app with the new SDK v2. In Kinect apps, Hello World is basically an app that detects a person in front of the Kinect sensor.
So let’s go for it
1. Create a console app
2. Add the reference to Microsoft.Kinect v2.0
I already write about the new Kinect One camera, is much better than the Kinect v1. This new features allows us to get access to elements like the heartbeat of a person, photos in HD, etc. The SDK is still progressing and gives us some interesting options such as for example, the ability to detect open or closed hands. Update: Victor reminded me that this already had it in the SDK 1.7 and higher, not as explicit as here.
What I will do is start from the console example of recently (post) and it will change the routine processing recognition of Bodies:
As you can see in the example above, once we have defined that the body is trackeable (line 11), then we will work with the property HandRightState (line 13). It has the values of Open, Close, Unknow, among others.
The video below shows an example of this app running:
Kinect V2 is great, and in the future we’ll get the change to identify people emotions.
If you take a detailed look at the Body class, you’ll find these collections
If we analyze the contents of these collection; we probably find something similar to the following values
One more time: KinectOne and KinectSdk rules!