Two for Kinect 4 Windows v2 and Unity 3D

Peter Daukintis, Microsoft Technical Evangelist, who we've recently highlighted, Kinect for Windows v2 Face Tracking Managed and Native and Saluting the Visual Gesture Builder - Details and example code, returns today with two Unity 3D related posts, a subject that I've seen a great deal of interest and discussion.

Kinect 4 Windows V2 – Unity 3D

After installing the Kinect v2 SDK from here http://www.microsoft.com/en-us/download/details.aspx?id=44561 you can also download the supporting Unity 3D plugins here http://go.microsoft.com/fwlink/?LinkID=513177. Note that the plugins require Unity 3D Pro and they expose APIs for Kinect for Windows core functionality, visual gesture builder and face to Unity apps. The zip file containing the Unity packages also contains two sample scenes Green Screen and Kinect View. Lets take a look at KinectView first:

So, I stumbled around a bit on the next step as initially I tried opening the KinectView scene from it’s existing location and this doesn’t seem to work very well. The result I got was that when I examined the game objects there was an error on each of the scripts. After some head-scratching I watched the video here https://channel9.msdn.com/Series/Programming-Kinect-for-Windows-v2/04 and at about 10:54 the presenter shows copying the scene locally to the current project. That fixed the issue for me so I was back up and running. You can plug in your Kinect sensor and run the game and then explore the scripts to see how the data is retrieved from the sensor and applied to GameObjects in your scene.

image

...

My idea was to create a very simple Kinect sample which will move particle systems around following the positions of the users hands. So lets step through what I did.

First, I chose File > New Project to bring up the Unity project wizard:

...

The next step was to add two new GameObjects, one for each hand and set their joint types to HandLeft and HandRight and then add particle system’s to each as a child. Then the particle systems had their parameters tweaked until they looked how I wanted. It seems to me that the combination of Unity and Kinect is a pretty powerful one and I’m looking forward to seeing the results in the Windows Store.

Project Information URL: http://peted.azurewebsites.net/kinect-4-windows-v2-unity-3d/

Project Source URL: http://1drv.ms/1tyKlIT.

Kinecting XAML + Unity

image

Following on from my previous post where I showed how easy it is to interact with Kinect data inside the unity 3D environment I wondered how I might manipulate 3D game objects using Kinect gestures within a Windows store app. (Before I begin I would like to point out that the Kinect integration is only available with Unity Pro version). Also, if you want to follow along you will need to get the Kinect Unity packages and also the Unity Visual Studio Tools. These packages can be added from the Unity Project wizard or can be later added using the Assets > Import Package menu.

I wanted to start by leveraging the built-in gestures that the Kinect SDK uses in its Controls Basics samples. There are samples in the Kinect v2 SDK for each of WPF, XAML and DX; the gesture support has been integrated into the ui frameworks for WPF and XAML but the Direct X sample is more low-level and shows how to use gesture recognizers directly.

I decided to leverage the Unity to XAML communication which is outlined in the Unity sample here http://docs.unity3d.com/Manual/windowsstore-examples.html and use the higher level gesture support in XAML. In case you were wondering when you create a Windows store app for Unity it is created as either a c# XAML + Direct X app or a c++ XAML Direct X app. What this means is a standard XAML app which has a SwapChainPanel which hosts the Unity content. The SwapChainPanel is a control which can sit anywhere in a the visual tree and can render Direct X graphics.

Setting the Scene

I started by looking through the Unity Assets store for a suitable 3D model and while browsing through the catalogue I found this free 3D earth model:

...

So, I imported this and added it to my scene in Unity. Now, I’m a beginner with Unity so at this point I wanted to get a better feel for how to interact with GameObjects in a scene, how this all translates back down to a Windows Store and Windows Phone app and also how to debug using Visual Studio.

Mouse/Camera

I thought that a basic scenario would be adding the ability to orbit the camera around the 3D model and then later I could think about how to achieve this using kinect gestures. To that end I added the standard Unity MouseOrbit script to the MainCamera in my scene and selected my earth model as the ‘look-at’ target. So, so far I’ve just pressed on a few buttons and I have a camera orbiting a 3D earth; let’s see if it runs as a Windows Store app.

Windows Store

To run/debug a windows store app from this Unity scene you can select File > Build Settings and you will see the dialog below:

...

I set some of the settings, added the scene here and chose ‘Build’. I chose a folder to store the project files and Unity created me a VS solution with my store app code. I opened this up in VS and one thing you will need to do before running the app is to choose a processor architecture as ‘Any CPU’ is not sufficient. I usually choose x86 while developing as the VS designer won’t work with x64. So with that selected you can hit F5 and the app will run. So at this stage I had an earth which I could orbit around with the camera. I also tweaked the mouse orbit script a little to provide a zoom in/out as I would want this functionality later when I use Kinect.

From Mouse to Kinect

So for the XAML side of things what I needed was a way to recognise gestures and pass that information through to the Unity Scene. The way to detect built-in gestures using XAML is to add a KinectRegion to the visual tree which will detect IKinectControl derived controls within it and along with a little more plumbing you can code a class which receives manipulation events corresponding to gestures. Here are the steps and the basic code I used to get to this stage:

Create an IKinectControl-derived Usercontrol that I could use inside a KinectRegion which would cover the whole screen.

...

An alternative to this would be to implement directly in the Unity environment using the KinectGestureRecognizer which I suspect would be better from a perf perspective. The sample code for this project can be downloaded here (please note that some paths appear to be hard-coded in the generated Unity script projects so this will prevent the solution from building without a bit of tweaking).

Project Information URL: http://peted.azurewebsites.net/kinecting-xaml-unity/

Contact Information:




Tags:

Follow the discussion

  • Oops, something didn't work.

    Getting subscription
    Subscribe to this conversation
    Unsubscribing
    Subscribing

Comments closed

Comments have been closed since this content was published more than 30 days ago, but if you'd like to continue the conversation, please create a new thread in our Forums, or Contact Us and let us know.