Kinecting is not hard to do... Another "first steps" story

Description

The important point to today's Gallery entry is that developing with the Kinect for Windows SDK is not hard to do. All you need to do is take it one step at a time...

Kinect – First steps

Next, I wanted to try out some experimenting in the code to get a feel for the API's. My intention was simply to have an image follow the movements of the right hand. There are two API's to control the Kinect; NUI API (Natural User Interface) and the Audio API. You can use C++ or C# to build applications for the Kinect and though I have lot of love for C++ I chose C# since, well, it's faster to get up and running with.

To use the API you simply reference Microsoft.Research.Kinect.dll on the .NET tab in the Add reference dialog and then add using Microsoft.Research.Kinect.Nui; in your source file. The main NUI API class is called Runtime and gives you access to the video stream, depth image stream and three events namely DepthFrameReady, SkeletonFrameReady and VideoFrameReady. Using the stream getters or using the events is just a matter of chosing to retreive image data using a polling model or an event model.

For inspiration and helper methods I used the SkeletalViewer application but instead of creating the UI using XAML I just launched a WPF GUI from a console application. Instead of .NET events I chose to use Reactive Extensions' Observables which gives you much greater flexibility and power for reacting on different user behavior. The main method code looks like this:

...

Project Information URL: http://blog.jayway.com/2011/09/22/kinect-first-steps/

image

[STAThread]
static void Main()
{
    _nui = new Runtime();
    var app = new Application();
    var window = new Window();
    InitializeNui();    //Initializing the Runtime object and opening video streams
    CreateGUI(window); //Setting up a canvas to hold the RGB video and the image attached to the hand of captured person
    var skeletonFrameReadyObservable = Observable.FromEventPattern(_nui, "SkeletonFrameReady");
    var trackedSkeletons = from ev in skeletonFrameReadyObservable
                            from skel in ev.EventArgs.SkeletonFrame.Skeletons
                            where skel.TrackingState == SkeletonTrackingState.Tracked
                            select skel;
    var rightHandPos = trackedSkeletons.ObserveOnDispatcher().Select(s => getDisplayPosition(s.Joints[JointID.HandRight]));
    rightHandPos.ObserveOnDispatcher().Subscribe(pos =>
    {
        _image.SetValue(Canvas.TopProperty, pos.Y);
        _image.SetValue(Canvas.LeftProperty, pos.X);
        Console.WriteLine(pos.X + ", " + pos.Y);
    });

    var videoFrameReadyObservable = Observable.FromEventPattern(_nui, "VideoFrameReady");
    videoFrameReadyObservable.ObserveOnDispatcher().Subscribe(evPattern =>
    {
        PlanarImage Image = evPattern.EventArgs.ImageFrame.Image;
        _video.Source = BitmapSource.Create(Image.Width, Image.Height, 96, 96, PixelFormats.Bgr32, null, Image.Bits, Image.Width * Image.BytesPerPixel);
    });
    app.Run(window);
}

Contact Information:

The Discussion

Comments closed

Comments have been closed since this content was published more than 30 days ago, but if you'd like to continue the conversation, please create a new thread in our Forums, or Contact Us and let us know.