KinectKit, a commercial library to help speed your Kinect for Windows SDK application development

Today's trial commercial product helps you build Kinect for Windows applications, incorporating NUI into your application, without worry about the Kinect down and deep details.

KinectKit

Develop more humanlike application interfaces made possible by tracking movement with Microsoft Kinect for Windows and Natural User Interface API (NAPI).

Chant KinectKit is comprised of software components that handle the complexities of tracking movement with Microsoft Kinect sensors. The components minimize the programming efforts necessary to construct software that maps movement from image and positional data captured by cameras.

With KinectKit you can:

  • capture and map color, depth, and skeleton data,
  • record and playback audio,
  • integrate movement tracking with speech technology,
  • enumerate and control Microsoft Kinect sensors, and
  • develop applications with your favorite programming language: C++Builder, C++, Delphi, Java, and .NET Framework.

image

Viewers Sample: Capture color, depth, and skeleton data to map movement.

image

Audio Source Sample: Detect audio signal source, record, and playback audio.

Within Chant Developer Workbench, you can:

  • Enumerate Microsoft Kinect sensors,
  • Render color, depth, and skeleton data,
  • Trace sensor events, and
  • Test application sensor management functions.

image

Project Information URL: http://visualstudiogallery.msdn.microsoft.com/51395a1c-5c9e-4ce1-8ec9-e1276cb60fce

Project Download URL: http://visualstudiogallery.msdn.microsoft.com/51395a1c-5c9e-4ce1-8ec9-e1276cb60fce

KinectKit

Movement Management Component Architecture

The KinectKit component library includes a movement management class that provides you a simple way to track and map movement with Microsoft Kinect sensors.

The movement management class, ChantKM, enables you to start and stop color, depth, skeleton, and audio data collection with Microsoft Kinect sensors. Your application can also use the KinectSensor and adjunct classes to manage low-level functions if desired.

With the ChantKM class, you can detect movement, process color, depth, and skeleton data, and record audio to a file. Your application uses the ChantKM class to manage the activities for interacting with the Microsoft Kinect sensor on behalf of your application. The ChantKM class manages the resources and interacts directly with the Natural User Interface API (NAPI) runtime.

Your application receives status notifications through event callbacks.

Speech recognition and synthesis are supported with the SpeechKit ChantSR and ChantTTS classes. See SpeechKit for more information about integrating speech technology.

image

The ChantKM class encapsulates the NAPI functions to make the process of tracking movement with Microsoft Kinect sensors simple and efficient for your application.

The ChantKM class simplifies the process of managing Microsoft Kinect sensors by handling the low-level activities directly with the sensor.

You instantiate a ChantKM class object before you want to start tracking movement within your application. You destroy the ChantKM class object and release its resources when you no longer want to track movement within your application.

Project Information URL: http://www.chant.net/Products/KinectKit/architecture.aspx

Tags:

Follow the Discussion

Comments Closed

Comments have been closed since this content was published more than 30 days ago, but if you'd like to continue the conversation, please create a new thread in our Forums,
or Contact Us and let us know.