Coffeehouse Thread

4 posts

Forum Read Only

This forum has been made read only by the site admins. No new threads or comments can be added.

Microsoft Kinect SDK vs Openni and KNITE

Back to Forum: Coffeehouse
  • User profile image
    AliEtoom

    HI,

    Now i am working on a KINECT project and i read a couple of articles that says Microsoft Kinect SDK does not support complex gestures but KNITE provides algorithms for complex gestures is that right?

     

     

  • User profile image
    JoshRoss

    @AliEtoom: With the risk of looking ignorant, I ask "what is KNITE?"

  • User profile image
    AliEtoom

    KNITE (Interface) is a middle-ware that provides algorithms to manipulate the data streams retrieved by KINECT sensor and using the algorithms by knite you can do complex gesutres

     

  • User profile image
    Dan

    @AliEtoom: There are advantages and disadvantages to both. While the Kinect SDK doesn't include built-in support for gestures, the SDK ships with a gesture sample ("Slideshows Gestures WPF" ), and several open source tools exist for building gestures like this one: http://kinecttoolbox.codeplex.com/

    The 1.5 Kinect SDK includes several features not available in OpenNI/NITE, like seated skeletal tracking, face tracking and support for Kinect for Windows near mode. The SDK also includes lots of samples and helper classes to get you started, like UI controls for camera, depth, skeletal data, audio, and Kinect sensor management. With OpenNI/NITE, you have to do all of these things, like wire up the raw camera data, or skeletal joints yourself.

     

Conversation locked

This conversation has been locked by the site admins. No new comments can be made.