Coffeehouse Thread

4 posts

Microsoft Kinect SDK vs Openni and KNITE

Back to Forum: Coffeehouse
  • User profile image
    AliEtoom

    HI,

    Now i am working on a KINECT project and i read a couple of articles that says Microsoft Kinect SDK does not support complex gestures but KNITE provides algorithms for complex gestures is that right?

     

     

  • User profile image
    JoshRoss

    @AliEtoom: With the risk of looking ignorant, I ask "what is KNITE?"

  • User profile image
    AliEtoom

    KNITE (Interface) is a middle-ware that provides algorithms to manipulate the data streams retrieved by KINECT sensor and using the algorithms by knite you can do complex gesutres

     

  • User profile image
    Dan

    @AliEtoom: There are advantages and disadvantages to both. While the Kinect SDK doesn't include built-in support for gestures, the SDK ships with a gesture sample ("Slideshows Gestures WPF" ), and several open source tools exist for building gestures like this one: http://kinecttoolbox.codeplex.com/

    The 1.5 Kinect SDK includes several features not available in OpenNI/NITE, like seated skeletal tracking, face tracking and support for Kinect for Windows near mode. The SDK also includes lots of samples and helper classes to get you started, like UI controls for camera, depth, skeletal data, audio, and Kinect sensor management. With OpenNI/NITE, you have to do all of these things, like wire up the raw camera data, or skeletal joints yourself.

     

Comments closed

Comments have been closed since this content was published more than 30 days ago, but if you'd like to continue the conversation, please create a new thread in our Forums, or Contact Us and let us know.