NuiLib - Adding a Plus (as in C++) to creating NUI's with the Kinect

Today's project by John McCaffery is a C++ library (and demo's) that looks like it would be a great help to those build Kinect for Windows applications in C++.

NuiLib

NuiLib is a utility library intended to ease integration of NUI (Natural User Input) Devices (such as the Microsoft Kinect) into applications. It provides an abstraction layer which hides the device being input from and provides easy support for common operations. Applications built using NUI lib benefit from the ability to make the logic of how the device is used clear and easy to understand. They also gain the ability to switch to different driver sets or even different devices with no code changes.

NuiLib is available for download or contribution on github.

Target Users
The library is focused on three users groups. Firstly, developers creating / modifying applications to provide NUI support. Secondly, developers interested in optimising NUI device use. Thirdly, developers experimenting with new algorithms for processing NUI data. Each group can work on or with the library in different ways and their innovations automatically feed back in to the other groups. Any optimisations done on the core library will speed up any application or algorithm based around the library. Any new algorithm developed through the library becomes another tool in the application developer’s arsenal. Lastly any feedback given by application developers can focus the efforts of the other two groups.

Ease of Use
Making it easy to integrate NUI input is a primary concern for NuiLib. This is what should make it attractive to application developers. The following code snippet is an example of the simplicity of accessing NUI input through NuiLib. It demonstrates initialising the device, gaining access to the location information for two skeleton joints and then computing the vector between the two joints. Last it outputs this vector whenever the value changes.

...

Extensibility
As well as ease of use the library focuses on extensibility. This makes it an attractive platform for experimenting with new algorithms. Developers can easily see the results of their work and make them available to other users. The library includes an extension mechanism so any physical device capable of providing cartesian coordinates for skeleton joints could be integrated. This support can be added to the main trunk or built as a seperate linkable unit. The core functionality of the system can also be extended. New algorithms can be written and linked into the system.

To aid in developing computer vision based algorithms NuiLib is built on top of OpenCV. This means depth and colour frames are available as OpenCV matrices for easy processing.

Structure
The library is accessed through three concepts. Components, Component Factory Functions and the NuiFactory.

Project Information URL: http://blogs.cs.st-andrews.ac.uk/mccaffery/nuilib/

Project Download URL: http://blogs.cs.st-andrews.ac.uk/mccaffery/nuilib/download/

Project Source URL: https://github.com/JohnMcCaffery/NuiLib

image

#include <NuiLib-API.h>
//Needed for waitKey
#include <opencv\highgui.h>


int main (int argc, char **args) {
    //Initialise the factory.
    NuiLib::NuiFactory()->Init();


    //Create the arm vector as the difference between the shoulder and the hand.
    NuiLib::Vector arm = NuiLib::joint(NuiLib::HAND_RIGHT) - NuiLib::joint(NuiLib::SHOULDER_RIGHT);
    //Add a listener so whenever the arm vector changes its new values are output.
    arm.AddListener([&arm](NuiLib::IObservable *s) { cout << "Right Arm: " << arm.X() << ',' << arm.Y() << ',' << arm.Z() << '\n'; });
    //Start the factory polling.
    NuiLib::NuiFactory()->SetAutoPoll(true);


    //Wait for user input to stop the program.
#ifdef VISUAL
    cv::waitKey();
#else
    string waitstr = " ";
    cin >> waitstr;
#endif
}

Contact Information:

Tags:

Follow the Discussion

Comments Closed

Comments have been closed since this content was published more than 30 days ago, but if you'd like to continue the conversation, please create a new thread in our Forums,
or Contact Us and let us know.