Project-Infrared, Kinect and the HoloLens


Today’s post highlights an article written by Jason Odom that takes one of the most interesting Kinect topics, motion tracking, and brings it into the future with the HoloLens…

Set Up Project-Infrared & Add Full Body Motion Tracking to the HoloLens Using a Kinect

Thanks to Project-Infrared, there's now a pretty straightforward way to add motion tracking to the HoloLens: Connect it to a Kinect.

Wavelength LLC's created a way to get the Microsoft Kinect working as a motion-tracking input device for HoloLens, which my colleague Adam Dachis wrote about last week. A few days later, the CEO of Wavelength LLC Kyle "G" and his team released Project-Infrared to the community as an open-source github repository.

Project-Infrared is a motion tracking system that uses Microsoft's Kinect and HoloLens. The Kinect takes the input from the movement of the user and transmits an avatar, mimicking the user's movement and displaying it on the Microsoft HoloLens.

Why Motion Tracking?

As developers, designers, and creators, we are always looking for new and exciting ways to interact with our various chosen platforms. With more input options comes new and unique solutions to the problems we are trying to solve. Motion capture has been used by the film and game industries for years, due to its highly expressive detail—everything from hand movements to body language and facial features are picked up and mimicked on screen.

What Is the Solution for Us?

Enter motion-tracking, which tracks a user's entire body so they can control their device without a specialized suit, and in a minimal amount of space. Compared to motion capture, motion tracking is a relatively young technology and far less sophisticated. For it to evolve, we need people developing ideas and new use cases.

The only way to do that is by building on each other's successes. Wavelenghth's Kyle G. says he released the code for their Project-Infrared as open source in response to the needs of an educator in the community: "I can't deny students learning. It's part of my 'no school left behind' belief."

With that in mind, let's walk through how to set this project up, and what we can do with it.

First, let us gather what we will need:

Now I am assuming a working knowledge of Unity and HoloLens development in general, including the ability to build, compile, and deploy code to the HoloLens or HoloLens Emulator.

Set Up Your Hardware and Install Project-Infrared

Now Let's Try It Out : Testing the Application


After a few seconds of the Unity Logo and a short wait for the data to reach the HoloLens, you should see something similar to the image below.


The Doctor is In... Virtual space!

Congrats! You now have working motion-tracking. Dance around and see Mortimer dance as well. In its current state that isn't a ton of options to dig into, but there are 3 avatars you can try. Look in the Avatar Source View, you can change the Avatar Asset Name from Mortimer to Jill or ParasiteLStarkie by typing the name into the input field.

Project Information URL:

Contact Information:

The Discussion

  • User profile image

    Thanks for the article!  Please note we just updated the license to MIT so everyone can freely use the software.

  • User profile image
    TonyVT Skarredghost

    Great job!
    We at Immotionar ( if you're curious) do something similar, since we put full body of the user inside VR (so Oculus/Vive and still not Hololens)... full body tracking with external sensor is really useful in lots of applications. Kinect is great since it requires the user to wear nothing, he/she is completely free. Unluckily it is not being updated since 2 years and that's a huge problem for all of us working with it...

Conversation locked

This conversation has been locked by the site admins. No new comments can be made.