Adding extreme gesture support to your app's with help from 3 Gear Systems

This hardware project and API lets you leverage the Kinect for Windows (actually two of them) to add very precise gesture support to your applications, we're talking finger precise gesture support...

3 Gear Systems

Finger-precise hand-tracking

Our technology enables the Kinect to reconstruct a finger-precise representation of what the hands are doing. This allows us to build simple and intuitive interactions that leverage small, comfortable gestures: pinching and small wrist movements instead of sweeping arm motions, for example.

Our system currently uses two Kinects to avoid problems with occlusions (where the fingers are blocked by part of the hand), although we expect future versions to relax this requirement. The cameras are mounted a little over a meter above the desk.

When properly calibrated to the user, our system provides millimeter-level accuracy of the user's hands.

Ergonomics

Because the cameras are mounted above the desk, it is still possible to use our system even when the hands are very close to the desk surface.

Because the cameras are mounted above the desk, our system works well even if the hands are just a couple centimeters above the keyboard or desk surface, avoiding the so-called “gorilla arm” problem. In most cases, you will find you can rest your forearms comfortably on the desk surface while making small motions with your wrist and fingers. We even (experimentally) allow writing on the desk surface itself using your index finger (see our video).

Robust tracking at your desk

Our system is designed to fit on top of your desk and work well alongside the mouse and keyboard. It's also designed to run in the backround all day, recovering gracefully from tracking failures, such as when you step away from the desk.

Application Programming Interface (API)

Our API is open, meaning that you can write software in any language that takes advantage of it. It's free during the beta period and afterwards for small commercial entities; contact us for a license if you are interested in integrating our API into your product. We have provided a number of example applications, and will be actively improving the tracking and the API in response to user feedback.

image

It's easy to get started.

To try out our technology or write your own gestural user interfaces, you'll need to get some off-the-shelf hardware and install our software development kit (SDK). Our software is free for both non-commercial and commercial use during the public beta period (up until November 30th, 2012). Why not give it a shot?

Step 1: Get the hardware

Our system uses commodity off-the-shelf hardware totaling about US$330. Here are the components you'll need and a suggestion of where to buy them from.

image

Step 2: Get the Software Development Kit (SDK)

Our software takes the raw 3D data from the Kinects and turns it into usable data on the state of the hands. We provide a simple API based on pointing and clicking as well as a lower-level API that provides approximate joint angles. Find more details in the API documentation. APIs are available in C++ and Java, with C# and Python on the way.

...

Project Information URL: http://www.threegear.com/technology.html

Project Download URL: http://www.threegear.com/download.html

Contact Information:

Tags:

Follow the Discussion

Comments Closed

Comments have been closed since this content was published more than 30 days ago, but if you'd like to continue the conversation, please create a new thread in our Forums,
or Contact Us and let us know.