Today's project is a call out for your help. Dwight Goins is new to the blog, but is someone I'm sure we're going to see more off in the future. Today he needs your help in beta test a library he's building...
I'm currently working on a library to detect Head Nods (Nodding in agreement), and Head Shakes (Shaking in disagreement) and I would like to know who would be interested in beta testing the Head Gesture Library for Windows 8.1 store applications? If this sounds like something you're interested in please like or up vote this post and send an email to DNGoins at Hotmail.
I will provide you with the details on how to get the library, usage and functionality. A quick write-up can be found ...
Currently, I am working on a medical project which requires detection of Head Nods (in agreement), Head Shakes (in disagreement), and Head Rolls (Asian/East Indian head gesture for agreement) within a computer application.
Being that I work with the Kinect for Windows device, I figured this device is perfect for this type of application.
This posting serves as explanation to how I built this library, the algorithm used, and how I used the Kinect device and Kinect for Windows SDK to implement it.
Before we get into the Guts of how this all works, let’s talk about why the Kinect is the device that is perfect for this type of application.
The Kinect v2.0 Device has many capabilities. One of which allows the device to capture a persons face in 3-D… That is 3-Dimensions:
Envision the Z-axis arrow pointing straight out towards you in one direction, and out towards the back of the monitor/screen in the other direction.
In Kinect terminology, this feature is called HD Face. In HD Face, the Kinect can track the eyes, mouth, nose, eye brows, and other specific things about the face when a person looks towards the Kinect camera.
We can measure height, width, and depth of a face. Not only can we measure 3-d values and coordinates on various axes, with a little math and engineering we can also measure movements and rotations over time.
Think about normal head movements for a second. We as humans twist and turn our heads for various reasons. One such reason is proper driving techniques. We twist and turn our heads when driving looking for other cars on the road. We look up at the skies on beautiful days. We look down on floors when we drop things. We even slightly nod our heads in agreement, and shake our heads in disgust.
Question: So from a technical perspective what does this movement look like?
Answer: When a person moves their head, the head rotates around a particular axis. It’s either the X, Y, Z, or even some combination of the three axis. This rotation is perceived from a point on the head. For our purposes, let’s look at the Nose as the point of perspective.
If you’re interested in testing out this library, please contact me here through this blog.
Here’s the library and a sample Windows 8.1 store application using the library in action. In the picture below, on the right a windows 8.1 store application displays a 3-D cube that represents the person’s tracked face. It moves as the head moves. When the person shakes or nods it counts. On the left represents tracing data from Visual Studio .Net 2013, and KinectStudio recorded clip of me testing the application
Project Information URL: https://dgoins.wordpress.com/2015/01/10/using-kinect-hd-face-to-make-the-headgesture-library/
- Blog: [Blog URL]
- Follow [Replace]