Seated posture and gesture recognition with Kinect
- Posted: Oct 07, 2011 at 6:00AM
- 10,493 views
- 2 comments
Loading user information from Channel 9
Something went wrong getting user information from Channel 9
Loading user information from MSDN
Something went wrong getting user information from MSDN
Loading Visual Studio Achievements
Something went wrong getting the Visual Studio Achievements
Most Kinect examples are based on someone standing in front of the Kinect. But what if you can't stand? Today's project is unusual in that it focuses on using the Kinect and doing position and gesture recognition while you are seated...
My article is devoted to research of sitting posture recognition. The sitting posture recognition is based on the human skeleton tracking. There are three software packages that may produce the human skeleton tracking with Kinect sensor: OpenNi/PrimeSense Nite library,Microsoft Kinect Research SDK and Libfreenet library. I used two first of them. On their basis I developed research C# WPF applications where I combined color video stream and skeleton image.
Sitting posture recognition algorithm is based on the human’s skeleton tracking and obtaining 3 coordinates (xs, ys, zs), (xh, yh, zh), and (xk, yk, zk) of the positions of the human’s Shoulder (denoted as S), Hip(denoted as H), and Knee (denoted as K) .
A sitting posture is related to the angle a between the line HK (from hip to knee) and the line HS (from hip to shoulder).
We will distinguish left body part angle a - angle between “center hip to left knee” vector and “center hip to center shoulder” vector and right body part angle a - angle between “center hip to left knee” vector and “center hip to center shoulder” vector.
From the angle a and the hand’s position, the human’s sitting posture can be concluded and classified into one of the 4 specified types - sleeping, concentrating, raising hand, and non-focusing as given in table below.
Project Information URL: http://www.codeproject.com/KB/game/Recognition.aspx