This academic project from Cornell University that shows how they are using Kinect to try to figure out what you're doing. How can a computer know if you're taking a drink, brushing your teeth, etc? That's what this research is focusing on.
While there's some source, it's not directly Kinect related. This is one of those getting you thinking, showing off some research being done with Kinect, all that science is hurting my brain, kind of post...
This post comes to us via Kinect-Based AI System Watches What You're Up To
Human Activity Detection from RGBD Images
Being able to detect and recognize human activities is important for making personal assistant robots useful in performing assistive tasks. The challenge is to develop a system that is low-cost, reliable in unstructured home settings, and also straightforward to use. In this paper, we use a RGBD sensor (Microsoft Kinect) as the input sensor, and present learning algorithms to infer the activities. Our algorithm is based on a hierarchical maximum entropy Markov model (MEMM). It considers a person's activity as composed of a set of sub-activities, and infers the two-layered graph structure using a dynamic programming approach. We test our algorithm on detecting and recognizing twelve different activities performed by four people in different environments, such as a kitchen, a living room, an office, etc., and achieve an average performance of 84.3% when the person was seen before in the training set (and 64.2% when the person was not seen before).
Project Information URL: http://pr.cs.cornell.edu/humanactivities/index.php, http://arxiv.org/abs/1107.0169v1
Project Download URL: http://pr.cs.cornell.edu/humanactivities/data.php