As a cold war army veteran and having a son that deployed to Afghanistan this post hit close to home....
According to the U.S. Department of Veterans Affairs, PTSD affects 11 to 20 percent of veterans who have served in the most recent conflicts in Afghanistan and Iraq. It’s no wonder, then, that DARPA (the Defense Advanced Research Projects Agency, a part of the U.S. Department of Defense), wants to detect signs of PTSD in soldiers, in order to provide treatment as soon as possible.
One promising DARPA-funded PTSD project that has garnered substantial attention is SimSensei, a system that can detect the symptoms of PTSD while soldiers speak with a computer-generated “virtual human.” SimSensei is based on the premise that a person’s nonverbal communications—things like facial expressions, posture, gestures and speech patterns (as opposed to speech content)—are as important as what he or she says verbally in revealing signs of anxiety, stress and depression.
The Kinect sensor plays a prominent role in SimSensei by tracking the soldier’s body and posture. So, when the on-screen virtual human (researchers have named her Ellie, by the way) asks the soldier how he is feeling, the Kinect sensor tracks his overall movement and changes in posture during his reply. These nonverbal signs can reveal stress and anxiety, even if the soldier’s verbal response is “I feel fine.”
SimSensei interviews take place in a small, private room, with the subject sitting opposite the computer monitor. The Kinect sensor and other tracking devices are carefully arranged to capture all the nonverbal input. Ellie, who has been programmed with a friendly, nonjudgmental persona, asks questions in a quiet, even-tempered voice. The interview begins with fairly routine, nonthreatening queries, such as “Where are you from?” and then proceeds to more existential questions, like “When was the last time you were really happy?” Replies yield a host of verbal and nonverbal data, all of which is processed algorithmically to determine if the subject is showing the anxiety, stress and flat affect that can be signs of PTSD. If the system picks up such signals, Ellie has been programmed to ask follow-up questions that help determine if the subject needs to be seen by a human therapist.
Giota Stratou, one of ICT’s key programmers of SimSensei, provided details on the role of the Kinect sensor. “We used the original Kinect sensor and SDKs 1.6 and 1.7, particularly to track the points and angles of rotation of skeletal joints, from which we constructed skeleton-based features for nonverbal behavior. We included in our analysis features encoded from the skeleton focusing on head movement, hand movement and position, and we studied overall value by integrating in our distress predictor models.”
Comments have been closed since this content was published more than 30 days ago, but if you'd like to send us feedback you can Contact Us.