Connect to space with NASA JPL, Oculus Rift and the Kinect v2
- Posted: Jan 08, 2014 at 6:00AM
- 10,999 views
- 1 comment
Loading user information from Channel 9
Something went wrong getting user information from Channel 9
Loading user information from MSDN
Something went wrong getting user information from MSDN
Loading Visual Studio Achievements
Something went wrong getting the Visual Studio Achievements
Today's project from one of my favorite space agencies provides an interesting glimpse into what the future of space exploration may look like...
NASA's Jet Propulsion Laboratory has been on the hunt for a more natural way to maneuver robots in space for some time now, resulting in cool experiments like using a Leap Motion controller to remotely control a Mars rover and using an Oculus Rift plus a Virtuix Omni to take a virtual tour of the Red Planet. It therefore made sense for the folks at JPL to sign up for the latest Kinect for Windows developer program in order to get their hands on the newer and more precise Kinect 2 (which, incidentally, is not available as a standalone unit separate from the Xbox One) to see if it would offer yet another robotics solution.
They received their dev kit in late November, and after a few days of tinkering, were able to hook up an Oculus Rift with the Kinect 2 in order to manipulate an off-the-shelf robotic arm. According to our interview with a group of JPL engineers, the combination of the Oculus's head-mounted display and the Kinect's motion sensors has resulted in "the most immersive interface" JPL has built to date. Join us after the break to see a video of this in action and find out just why one of them has called this build nothing short of revolutionary.
JPL took part in the first Kinect developer program as well, so it was already intimately familiar with how Kinect's motion sensor technology worked. It built a series of applications and eventually worked with Microsoft to release a game where you were tasked with landing Curiosity safely on Mars. The second Kinect, however, offers a lot more precision and accuracy than the first. "It allowed us to track open and closed states, and the rotation of the wrist," says Human Interfaces Engineer Victor Luo. "With all of these new tracking points and rotational degrees of freedom, we were able to better manipulate the arm."
Project Information URL: http://www.engadget.com/2013/12/23/nasa-jpl-control-robotic-arm-kinect-2/