Oculus Rift, Parrot Ar.Drone, a Kinect and you...

Description

Today's project by Alessandro Colla is just all sorts of cool... VR, Quads and the Kinect? Oh yeah, you know I have to highlight this!

Oculus, Ar. Drone and Kinect completed

Recently I completed my experiment with the Oculus Rift and a Kinect to drive a Parrot Ar.Drone.

Using Kinect I implemented both voice commands and gestures to drive the drone. With voice commands I issue the takeoff, land, emergency and change camera commands, while with gestures I drive the drone.

The nice thing about the voice recognition of the Kinect is that it works quite well with all the noise made by the drone. As you will see in the video I raised a bit my voice, but I still find it awesome.

The gestures implemented are the following:

  • Both arms forward -> move the drone forward
  • Both hands near the shoulders -> move the drone backward
  • Left arm extended to the left and right arm along the body -> move the drone to the left
  • Left arm extended to the left and right arm forward -> move forward and to the left the drone
  • Left arm extended to the left and right hand near the right shoulder -> move backward and to the left the drone

The right movements are simply the left ones mirrored.

The Oculus, instead, was used to view the live feed of both drone’s cameras (with the stereoscopic filter applied, of course) and the head tracking to move the drone up/down and turn left/right.

Here is a video of it working. In the first part you can see the head movements, while in the second part the gestures.

It was fun to put together all these gadgets and try them out (I mean, having the drone crash everywhere Big Smile)

Here you can find the full source code if you want to try it out.
As a disclaimer, I did this to learn something new and just for fun in my spare time, so the code is nothing to be proud of Big Smile

This is a simple summary, but, if you’re interested, I will make some posts about the code of every part involved (gestures, voice commands, head tracking)

Project Information URL: http://bettercoderwannabe.blogspot.it/2013/08/oculus-ar-drone-and-kinect-completed.html

Project Source URL: https://github.com/Iridio/OculusArDroneKinect

OculusArDroneKinect

An XNA project to control an Ar.Drone with Kinect gestures and voice commands while the video feed of the cameras is streamed to the Oculus Rift (can be disabled by setting drawOculus=false in the class OculusParrotKinect.cs)

Voice commands

Voice commands are specified in the class VoiceCommands.cs

Commands supported are: take off, land, emergency, change camera
Activate and deactivate Face recognition

Gesture

With both arms forward -> move the drone forward
With both hands near the shoulders -> move the drone backward
With left arm extended to the left and right arm along the body -> move the drone to the left
With left arm extended to the left and right arm forward -> move forward and to the left the drone
With left arm extended to the left and right hand near the right shoulder -> move backward and to the left the drone
The right movements are simply the left ones mirrored.

Credits

For the Ar.Drone library, I used the one from Ruslan Balanukhin.
The AR.Drone projet and the .NET FFMpeg wrapper.
For the Oculus implementation, I used the Sunburn StereoscopicRenderer plugin as a starting point. It's an implementation made by the guy behind the Holophone3D.
I learned a lot from these projects.

image

image

image

Contact Information:

The Discussion

Comments closed

Comments have been closed since this content was published more than 30 days ago, but if you'd like to continue the conversation, please create a new thread in our Forums, or Contact Us and let us know.