Recently I completed my experiment with the Oculus Rift and a Kinect to drive a Parrot Ar.Drone.
Using Kinect I implemented both voice commands and gestures to drive the drone. With voice commands, I issue the takeoff, land, emergency and change camera commands, while with gestures I drive the drone.
The nice thing about the voice recognition of the Kinect is that it works quite well with all the noise made by the drone. As you will see in the video I raised a bit my voice, but I still find it awesome.
The gestures implemented are the following:
- Both arms forward -> move the drone forward
- Both hands near the shoulders -> move the drone backward
- Left arm extended to the left and right arm along the body -> move the drone to the left
- Left arm extended to the left and right arm forward -> move forward and to the left the drone
- Left arm extended to the left and right hand near the right shoulder -> move backward and to the left the drone
The right movements are simply the left ones mirrored.
The Oculus, instead, was used to view the live feed of both drone’s cameras (with the stereoscopic filter applied, of course) and the head tracking to move the drone up/down and turn left/right.
Here is a video of it working. In the first part you can see the head movements, while in the second part the gestures.
It was fun to put together all these gadgets and try them out (I mean, having the drone crash everywhere :D)
Here you can find the full source code if you want to try it out.
As a disclaimer, I did this to learn something new and just for fun in my spare time, so the code is nothing to be proud of 😀
This is a simple summary, but, if you are interested, I will make some posts about the code of every part involved (gestures, voice commands, head tracking)