This project came out of another collaboration with artist Antoine Catala. He came to me in January with a concept involving autonomously flying quadcopters. The all-time record (as Antoine can attest) was back and forth on the line 12 times…though I did have to use a much fatter line for this.
I spent a couple of weeks doing internet research, and found a wealth of information. The two main sources of information were the DIY Drones community and the Labview Hacker community. The Labview ‘toolkit’ (the technology I ultimately settled on) for controlling quadcopters initially derived in part from Mike Mogenson who wrote a master’s thesis on building a controller to make the Parrot AR Drone 1.0 autonomous. He was an invaluable source of help throughout the project, and one of his video is shown below:
Because the concept was to involve indefinite, repeatable motion, I needed some way to orient the drone in a room. As one option, we investigated the very sophisticated camera mapping done through Robot Operating System (ROS). We loaded Linux on a laptop and my dad helped load ROS using the instructions found here. We could not achieve the results seen in the youtube video, and ultimately gave up on the option.
Another means of orientation, used by the DIY Drone community: GPS. The difficulty was that we were hoping to fly the drones indoors. I had hoped that a limited, fragmented signal could be of some use in providing a reference point. As a test, I ended up purchasing a GPS transmitter and used an Arduino microprocessor to interpret the coordinates. I found out that using the GPS indoors was not accurate to anything less than 3-5 meters, so I gave up on this.
The last option, the one that I settled on, was using the Labview hacker toolkit, and doing image processing on the AR drone’s camera feed. This guy didn’t use Labview, but was successful tracking a red line on the ground using the downward facing camera. I decided to do something similar.
I was able to sample the camera’s feed every 20ms (which was the speed that my entire program operated), and parse the images into RGB arrays. Then with each element valued 0 to 255 in red, greeen and blue, I was able to apply a filter to determine the edges of the red line in the drone’s field of vision. I sampled for the red line at the top of the vision and the bottom, which allowed me to give a yaw command if drone was not oriented parallel to the line.
Like the yaw control, I had a certain threshold for roll commands. For instance if the red line was detected as 15 pixels off center right, it would trigger the lower end of a proportional roll command. All commands were sent to the drone’s controller as values between 1 and -1. After completing the start-up routine–ie reaching a designated altitude of 2m–the drone would be given a constant .07 pitch forward command (which was the smallest command that would affect the drone), where roll and yaw commands would be given proportional to the amount of error the red line was off center. When the top half of the red line sampler registered an end of the red line, the opposite pitch command (-.07) would be given and the drone would reverse its course backwards along the line.
Unfortunately, WordPress would not allow me to upload my Labview VIs. It won’t really be intelligible to anyone but me, but I have posted several screenshots taken from my Labview work.
Error Checking the Red Pixels
Sending Proportional Commands to Drone Controller
Main Operating Loop for Drone