Feature Stories Campus Events All Stories

Robots and Spring Term

The video above shows Upol Ehsan, a 2013 graduate of Washington and Lee, controlling a drone with hand gestures — the result of a project that he and classmates Dia Bisharat, Gabi Tremo and Fred Gisa completed during the four-week Spring Term course on robotics taught by Simon Levy, professor of computer science.

There’s lot to admire about the project — aside from the fact that the videos is just plain cool.

From the professor’s point of view, it’s pretty remarkable that the students could complete such a project in such a short time span. That was Simon’s goal at the outset. As he explained: “A couple of years ago, I developed a software tool to enable non-experts to control the AR.Drone using the Python programming language. It has proven very popular at W&L and other institutions.”

Simon went on to say how software like the one he created permits students to complete ambitious projects in a short amount of time. “If you look for similar-looking projects on YouTube or Google, you’ll likely find, as did my other students and I, that such projects are typically ‘one-off’ demonstrations that are difficult to replicate or modify. In contrast, part of our mission in the Computer Science Department is to train students to write software that can actually be used and improved by others, through good design principles.”

So with that software available, one four-member team in the class determined to see if they couldn’t turn a person into a control stick by using an Xbox Kinect, which allows controller-free, full-body game playing.

“The person becomes the controller,” said Upol, a double major in physics-engineering and philosophy with a minor in mathematics, who is from Bangladesh. “The natural thing people have is their touch. Touching and swaying your hands is intuitive. In creating the software to control the drone, we wanted to pick up on that and bypass any sort of sensors or joystick.”

Upol noted that lots of potential applications exist for a drone that is operated by gestures, and that the next step would be to add artificial intelligence that would allow the drone to go off and do the job on its own after a little training.

“The real advantage that we saw was the ability to get the drone out of precarious situations with gesture control,” he said. “You can get it out of a very tight spot.”

Because of his background in philosophy, Upol said that he pays very close attention to how humans interact with machines, so the project blended his interests. And Simon said that students solved a pretty tricky problem by figuring out how to get the output of the Kinect to control the movements of the drone.

The gesture-controlled drone wasn’t the only cool robot that came out of the Spring Term class. Have a look at this video of a Neato XV-11 vacuuming robot programmed to avoid obstacles using lidar. Kinsey Schell, a sophomore computer science major from New Orleans, had charge of that project.