MIT News - March 14, 2012
Aircraft-carrier crew use a set of standard hand gestures to guide planes on the carrier deck. But as robot planes are increasingly used for routine air missions, researchers at MIT are working on a system that would enable them to follow the same types of gestures.
The problem of interpreting hand signals has two distinct parts. The first is simply inferring the body pose of the signaler from a digital image: Are the hands up or down, the elbows in or out? The second is determining which specific gesture is depicted in a series of images. The MIT researchers are chiefly concerned with the second problem; they present their solution in the March issue of the journal ACM Transactions on Interactive Intelligent Systems. But to test their approach, they also had to address the first problem, which they did in work presented at last year's IEEE International Conference on Automatic Face and Gesture Recognition.
Yale Song, a PhD student in MIT's Department of Electrical Engineering and Computer Science, his advisor, computer science professor Randall Davis, and David Demirdjian, a research scientist at MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL), recorded a series of videos in which several different people performed a set of 24 gestures commonly used by aircraft-carrier deck personnel. In order to test their gesture-identification system, they first had to determine the body pose of each subject in each frame of video. "These days you can just easily use off-the-shelf Kinect or many other drivers," Song says, referring to the popular Microsoft Xbox device that allows players to control video games using gestures. But that wasn't true when the MIT researchers began their project; to make things even more complicated, their algorithms had to infer not only body position but also the shapes of the subjects' hands.
Read more at MIT News: http://web.mit.edu/newsoffice/2012/robots-hand-gestures-0314.html
Video: Melanie Gonick
Simulations courtesy of Yale Song
Tagged under: gesture,recognition,robot,plane,NAVY,hands,control,autonomous,unmanned,MIT,news
Clip makes it super easy to turn any public video into a formative assessment activity in your classroom.
Add multiple choice quizzes, questions and browse hundreds of approved, video lesson ideas for Clip
Make YouTube one of your teaching aids - Works perfectly with lesson micro-teaching plans
1. Students enter a simple code
2. You play the video
3. The students comment
4. You review and reflect
* Whiteboard required for teacher-paced activities
With four apps, each designed around existing classroom activities, Spiral gives you the power to do formative assessment with anything you teach.
Carry out a quickfire formative assessment to see what the whole class is thinking
Create interactive presentations to spark creativity in class
Student teams can create and share collaborative presentations from linked devices
Turn any public video into a live chat with questions and quizzes