Monday 03 May 2010 09:54 Age: 3 yrs
Talk at Microsoft Research Symposium: Multi-Touch Human-Robot Interaction for Disaster Response
On April 6, 2010, Holly Yanco, Director of the Robotics Lab, and doctoral candidate Mark Micire presented at the 2010 Microsoft Research Symposium. The video of their talk can be found at http://research.microsoft.com/en-us/um/redmond/events/ersymposium2010/18700/player.htm.
In 2005, the response to Hurricane Katrina exposed several technological gaps. In a day where satellite photography is becoming ubiquitous in our digital lives, it was surprising to find that many response groups were still using hand-drawn paper maps and had no way to transmit video from cameras, sensors, or robots in the field. These gaps are largely due to the fact that there was no common computing platform to bring all of this information to the command staff. Our goal is to improve disaster response by coalescing available information from robots, humans, satellite photography, and GIS information into a consistent command and control system. Such an interface would be able to facilitate informed group discussion, risk assessment, plan development, and resource allocation.
To improve the ease of learning and usability of these interfaces, we conducted an experiment to determine the gestures that people would naturally use, rather than the gestures they would be instructed to use in a pre-designed system. In this presentation, we present the details of these findings, a taxonomy of the gesture set, and guidelines for designing gesture sets for robot control. We will also discuss our current work where we are developing an interface between the Microsoft Surface and Microsoft Robotics Developer Studio that will allow us to create a multi-robot interface for command staff to monitor and interact with all of the robots deployed at a disaster response. This work will be expanded in the future to allow for the tracking and tasking of search teams as well as a means for receiving data from the field.