SUBTLE Multi-University Research Initiative

The Situation Understanding Bot Through Language And Environment, or SUBTLE, project is a collaboration between multiple universities (University of Pennsylvia, UMass Amherst, UMass Lowell, Stanford, Cornell, George Mason) to design a prototype system that will allow a field commander in any number of contexts to control a robot by providing it similar commands and directives over voice that he or she would provide a human soldier or search and rescue worker. Our lab is engaged in designing and implementing the interface between high-level commands and the robotics platform.

The iRobot ATRV-JR used.

This project is comprised of three primary components: The command interface, the command logic, and the robotics platform control. These components perform message-passing amongst each other using the Robotics Service Oriented Architecture, a project written by a doctoral student in this laboratory.

The command interface is currently a graphical user interface, but the long-term goal is to replace this with a language-based interface, work on which is being performed at the University of Pennsylvania and the University of Massachusetts Amherst. The graphical interface as designed will provide for the display of photographs and maps, as well as an interface for issuing orders via dropdown menus and a visual task stack.

The command logic, named "Pragmatics," performs logical resolution and provides "common-sense" substitution abilities for managing order constraints. It is being developed by Dr. Christopher Potts at the University of Massachusetts Amherst. It breaks down the larger top-level orders into more specific commands to the robot, and receives back from the robot any relevant information about what it has detected. It uses this information to determine when to issue another direct command to the robot. It will be able to prompt the commander via the interface to accept or reject a similar or related match as a valid resolution to the issued order, and instruct the robot to act accordingly.

The robotics platform control performs the complex tasks which are necessary for successfully executing the commands issued by Pragmatics. It integrates a number of pre-existing techniques and software components with the rest of the system in order to perform searches and plan paths while avoiding collisions. It runs on an iRobot ATRV-JR, using a SICK LMS200 rangefinding laser for navigation.

To provide a concrete example of how this system works, consider the following scenario: The commander issues an order to find an assault rifle, a laptop, and a bomb, using the boolean meaning of the word "and." Pragmatics then tells the robot to perform a search. The robot returns back, in turn, as it finds them, a number of objects, some of which are not relevant to the currently issued order. As soon as the collection of found objects meets the constraints set forth in the order, Pragmatics issues an order to the robot to stop, and tells the commander that the order has successfully completed.

For further information about the overall goals and information about the other participants in this project are available at the SUBTLE website,

Related Papers

Daniel J. Brooks, Constantine Lignos, Cameron Finucane, Mikhail S. Medvedev, Ian Perera, Vasumathi Raman, Hadas Kress-Gazit, Mitch Marcus, and Holly A. Yanco. Make it so: Continuous, Flexible Natural Language Interaction with an Autonomous Robot. Proceedings of the AAAI 2012 Workshop on Grounding Language for Physical Systems, Toronto, Ontario, Canada, July 2012.

Daniel Brooks, Abraham Shultz, Munjal Desai, Philip Kovac, and Holly A. Yanco. Towards State Summarization for Autonomous Robots. Proceedings of the AAAI Fall Symposium on Dialog with Robots, November 2010.