Visual Control Interface for Manus ARM
Accessibility is more than an access device. It is the ability of a person to understand and manipulate a device. Accessibility depends on a person's motor dexterity, cognitive capabilities, sensory impairments, behavioral skills, and social skills. This research focuses on providing methods for independent manipulation of unstructured environments utilizing a wheelchair mounted robotic arm. The target audience would be power wheelchair users who may additionally have cognitive impairments. It was hypothesized that a vision-based interface would be easier to use then a menu-based system. With greater levels of autonomy, less user input would be necessary for control. Thus, by explicitly designating the end goal of a "pick-and-place" activity of daily living, the end user population could be expanded to include persons with cognitive impairments.
- The user “zooms in” on the doorknob using progressive quartering. The red box indicates the selected region which contains the desired object.
Towards this end, we designed and implemented human-robot interfaces compatible with indirect (e.g. single switch scanning) and direct (e.g. touch screen and joystick) selection. We implemented an autonomous system for the Manus robot arm to reach towards the desired object. We evaluated the interfaces and system with able-bodied participants to provide a baseline and with end-users. We developed interface design guidelines and experimental design guidelines for human-robot interaction with assistive technology.
- The user selects the desired object indicating "I want that" with either a touch screen or mouse-emulating joystick. The fixed camera view selection is shown on the left. The moving camera view is shown center and right, using touch screen and joystick, respectively.
In our current work, we have integrated our visual interface and reaching algorithms with the our University of Central Florida collaborator's object recognition and grasping algorithms. A video of the end-to-end system can be seen below:
Related Papers
Katherine M. Tsui, Dae-Jin Kim, Aman Behal, David Kontak, and Holly A. Yanco. "I Want That": Human-in-the-Loop Control of a Wheelchair-Mounted Robotic Arm. To appear in the Journal of Applied Bionics and Biomechanics, Special Issue on Assistive and Rehabilitation Robotics. IOS Press, 2011.
Katherine M. Tsui and Holly Yanco. Towards Establishing Clinical Credibility for Rehabilitation and Assistive Robots Through Experimental Design. In Proceedings of the Robotics: Science and Systems Workshop on Good Experimental Methodology in Robotics. June 28, 2009.
Katherine M. Tsui, Holly Yanco, David Feil-Seifer, and Maja Mataric. Methods for Evaluating Assistive Robotic Technology. Performance Evaluation and Benchmarking of Intelligent Systems. Edited by Raj Madhavan, Edward Tunstel, and Elena Messina. Springer, 2009.
Katherine M. Tsui. MS Thesis: Design and Evaluation of a Visual Control Interface of a Wheelchair Robotic Arm for Users with Cognitive Impairments. University of Massachusetts Lowell, Lowell, MA. May 2008.
Katherine M. Tsui, Holly Yanco, David Kontak, and Linda Beliveau. Development and Evaluation of a Flexible Interface for a Wheelchair Mounted Robotic Arm. Proceedings of the ACM SIGCHI/SIGART Human-Robot Interaction Conference, March 2008.
Katherine M. Tsui, Holly Yanco, David Kontak, and Linda Beliveau. Experimental Design for Human-Robot Interaction with Assistive Technology. Human-Robot Interaction Conference Workshop on Robotic Helpers: User Interaction, Interfaces and Companions in Assistive and Therapy Robotics, March 12, 2008.
Katherine M. Tsui and Holly A. Yanco. Simplifying Wheelchair Mounted Robotic Arm Control with a Visual Interface. Proceedings of the AAAI Spring Symposium on Multidisciplinary Collaboration for Socially Assistive Robots, March 2007.
Katherine M. Tsui and Holly A. Yanco. Human-in-the-Loop Control of an Assistive Robot Arm. Proceedings of the Workshop on Manipulation for Human Environments, Robotics: Science and Systems Conference, August 2006.