Wheelesley

The Wheelesley robot - circa 1997
The Wheelesley robot - 2005

 

This research project started at Wellesley College in January 1995 where Holly Yanco was an Instructor in the Computer Science Department. The project moved to the MIT Artificial Intelligence Laboratory.

Our system, Wheelesley, consisted of an electric wheelchair outfitted with a computer and sensors and a Macintosh Powerbook that was used for the user interface. The electric wheelchair kit was purchased from the KISS Institute for Practical Robotics in April 1995.

The robot can travel semi-autonomously in an indoor environment. This allowed the user to issue general directional commands and to rely upon the robot to carry out the low level routines such as object avoidance and wall following.

The user interface developed allowed the user to operate in three modes: manual, joystick and user interface. In manual mode, the wheelchair functions as a normal electric wheelchair. In joystick mode, the user issues directional commands through the joystick while the robot will avoid objects in the requested path. In user interface mode, the user interacts with the robot solely through the user interface. See below for work on customizing the user interface mode using an eye tracking system and a single switch.

Work on the project investigated the following issues:

  • Outdoor navigation using a vision system
  • Seamless movement between indoor and outdoor environments
  • Customizing user interfaces for people of varying abilities, using the same underlying navigation system
  • Shared control between a person and the robot

Customizing the interface

The wheelchair system must able to be customized for users of varying abilities. Research with Jim Gips of Boston College is investigating the issues involved. To date, the chair has been driven using the EagleEyes system and a single switch.

Wheelesley being driven using EagleEyes.

EagleEyes allows a person to control the computer through five electrodes place around the eyes. The electrodes measure the EOG (electro-oculographic potential), which corresponds to the angle of the eyes in the head. The user controls the cursor by moving his eyes and head. The system works as a replacement for a mouse on a Macintosh.

With a single switch, the user controls the wheelchair by clicking the button only. This one bit of information can be used to control the chair.

Public demonstrations of the wheelchair

We took the robot to the International Joint Conference on Artificial Intelligence in August 1995 where we participated in the robotic wheelchair exhibition. In a competition to see which system could go through the narrowest doorways, our system tied for first and was the only system that could go through the doorway without being guided.

System demonstration at IJCAI-95:
Moving around a corner without requiring the user to steer.

The system will be demonstrated with EagleEyes and with single switch control at the robot exhibition at AAAI-97, the Fourteenth National Conference on Artificial Intelligence, to be held in Providence, RI from July 27 to July 31.


This project was supported in part by the National Science Foundation under Grant Number CDA-9505200, in part by a Faculty Research Grant from Wellesley College and by the MURI project at the MIT AI Lab.