Robotic Telepresence

Margo

Commercial telepresence robots can be thought of as embodied video conferencing on wheels. Several companies now produce and sell telepresence robots for the purposes of providing interactive 2-way audio and video. Additionally, these teleprensece robots’ mobility provides the operator the means to explore as he/she desires. The companies have envisioned telepresence robots being used for a variety of applications such has as having ad-hoc office conversations, conducting patient rounds in hospitals, and touring manufacturing facilities. 

During Summer 2010, we conducted a series of user studies of two telepresence robots (Anybots' QB and VGo Communications' VGo) in an office environment at Google in Mountain View, CA. One study focused on virtual teams in which a remote teammate (n=6) used a telepresence robot to attend his/her regularly scheduled team meetings. We found that people who used to be in the same building as their teammates and then moved to a different location had the best experiences recreating this closeness with their teams using telepresence robots.

We hypothesize that people with disabilities who may live full time in a medical institution can gain similar benefits. For example, medically fragile students could use a robot to attend their regular classes at school (see CNN and Texomas). Elders in nursing homes or family members residing in medical institutions can “hang out” as if they were physically together. To this end, we are beginning a new study in conjunction with Crotched Mountain Rehabilitation Center investigating if telepresence robots can recreate the closeness of daily interactions and mitigate isolation.

 

We are also currently researching human-robot cooperative navigation in conjunction with teams at Tufts University, University of Michigan, and Crotched Mountain Rehabilitation Center. A robotic wheelchair and two VGo Communications telepresence robots, Hugo and Margo, have been augmented to aid in this research. Margo's design has been replicated twice to make Tufts' Tobego and UMichigan's Largo.

We are investigating how people would like to interact with and instruct these robots, as well as what kinds of autonomous behaviors are needed. Focus groups have been run and found that users are interested in different access methods, personalization, and commands to the robot at a higher level (e.g. go to location, follow person, etc.)

 

Acknowledgements:

This research has been funded in part by NSF (IIS-0905228, IIS-0546309). We would like to thank Dr. Chris Uhlik of Google, and Anybots and VGo Communications for loaning us prototype robots.

 

Related Papers

Katherine M. Tsui, Munjal Desai, and Holly A. Yanco. Towards Measuring the Quality of Interaction: Communication through Telepresence Robots. Proceedings of the Performance Metrics for Intelligent Systems Workshop (PerMIS), College Park, Maryland, March 20-22, 2012.

Katherine M. Tsui, Adam Norton, Daniel Brooks, Holly A. Yanco, and David Kontak. Designing Telepresence Robot Systems for Use by People with Special Needs. Proceedings of the International Symposium on Quality of Life Technologies 2011: Intelligent Systems for Better Living, held in conjunction with RESNA 2011 as part of FICCDAT, Toronto, Canada, June 6-7, 2011.

Munjal Desai, Katherine M. Tsui, Holly A. Yanco, and Chris Uhlik. Essential Features of Telepresence Robots. Proceedings of the IEEE International Conference on Technologies for Practical Robot Applications, Woburn, MA, April 2011.

Katherine M. Tsui, Munjal Desai, Holly A. Yanco, and Chris Uhlik. Telepresence Robots Roam the Halls of My Office Building. Human-Robot Interaction (HRI) Workshop on Social Robotic Telepresence, Lausanne, Switzerland, March 6-9, 2011.

Katherine M. Tsui, Munjal Desai, Holly A. Yanco, and Chris Uhlik.  Exploring Use Cases for Telepresence Robots. Proceedings of the 6th ACM/IEEE International Conference on Human-Robot Interaction, Lausanne, Switzerland, March 6-9, 2011.

 

Press Images

Click the thumbnail for a high-resolution photo.

"Hugo (an augmented VGo Communication's VGo telepresence robot) is being driven remotely and being used to walk alongside a colleague, actively participating in a mobile conversation. The driver can be seen on Hugo's screen." Kate Tsui is the robot driver and next to her is Adam Norton. Adam is an educator and designer working in the UMass Lowell Robotics Lab. Photo credit goes to John Fertitta.

"The top half of Hugo (an augmented VGo Communication's VGo telepresence robot) has been augmented with a light-up LED tie used to indicate the robot's current status. The driver can be seen on Hugo's screen." Photo credit goes to Adam Norton.