Robotic Telepresence

Margo

Commercial telepresence robots can be thought of as embodied video conferencing on wheels. Several companies now produce and sell telepresence robots for the purposes of providing interactive 2-way audio and video. Additionally, these teleprensece robots’ mobility provides the operator the means to explore as he/she desires. The companies have envisioned telepresence robots being used for a variety of applications such has as having ad-hoc office conversations, conducting patient rounds in hospitals, and touring manufacturing facilities. 

During Summer 2010, we conducted a series of user studies of two telepresence robots (Anybots' QB and VGo Communications' VGo) in an office environment at Google in Mountain View, CA. One study focused on virtual teams in which a remote teammate (n=6) used a telepresence robot to attend his/her regularly scheduled team meetings. We found that people who used to be in the same building as their teammates and then moved to a different location had the best experiences recreating this closeness with their teams using telepresence robots.

We hypothesize that people with disabilities who may live full time in a medical institution can gain similar benefits. For example, medically fragile students could use a robot to attend their regular classes at school (see CNN and Texomas). Elders in nursing homes or family members residing in medical institutions can “hang out” as if they were physically together. To this end, we are beginning a new study in conjunction with Crotched Mountain Rehabilitation Center investigating if telepresence robots can recreate the closeness of daily interactions and mitigate isolation.

 

We are also currently researching human-robot cooperative navigation in conjunction with teams at Tufts University, University of Michigan, and Crotched Mountain Rehabilitation Center. A robotic wheelchair and two VGo Communications telepresence robots, Hugo and Margo, have been augmented to aid in this research. Margo's design has been replicated twice to make Tufts' Tobego and UMichigan's Largo.

We are investigating how people would like to interact with and instruct these robots, as well as what kinds of autonomous behaviors are needed. Focus groups have been run and found that users are interested in different access methods, personalization, and commands to the robot at a higher level (e.g. go to location, follow person, etc.)

 

Acknowledgements:

This research has been funded in part by NSF (IIS-0905228, IIS-0546309, IIS-1111125), We would like to thank Dr. Chris Uhlik of Google, and Anybots and VGo Communications for loaning us prototype robots.

 

Related Papers

Katherine M. Tsui, James M. Dalphond, Daniel J. Brooks, Mikhail S. Medvedev, Eric McCann, Jordan Allspaw, David Kontak, and Holly A. Yanco. Accessible Human-Robot Interaction for Telepresence Robots: A Case Study. Paladyn, Journal on Behavioral Robotics, Special Issue on Assistive Robotics. Volume 6, Issue 1, pp. 1-29, January 2015.

Katherine M. Tsui. PhD Thesis: The Development of Telepresence Robots for People with Disabilities. University of Massachusetts Lowell, Lowell, MA. April 2014. [84 MB high res | 14 MB low res]

Katherine M. Tsui, Adam Norton, Daniel J. Brooks, Eric McCann, Mikhail Medvedev, Jordan Allspaw, Sompop Sukawat, James M. Dalphond, Michael Lunderville, and Holly A. Yanco. Iterative Design of a Semi-Autonomous Social Telepresence Robot Research Platform: A Chronology. Intelligent Service Robotics Journal, Volume 7, Issue 2, pp. 103-119, 2014.

Katherine M. Tsui, Eric McCann, Amelia McHugh, Mikhail Medvedev, Holly A. Yanco, David Kontak, and Jill L. Drury. Towards Designing Telepresence Robot Navigation for People with Disabilities. International Journal of Intelligent Computing and Cybernetics, Special Issue on Robotic Rehabilitation and Assistive Technologies. Volume 7, Issue 3, pp. 307-344. 2014.

Katherine M. Tsui and Holly A. Yanco. Design Challenges and Guidelines for Social Interaction Using Mobile Telepresence Robots. Reviews of Human Factors and Ergonomics, Volume 9, Issue 1, November 2013, pp. 228-302.

Katherine M. Tsui, Kelsey Flynn, Amelia McHugh, Holly Yanco, and David Kontak. Designing Speech-Based Interfaces for Telepresence Robots for People with Disabilities. Proceedings of the IEEE International Conference on Rehabilitation Robotics (ICORR). Seattle, Washington, June 24-26, 2013. Selected for podium presentation.

Katherine M. Tsui, Adam Norton, Daniel J. Brooks, Eric McCann, Mikhail S. Medvedev, and Holly A. Yanco. Design and Development of Two Generations of Semi-Autonomous Social Telepresence Robots. Proceedings of the IEEE International Conference on Technologies for Practical Robot Applications (TePRA), Woburn, Massachusetts, April 2013.

Katherine M. Tsui, Munjal Desai, and Holly A. Yanco. Towards Measuring the Quality of Interaction: Communication through Telepresence Robots. Proceedings of the Performance Metrics for Intelligent Systems Workshop (PerMIS), College Park, Maryland, March 20-22, 2012.

Katherine M. Tsui, Adam Norton, Daniel Brooks, Holly A. Yanco, and David Kontak. Designing Telepresence Robot Systems for Use by People with Special Needs. Proceedings of the International Symposium on Quality of Life Technologies 2011: Intelligent Systems for Better Living, held in conjunction with RESNA 2011 as part of FICCDAT, Toronto, Canada, June 6-7, 2011.

Munjal Desai, Katherine M. Tsui, Holly A. Yanco, and Chris Uhlik. Essential Features of Telepresence Robots. Proceedings of the IEEE International Conference on Technologies for Practical Robot Applications, Woburn, MA, April 2011.

Katherine M. Tsui, Munjal Desai, Holly A. Yanco, and Chris Uhlik. Telepresence Robots Roam the Halls of My Office Building. Human-Robot Interaction (HRI) Workshop on Social Robotic Telepresence, Lausanne, Switzerland, March 6-9, 2011.

Katherine M. Tsui, Munjal Desai, Holly A. Yanco, and Chris Uhlik.  Exploring Use Cases for Telepresence Robots. Proceedings of the 6th ACM/IEEE International Conference on Human-Robot Interaction, Lausanne, Switzerland, March 6-9, 2011.

 

Press Images

Click the thumbnail for a high-resolution photo.

"Hugo (an augmented VGo Communication's VGo telepresence robot) is being driven remotely and being used to walk alongside a colleague, actively participating in a mobile conversation. The driver can be seen on Hugo's screen." Kate Tsui is the robot driver and next to her is Adam Norton. Adam is an educator and designer working in the UMass Lowell Robotics Lab. Photo credit goes to John Fertitta.

"The top half of Hugo (an augmented VGo Communication's VGo telepresence robot) has been augmented with a light-up LED tie used to indicate the robot's current status. The driver can be seen on Hugo's screen." Photo credit goes to Adam Norton.