Welcome to the Project Tango for Robotics page! Here you will find documentation, videos, and links for all publicly available work that uses Project Tango devices with robots. This page is brought to you by the UMass Lowell Robotics Lab and the NERVE Center.

If you have a project that you'd like to see featured here please e-mail anorton[at]cs.uml.edu.

Join the Project Tango Developer's Community on Google+, and check out the Robotics flag!


What is Project Tango?

The goal of Project Tango, developed by Google ATAP, is to give mobile devices a human-scale understanding of space and motion. Project Tango devices contain customized hardware and software designed to track the full 3D motion of the device, while simultaneously creating a map of the environment. These sensors allow the device to make over a quarter million 3D measurements every second, updating its position and orientation in real-time, combining that data into a single 3D model of the space around you.

They run Android and include development APIs to provide position, orientation, and depth data to standard Android applications written in Java, C/C++, as well as the Unity Game Engine.

More information on Project Tango can be found here: https://www.google.com/atap/projecttango/


Bosch Research and Technology Center

The robotics team at the Bosch Research and Technology Center in Palo Alto, CA has been working with the recently released Google Tango mobile device.

The first package developed has a set of applications to parse Tango Mapper log files and convert them to ROS bagfiles. To show the results of their work, Bosche collected data while walking through their robotics lab.

GitHub repo link: https://github.com/Project-Tango-for-Robotics/tango-to-bagfiles


Kitware

Kitware has been working with the Google Project Tango team to help create software dashboards for CDash and track algorithm performance over time with MIDAS.

Kitware has used a Project Tango device with ParaView, a data analysis and visualization application.

Kitware blog link with step-by-step instructions: http://www.kitware.com/blog/home/post/650


OLogic

Visual Intertial Odometry (VIO) using ROS and RViz

GitHub repo link: https://github.com/Project-Tango-for-Robotics/Tango


University of Massachusetts Lowell Robotics Lab

The UMass Lowell Robotics Lab is currently developing ways to link project Tango with ROS. Depth and odometry information from the C API on the Tango device is being streamed and published into ROS. Concurrent color image streaming is also a work in progress.

A large facet of research out of the Robotics Lab is focused on robots used in urban search and rescue (USAR) scenarios. The work we are currently developing will enable technology for the USAR domain, wherein both Tango and Google Glass would play a role. A video showing the Tango C API being used to to visualize the phone moving with a pointcloud in rviz, along with a github to use the code yourself.

GitHub repo link: https://github.com/uml-robotics/ros_tango_native_stream


University of Pennsylvania Grasp Laboratory

The team at the Grasp Lab of the Unviersity of Pennsylvania has been working to provide autonomy to flying robots using the Tango device. The current vehicle represents the first autonomous commercial off the shelf flying platform relying on a camera phone. The control, estimation, and mapping are performed onboard the vehicle.

Autonomous Quadrotor Flight

The Tango device is used to maintain the pose of the quadrotor, along with an onboard IMU. All necessary computing is performed onboard the robot. The laptop in this video is only used to send trajectories to the robot and for monitoring/visualization.

Vijay Kumar's Lab: http://www.kumarrobotics.org/

Grasp Lab's YouTube Channel

Article on IEEE Spectrum Website

G. Loianno, G. Cross, C. Qu, Y. Mulgaonkar, J. A. Hesch, and V. Kumar. Smart Phones Power Flying Robots. IEEE Robotics and Automation Magazine, submitted