There were some good talks and workshops centered around Googles’ Project Tango Tablet. The current Project Tango hardware has a Tegra K1 processor (the same processor as the Jetson Development Kit) which is built to be a computer vision platform. On the hardware side, the device is a standard Android tablet with a wide angle camera used for image processing and motion tracking. In addition, there is a depth camera which gives depth scene information along with a sensor hub which provides things such as very accurate time stamps. This hardware gives the device three major capabilities, motion tracking, area learning, and depth perception. The area learning is impressive. The device optically ‘remembers’ areas that have been traversed previously, all without using GPS. In one of the video demos, a user navigates to a particular office in a building with the Project Tango tablet overlaying an augmented reality “arrow path” to give the user directions to get there.
If you were wondering what a development kit like the Jetson is used for, the Project Tango Tablet is an example of a device which the Jetson would serve as the original platform for development of the hardware.
In the talk Project Tango Tablet: Application Rapid Fire Presentations, the first wave of application developers share some of their experiences developing on the device. I especially liked one of the advantages that the Tegra K1 brings, it “allows you to get more Zombies on the screen”. A good, informative talk.
Wil Braithwaite of NVIDIA gave a talk Augmented Reality with Google’s Project Tango and NVIDIA Technology that is worth taking the time to watch. Even though the Tegra K1 is a speedy little puppy, it’s still not a big dog and there are a lot of design considerations and tradeoff decisions that need to be made to get the best user experience.