# Jetson Nano – RealSense Tracking Camera

The Intel RealSense T265 Tracking Camera solves a fundamental problem in interfacing with the real world by helpfully answering “Where am I?” Looky here:

## Background

One of the most important tasks in interfacing with the real world from a computer is to calculate your position in relationship to a map of the surrounding environment. When you do this dynamically, this is known as Simultaneous Localization And Mapping, or SLAM.

If you’ve been around the mobile robotics world at all (rovers, drones, cars), you probably have heard of this term. There are other applications too, such as Augmented Reality (AR) where a computing system must place the user precisely in the surrounding environment. Suffice it to say, it’s a foundational problem.

SLAM is a computational problem. How does a device construct or update a map of an unknown environment while simultaneously keeping track of its own location within that environment? People do this naturally in small places such as a house. At a larger scale, people have been clever enough to use visual navigational aids, such as the stars, to help build their maps.

This  V-SLAM solution does something very similar. Two fisheye cameras combine with the information from an  Inertial  Measurement  Unit (IMU) to navigate using visual features to track its way around even unknown environments with accuracy.

Let’s just say that this is a non-trivial problem. If you have tried to implement this yourself, you know that it can be expensive and time consuming. The Intel RealSense T265 Tracking Camera provides precise and robust tracking that has been extensively tested in a variety of conditions and environments.

The T265 is a self-contained tracking system that plugs into a USB port. Install the librealsense SDK, and you can start streaming pose data right away.

Tech Stuffs

Here’s some tech specs:

Cameras

• OV9282
• Global Shutter, Fisheye Field of View = 163 degrees
• Fixed Focus, Infrared Cut Filter
• 848 x 800 resolution
• 30 frames per second

Inertial Measurement Unit (IMU)

• 6 Degrees of Freedom (6 DoF)
• Accelerometer
• Gyroscope

Visual Processing Unit (VPU)

• Movidius MA215x ASIC (Application Specific Integrated Circuit)

The Power Requirement is 300 mA at 5V (!!!). The package is 108mm Wide x 24.5mm High x 12.50mm Deep. The camera weighs 60 grams.

Installation

To interface with the camera,  Intel provides the open source library librealsense. On the JetsonHacksNano account on Github, there is a repository named installLibrealsense. The repository contains convenience scripts to install librealsense.

In order to use the install script, you will either need to create a swapfile to ease an out of memory issue, or modify the install script to run less jobs during the make process. In the video, we chose the swapfile route. To install the swapfile:

$git clone https://github.com/jetsonhacksnano/installSwapfile$ cd installSwapfile
$./installSwapfile.sh$ cd ..

You’re now ready to install librealsense.

$git clone https://github.com/jetsonhacksnano/installLibrealsense$ cd installLibrealsense
\$ ./installLibrealsense.sh

While the installLibrealsense.sh script has the option to compile the librealsense with CUDA support, we do not select that option. If you are using the T265 alone, there is no advantage in using CUDA, as the librealsense CUDA routines only convert images from the RealSense Depth cameras (D415, D435 and so on).

The location of librealsense SDK products:

• The library is installed in /usr/local/lib
• The header files are in /usr/local/include
• The demos and tools are located in /usr/local/bin

Go to the demos and tools directory, and checkout the realsense-viewer application and all of the different demonstrations!

# Conclusion

The Intel RealSense T265 is a powerful tool for use in robotics and augmented/virtual reality. Well worth checking out!

### Notes

• Tested on Jetson Nano L4T 32.1.0
• If you have a mobile robot, you can send wheel odometry to the RealSense T265 through the librealsense SDK for better accuracy. The details are still being worked out.

1. Thanks, this looks great. Would it be possible to update the Xavier repo for JP 4.2? I’d imagine it would be very similar to the Jetson Nano

2. Thank you! I can always find great information from your blog.

By the way, just curious, do you have any insights of using ZED Mini camera in this case?

• You are welcome. I do not have any experience with the ZED Mini and the Nano yet. Thanks for reading!

3. These two look like a dandy pair of devices for small robots! Will be interesting to see the machines made with them.

4. FYI: The software for the ZED cameras has been going through a big update this past year.

5. Thank you for ALL the information! It’s super helpful! I just got my T265 up and running on my Jetson Nano and have a D435i that I’ll be testing later. After setting everything up and getting the camera working through the RealSense Viewer, I’m having trouble exporting a .PLY file using the GUI. Have you had any luck with this? Thanks again! I’m fairly new to this world and watching your videos and reading your site has been a huge help!

• Thank you for the kind words. I have not tried exporting .PLY files. Thanks for reading!

6. Great tutorial, really easy to follow- however I used the additional patches and such from the github page in order to get the D435i working on the nano as well, but the device does not get recognized – any ideas?

• It is difficult to tell from your description what the issue may be. Did you build and install everything on a fresh SD card? Or did you use a USB drive?

• I used a fresh 64GB SD card- followed the exact instructions from github, watched the other video you posted about using the D435i and still won’t work. I saw a note in the github comments that the patches don’t work for the “i” camera, and that they only work for the D435, is this true?

• It is difficult to provide much help without knowing what “don’t work” might mean.
The patches work for both the D435i and the D435.

• My apologies, I should clarify, whenever the camera is plugged in the nano does not recognize it. When I run rs-enumerate-devices it says “No device detected. Is it plugged in?” and I receive the “DS5 group_devices is empty” warning when trying to run realsense-viewer.

• I added what I am hoping is a fix for your issue. v0.8 (the current master) addresses the problem. Please try it out and let us know how it goes!

7. Hi Kangalow,
Thank you for the post. I set up everything and it looks all worked before I open the realsenseview. The system will be power off when I try to open the realsenseview. Do I have to use 5V-4A instead of 5V-2A if I power mouse/keyboard/t265 at the same time?

• It depends on how you intend to use it. If you are running in 10W mode (2A) on the Jetson Nano, the Nano Module is using 2A for just the module. If you want to run peripherals in 10W mode, you will need more current. Thanks for reading!

8. Is there any solution to get the camera working with the realsense2_camera ros package on the Jetson Nano?