In this article, we cover ROS installation of the Intel RealSense Camera on the NVIDIA Jetson TK1. Note: This article was updated 9-9-16. Looky here:
Note: This article is intended for intermediate to advanced users who are familiar with ROS.
One of the intended uses of Intel RealSense Cameras is robotics. The premiere operating system for robots is Robot Operating System (ROS). A great platform for running ROS and RealSense Cameras? Jetson TK1! Let’s get the shotgun out and have wedding!
Using the information in this article, you will:
- Build a new kernel for the Jetson TK1
- Build the RealSense Camera driver library
- Install ROS on the Jetson TK1
- Install the RealSense Camera ROS Node
- Be fulfilled as a human being
Let’s get started. The first step is to read through this article to get familiar with the steps involved. Then start installation.
In order to get started, there are three prerequisites required. These instructions are based on a freshly flashed Jetson TK1, the one in the video was done using JetPack 2.2, which installed L4T 21.4. You should also have Git installed:
$ sudo apt-get install git
Note: The installation in the video was done on a fresh Linux for Tegra (L4T) JetPack installation with the standard JetsonHacks postFlash done afterwards. This sets up the USB ports appropriately, and eliminates the need to run the setupTK1 script mentioned later.
First, librealsense needs to be installed on the Jetson TK1. This involves building a new kernel to support the video modes of the RealSense camera in the UVC module, and building the librealsense library. Here is an article on how to do that, librealsense installation on Jetson TK1. Make sure that you remember to run the setupTK1 script to assign the USB port to USB 3.0 speeds and turn USB auto-suspend off. At the end of the RealSense camera installation, you should run some of the examples provided to make sure that the camera is installed and works properly.
A good practice is to make a clone of the image after building a new kernel, so that if something goes wrong later (that could never happen, could it?) you don’t have to start again from scratch.
Second, we need to have ROS installed on the Jetson. Once the RealSense camera is working, from a new Terminal install ROS:
$ git clone https://github.com/jetsonhacks/installROS.git
$ cd installROS
$ cd ..
This will install ROS Indigo ros-base, rosdep, and rosinstall.
Note: In an earlier article article on installing ROS on the Jetson TK1, we used the Grinch kernel. The Grinch kernel provides access to a large number of peripherals on the TK1. Because we modified the stock kernel for the RealSense camera in our first step, the Grinch kernel is not used here. If you need the Grinch kernel, you will have to recompile the Grinch kernel with the RealSense camera changes. That is an excercise left to the reader.
Third, we download the realsense_camera package installer:
Open a new Terminal, which will source the new environment setup by ROS, and:
$ git clone https://github.com/jetsonhacks/installRealSenseCameraROS.git
$ cd installRealSenseCameraROS
We need is a Catkin Workspace for our base of operations. There is a convenience script to create a new Catkin Workspace.
$ ./setupCatkinWorkspace [workspace name]
In the video above, jetsonros is the workspace name. This script creates an initialized Catkin Workspace in the ~/ directory.
With the prerequisites installed, we’re ready to install the realsense_camera package:
$ ./installRealSense.sh [workspace name]
where [workspace name] is the name of the Catkin Workspace where you want the realsense_camera package installed. In the video, the workspace name used is jetsonros.
If you do not have a swap file enabled on your Jetson, there may be issues compiling the package because the TK1 does not have enough memory to compile this in one pass. The installation script has been changed since the video was filmed to compile using only one core to relieve memory pressure, i.e.
$ catkin_make -j1″
If this doesn’t fix the problem, refer to the video for a workaround.
Note: As of this writing, the ROS package in the debian repository cv_bridge is hard linked against an OpenCV package which is not installed on the Jetson (2.4.8). There are several ways to get around this, discussed on the ROS Answers forum. For this installation, installing cv_bridge from source is chosen.
At this point, you are ready to launch the node.
Launch RealSense Camera Node
There are several launch files included in the realsense_camera package. These are covered in the README.md file in real_sense camera directory. In order to launch the camera on the Jetson:
$ cd [catkin workspace] , e.g. cd ~/jetsonros
$ source devel/setup.bash
$ roslaunch realsense_camera realsense_r200_nodelet_standalone_preset.launch
On your visualization workstation, you can view the camera configuration:
$ rosrun rqt_reconfigure rqt_reconfigure
If you intend to view a point cloud, you must setup a frame of reference, i.e.
$ rosrun tf static_transform_publisher 0.0 0.0 0.0 0.0 0.0 0.0 map camera_depth_optical_frame 100
You can also open RVIZ and load the provided RVIZ configuration file: realsenseRvizConfiguration1.rviz.
$ roscd realsense_camera
$ rosrun rviz rviz -d rviz/realsenseRvizConfiguration1.rviz
Please read the README.md file for more information.
Note: You can install RViz on the Jetson TK1 using these instructions.
RealSense camera support under ROS is still relatively new. Since this article was initially published, there have been many changes to the ROS node package. The above scripts clamp the software to the publish date, but if you wish to use this package it will be worth investing some time in updating to later releases. Things are shaping up quite nicely for this new entry in the RGBD camera space.