JetsonHacks

Developing on NVIDIA® Jetson™ for AI on the Edge

Intel RealSense D400 librealsense2 – NVIDIA Jetson TX Dev Kits

Intel has recently begun shipping the RealSense D435 and D415 depth cameras. Let’s start working on running them on the NVIDIA Jetson TX kits. Looky here:

Background

As you may recall, we were using librealsense with the previous generation R400 RealSense camers. With the advent of the new D400 series RealSense cameras, Intel has upgraded the librealsense to version 2.0 to support the new camera family and its features.

The new hardware introduces a couple of different video modes, as well as support for camera on-board accelerometer and gyroscope. While the D435 in the video does not have the additional hardware, other cameras in the range do. As a result, librealsense requires modification to the Jetson kernel Image and additional modules to support the new features.

The Jetson TX kits are embedded systems, so they don’t quite line up with the way that most developers think about the Linux desktop. In the regular installers for librealsense, there are several assumptions made about how devices attach to the system. Also, some assumptions are made about the kernel configuration that do not match the Jetson.

The bottom line is that we need to build a new kernel to support the RealSense cameras. We’ll break this installation into two parts. The first part is installing librealsense itself. The second part will build a kernel that supports the cameras.

Librealsense2 Installation

On the JetsonHacks Github account, there is a repository named buildLibrealsense2TX. To download the repository:

$ cd $HOME
$ git clone https://github.com/jetsonhacks/buildLibrealsense2TX
$ cd buildLibrealsense2TX

Next, make sure that the RealSense camera is not attached to the system. Then install the library:

$ ./installLibrealsense.sh

Looking inside the script file, you will see that there are a couple of patches for the Jetson. The first patch is a work around for some code related to an Intel specific instruction set, the other is a workaround for the Industrial I/O (IIO) device detection.

The stock librealsense code appears to assume that a IIO device reports with a device and bus number. On the Jetson, the ina3221x Power Monitors do not follow this protocol. The result is that a series of warning are issued continuously as the library scans for HID devices that have been added (plugged in) to the system.

The library is looking for IIO HID devices (the accelerometer and gyroscope on a RealSense camera). The ina3221x is not a HID device, but appears during the IIO scanning. The library scans, but because it does not find a device or bus number for the power monitor, it issues a warning to stderr (the console). The result is that the console gets spammed, which in turn results in a performance penalty.

The workaround patch checks to see if the device detected is the power monitor before issuing a warning.

Similarly, the built in camera module control is via CSI/I2C, not USB as expected by librealsense. Again, a warning is sent to stderr by librealsense. There may be a clever way to determine if the warning is about the on-board Jetson camera, but in this case the patch just comments out the warning.

After applying the patches, the script compiles the library, examples and tools:

  • The library is installed in /usr/local/lib
  • The header files are in /usr/local/include
  • The examples and tools are located in /usr/local/bin

The script also sets up a udev rule so that the RealSense camera is available in user space.

Once the library is installed, plug the camera into the Jetson, or into the Jetson through a powered USB 3.0 hub. You can then go and execute the tools and examples. For example:

$ cd /usr/local/bin
$ ./realsense-viewer

As shown in the video, you will be able to use the camera. However, if you examine the console from where the app is launched, you will notice that there are a couple of issues. First, some of the video modes are not recognized. Second, some of the frame meta-data is absent.

The video modes are identified in the Linux kernel. The frame meta-data is too. In order for this information to become available to librealsense, patches must be applied to the kernel and the kernel rebuilt.

You will need to make the determination as to if the added information is important for your application.

Kernel and Modules

The changes that librealsense needs are spread across several files. Some of the changes relate to the video formats and frame meta-data. These changes are in the UVC video and V4L2 modules.

In previous versions of librealsense, we would build the UVC module as an external module. This was relatively simple. However, things have changed a little internally in the way that L4T 28.2 is configured. The V4L2 module is built into the kernel Image file (it is an ‘internal’ module). The UVC can still be compiled as an external module.

The other new HID modules that the library uses are part of the IIO device tree. These modules rely on the internal module for IIO, as well as a couple of other support modules which must be enabled as internal modules.

As a result, this is a little tricky for development purposes in a general purpose manner. First, there are some modules which need to be enabled. They are a little picky in that some need to be internal modules. There are also patches that need to be applied to the stock kernel sources.

There are two ways to go about this. The first is the recommended way, where you work it all into your development process. The second way is to use a provided script which will attempt to build a stock kernel with the addition of the kernel support needed for librealsense. If something goes wrong during this second method, most likely you will be forced to reflash your Jetson because it is in a bad state.

In either case, you should be building your kernel on a freshly flashed Jetson. You will need ~3GB of free space for building the kernel.

Build steps

We’ve talked about building the kernel with modules before. Basically the steps are:

  • Get the kernel sources
  • Configure the kernel
  • Apply any needed patches
  • Make the kernel and modules
  • Install
  • Cross your fingers and hope

Typically you’re working on a development kit, where you have your own kernel modification and configuration in place. You will need to do some configuration of the kernel for librealsense. Located in:

buildLibrealsense2TX/config/ (e.g. buildLibrealsense2TX/config/TX2

there is a stock .config file with the changes that are needed to the configuration for the librealsense library. You should diff this with a stock kernel configuration, and then add the changes to your development .config file.

You can then apply the kernel patches:

$ ./applyKernelPatches.sh

These patches will fix up the camera formats, add the meta-data and so on.

At this point, you are ready to build the kernel and install it. As usual, you should make a backup of the stock image and modify /boot/extlinux/extlinux.conf to add an option to boot to either the stock image or the new image. Also, remember to set the local version in the .config file!

Just go for it!

You’re the type of person who lives on the edge. Doesn’t care about what others think. You just want it, and you want it now. Have I got the script for you!

If you are not concerned about kernel development and just want the camera up and running properly, you can run a script which will rebuild the kernel with all of the changes needed and install it. Just to be clear, this will install a stock kernel with the librealsense changes in place of whatever is currently there. If you have kernel modifications already installed, they will disappear.

Be forewarned though, sometimes when you live on the edge, you can fall over the edge. If something untoward happens during the build, it can render your Jetson brickly; you will need to reflash it.

For the install on a Jetson TX2:

$ ./buildPatchedKernelTX2.sh

After the installation, reboot, and you should be ready for goodness.

Note: We’ll provide a TX1 script soon.

If something does go wrong during the build, you may want to try to debug it. As part of its cleanup process, the buildPatchedKernel script erases all of the source files and build files that it has downloaded and built. You can pass nocleanup as a command line flag so it keeps those files around. Hopefully you can fix everything.

$ ./buildPatchedKernelTX2.sh –nocleanup

Actually, this script is more useful as a template for rebuilding your kernel with the librealsense changes.

Performance

As we note in the video, there appears to be some issues with performance with the realsense-viewer application. It seems rather suspicious that when the program starts that 100% of one of the CPU cores is being used. This usually indicates that the GUI is not yielding at the bottom of its event loop.

Another issue is that most of librealsense is built for optimization of x86 code. This is to be expected, after all it is an Intel product. Because the Jetson is ARM based, the code defaults to some generic drivel. If only there was a way to exploit parallel processing on a Jetson which has 256 CUDA cores …

Making progress! Off to the next installment of the series.

Notes

  • In the video, iInstallation was performed on a Jetson TX2 running L4T 28.2 (JetPack 3.2)
  • Librealsense 2.10.2
  • Intel RealSense D435 camera

Intel Documentation:

Facebook
Twitter
LinkedIn
Reddit
Email
Print

50 Responses

    1. My understanding is that the cameras have been popular. I know I ordered the one I have several months ago, I believe it was in the third or fourth production batch. I’ve read that the backlog may take a month or two to clear up.

      1. My D435 camera came this week. Don’t know why one must pop off a door to plug in the USB cable. It’s not like it will work without the connection! But that little tripod is cute. Maybe it will work without the legs atop my tank robot. I’m hoping to get it to work with Rtabmap and Kinetic ROS and Jetson TX1.

  1. Love the NVIDIA ruler. Brought back a bunch of them from GTC. They are hard to get and NVIDIA considers them to be their most popular item at the employee store.

    1. I had to run a covert op into NVIDIA world headquarters, bribe about 20 people, sneak past three guards, and walk with a limp out of the building to smuggle mine out. It took about three months to plan the op, but now I have one!

  2. I’ve had a D435 on order for weeks and weeks. Does it still use a laser or did they switch over to LED’s for ranging? I can’t make sense out of their propaganda.

  3. A bit of looking over the software SDK indicates that the laser system is not very accurate at any distance. Have you guys found that to be true? Being a laser, it could be accurate to microns, easily, with the proper return circuitry. What actual accuracies have you guys found?

    1. It depends greatly on lighting conditions, indoors and outdoors have different rates of accuracy. I’m not quite sure why you think a stereo depth camera would have micron accuracy at a price point of $200. It’s probably on the order of < 1%. For example, 1 meter would be 2.5mm to 5mm.

  4. There are quite a few laser sensors available now, standalone units, in the sub-$50 range with better accuracy than that. For a (bad) example, the Garmin Lidar-Lite V3 is $130 (but Garmin is extremely overpriced in ALL their products,) going down to the Chinese knock-offs that are under $50. And they all have full LIDAR capability, whereas the D400’s don’t do multi-sampling of the return waveform that gives the LIDAR capabilities. Not saying that it should, but it should have way better resolution than it appears to have. That’s what I was trying to find out. I bet that’s a contributor to why the point clouds are less than wonderfully accurate. Within it’s margins, some good work can be performed, but it could be lots better, is my guess.

    1. You’re comparing a single point LIDAR to a 1280×720 depth camera? Not quite sure what the point of that is. The two are not the same category of device.

  5. One person used the D435 to sample a flat surface, for example. The resultant point cloud was anything BUT flat. And that is disappointing.

    1. That’s certainly sounds like good news for you. You should be able to cancel your order and put the money towards a solution that better meets your needs.

  6. Thanks for the article !
    I had issues while installing 2.10.2 on TX1 running L4T 28.2 (JetPack 3.2). I paused when I came across this article. It is been a while now waiting for an update; Have you tested installation with TX1 yet?
    Meanwhile librealsense 2.10.4 is available, I would start trying out again. How different can be the problems between TX1 and TX2 ?

      1. I had started following your old article on realsense and TX1 and also referenced intel librealsense instructions. I discovered problems on the way like low memory for TX1, missing repository or amd64 related sources etc. They had solutions online already. I forgot most of the stuff other than some bookmarks I found useful 😛

        Last issue where I was stuck was: patches were getting stuck at “File to patch:” and I wasn’t sure which path to give and why others haven’t faced the similar problem. Keeping the patch files in the folder “Kernel 4.4” made the patches run successfully.

        When I had installed librealsense on my laptop for SR300 two years ago, I remember having uvcvideo.ko file and the module was working before I started patches. Patches updated the module for kernel, .ko file was copied and pasted at the right place. But I do not have any uvc module on the freashly flashed TX1. Do I need to compile this module separatly before patches ?

        Tonight, I will start from fresh flash on TX1, moving to SSD and then install everything and also have a reference: https://github.com/IntelRealSense/librealsense/issues/1424

        If you have any suggestions already, please let me know.

        1. These instructions are for the new D400 series of cameras. Older cameras, like the SR300 and R200 use the legacy RealSense library as described in the previous article. In this version (0.5 release) of these scripts, the Jetson TX1 is not supported. The kernel sources for the TX1 have not been updated yet for in the buildJetsonTX1Kernel repository.
          You must match the kernel sources with the kernel you are building.

          1. yes .. bad idea of mixing up new cameras with old patches. I can see the commits for TX1, I hope it will be available soon.
            I just tried TX2 scripts with TX1 (wanted to see where they fail). Other than camera stuff, I noticed that wifi module stopped.
            Are you considering wifi drivers also while building the new kernel?
            I checked that you are not using wifi in above video. I would like to use wifi.
            Just in case it helps.. if you will be going for librealsense 2.10.4 we need to comment mavx flag in cmakelist.

        1. The error message says that you are missing the diagnostic_updaterConfig package. Did you try to install it? ie

          $ sudo apt-get install ros-kinetic-diagnostic-updaterConfig

  7. Hello,

    I am following your tutorial but I am having errors when I run “./installLibrealsense.sh”. These are the errors I got:
    CMake Error at /usr/share/cmake-3.5/Modules/FindCUDA.cmake:617 (message):
    Specify CUDA_TOOLKIT_ROOT_DIR
    Call Stack (most recent call first):
    CMakeLists.txt:61 (find_package)

    — Configuring incomplete, errors occurred!
    See also “/home/nvidia/librealsense/build/CMakeFiles/CMakeOutput.log”.
    Building librealsense, headers, tools and demos
    make: *** No targets specified and no makefile found. Stop.
    Installing librealsense, headers, tools and demos
    make: *** No rule to make target ‘install’. Stop.
    Library Installed

    —————————————–
    The library is installed in /usr/local/lib
    The header files are in /usr/local/include
    The demos and tools are located in /usr/local/bin

    And the libraries are not installed nor the demos.

    Any idea about what could be the problem?

      1. I am using version 3.2 and I didn’t install any CUDA libraries yet. I just got the Jetson. So What should I install first? before following this tutorial

      2. I followed your tutorial for CUDA installation
        https://jetsonhacks.com/2018/04/25/now-cuda-intel-realsense-d400-cameras-nvidia-jetson-tx/
        but I still got the same error:

        CMake Error: The source directory “/home/nvidia” does not appear to contain CMakeLists.txt.
        Specify –help for usage, or press the help button on the CMake GUI.
        Building librealsense, headers, tools and demos
        make: *** No targets specified and no makefile found. Stop.
        Installing librealsense, headers, tools and demos
        make: *** No rule to make target ‘install’. Stop.
        Library Installed

        —————————————–
        The library is installed in /usr/local/lib
        The header files are in /usr/local/include
        The demos and tools are located in /usr/local/bin

        —————————————–

        and there is nothing in these folders (/usr/local/bin , /usr/local/include , /usr/local/lib)

        1. Is CUDA installed on the Jetson? What commands did you execute when you get the error? Did you try to execute the commands in the script separately?

          1. I solved the question an hour ago. Thank you. I flashed the jetson and started everything again. After installing the CUDA and fixing a couple of things in the build folder in the libsense repo, it worked.

    1. No, I don’t think there is enough bandwidth for 3 USB 3.0 cameras through the one USB port. The hub allows you to plug in 3 cameras, but it’s still going in the one port. You would have to add something like a PCIe USB card for more bandwidth. Thank for reading!

    1. These scripts have not been tested under JetPack 3.3. You will probably need to make sure that the version of the Linux kernel is the same between the two releases for the scripts to be successful. Thanks for reading!

  8. Can you please update this article for Xavier? I really want to be able to eliminate some of the UVC errors but I’m worried about bricking it since it’s a completely different device and OS version.

  9. Tried running the install script but it fails with can’t find libusb-1.0. There exists a file called “libusb-1.0-0” in the filesystem. Is this the same function but with a new name? I’ve searched all the install .sh’es and can find no direct reference to either file name. Any suggestions on where to go from here, please?
    Thanks.

  10. is it possible to install a newer version (at least 2.17) of librealsense using this install script? It took me a while to get the camera working as described in your article, so I am hesitant to apply any changes to my current configuration.

    Thanks in Advance!

  11. Hey there;
    I am using realsense D435i with jetson tx2 while doing ./installLibrealsense.sh it is asking file to patch. How to solve it..

Leave a Reply

Your email address will not be published. Required fields are marked *

Disclaimer

Some links here are affiliate links. If you purchase through these links I will receive a small commission at no additional cost to you. As an Amazon Associate, I earn from qualifying purchases.

Books, Ideas & Other Curiosities