JetsonHacks

Developing on NVIDIA® Jetson™ for AI on the Edge

Install Kinect V2 – Part II

Update

On April 2nd, Microsoft announced that they are consolidating the Kinect for Windows around a single sensor. They will discontinue “Kinect for Windows V2” described below. However the replacement will be a regular Xbox One Kinect Sensor Bar along with a Kinect Adapter for Windows which will provide a functionally equivalent solution. You will need both a Kinect Xbox One sensor and the adapter for use with the Jetson. The Kinect Adapter for Windows converts the output from the Kinect to USB 3.0. The advantage of this setup is that you can use the Kinect sensor from your Xbox One, or at least have an excuse to get a Xbox One + Kinect for “research” purposes.

Introduction

Open source drivers (OpenKinect libfreenect2) are available for the Kinect for Windows V2. Most installations are for PC based, x86 machines. The Jetson TK1, being an ARM V7 machine poses some special challenges for porting the source over. Most of those issues have been addressed, and an installation script is now available. Looky here:
http://youtu.be/URhu-fAUWWQ

Introduction

While most of the porting issues are straightforward, some special code needed to be written to get good performance out of the Kinect sensor on the Jetson. There are also some hardware issues that need to be addressed.

Fortunately Lingzhu Xiang added Tegra support for the Jetson to libfreenect2 which addressed the performance issues. The code is available on Github at: https://github.com/xlz/libfreenect2. Here is the “how to”: Jetson TK1 HOWTO. The Tegra code allows frame processing at 60Hz. Way excellent!

You might remember from an earlier post (Kinect V2 with libfreenect2) that there were issues using the built in USB controller. So a Syba 19-Pin USB 3.0 Header Mini PCI-Express Card with Female USB 3.0 Cable SD-MPE20142 was acquired. Note: It is a full size Mini PCI-Express card, so it does hang over the edge of the Jetson when installed. Also, the card requires a floppy drive power connector. I found one at a local Frys electronics store that hooked into the Jetson onboard Molex power connector.

Note: Please read the Built In USB below for notes on how to get this to work with the full size USB 3.0 connector on the Jetson.

Also, remember that in order to use the Kinect for Window V2 with the Jetson, the Kinect needs to be initialized with a Windows 8.1 machine. This should be done before plugging the Kinect into the Jetson for the first time.

This installation was run on a Jetson with L4T 21.2 with OpenCV and CUDA 6.5 installed.
(See JetPack installation). After the flash installation, the Post Flash Setup procedure was performed, which prepares the Jetson for desktop operation. After using JetPack and performing the post flash setup, the Jetson is prepared as the starting point for libfreenect2 installation.

Another note, the Jetson feature of USB autosuspend needs to be disabled for the Kinect V2 to work correctly. This is taken care of by the post flash setup by the installation of a startup script. WARNING: Replugging the Kinect may enable auto suspend again. To disable, execute this:

$ sudo sh -c ‘for dev in /sys/bus/usb/devices/*/power/autosuspend; do echo -1 >$dev; done’

You can check the result:

$ grep . /sys/bus/usb/devices/*/power/autosuspend

which should show all entries as -1.

Updates

Since the first article, there were a couple issues that have been addressed in xlz’s repository.

First, xlz reverted a previous libfreenect2 commit to fix MAX_ISO_BUFFER_LENGTH which was causing libusb issues.

Second, libfreenect2 installs libopencv-dev, which overwrites the CUDA accelerated opencv4tegra library. The install script on the How-To removes the libopencv-dev and adds libjpeg-turbo8-dev which installs the ‘turbojpeg.h’ header file.

Installation

The demo that I originally showed included a colormap depth image, which required changes to Protonect. I also wrote an installation script so that the installation process could be automated. I still have an updated fork of xlz’s repository, but there aren’t that many changes to it.
The installation script is available on Github:

Download it to the Jetson. In the video above, the script is placed into the Home (~/) directory.

$ sh installLibfreenect2.sh

This will install the code dependencies, download the libfreenect source and dependent libraries, and compile them.
Once the installation is complete, you can reboot the Jetson. This will enable the udev Kinect rules, which allows user access to the Kinect. If you do not reboot the Jetson, you will have to use the ‘sudo’ command before running the demo.

Demo

Switch to the demo directory:

$ cd libfreenect2/examples/protonect/bin

(This assumes that libfreenect2 was installed in your home directory). To run the demo:

$ ./Protonect

If all went well, you should see the depth and rgb images.

Built In USB

First, The mPCIe card mentioned above does not appear to have any issues, and seems to work reliably. If the Kinect V2 is plugged into the card’s port when the system boots, it works properly. The Kinect can also be hot plugged into the card’s port and it works properly.

The Jetson’s native USB 3.0 port works, but seems less reliable and requires a few extra steps.

Notes: The term ‘hub’ refers to an external, powered, USB 3.0 7 port hub connected to the full size USB connector on the Jetson. I am using a USB 3.0 hub with a keyboard and mouse also plugged into it, along with the Kinect. Before running Protonect each time, I checked the auto suspend state to make sure everything was -1. Replug means to unplug the device, wait a few seconds, and then plug it back in. Protonect is an example program which displays the RGB, infrared and depth camera streams.

FAILURE:
If the Jetson is booted with the hub and Kinect plugged into the hub, when Protonect is executed, it fails. Protonect gets to the point where it prints out ‘device firmware: 4.3.3912.0.7’ and gets deadlocked in waitForNewFrame. No windows appear., but the [TegraJpegRgbPacketProcessor] heartbeat message appears.The [CudaDepthPacketProcessor] heartbeat does NOT appear .

Subsequent attempts to run Protonect also fail. Protonect fails after it prints out the 92 bytes of raw data and then:

[code][CommandTransaction::receive] bulk transfer failed! libusb error -1: LIBUSB_ERROR_IO
Segmentation fault[/code]

This happens in the case where the Kinect remains in place, the case where the Kinect is replugged, and the case where the hub is replugged.

SUCCESS:
A) Boot the Jetson with the USB hub connected. Do not run Protonect.
B) Once the Jetson boots, replug the hub.
C) Turn off auto suspend
D) Run Protonect.

This brings up the program as expected.

The Kinect can be plugged into the hub at boot time, or after the replug. I did have issues in the case where the Jetson booted without the Kinect in the hub, unplug the hub, plug the Kinect into the hub, plug the hub back into the Jetson. In that case, the keyboard and mouse failed to be recognized. That could be a hub issue.

I played with the different combinations for several hours. The boot, replug the hub solution worked consistently, though there were times that replugging did not show the Kinect using “$ lsusb -t” and required replugging the hub once again.

Since the L4T kernel lags behind the mainstream Linux one, it’s possible that the above issue has been addressed. At this point, my curiosity has been more than satisfied, and I’m going to rate Kinect V2 support as working if still experimental. With the addition of xlz’s Tegra code, it is also at the point of being usable.

Here’s the unboxing and demo video for what it looks like when it’s up and running:

Facebook
Twitter
LinkedIn
Reddit
Email
Print

84 Responses

  1. I’m trying to get something similar working, but my syba pcie card doesn’t seem to be detected. Did you have to do anything special, or was it just plug-and-play? e.g. lspci doesn’t show anything beyond the TK1’s builtins, and there is an error message about link 0 being down during probing.

    1. I did not have to do anything special. Even if it doesn’t show up as a USB hub, it should show up in lspci. I’m assuming that you have reseeded the card a couple of times. I am running LT4 21.2, if that’s of any help.

      1. Ok, maybe I have a bad card or board. Yes, I did try different combinations of replugging and power cycling, to no help. I tried both 21.2 and the new 21.3. Thanks all the same =)

  2. >the Kinect needs to be initialized with a Windows 8.1 machine.

    Please tell me how to do it. Just install drivers and plug-in Kinnect?

    1. This is several months ago for me, I don’t recall what the exact procedure was. Please note that Microsoft has discontinued this particular product. The replacement is a Kinect for Xbox one with an add-on adapter.

      As I remember, you plug the Kinect for Windows into the Windows 8.1 box. It is a plug and play device. You can then download the SDK samples from Microsoft (the links are available from the packaging material included with the Kinect), and run sample programs. The camera is then ready to be used on the Jetson.

      Hope this helps.

      1. kangalow, Thanks for your response

        >Please note that Microsoft has discontinued this particular product. The replacement is a Kinect for Xbox one with an add-on adapter.
        I am trying to bring “Kinect for Xbox one with an add-on adapter” up right now.
        And now I see why the first one was out of stock everywhere.

        Now I am running LT4 21.3 and have these:

        ubuntu@tegra-ubuntu:~/tmp/libfreenect2/examples/protonect/bin$ ./Protonect
        terminate called after throwing an instance of ‘std::runtime_error’
        what(): JPEG parameter struct mismatch: library thinks size is 560, caller expects 536
        Aborted

        It seems there are lots of troubles with USB 3.0 controllers and hubs even for ordinary Windows PC:
        https://social.msdn.microsoft.com/Forums/en-US/bb379e8b-4258-40d6-92e4-56dd95d7b0bb/confirmed-list-of-usb-30-pcie-cardslaptopsconfigurations-which-work-for-kinect-v2-during?forum=kinectv2sdk.

        Because of that I haven’t had any chance yet to “initialize” my brand-new Kinect. And I thought that can be a reason.

        1. Hi laborer,
          Certainly the USB card has been an issue for a while, from what I’ve seen some of those issues have been resolved with later updates. 3.0 USB in isochronous mode is still relatively untested out in the real world. Later versions of the Linux kernel (3.16+ I believe) address these specifically, but I don’t know what the Windows boxes have to do to get it to work.

          The JPEG library mismatch is probably an issue with paths and the two different libJPEGS that are being used. On the Jetson, if you ‘ldd’ you should see that two libjpeg are linked, one is the Tegra version, one is the non-accelerated version. Lingzhu Xiang, who I forked on Github, also has instructions: https://github.com/xlz/libfreenect2/tree/jetsontk1.

          Good luck!

          1. kangalow,

            I haven’t find yet a supported USB 3.0 controller for the Windows machine.
            Therefore I cannot assert that I “initialized” my Kinect(But I tried).
            Nevertheless Kinect is working for me. And I’m on 21.3 L4T.

            As for jpeg, I turned off ENABLE_TEGRA_JPEG in the CMakeList.txt.
            This’s not a solution. CPU seems to be choking. But I saw that my Kinect is good.

            1. That’s great to hear! The Tegra JPEG issue is a conflict between the different libJPEG libraries that are used. There’s the regular one, and then there’s the Tegra one. Make sure that you have the headers from the Tegra bundle in the ‘depends’ directory. I know that when I have a fresh install and run the install script I put up for libfreenect2 on Github, it works. However, that doesn’t automatically mean that it works under all conditions (or other peoples machines). That’s the challenge. It is tricky to get setup.

  3. Hi

    Thanks for the post. I am getting following error. I am using onboard usb;
    [Freenect2Impl] enumerating devices…
    [Freenect2Impl] 4 usb devices connected
    [Freenect2Impl] found valid Kinect v2 @2:37 with serial 501189441942
    [Freenect2Impl] found 1 devices
    [Freenect2DeviceImpl] opening…
    [UsbControl::claimInterfaces(IrInterfaceId)] failed! libusb error -6: LIBUSB_ERROR_BUSY
    [Freenect2DeviceImpl] closing…
    [Freenect2DeviceImpl] deallocating usb transfer pools…
    [Freenect2DeviceImpl] closing usb device…
    [Freenect2DeviceImpl] closed
    [Freenect2DeviceImpl] failed to open Kinect v2 @2:37!
    no device connected or failure opening the default one!

    Can you please advice, what am I doing wrong. Thanks in advance.

    1. Which version of LT4 are you using? The LIBUSB_ERROR_BUSY probably means that the device was being used at one time, but did not shut down correctly when it was trying to be used again. For me, the only way I could get it to work after that is to reboot. I’ve had issues with the onboard USB, if Kinect2 is plugged in when the system boots up, it might not work despite correct autosuspend settings. Reconnecting or plugging in Kinect2 after system boots seems to solve this problem. You should also run the auto suspend script after reconnecting it.

      1. Hi Kangalow,

        Thank you very much for your prompt reply. Your advice was for very helpful for getting it to work. How can I avoid re-connecting Kinect to work each time I power off and on Jetson, because in real world application it won’t be good. Further I have permanently set the autosuspend to -1 from boot by adding it to exntlinux.conf.
        Do you think if I use Mini PCIe card with USB 3 I can over come this problem.
        Thanking in advance.
        Best regards
        Jo

        1. Glad to hear you got it working. The internal Jetson USB 3.0 re-connect is a known issue, with the only work around that I know is to replug after startup. The Syba USB 3.0 mPCIe card mentioned above does not have that issue. If that’s an important issue for you, consider adding the extra card.

            1. Hi Jo,
              This script does not work with ODROID-XU3 because it specifically uses the Jetson TK1 JPEG decoder. You could try mainstream OpenKinect libfreenect2 if you want the ODROID to work. Good luck!

        2. Hi Jo

          >How can I avoid re-connecting Kinect to work each time I power off and on Jetson, because in real world application it won’t be good.

          That question is also very interested to me. I think we have to wait until more fresh kernel will be released by NVIDIA(not from 3.10 branch) or we will be able to run vanila kernel on the Jetson TK1.

          Meanwhile, Perhaps “software reconnect” will help. I mean “modprobe -r” and next “modprobe” for USB modules.

  4. Hi Kangalow,

    I wanted to be bit adventurous and tried to do the following since I am not familiar with cmake. I copied the relevant headers files to /usr/include/libfreenect2 & the lib files to usr/lib and then I copied the protonect.cpp to my home directory and renamed it as p.cpp and pointed it the usr/include for header files. I ran the following command;
    sudo g++ -o p `pkg-config opencv –cflags` p.cpp `pkg-config opencv –libs` -L /home/ubuntu/libfreenect2/examples/protonect/lib

    It wouldn’t compile and the out put as follows;
    .cpp:(.text+0x2b4): undefined reference to `libfreenect2::Freenect2::Freenect2(void*)’
    p.cpp:(.text+0x2be): undefined reference to `libfreenect2::Freenect2::openDefaultDevice()’
    p.cpp:(.text+0x35c): undefined reference to `libfreenect2::SyncMultiFrameListener::SyncMultiFrameListener(unsigned int)’
    p.cpp:(.text+0x6e2): undefined reference to `libfreenect2::SyncMultiFrameListener::waitForNewFrame(std::map<libfreenect2::Frame::Type, libfreenect2::Frame*, std::less, std::allocator<std::pair > >&)’
    p.cpp:(.text+0xa04): undefined reference to `libfreenect2::SyncMultiFrameListener::release(std::map<libfreenect2::Frame::Type, libfreenect2::Frame*, std::less, std::allocator<std::pair > >&)’
    p.cpp:(.text+0xa76): undefined reference to `libfreenect2::SyncMultiFrameListener::~SyncMultiFrameListener()’
    p.cpp:(.text+0xa80): undefined reference to `libfreenect2::Freenect2::~Freenect2()’
    p.cpp:(.text+0xc9e): undefined reference to `libfreenect2::SyncMultiFrameListener::~SyncMultiFrameListener()’
    p.cpp:(.text+0xcaa): undefined reference to `libfreenect2::Freenect2::~Freenect2()’
    collect2: error: ld returned 1 exit status

    Can you please advice me how can I compile it from command line instead of camke. I want to do some changes and explore. For your info I am new to Linux. Thanks in advance.

    1. Hi jo,
      Good to hear that you’re ready to work on it.
      Some comments:
      1) On Linux, most projects are built using CMake which builds Make files. The make files are then executed. This is because that any project that has more than a few files or library dependencies is almost impossible to build manually on the command line. You should use those tools.
      2) Community projects on Github allow you to use Git for source control. This means that you can create your own branch locally and not have to worry about making a mess.
      3) Not quite sure what you were trying to do with your command line. There are not any libraries defined to link against (that’s why it can’t find the symbols in libfreenect2).

      1. Hi kangalow,

        Thank you for the prompt reply. Can you please point to me where the Lib files are and also do I have to point to files in the src folder as well.

        Thank you once again.

        1. Hi Jo,
          If you have a working copy of an executable file (like Protonect) you can use the command ‘ldd’ to list all of the libraries that the executable contains. If you are just looking for a given file, you can use the ‘find’ command. You can get help for most Linux commands on a Terminal by adding ‘–help’ after the command, i.e.
          $ ldd –help
          The location of the libraries is dependent on your machine and the way it is set up, which is beyond the scope of what can be discussed in a blog post.
          Good luck!

          1. Hi Kangalow,

            Thank you for your prompt reply. The information you provided was very valuable. I learned a lot by playing with ldd command. I manage to compile the code from command line and run it, with out any problem.
            I compiled the code as shown below;
            sudo g++ -o p `pkg-config opencv –cflags` p.cpp `pkg-config opencv –libs` -L/home/ubuntu/libfreenect2/examples/protonect/../../depends/libusb/lib -rdynamic /usr/lib/libfreenect2.so -lusb-1.0

            Thank you once again for your help.
            Best regards

  5. Hi Kangalow,

    I am working on with IR and Depth frames, How can I get Z values at particular x and y from depth->data.

    Thanks inn advance.

  6. Hi Kangalow,

    Thank you for the link & I will look into it. Anyway I did manage do my own calibration and converted the depth frame pixel values to depth values in meters. I works fairly well and simple as well. Further, what is the reason when the protonect is running in the terminal it comes with the message “Depth frame skipped because processor not ready”.

    Thanks once again for your post, it greatly helped me with my project. I am planing to do it on ODROID-XU3 and compare the performance.

    1. It means that a depth frame was received, but could not be processed in time before more depth buffers became available. This is common when first starting up, and also when there are other tasks that taking too much time doing other things such that the depth frame is not processed.

      1. Hi kangalow,

        Thank you. I wish I can find way to over come that.

        So far playing with that I could measure depth of 4mtrs without any problem. Further I am considering Multiple Kinect in one Jetson board by mini PCIe to USB 3card.
        I am not sure I could do that but will give it go and let you know.

  7. Dear kangalow,

    Thanks to the tuturial i got the kinect v2 working on the jetson via the Proton example.

    However i want to start a kinect project in the qt editor.
    i tried starting a qt app and inculded the nesecery libs and the same for a plain c++ project.
    afther getting everything compiling i get the jpeg parameter struct mismatch on runtime.

    i tried to figure out what the problem was using ldd to no afail.
    can you point me in the right directions to get the proton example working in qt? (it works fine with your script and setup)

    1. From the ldd output of:
      The working proton
      libopenjpeg.so.2 => /usr/lib/arm-linux-gnueabihf/libopenjpeg.so.2 (0xb308b000)
      libjpeg.so => /usr/lib/arm-linux-gnueabihf/libjpeg.so (0xb6616000)
      libturbojpeg.so.0 => /usr/lib/arm-linux-gnueabihf/libturbojpeg.so.0 (0xb6671000)

      The Not working proton:
      libopenjpeg.so.2 => /usr/lib/arm-linux-gnueabihf/libopenjpeg.so.2 (0xb2efb000)
      libjpeg.so.8 => /usr/lib/arm-linux-gnueabihf/libjpeg.so.8 (0xb6171000)
      libjpeg.so => /usr/lib/arm-linux-gnueabihf/tegra/libjpeg.so (0xb654d000)
      libturbojpeg.so.0 => /usr/lib/arm-linux-gnueabihf/libturbojpeg.so.0 (0xb65bc000)

      So it seems i need to unlink libjpeg.so.8??
      or am i on the wrong track here?

      1. The source code is proton.cpp copied into a main.cpp
        and this is my current .pro file witch results in runtime jpeg struct mismatch:

        TEMPLATE = app
        CONFIG += console
        CONFIG -= app_bundle
        CONFIG -= qt

        SOURCES += main.cpp

        LIBS += -L$$PWD/../../../../../usr/local/cuda-6.5/lib/ -lcudart
        LIBS += -lopencv_contrib
        LIBS += -lopencv_highgui
        LIBS += -lopencv_imgproc
        LIBS += -lopencv_core

        LIBS += -L$$PWD/../../../libfreenect2/examples/protonect/lib/ -lfreenect2

        LIBS += -L$$PWD/../../../libfreenect2/examples/protonect/lib/ -lglfw

        INCLUDEPATH += $$PWD/../../../libfreenect2/examples/protonect/include
        DEPENDPATH += $$PWD/../../../libfreenect2/examples/protonect/include

        INCLUDEPATH += $$PWD/../../../../../usr/local/cuda-6.5/include
        DEPENDPATH += $$PWD/../../../../../usr/local/cuda-6.5/include

    2. Hi Hylke,
      I’ve been thinking about it, but could not figure out what the issue might be. I’m assuming that Qt links against it’s own libjpeg which is in the executable. Perhaps it might help to compile a very simple program to see which libjpeg Qt uses. My guess is that Qt and the linker are upset about using tegra/libjpeg.so. Another guess is that libjpeg.so is a symbolic link to whatever the latest version of libjpeg.so (libjpeg.so.8 may be correct).
      Sorry I can’t be of more help.

  8. Use your instruction, and when i run ./Protonect I get the images, but very slow.
    And I have this output:

    ubuntu@tegra-ubuntu:~/libfreenect2/examples/protonect/bin$ ./Protonect
    [Freenect2Impl] enumerating devices…
    [Freenect2Impl] 11 usb devices connected
    [Freenect2Impl] found valid Kinect v2 @2:3 with serial 034327244547
    [Freenect2Impl] found 1 devices
    [Freenect2DeviceImpl] opening…
    [Freenect2DeviceImpl] opened
    [Freenect2DeviceImpl] starting…
    [Freenect2DeviceImpl] ReadData0x14 response
    92 bytes of raw data
    0x0000: 00 00 12 00 00 00 00 00 01 00 00 00 43 c1 1f 41 2e2e2e2e2e2e2e2e2e2e2e2e432e2e41
    0x0010: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 2e2e2e2e2e2e2e2e2e2e2e2e2e2e2e2e
    0x0020: 0a 21 33 55 c2 00 17 20 00 08 00 00 10 00 00 00 2e2133552e2e2e202e2e2e2e2e2e2e2e
    0x0030: 00 01 00 00 00 10 00 00 00 00 80 00 01 00 00 00 2e2e2e2e2e2e2e2e2e2e802e2e2e2e2e
    0x0040: 31 33 00 00 00 01 0e 01 47 4b 53 36 35 30 2e 31 31332e2e2e2e2e2e474b533635302e31
    0x0050: 58 00 00 00 00 00 00 00 07 00 00 00 582e2e2e2e2e2e2e2e2e2e2e

    [Freenect2DeviceImpl] ReadStatus0x090000 response
    4 bytes of raw data
    0x0000: 01 26 00 00 2e262e2e

    [Freenect2DeviceImpl] ReadStatus0x090000 response
    4 bytes of raw data
    0x0000: 03 26 00 00 2e262e2e

    [Freenect2DeviceImpl] enabling usb transfer submission…
    [Freenect2DeviceImpl] submitting usb transfers…
    [TransferPool::submit] failed to submit transfer: LIBUSB_ERROR_IO
    [TransferPool::submit] failed to submit transfer: LIBUSB_ERROR_IO
    [TransferPool::submit] failed to submit transfer: LIBUSB_ERROR_IO
    [TransferPool::submit] failed to submit transfer: LIBUSB_ERROR_IO
    [TransferPool::submit] failed to submit transfer: LIBUSB_ERROR_IO
    [TransferPool::onTransferComplete] failed to submit transfer: LIBUSB_ERROR_IO
    [TransferPool::submit] failed to submit transfer: LIBUSB_ERROR_IO
    [DepthPacketStreamParser::onDataReceived] not all subsequences received 0
    [TransferPool::submit] failed to submit transfer: LIBUSB_ERROR_IO
    [TransferPool::onTransferComplete] failed to submit transfer: LIBUSB_ERROR_IO
    [TransferPool::submit] failed to submit transfer: LIBUSB_ERROR_IO
    [TransferPool::onTransferComplete] failed to submit transfer: LIBUSB_ERROR_IO
    [TransferPool::submit] failed to submit transfer: LIBUSB_ERROR_IO
    [TransferPool::submit] failed to submit transfer: LIBUSB_ERROR_IO

    ………………………………………

    [DepthPacketStreamParser::onDataReceived] not all subsequences received 512
    [TransferPool::submit] failed to submit transfer: LIBUSB_ERROR_IO
    [TransferPool::submit] failed to submit transfer: LIBUSB_ERROR_IO
    [TransferPool::submit] failed to submit transfer: LIBUSB_ERROR_IO
    [Freenect2DeviceImpl] started
    device serial: 034327244547
    device firmware: 2.3.3913.0.7
    [TransferPool::onTransferComplete] failed to submit transfer: LIBUSB_ERROR_IO
    [DepthPacketStreamParser::onDataReceived] not all subsequences received 31
    [DepthPacketStreamParser::onDataReceived] not all subsequences received 830
    [DepthPacketStreamParser::onDataReceived] not all subsequences received 63
    [DepthPacketStreamParser::onDataReceived] not all subsequences received 447
    [DepthPacketStreamParser::onDataReceived] skipping depth packet
    [DepthPacketStreamParser::onDataReceived] skipping depth packet
    [RgbPacketStreamParser::onDataReceived] skipping rgb packet!
    [RgbPacketStreamParser::onDataReceived] skipping rgb packet!

    What I forgot?
    Thanks.

    1. How is your Kinect V2 connected to the Jetson? Is it through a PCIe card, straight to the USB 3.0 connector, or through a hub? Do you have USB autosuspend turned off. I am assuming that USB 3.0 is enabled, or it probably would not work at all.

      1. I am use hub.
        Yes, I have USB autosuspend turned off.
        ubuntu@tegra-ubuntu:~$ grep . /sys/bus/usb/devices/*/power/autosuspend
        /sys/bus/usb/devices/1-2/power/autosuspend:-1
        /sys/bus/usb/devices/1-3.1.1/power/autosuspend:-1
        /sys/bus/usb/devices/1-3.1.2.1/power/autosuspend:-1
        /sys/bus/usb/devices/1-3.1.2/power/autosuspend:-1
        /sys/bus/usb/devices/1-3.1/power/autosuspend:-1
        /sys/bus/usb/devices/1-3/power/autosuspend:-1
        /sys/bus/usb/devices/2-1/power/autosuspend:-1
        /sys/bus/usb/devices/usb1/power/autosuspend:-1
        /sys/bus/usb/devices/usb2/power/autosuspend:-1
        /sys/bus/usb/devices/usb3/power/autosuspend:-1

        My lsusb:
        Bus 002 Device 026: ID 045e:02c4 Microsoft Corp.
        Bus 002 Device 002: ID 2109:0812
        Bus 002 Device 001: ID 1d6b:0003 Linux Foundation 3.0 root hub
        Bus 001 Device 007: ID 09da:c10a A4 Tech Co., Ltd
        Bus 001 Device 006: ID 1a40:0101 Terminus Technology Inc. 4-Port HUB
        Bus 001 Device 005: ID 0b38:0003 Gear Head Keyboard
        Bus 001 Device 004: ID 1a40:0101 Terminus Technology Inc. 4-Port HUB
        Bus 001 Device 003: ID 2109:2812
        Bus 001 Device 002: ID 8087:07dc Intel Corp.
        Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
        Bus 003 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub

        My udev rules:
        ubuntu@tegra-ubuntu:/etc/udev/rules.d$ cat 90-kinect2.rules
        # ATTR{product}==”Kinect2″
        SUBSYSTEM==”usb”, ATTR{idVendor}==”045e”, ATTR{idProduct}==”02c4″, MODE=”0666″
        SUBSYSTEM==”usb”, ATTR{idVendor}==”045e”, ATTR{idProduct}==”02d8″, MODE=”0666″
        SUBSYSTEM==”usb”, ATTR{idVendor}==”045e”, ATTR{idProduct}==”02d9″, MODE=”0666″

        And one more. When I run ./Protonect cl , I have this output:
        [Freenect2Impl] enumerating devices…
        [Freenect2Impl] 11 usb devices connected
        [Freenect2Impl] found valid Kinect v2 @2:26 with serial 034327244547
        [Freenect2Impl] found 1 devices
        OpenCL pipeline is not supported!
        [Freenect2DeviceImpl] opening…
        [Freenect2DeviceImpl] opened
        [Freenect2DeviceImpl] starting…

        maybe in this problem “OpenCL pipeline is not supported!”

        1. In the example, I did not use the ‘cl’ switch with ./Protonect, it might be an issue.
          I also noticed that your hub uses a ‘Terminus Technology Inc. 4-Port HUB’, this might also be an issue. From a quick look on the Internet, this appears to be a USB 2.0 hub, a USB 3.0 hub is needed. Worth checking.

          1. I tried connect kinect without hub and have this output:

            ubuntu@tegra-ubuntu:~$ lsusb
            Bus 002 Device 005: ID 045e:02c4 Microsoft Corp.
            Bus 002 Device 001: ID 1d6b:0003 Linux Foundation 3.0 root hub
            Bus 001 Device 004: ID 8087:07dc Intel Corp.
            Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
            Bus 003 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
            ubuntu@tegra-ubuntu:~$

            Problem still exists.

  9. Without wrong messages my output look like this:

    [OpenGLDepthPacketProcessor] avg. time: 25.4012ms -> ~39.3683Hz
    [OpenGLDepthPacketProcessor] avg. time: 25.3268ms -> ~39.4839Hz
    [TurboJpegRgbPacketProcessor] avg. time: 58.6912ms -> ~17.0383Hz
    [OpenGLDepthPacketProcessor] avg. time: 25.312ms -> ~39.5069Hz
    [OpenGLDepthPacketProcessor] avg. time: 24.9397ms -> ~40.0967Hz
    [OpenGLDepthPacketProcessor] avg. time: 24.9134ms -> ~40.1391Hz
    [OpenGLDepthPacketProcessor] avg. time: 25.2323ms -> ~39.6317Hz
    [TurboJpegRgbPacketProcessor] avg. time: 60.9061ms -> ~16.4187Hz
    [OpenGLDepthPacketProcessor] avg. time: 25.2316ms -> ~39.6329Hz
    [OpenGLDepthPacketProcessor] avg. time: 25.707ms -> ~38.8999Hz

    and

    $ cat /etc/nv_tegra_release
    # R21 (release), REVISION: 4.0, GCID: 5650832, BOARD: ardbeg, EABI: hard, DATE: Thu Jun 25 22:38:59 UTC 2015

    If I right understand my version lL4T is 21.4.

    1. The only difference that I can think of is that the demo was done on a L4T 21.2 setup, while you’re running 21.4. There could have been a change in the TurboJPEG libraries since then that are holding things back. I did notice that there are requests in the libfreenect2 upstream to pull in CUDA changes to speed things back up, but I don’t know when they will be incorporated. Sorry I couldn’t be of more help.

  10. Hello
    I completed install.
    I got usb to run well with 3.0 USB hub.
    When I start Protonect one screen opens — no error messages — firmware — I see messages for frame rate — a great 60- an occasional RGB packet dropped.
    What I don’t see are the three additional windows — I never see them start — flash
    All I see is the one window with frame rate messages.
    The sensor has three red lights.
    What can I be doing wrong that the other three screens do not at least pop up?

    Thanks
    Troy
    I see no error messages at start up — what can I be doing wrong not to see the

  11. Hello
    I have found out a great deal more.
    Even though I see TegraJpegRgbPacketProcessor messages — I am not getting frames sent back to listener
    One time I started the application and all the windows came up and I saw everything
    I rebooted the machine — started up the application and no cv windows only the processor messages
    I can set the application to start at login — I then see three cv windows but no frames are displayed
    It is almost as if the application tries to read the usb stream too soon or gets out of synch
    I have tried multiple usb hubs — 3.o and they all do the same thing

  12. I can see frames coming into SyncMultiFrameListener but the lock being called from WaitForNewFrame is never released-it looks to me that a dead lock situation occures and there is no way out

  13. I have drilled further down.

    In the code, it expects Color, IR and Depth to be returned.
    The device gets itself into a state where only color is returned.
    If I modify the code to only look for color frames, it works like a charm.

    If you get this in time, where is it that the device is told to return all three streams?
    What determines that all three streams will come back in a specific sequence?

    Otherwise I will continue to dig deeper.

    Thanks for the support.

    In my application, I have a platform hexagon in shape. Every 60 degrees I have a Kinect device — the room is completely covered. I am monitor for specifics as multiple people enter the room.

  14. I am assuming that it is OK for for two devices to have the same ID

    Bus 002 Device 008: ID 2109:0812
    Bus 002 Device 009: ID 045e:02c4 Microsoft Corp.
    Bus 002 Device 007: ID 045e:02d9 Microsoft Corp.
    Bus 002 Device 006: ID 2109:0812
    Bus 002 Device 001: ID 1d6b:0003 Linux Foundation 3.0 root hub
    Bus 001 Device 010: ID 093a:2521 Pixart Imaging, Inc.
    Bus 001 Device 009: ID 2109:2812
    Bus 001 Device 008: ID 0c45:7603 Microdia
    Bus 001 Device 007: ID 045e:02d9 Microsoft Corp.
    Bus 001 Device 006: ID 2109:2812
    Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
    Bus 003 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub

  15. Ok

    I got it to work. First an explanation. I initially thought I was having issues with OpenCl4Tegra. I rebuilt OpenCl for four processors and had the same issues. I was then that I realizex that I was only getting the Color Frames and nothing would be drawn until I was receiving all the frames.

    I started looking more at the locks — I still don’t entirely understand the model as implemented.

    What I ended up doing was to make sure that I had inbound frames of all three types before I started looking for frames to draw. Success.

    I will end up rewriting some of the code so that I better understand the lock model. I also need to be able to use the depth data to draw skeletons.

    Bottom line, I am off and running and want to thank you for your support.

    Thank You?

    1. Hi Troy,

      I believe I may be running into the same issue you had, and was wondering if you could share or explain how you changed the example code to get it working.

      Any help appreciated,

      Thanks!

  16. Hi Troy,
    It’s great to hear that you got it working to your satisfaction! I’ve been traveling, sorry I didn’t get a chance to help you with this. Your project sounds like fun, I hope you can share some of it with us.

  17. Hello

    The issue that I had was a locking problem. Simply add a sleep between the start and the loop that listens in Protonect.cpp. This allows all three streams to start collection. This is a hack and not an enterprise solution — But I have been running for more than 72 hours and there is no difference between the data collected by the Jetson boards and the data collected by the Microsoft Surface laptops.

  18. Hello

    I want to be able to stream the depth and rgb data. I want MJPEG data. Is GStream the way to go? If so, are there any code samples? If not, suggestions are appreciated.
    Thanks

    1. Hi Troy,
      Sounds like an ambitious little project.
      GStreamer is really the only officially supported multimedia interface on the Jetson.
      I haven’t really thought about it, but you probably have to implement a V4L driver to get the color and depth streams into GStreamer. Something similar to:
      https://github.com/yoshimoto/gspca-kinect2
      Once you can get the V4L video, you can implement the GStreamer code.
      Using the Jetson hardware: http://elinux.org/Jetson/H264_Codec may help you achieve better frame rates.
      An outline of the code for streaming using GStreamer is at: https://jetsonhacks.com/2014/10/28/gstreamer-network-video-stream-save-file/
      Hope this helps,
      Jim

  19. hi do you know if this can be ported to android 5.1 specifically. I have nvidia shield tv and kinect v2 , i connected and the microphone work

    1. A port seems complicated, the software being used is a C library (libfreenect2). Unfortunately I don’t have much experience with Android ports, but developing on the Shield seems challenging.

  20. Hello kangalow
    I just have Jetson TX1,I want to use it run kinect V2,but have some error.(I have install JetPack2.0)I think the problem is the vision of JetPack,but Jetson TX1 can not install JectPack1.0. I have no ideal,Do you make video tutorial about Jetson TX1 with kinectV2?If not,can you tell me how to do it?Thank you very much!There are some error when I run kinectV2 use Jetson TX1.

    using tinythread as threading library
    — Could NOT find OpenCL (missing: OPENCL_LIBRARIES OPENCL_INCLUDE_DIRS)
    CUDA_TOOLKIT_ROOT_DIR not found or specified
    — Could NOT find CUDA (missing: CUDA_TOOLKIT_ROOT_DIR CUDA_NVCC_EXECUTABLE CUDA_INCLUDE_DIRS CUDA_CUDART_LIBRARY)
    CMake Error at /usr/share/cmake-2.8/Modules/FindCUDA.cmake:548 (message):
    Specify CUDA_TOOLKIT_ROOT_DIR
    Call Stack (most recent call first):
    /usr/share/OpenCV/OpenCVConfig.cmake:45 (find_package)
    /usr/share/OpenCV/OpenCVConfig.cmake:242 (find_host_package)
    CMakeLists.txt:47 (FIND_PACKAGE)

    — Configuring incomplete, errors occurred!
    See also “/home/ubuntu/libfreenect2/examples/protonect/CMakeFiles/CMakeOutput.log”.
    See also “/home/ubuntu/libfreenect2/examples/protonect/CMakeFiles/CMakeError.log”.
    ubuntu@tegra-ubuntu:~/libfreenect2/examples/protonect$ cmake CMakeLists.txt
    — using tinythread as threading library
    — Could NOT find OpenCL (missing: OPENCL_LIBRARIES OPENCL_INCLUDE_DIRS)
    CUDA_TOOLKIT_ROOT_DIR not found or specified
    — Could NOT find CUDA (missing: CUDA_TOOLKIT_ROOT_DIR CUDA_NVCC_EXECUTABLE CUDA_INCLUDE_DIRS CUDA_CUDART_LIBRARY)
    CMake Error at /usr/share/cmake-2.8/Modules/FindCUDA.cmake:548 (message):
    Specify CUDA_TOOLKIT_ROOT_DIR
    Call Stack (most recent call first):
    /usr/share/OpenCV/OpenCVConfig.cmake:45 (find_package)
    /usr/share/OpenCV/OpenCVConfig.cmake:242 (find_host_package)
    CMakeLists.txt:63 (FIND_PACKAGE)

    — Configuring incomplete, errors occurred!
    See also “/home/ubuntu/libfreenect2/examples/protonect/CMakeFiles/CMakeOutput.log”.
    See also “/home/ubuntu/libfreenect2/examples/protonect/CMakeFiles/CMakeError.log”.
    ubuntu@tegra-ubuntu:~/libfreenect2/examples/protonect$

  21. Hello kangalow,
    My writing in the above post was a bit ambiguous. To make it clear, I just rewrite my question as follows
    You provided the video tutorial for connecting Jetson TK1 with Kinect v2 using JetPack 1.0.
    But I just want to connect kinect V2 to Jetson TX1. To this end, I installed JetPack2.0, but find it hard get through with the task. Lots of errors as listed in the above thread. Is there a smooth way for us to run Kinect v2 to Jetson TX1? Thanks a lot!

    1. For installing JetPack 2.0 – https://jetsonhacks.com/2015/11/18/jetpack-2-0-nvack-jetson-tx1/
      Some of the above error messages indicate that CUDA is not installed. Please check your settings. The Jetson TX1 only runs LTS 23.1 and above, it is not compatible with the older 21.4 on the TK1.

      Currently the Kinect v2 does not work with the TX1, there appear to be issues with USB 3.0 that prevent proper frame grabbing. See the thread on the Jetson forum: https://devtalk.nvidia.com/default/topic/919354/jetson-tx1/usb-3-transfer-failures/

  22. Hi

    I have buy one JetsonTK1,but it have some error when I install JetPack1.2 with JetsonTK1, I don’t know the reason.Thanks a lot!

    Error running /home/ubuntu/JetPackTK1-1.2/_installer/DownloadHelper http://developer.download.nvidia.com/devzone/devcenter/mobile/jetpack_tk1/007/common/docs.zip /home/ubuntu/JetPackTK1-1.2/jetpack_download/docs.zip -t “Downloading Documents” -r 10 -c a4ba028423b4920a18e6954b7de7d8e6 –use_md5 1:
    (DownloadHelper:4857): GLib-WARNING **: unknown option bit(s) set

    (DownloadHelper:4857): GLib-WARNING **: unknown option bit(s) set

    (DownloadHelper:4857): GLib-WARNING **: unknown option bit(s) set

    (DownloadHelper:4857): GLib-WARNING **: unknown option bit(s) set

  23. Hi

    Although I don’t find the real reason, but found a solution.Manually download packages that can not download and install.It works!

    Thanks a lot!

  24. Hi

    It have some error when I compile the example of protonect(shown below).
    I have installed JetPack 1.0(Manually download packages that can not download and installed)and also perform smoke example well,but it can not find cuda opencv and opencl(without install) according to above error.what’s more, I can’t find OpenCVConfig.cmake and opencv-config.cmake file:
    ***********************************************************************************
    ubuntu@tegra-ubuntu:~/libfreenect2/examples/protonect$ cmake CMakeLists.txt
    — using tinythread as threading library
    — Could NOT find OpenCL (missing: OPENCL_LIBRARIES OPENCL_INCLUDE_DIRS)
    CUDA_TOOLKIT_ROOT_DIR not found or specified
    — Could NOT find CUDA (missing: CUDA_TOOLKIT_ROOT_DIR CUDA_NVCC_EXECUTABLE CUDA_INCLUDE_DIRS CUDA_CUDART_LIBRARY)
    CMake Error at CMakeLists.txt:47 (FIND_PACKAGE):
    By not providing “FindOpenCV.cmake” in CMAKE_MODULE_PATH this project has
    asked CMake to find a package configuration file provided by “OpenCV”, but
    CMake did not find one.

    Could not find a package configuration file provided by “OpenCV” with any
    of the following names:

    OpenCVConfig.cmake
    opencv-config.cmake

    Add the installation prefix of “OpenCV” to CMAKE_PREFIX_PATH or set
    “OpenCV_DIR” to a directory containing one of the above files. If “OpenCV”
    provides a separate development package or SDK, be sure it has been
    installed.

    — Configuring incomplete, errors occurred!
    See also “/home/ubuntu/libfreenect2/examples/protonect/CMakeFiles/CMakeOutput.log”.
    See also “/home/ubuntu/libfreenect2/examples/protonect/CMakeFiles/CMakeError.log”.
    ***********************************************************************************

    ****************************************************************
    ubuntu@tegra-ubuntu:~$ find OpenCVConfig.cmake
    find: `OpenCVConfig.cmake’: No such file or directory
    ubuntu@tegra-ubuntu:~$ find opencv-config.cmake
    find: `opencv-config.cmake’: No such file or directory
    ******************************************************************

    Maybe there is something wrong.but I don’t know where I was wrong.Thanks a lot!

  25. Thank you for your help.I got it to work.But I did not use Mini PCI-Express card,I just use 3.0 hub to connect kinect,it works!

    Thanks again.

  26. Hola yo quiero trabajar con kinect v2 pero sin NVIDIA JETSON TK1 se puede hacer?
    Yo tengo una laptop ASUS U56E INTEL i5.
    Saludos.

  27. Hello.
    Firstly, I’m japanesse.So,if there are some grammatical errors,I’m sorry about that.
    I use Jetson tk1 and Kinect v2.And I can get depth , color and ir images.But it’s too late.
    There are many messages like,
    [DepthPacketStreamParser::handleNewData] skipping depth packet because processor is not ready
    [RgbPacketStreamParser::handleNewData] skipping rgb packet!.
    I think processor isn’t ready.But I don’t know how to resolve this problem.
    And I do maximizing CPU performancs and controlling GPU performance (852000kHz) with looking (http://elinux.org/Jetson/Performance).
    But it is still later to change next images than this page’s animations.
    Is there any solution about this problem?

      1. Thank you for replying.In my case, when I wave my hand about 100mm front from Kinect v2,I can see striped pattern on my hand in color image.
        Is there any reason to occur this?

          1. Sorry for replying late,kangalow. When I wave my hand fast, that issue is occured. So I don’t think it is caused by cable issue.
            I tried to take a picuture of that issue. And I could do that the format of PNG. But I couldn’t do the format of JPG,because of Segmentation
            fault.
            I don’t know why does it occure.And this is the picture of stripe? issue’s url (http://www.fastpic.jp/images.php?file=4550342358.png).
            This picture will be deleted in a year.

          2. Hi Hirokichi,
            My guess would be that it’s a vertical sync issue with the display, and probably requires something like graphics double buffering to fix. However, I have not experienced the issue, so I’m not a good resource for fixing this.

  28. Hi,kangalow.
    I coudn’t solve the issue,but it’s ok.Because I don’t look some fast moving things.
    Many thanks.

  29. If you still get the error ” failed! libusb error -6: LIBUSB_ERROR_BUSY “, you might have your TK1 USB port set to 2.0 .
    Check for usb_port_owner_info value in /boot/extlinux/extlinux.conf, it should be set to 2, which means USB 3.0.

  30. Hello I am doing some research for work. Any information will be helpful to the following questions;

    1)How can I find out what firmware our kinect v2 sensors have?

    2)What is the latest firmware version for the kinect v2 sensors?

    3)How can I update a kinect v2 sensor that has an older firmware version with the latest version using a labtop with windows 8?

  31. Hi kangalow !
    A really nice and helpful read. I am just starting with Libfreenect2 and am done with the installation. There seem to be many cool things out there that can be done with this piece of equipment. Could you please give be some direction as from where to start and how to go about playing with the code. I am on ubuntu 16.04.
    Thanks !

  32. Thank you very much for all this work and post, much appreciated. I was able to get a previous build to work with the command ./Protonect within the libfreenect2/build directory. When I run the command, I do get all 4 images (infrared, color, depth, depth greyscale). I would like to split the screen into the 4 windows as you have done. How can I go about doing such? I was hoping I can run the executable with certain parameters so I can only view 1 or more of the images at one time. Any guidance or pointers would be very much appreciated. Thank you!

Leave a Reply

Your email address will not be published. Required fields are marked *

Disclaimer

Some links here are affiliate links. If you purchase through these links I will receive a small commission at no additional cost to you. As an Amazon Associate, I earn from qualifying purchases.

Books, Ideas & Other Curiosities