Jetson Nano + Raspberry Pi Camera

The NVIDIA Jetson Nano Developer Kit plugs and plays with the Raspberry Pi V2 Camera! Looky here:


Since the introduction of the first Jetson in 2014, one of the most requested features has been Raspberry Pi camera support. The Jetson Nano has built in support, no finagling required.

The Jetson family has always supported MIPI-CSI cameras. MIPI stands for Mobile Industry Processor Interface, the CSI stands for Camera Serial Interface. This protocol is for high speed transmission between cameras and host devices. Basically it’s a hose straight to the processor, there isn’t a lot of overhead like there is with something like say a USB stack.

However, for those folks who are not professional hardware/software developers, getting access to inexpensive imaging devices through that interface has been, let’s say, challenging.

This is for a couple of reasons. First, the camera connection and wiring is through a connector to which most hobbyists don’t have good access. In addition, there’s a lot of jiggering with the drivers for the camera in the Linux kernel along with manipulation of the device tree that needs to happen before imaging magic occurs. Like I said, pro stuff. Most people take the path of least resistance, and simply use a USB camera.

Raspberry Pi Camera Module V2

At the same time, one of the most popular CSI-2 cameras is the Raspberry Pi Camera Module V2. The camera has a ribbon connector which connects to the board using a simple connector. At the core, the RPi camera consists of a Sony IMX-219 imager, and is available in different versions, with and without an infrared filter. Leaving out the infrared filter in the Pi NoIR camera (NoIR= No Infrared) allows people to build ‘night vision’ cameras when paired with infrared lighting. And they cost ~ $25, lots of bang for the buck!

Are they the end all of end all cameras? Nope, but you can get in the game for not a whole lot of cash.

Note: The V1 Raspberry Pi Camera Module is not compatible with the default Jetson Nano Install. The driver for the imaging element is not included in the base kernel modules.

Jetson Nano

Here’s the thing. The Jetson Nano Developer Kit has a RPi camera compatible connector! Device drivers for the IMX 219 are already installed, the camera is configured. Just plug it in, and you’re good to go.


Installation is simple. On the Jetson Nano J13 Camera Connector, lift up the piece of plastic which will hold the ribbon cable in place. Be gentle, you should be able to pry it up with a finger/fingernail. Once loose, you insert the camera ribbon cable, with the contacts on the cable facing inwards towards the Nano module. Then press down on the plastic tab to capture the ribbon cable. Some pics (natch):

Make sure that the camera cable is held firmly in place after closing the tab. Here’s a pro tip: Remove the protective plastic film which covers the camera lens on a new camera before use. You’ll get better images (don’t ask me how I know).

Testing and some Codez

The CSI-Camera repository on Github contains some sample code to interface with the camera. Once installed, the camera should show up on /dev/video0. On the Jetson Nano, GStreamer is used to interface with cameras. Here is a simple command line to test the camera (Ctrl-C to exit):

$ gst-launch-1.0 nvarguscamerasrc ! nvoverlaysink

On newer Jetson Nano Developer Kits, there are two CSI camera slots. You can use the sensor_mode attribute with nvarguscamerasrc to specify the camera. Valid values are 0 or 1 (the default is 0 if not specified), i.e.

nvarguscamerasrc sensor_mode=0

A more specific example, which takes into account the actual modes of the particular sensor:

$ gst-launch-1.0 nvarguscamerasrc sensor_mode=0 ! 'video/x-raw(memory:NVMM),width=3820, height=2464, framerate=21/1, format=NV12' ! nvvidconv flip-method=0 ! 'video/x-raw,width=960, height=616' ! nvvidconv ! nvegltransform ! nveglglessink -e

This requests GStreamer to open a camera stream 3820 pixels wide by 2464 high @ 21 frames per second and display it in a window that is 960 pixels wide by 616 pixels high. The ‘flip-method’ is useful when you need to change the orientation of the camera because if flips the picture around for you. You can get some more tips in the file in the repository.

There are also a couple of simple ‘read the camera and show it in a window’ code samples, one written in Python, the other C++.

Note: Starting with JetPack 4.3/L4T 32.3.1 the Jetson runs OpenCV 4. This means that if you are using an earlier version of JetPack, you will need to select an earlier release of the CSI-Camera repository. In order to do that, before running the samples:

$ git clone
$ git checkout v2.0

The third demo is more interesting. The ‘’ script runs a Haar Cascade classifier to detect faces on the camera stream. You can read the article Face Detection using Haar Cascades to learn the nitty gritty. The example in the CSI-Camera repository is a straightforward implementation from that article. This is one of the earlier examples of mainstream machine learning.


Updated 2-29-2020 Simpler camera test, and information on how to access the different CSI cameras on the Jetson Nano B01 carrier board.

A Logitech C920 webcam is used in the video through the Cheese application.

Demonstration environment:

  • Jetson Nano Developer Kit
  • L4T 32.1.0
  • Raspberry Pi Camera Module V2


  1. Wondering if the OV5647 cams (like Raspberry Pi Camera v1) will also be compatible with the #JetsonNano Dev Kit? Would you happen to know?
    Thanks in advance

        • Did you managed it to work, the night vision one ??
          I have tried the “original” raspberry v2.1 cam, this one is working.
          But a second one called “Raspberry Pi Full HD Kamera Madul” with night vision from just want work at all.

          • For the default Jetson Nano Image, you must use an IMX219 based sensor, such as the Raspberry Pi 1.2 NoIR camera. The camera you mention is a OV5647 sensor. Thanks for reading!

            • Thank you clearing this out kanglow!
              I have the Raspberry Pi Noir Kamera-Modul V2 connected and working, unfortunately the pictures are blue in the middle and red outside.
              maybe it is just broken…
              funny thing is, nvidia AI says “jelly fish” all the time

  2. Awesome article! Thanks. Would you happen to know the gstreamer command line to operate a second CSI camera on the nano?

  3. Hi there!

    I tried to do it like you! And thanks for your time and work to make this report. But I got some Issues and dont know how to solve it cause I am fairly new to Linux.
    I think thats what causes my trouble:

    (Argus) Error FileOperationFailed: Connecting to nvargus-daemon failed: Die Struktur muss bereinigt werden (in src/rpc/socket/client/SocketClientDispatch.cpp, function openSocketConnection(), line 201)
    (Argus) Error FileOperationFailed: Cannot create camera provider (in src/rpc/socket/client/SocketClientDispatch.cpp, function createCameraProvider(), line 102)
    Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, execute:515 Failed to create CameraProvider
    FEHLER: Von Element /GstPipeline:pipeline0/GstEglGlesSink:eglglessink0: Output window was closed

    It would be great if someone can tell me what I have to take care with ..

    Greetings Nico

  4. Thanks a lot i could fix it by taking another sd-card something went wrong with it… now everything works like described…

      • Thanks for great work!
        I would like to connect my custom camera to jetson nano. My question is for connecting a different camera to nano. Can you tell me which kernel file be edited for a different camera. I have all the data for my customly made camera. I think it is not impossible to do that.

  5. You have great tutorials, I really appreciate them!

    When I run the gst-launch command in the terminal it works fine, no problem. But when I tried your python example ( it could not open the camera. I found out that the pre-installed opencv-3.2.0 was built without gstreamer support:

    Python 3.6.7 (default, Oct 22 2018, 11:32:17)
    [GCC 8.2.0] on linux
    Type “help”, “copyright”, “credits” or “license” for more information.
    >>> import cv2
    >>> print(cv2.getBuildInformation())

    GStreamer: NO

    I used a newly flashed SD card with Jetpack. Do you have any idea on how to proceed?

    • Thank you for the kind words. Did you try with Python 2.7? For Python 3.6, there are a lot of steps needed to install for OpenCV, it is possible that the original OpenCV was overwritten.

  6. In this article you use the camera utility gst-launch-1.0. In several of the Nvidia demos they use another utility nvgstcapture-1.0.

    Can you explain the differences? I am very new to Linux so this may be obvious to more experienced users

  7. hello,
    thank you for all your tutorials,they really help me a lot!
    I’m a TX2 user and wonder if PRI v2 cam can work on my TX2 with the latest Jetpack 4.2 version?
    I have a Auvidea J20 board ,a 6-csi-cam-add-on module.When I plug everything on my TX2 dev-kit and make some “sudo”-command(5 i2cset command) as Auvidea reference manual said.The GPIO light switch on but still no “/dev/video”.
    I found someone having the same problem with J20 and PRIV2cam,and some questioned on nvidia Forums(mainly on 2017 and 2018).Still I don’t get much useful help.
    I know that Ridgerun have done these things and nicely write a wiki on how to make it work.But it costs 2,499$ to buy their drivers.That’s far from i can afford.I don’t know what should I do now.
    I’m appreciated for all the advise or guide.

  8. In case this helps anyone trying to stream over X11 to a remote desktop:
    ssh -X $USER_NAME@$NANO_IP -C “gst-launch-1.0 nvarguscamerasrc ! ‘video/x-raw(memory:NVMM),width=3820, height=2464, framerate=21/1, format=NV12’ ! nvvidconv flip-method=2 ! ‘video/x-raw,width=960, height=616’ ! nvvidconv ! ximagesink”

  9. Any thoughts on why a Raspberry Pi cam 2.1 is very red and grainy? simple_camera and face-detect work.

    Regards, Geordy

  10. Hi.I try this code but not work.Can you help me.

    rancer91@rancer91-desktop:~/CSI-Camera$ python
    File “”, line 27
    print gstreamer_pipeline(flip_method=0)
    SyntaxError: invalid syntax

  11. I have downloaded the github files that you instructed us to download however when i run the gstream code it pops up a blank screen and says no camera detected, may i ask if you know why is this happening?

    • Your issue could be the result of multiple causes. First to check is which camera are you using? The second is to make sure that the camera is installed correctly. Does the following work:

      $ gst-launch-1.0 nvarguscamerasrc ! ‘video/x-raw(memory:NVMM),width=3820, height=2464, framerate=21/1, format=NV12’ ! nvvidconv flip-method=0 ! ‘video/x-raw,width=960, height=616’ ! nvvidconv ! nvegltransform ! nveglglessink -e

  12. Please I need help concerning a CSI camera I bought on Amazon
    . It has out of the box Nvidia-Jetson Nano support so I connected it as shown. It even shows up when running `ls /dev/video0`
    but when running the example shown on it merely screenshots my screen as opposed to streaming a video

    The command specified is

    gst-launch-1.0 nvarguscamerasrc ! ‘video/x-raw(memory:NVMM),width=3820, height=2464, framerate=21/1, format=NV12’ ! nvvidconv flip-method=0 ! ‘video/x-raw,width=960, height=616’ ! nvvidconv ! nvegltransform ! nveglglessink -e

  13. kangalow can you please tell me how can i record video using my resberry pi camera mversion v2 i spend almost a day but cant find the way to record a video through v2 camera.

  14. shubh@shubh-desktop:~$ gst-launch-1.0 nvarguscamerasrc ! ‘video/x-raw(memory:NVMM),width=3820, height=2464, framerate=21/1, format=NV12’ ! nvvidconv flip-method=0 ! ‘video/x-raw,width=960, height=616’ ! nvvidconv ! nvegltransform ! nveglglessink -e
    Setting pipeline to PAUSED …

    Using winsys: x11
    Pipeline is live and does not need PREROLL …
    Got context from element ‘eglglessink0’: gst.egl.EGLDisplay=context, display=(GstEGLDisplay)NULL;
    Setting pipeline to PLAYING …
    New clock: GstSystemClock
    Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, execute:532 Failed to create CaptureSession
    Got EOS from element “pipeline0”.
    Execution ended after 0:00:00.083059046
    Setting pipeline to PAUSED …
    Setting pipeline to READY …
    Setting pipeline to NULL …
    Freeing pipeline …

    • I don’t know what you mean. Which camera are you talking about, and what type of image quality are you expecting? Do you actually have a camera that you are comparing it against, or have one of the cameras on your Jetson? Are you comparing this to a Raspberry Pi running the same camera?

  15. I just received another new nano and it is a little different than the others. TWO CAMERA CONNECTORS for raspi v2. Stereo vision!

  16. Hello! Thanks for putting this tutorial together, it’s very easy to follow! I did have a question for you though.

    I bought an enclosure for the jetson nano and, the way it’s designed, I have the option of having my camera (rpi V2) face backwards or flip it upside down. Looking through your documentation (your readme file), it appears it should be easy enough to change flip_method to 2 and it should flip it 180 degrees but, unfortunately, the whole thing hangs when I attempt this and I have to kill the process through process manager. If I change it back to 0 though, everything works again.

    Below is the code I’m using:

    gst-launch-1.0 nvarguscamerasrc ! ‘video/x-raw(memory:NVMM),width=3820, height=2464, framerate=21/1, format=NV12’ ! nvvidconv flip-method=2 ! ‘video/x-raw,width=960, height=616’ ! nvvidconv ! nvegltransform ! nveglglessink -e

    Am I missing something really obvious? Thanks for all your help!

  17. Hi! First of all I would want to thank you for your awesome content. I was directed here from a medium article I was reading on the jetson nano and good Lord it was nice to finally a step-by-step instruction guide. So far I’ve managed to get everything you’ve offered going and I only wish I got here earlier.

    Now I only have one question, which I’m asking here because It seems like its most content relevant and others might benefit. I’ve tried to search for the answers on the Nvidia forum but clearly I need to be working on my search terms.

    So I don’t know if you have noticed this but OpenCV doesn’t gracefully exit when running A hint of this is that GST_ARGUS is stuck at PowerServiceHWVic::cleanResources, so probably suggesting that gstreamer is still somehow holding on to resources. I’ve narrowed this down to the classifier.detectMultiScale(img, 1.3, 5) function, which I’m guessing its not freeing up memory, or something.

    I’m wondering if you could tell me how to workaround this, or at least give me some hints on how to figure out what’s going wrong

    • Thank you for the kind words.
      The sample is overly simple, and doesn’t have all the error checking/termination code that is needed for “real” situations.

      How are you trying to terminate the program? Hitting the escape key is the one that does cap.release()

  18. Hi! I found a memory leak issue when I open and close pipeline again and again on System Monitor of Ubuntu.
    When finish the program, Memory is returned to its original state.
    Did you know about this issue and have solution?

    #include “mainwindow.h”
    #include “ui_mainwindow.h”

    MainWindow::MainWindow(QWidget *parent) :
    ui(new Ui::MainWindow)
    timer = new QTimer(this);

    delete ui;

    void MainWindow::on_pushButton_open_webcam_clicked()
    QString strPipline = QString(“nvarguscamerasrc ! video/x-raw(memory:NVMM), width=(int)%1, height=(int)%2, format=(string)NV12, framerate=(fraction)%3/1 ! nvvidconv flip-method=%4 ! video/x-raw, width=(int)%5, height=(int)%6, format=(string)BGRx ! videoconvert ! video/x-raw, format=(string)BGR ! appsink sync=true”)
    .arg(ui->label->size().height());, CAP_GSTREAMER);

    if(!cap.isOpened()) // Check if we succeeded
    cout << "camera is not open" << endl;
    cout << "camera is open" <start(20);

    void MainWindow::on_pushButton_close_webcam_clicked()
    disconnect(timer, SIGNAL(timeout()), this, SLOT(update_window()));
    Mat image = Mat::zeros(frame.size(),CV_8UC3);
    qt_image = QImage((const unsigned char*) (, image.cols, image.rows, QImage::Format_RGB888);
    cout << "camera is closed" <> frame;
    cvtColor(frame, frame, CV_BGR2RGB);
    qt_image = QImage((const unsigned char*) (, frame.cols, frame.rows, QImage::Format_RGB888);

  19. Thank you for your series on Jetson Nano.

    I am using camera IMX219-77 IR for my Jetson Nano. I followed the instruction but the video has purple-like color.

    What is the reason of that and how can I obtain normal color video?

    Many thanks,

    • I do not understand your question. You are using a camera without an infrared filter. If you do not use an infrared filter, the images can appear purple. What are your expectations here?

  20. Thank you for your article!
    I try to set a lower framerate (For example: framrate=10/1) but it still run with framerate=59.
    Could you help me?
    Thank you very much.

    • These cameras do not have v4l2 drivers by default. In order to work around this, you would have to create a v4l2 loopback device and then use gstreamer to sink to the that device. Thanks for watching!

  21. Would you please help me out? After I accidentally pull out the pi camera while the Jetson Nano is running, the screen fonts and outlay is enlarged. What damage would have been inflicted upon? And how to check the source of the problems and fix them?

    • I do not have anything to share on your issue. Please ask this question on the official NVIDIA Jetson Nano forum where a large group of developers and NVIDIA engineers share their experience.

  22. Hi @kangalow,

    First of all: many thanks for sharing your knowledge and experiences!

    I’m working with the first release of the Nano. After setting up a fresh image and connecting the picam v2 the python sample works very well. But compiling the CPP failed because of missing opencv-headers. What needs to be done to get the headers available or setting up the compiler flags correctly?

  23. Hi, Nice video. I just put together by new Jetson Nano and when I put the cord into the Jetson Nano J13 Camera Connector the plastic on top broke off. Is there a way to get a replacement part? I did not finish loading software so don’t know if the loose cconnection will work.

Leave a Reply

Your email address will not be published.