RACECAR/J Software Install

In the previous article in our RACECAR/J Build Series, we finished the initial assembly of the base robot. Now it is time for the Software Install. Looky here:

Background

In the video, we install the software for the “MIT Hardware Configuration – Version 2.5” of RACECAR/J. This includes support for the following:

  • VESC 4.12 hardware compatible electronic speed controller
  • Sparkfun SEN-14001 IMU
  • Stereolabs ZED Camera
  • Hokuyo UST-10LX Scanning Laser Range Finder

The VESC and SEN-14001 are part of the RACECAR/J Base hardware configuration, the ZED and UST-10LX are additional.

The base RACECAR/J uses some new code for the Sparkfun IMU. Note that this is transition code in the RACECAR/J racecar Repository (the RacecarJTransitory branch). A different code base for the IMU is now under submission to the OSRF to become an official package. As such, we’ll publish updates as new code becomes available.

Installation Requirements

The software install is for the Jetson on the RACECAR/J. In the video a Jetson TX2 is used, connected to a monitor, keyboard, mouse and ethernet connection. The only difference in the code bases is the version of the ZED Camera driver. The version for the Jetson TX2 is in the folder ‘JetsonTX2’ and the version for the Jetson TX1 is in the folder ‘JetsonTX1’.

Note that version 2.2.1 of the ZED Camera driver installs to match the CUDA 8.0 package.

The current software stack runs on the Jetson TX1 and Jetson TX2 running L4T 28.1. Use JetPack 3.1 to install L4T 28.1, and at a minimum:

  • CUDA 8.0
  • cuDNN 6.0
  • OpenCV4Tegra 2.4.13

Note that the next version of L4T will release in the next few weeks from this writing, at which time we’ll start to update the code.

Installation

The installRACECARJ repository on the RACECARJ Github account contains the scripts necessary to install the software stack. First, clone the repository and switch to the repository directory:

$ git clone https://github.com/RacecarJ/installRACECARJ.git
$ cd installRACECARJ

Next, install the appropriate ZED camera driver for the Jetson in use. If possible, plug the ZED camera into the RACECAR/J USB hub (in one of the FAST ports, they are full speed USB 3.0). If the ZED is present during installation, the ZED driver installer will download the cameras’ calibration file. To install the Jetson TX2 driver for example:

$ cd JetsonTX2
$ ./installZED-SDK-TX2.sh

Then, return to the installRACECARJ directory:

$ cd ..

We’re now ready to install the Robot Operating System (ROS) software and the rest of the RACECAR/J software stack. The installation script does the following:

  • L4T 28.1 does not have a cdc-acm driver. The script installs a pre-built cdc-acm driver. The driver expects a stock kernel (4.4.38-tegra)
  • Because the electronic speed controller and the IMU both report as ttyACM, a udev rule is installed which names them as vesc and imu respectively.
  • ROS is configured and rosbase is installed
  • One of the dependencies is missing in the package specifications, so ros-kinetic-opencv3 is installed.
  • The MIT RACECAR packages are installed, which includes the ZED v2.2.x ROS wrapper.

To start the installation:

$ ./installMITRACECAR.sh

The directory ‘~/racecar-ws’ is the default workspace directory name; it can be specified on the command line after the script name. Because there is such a large amount of messages during installation, you may want to log everything to a log file:

$ ./installMITRACECAR.sh |& tee softinstall.log

The log will be placed in the file ‘softinstall.log’ for review. This is useful in case there are installation issues.

ROS Environment Variables

Worth noting is that the scripts also set up two environment variables in the .bashrc, namely ROS_MASTER_URI and ROS_IP. These are placeholders, you should replace them with values appropriate to your network layout. Also, while the normal ROS setup.bash is sourced, you may want to source the devel/setup.bash of your workspace.

After installation, you should be able to run the teleoperation launch file if the VESC is programmed. If you buy the RACECAR/J kit, the VESC should already be programmed. You are ready to tele-operate the robot: RACECAR/J – ROS Teleoperation

If you buy the VESC separately, program the VESC next: RACECAR/J – Programming the Electronic Speed Controller

Conclusion

The installation of the entire ROS software stack for RACECAR/J can be a complicated affair. However, these installation scripts should make things fairly simple.

54 Comments

  1. Hi! I just go through clean install everything on my Jetson TX2 by your instruction. It was straight forward with little exception. The location of SDK for ZED has been changed. Amcl and map_server need manual install. Thank you for your effort. Hopefully, I want to see next step of driving.

  2. Hello,
    First of all I would like to thank you for all the hard work you put on this website to share your knowledge. It’s very helpful.

    I have problem that I could not solve for a while.

    I’m trying to build an RC car similar to the racecar. I have a problem with capturing an image frame of the ZED camera. It causes huge delay to the system (~6 seconds). Now keep in mind that I’m using the ZED Python API which is still in beta. Can you guide me to the right API to capture ZED camera frames smoothly?

    Thanks.

  3. Thanks for this article, and your website in general. Super helpful and informative! Can I ask: Why use BOTH a depth-sensing stereo camera, AND a Lidar unit? Are they each effective at different ranges or something? Thanks again!

    • You’re welcome!
      The RACECAR is used in educational and research settings. One of the interesting problems is sensor fusion, which is how do you combine the information streams from different sensors to get a comprehensive overview of the surrounding environment. A subproblem is how do you determine which sensor to believe for a particular situation.
      The Hokuyo is a very high quality 2D LIDAR, but basically it’s just the line of objects in the field of view at a particular height. The 3D depth camera provides a wider field of view, but isn’t as accurate. The advantage of using a camera to identify something like a stop sign or so on is readily apparent. Typically autonomous vehicles use multiple cameras, radar, ultrasonics, and most use LIDAR. The “real” cars use 3D LIDAR, but it is considerably more expensive than the one used here. A “real” car also combines in GPS information.
      The thing to remember is that sensors like LIDAR may encounter issues depending on things like lighting conditions and reflectivity of objects to the laser being used. Cameras have issues with lighting also, so typically they have RGB cameras and infrared cameras too. You also have to plan for the case where a sensor gets obscured by dirt or something.

      • Would love to see a shootout between the ZED camera, RealSense D435 and Structure.io cameras, specifically in the context of RACECAR/J, and which one(s) you ended up going with.

  4. When I install the ZED driver,

    ./installZED-SDK-TX2.sh

    Igot the following error messages:
    Installation path: /usr/local/zed
    Checking CUDA version…
    ERROR : Found CUDA 9.0 but the ZED SDK requires 8.0
    The ZED will NOT work with this CUDA version (9.0), Continue (Y/N) ?
    Installing…

    Could you please let me know how to fix it? I thin I flashed Jetsonpack3.1.

    THanks,

    George

    • I believe that there is a new ZED driver available. Be aware that the entire software stack has not been tested thoroughly on L4T 28.2. You may also have to change the .rosinstall scripts for a new ZED wrapper.

  5. I flash my Tx2 with JetPack3.1. But when I tried to install the ZED, it said:

    ERROR: cannot verify http://www.stereolabs.com‘s certificate, issued by ‘CN=RapidSSL RSA CA 2018,OU=www.digicert.com,O=DigiCert Inc,C=US’:
    Unable to locally verify the issuer’s authority.
    To connect to http://www.stereolabs.com insecurely, use `–no-check-certificate’.
    chmod: cannot access ‘ZED_SDK_Linux_JTX2_v2.2.1.run’: No such file or directory
    ./installZED-SDK-TX2.sh: line 19: ./ZED_SDK_Linux_JTX2_v2.2.1.run: No such file or directory

    Could you let me know how to fix it?

    Thanks,

  6. I plugged-in the ZED camera. But when I install the SDK by running:

    ./installZED-SDK-TX2.sh

    It complained

    *** Download Factory Calibration ***
    No ZED detected : ZED calibration file will not be downloaded. Go directly to http://calib.stereolabs.com to manually download your calibration file.
    ZED SDK installation complete !

    Is this normal?

  7. “If the ZED is present during installation, the ZED driver installer will download the cameras’ calibration file.”
    I don’t see anything in the ZED documentation about calibrating the camera, and no indication of whether or not this succeeded. Cna you tell me where to look for the calibration file?

    • For anyone finding this question int he future, and having the same problem, my main clue that solved this for me was the comment from George above. The ZED camera installation script does not actually automagically download the calibration file. You need to run wget http://calib.stereolabs.com/?SN= , and replace with the serial number for your camera, which is printed on the back of the box it came in.

      Alternatively (or possibly in addition to?) the above, you can run /usr/local/zed/tools/ZED\ Calibration , which I’m guessing creates a custom calibration file.

      Hoping this helps someone out in the future, because it isn’t documented anywhere on the ZED developer site.

      • The ZED shipped nowadays has a new firmware that does not work with the RacecarJ install script. RacecarJ uses a old version of ZED API, including the tools, such as Explorer. What I did (recommended by ZED) was install the new ZED API on a different machine, link ZED to that machine, then use the tool on that machine to downgrad the ZED firstware to a older version. Then, link ZED to TX2 and RacecarJ script will work perfectly.

        • I *think* you’re talking only about the RacecarJ install script, is that correct? I only ask because, as far as I can tell, RacecarJ (and mit-racecar as well) doesn’t contain any actual logic. Just a bunch of ROS sensor nodes chattering away at eachother. I want to just make sure that my manual installation of the ZED driver, software and ROS node didn’t screw something up (seems to be working fine on the command line, with no errors, but who knows for sure?)

          • I noticed that teleop.launch did not start ZED ROS node. What is the correctly way to launch the car ros node with ZED ros node?

  8. When I run “roslaunch zed_wrapper zed.launch”

    I got below error. Do you know why? and How do I fix it?

    [ERROR] [1531400428.350306233]: Failed to load nodelet [/zed/zed_wrapper_node] of type [zed_wrapper/ZEDWrapperNodelet] even after refreshing the cac he: MultiLibraryClassLoader: Could not create object of class type zed_wrapper::ZEDWrapperNodelet as no factory exists for it. Make sure that the li brary exists and was explicitly loaded through MultiLibraryClassLoader::loadLibrary()
    [ERROR] [1531400428.350438805]: The error before refreshing the cache was: MultiLibraryClassLoader: Could not create object of class type zed_wrappe r::ZEDWrapperNodelet as no factory exists for it. Make sure that the library exists and was explicitly loaded through MultiLibraryClassLoader::loadL ibrary()

  9. Can you tell me the details steps required to run the racecar j in autunomous mode. I can run it using teleop.launch. I also have lidar,imu interfaced with jetson.So just need to understand how to feed to map and what commands to run

  10. Hi,

    I am trying to understand how the car runs in autonomous mode. Can you tell me the commands to run it in autonomous mode? Also, I am trying to understand how you are getting the odometry information and passing it to navigation stack. can anyone help me to understand this?

  11. Hi,
    I downloaded and followed the installation step given above, now I want to run it in autonomous mode and I don’t know which command to run for it to run in autonomous mode. Also, I am not able to understand where is the message published on “ackermann_cmd_input” topic and “ackermann_cmd” topic. Can you help me understand what information needs to be given to the navigation stack and how are you doing it in the program?

    • I’m not sure what you mean “run it in autonomous mode”. The “known_map_localization.launch” file will run using the map of the tunnels underneath MIT. But unless you have that environment you will need to create your own maps.

      Did you try using rqt to examine the node graph and examine topics? Did you look at the ROS messages? /joy_teleop is a node that translates the /joy input to ackermann commands. /joy_teleop publishes on /ackermann_cmd_mux/input/teleop

      /joy_teleop is implemented by src/racecar/racecar/scripts/joy_teleop.py

      • By autonomous mode I mean using the ROS navigation stack.
        If I am right, there are three modes of operation one,where we can use the teleop mode (which I tried. works fine) and two modes that work on ROS navigation stack and ackermann control. I do not own a hokuyo lidar and I have a camera that produces pointcloud images (I have all the other components). From the tutorial I have understood that we can provide either a laser scan data or a point cloud data. I would like to use the ROS navigation stack by giving it the point cloud messages and I am having trouble finding the right place to provide this message as well as to launch the RACECAR in this autonomous mode where the navigation stack is used. Thank you.

        • I’m not quite sure how you are thinking about this. My understanding is that there is one “out of the box” launch file which is for teleoperation. The other launch files are for two purposes. The first is called “mux.launch”. This is for a graph which includes a high level and a low level ackerman control. The other launch file is for a known map, which is the map for the tunnels underneath MIT.

          In the introduction to robotics class, there are 6 assignments for the 12 week class. The MIT students modify the “mux.launch” file to include their algorithms for autonomy. These include a parking controller, a line follower, a parking cone detector, a wall follower, visual servoing, path planning and so on. The output of the mux ties into the vesc/ackermann topic.

          The last assignment is to race through the tunnels underneath MIT in time trials using the supplied map.

          In order to build your own autonomy algorithm, you would create a node(s) which subscribes (or action/service) to the desired topics, do what ever processing you need, and then publishes to the desired topic. You will also need to think about a safety controller, so that you have some control over the robot after launch. Hope this helps.

          The last assignment ties together

  12. Thank you for the information and sorry for the trouble. I am not clearly understanding whether any default autonomy algorithm is included in this package or we should include our own?

    • It’s not trouble, there just seems to be a nomenclature issue. Autonomy is too general of a term to really define what you are trying to accomplish. An autonomous vehicle just means that it does something on its own, the complexity of the goal is not specified. One could have a robot drive for 10 seconds in a straight line. Or one could have a robot navigate through an environment avoiding obstacles to a stated location. Both are “autonomous”, that is, they work without intervention. But they are hardly the same thing.

      You should build your own “autonomy algorithm” depending on your goals. There are several tools in the mit-racecar Github repository, such as particle_filter ( https://github.com/mit-racecar/particle_filter ) , which you may find useful for scaffolding your solution.

      Typically you would publish your “go command” to the high level ackermann cmd in the mux, and also provide an arbiter in the safety control node so that you avoid running into anything. You would also publish the low low level ackermann commands to the VESC.

  13. Thank you for the information and I agree that there is a bit of nomenclature issue. Is there any place you can point me to where I can find the signal flow of this RACECAR?

        • What did you try?
          rqt_graph will show the graph of a running ROS system.
          rosnode, rostopic, rosmsg and rossrv and other command line tools can be used for examining a running system.

          For example, start teleop. Then:

          $ rostopic info joy_teleop

          joy_teleop is the node which converts joystick commands into ackermann_msgs which eventually to the VESC.

          If you have read and understand this book, you should be able to find your way around:
          Programming Robots with ROS: A Practical Introduction to the Robot Operating System (Quigley et al)
          https://amzn.to/2PgOVJ8

  14. Hi Jim,

    I have had some trouble with the first step for the ZED camera. I installed the latest version of JetPack so I have CUDA 9.0. Is there a way I can continue with CUDA 9.0 or do I have to use CUDA 8.0. If so, how?

    Thanks!

  15. If you want to use CUDA 9.0, you will need to install the appropriate driver for the ZED camera from the Stereolabs developer website, along with modifying the version of the ROS ZED wrapper package. The only currently supported version of the software stack is as described above.

  16. Hi there,

    Currently, I’m exploring how the ackermann_cmd_mux package interfaces with the VESC while the car is operating. My end goal here is to understand how exactly the joystick manages to publish to ackermann_cmd_mux, which then prescribes inputs to the VESC (apologies if I’ve understood this wrong). My end goal is to allow keyboard operation, or to use another interface to program values to the VESC, but I haven’t really been able to figure out how the joystick sends commands. Is there a configuration file I’m missing, or am I just thinking about this the wrong way? Could you please point me in the right direction?

    • Hello,

      I recently downloaded the the racecar package and made it run through the teleop launch file. I then went inside the vesc folder and inside vesc/src there is vesc_to_odom.cpp which i think basically gives out odometry information from the vesc and in order to interpret the data I changed “published_tf:false” to true considering it will show me odom topic when I run rostopic list but I am unable to see the topic. Is there any other file I need to edit to get the odometry info? Am I in the right direction or completely wrong? My end goal is to take to odometry info from vesc and feed it to gmapping. Can anyone help me out?

      Thanks.

      • Hi there,

        I think the odometry information is provideed by IMU and not VESC. The IMU sends the odometry information to vesc.Someone please tell me if my understanding is wrong, According to me since the IMU has accelerometer and gyroscope maybe that is giving out the odometry information. I am trying to figure out how is the map build,currently I am looking at map_server and gmapping to create a map and I will update you if I figure it out .

        • The IMU is not connected to the VESC. The VESC node calculates odometery. The VESC tracks the motor speed. You can calculate the wheel speed from the motor speed and gearing. If you know the size of the wheels, you can then calculate positional changes. The VESC also controls the steering servo. Using information from the steering servo, you can calculate the angular velocity. The wheel base of the robot is also needed to help in the calculations.

          You can see this code here: https://github.com/mit-racecar/vesc/blob/master/vesc_ackermann/src/vesc_to_odom.cpp

          Note that brushless motors have some issue on startup because they do not “know” where the rotor is so they do not know which stator to fire first. This can cause the motor to “cog” while it hunts for the right stator sequence as the motor starts up. This can cause the odometery to be a little off. Adding a sensored brushless motor can help with this. Sensored brushless motors use something like a hall effect sensor to calculate where the rotor is, which helps calculate the stator firing sequence for a smoother startup. The VESC can handle sensored brushless motors.

          Note that this is not as precise as having encoders on the wheels. The IMU can be used to help in these calculations, along with the information from the lidar. Typically that’s one of the tasks on the robot, fusing sensor data to get a ground truth of what is taking place.

          • Hello,

            I am trying to recreate this project. I have few questions when I went through the workspace.
            1. After reading the comments I understood the odometry info is coming from vesc, then what is the use of IMU?
            2.How were the maps generated? for this I read through the book and understood it requires gmapping or hector_slam to create the map but I don’t know how to send the “tf” and update it while creating rosbag file which are the subscribed topics for gmapping.
            3.How was the path decided once the map was created?

            I know these are alot of questions and could be most of them would not make any sense but just want to have proper understanding of the project.

            Thanks.

Leave a Reply

Your email address will not be published.


*