JetsonHacks

Developing on NVIDIA® Jetson™ for AI on the Edge

RACECAR/J Software Install

In the previous article in our RACECAR/J Build Series, we finished the initial assembly of the base robot. Now it is time for the Software Install.

FOCBOX replaced by VESC6 Plus

Important Note: Currently the RACECAR/J Kits are shipping with the VESC 6 Plus.

There are two installation scripts in the Github repository. In the video, we use installMITRACECAR.sh which installs the VESC driver software for version 4.12 hardware (FOCBOX, earlier versions of VESC). The script installMITRACECARVESC6.sh installs the VESC driver software for version 6.X hardware such as the VESC 6 Plus.

The format of some of the data structures have changed for the 6.X hardware, make sure that you match the installer to the version of VESC that you are using.

Looky here:

Background

In the video, we install the software for the “MIT Hardware Configuration – Version 2.5” of RACECAR/J. This includes support for the following:

  • VESC 4.12 hardware compatible electronic speed controller
  • Sparkfun SEN-14001 IMU
  • Stereolabs ZED Camera
  • Hokuyo UST-10LX Scanning Laser Range Finder

The VESC and SEN-14001 are part of the RACECAR/J Base hardware configuration, the ZED and UST-10LX are additional.

The base RACECAR/J uses some new code for the Sparkfun IMU. Note that this is transition code in the RACECAR/J racecar Repository (the RacecarJTransitory branch). A different code base for the IMU is now under submission to the OSRF to become an official package. As such, we’ll publish updates as new code becomes available.

Installation Requirements

The software install is for the Jetson on the RACECAR/J. In the video a Jetson TX2 is used, connected to a monitor, keyboard, mouse and ethernet connection. The only difference in the code bases is the version of the ZED Camera driver. The version for the Jetson TX2 is in the folder ‘JetsonTX2’ and the version for the Jetson TX1 is in the folder ‘JetsonTX1’.

Note that version 2.2.1 of the ZED Camera driver installs to match the CUDA 8.0 package.

The current software stack runs on the Jetson TX1 and Jetson TX2 running L4T 28.1. Use JetPack 3.1 to install L4T 28.1, and at a minimum:

  • CUDA 8.0
  • cuDNN 6.0
  • OpenCV4Tegra 2.4.13

Note that the next version of L4T will release in the next few weeks from this writing, at which time we’ll start to update the code.

Installation

The installRACECARJ repository on the RACECARJ Github account contains the scripts necessary to install the software stack. First, clone the repository and switch to the repository directory:

$ git clone https://github.com/RacecarJ/installRACECARJ.git
$ cd installRACECARJ

Next, install the appropriate ZED camera driver for the Jetson in use. If possible, plug the ZED camera into the RACECAR/J USB hub (in one of the FAST ports, they are full speed USB 3.0). If the ZED is present during installation, the ZED driver installer will download the cameras’ calibration file. To install the Jetson TX2 driver for example:

$ cd JetsonTX2
$ ./installZED-SDK-TX2.sh

Then, return to the installRACECARJ directory:

$ cd ..

We’re now ready to install the Robot Operating System (ROS) software and the rest of the RACECAR/J software stack. The installation script does the following:

  • L4T 28.1 does not have a cdc-acm driver. The script installs a pre-built cdc-acm driver. The driver expects a stock kernel (4.4.38-tegra)
  • Because the electronic speed controller and the IMU both report as ttyACM, a udev rule is installed which names them as vesc and imu respectively.
  • ROS is configured and rosbase is installed
  • One of the dependencies is missing in the package specifications, so ros-kinetic-opencv3 is installed.
  • The MIT RACECAR packages are installed, which includes the ZED v2.2.x ROS wrapper.

To start the installation (use installMITRACECAR.sh for earlier 4.X version VESC hardware):

$ ./installMITRACECARVESC6.sh

The directory ‘~/racecar-ws’ is the default workspace directory name; it can be specified on the command line after the script name. Because there is such a large amount of messages during installation, you may want to log everything to a log file:

$ ./installMITRACECARVESC6.sh |& tee softinstall.log

The log will be placed in the file ‘softinstall.log’ for review. This is useful in case there are installation issues.

ROS Environment Variables

Worth noting is that the scripts also set up two environment variables in the .bashrc, namely ROS_MASTER_URI and ROS_IP. These are placeholders, you should replace them with values appropriate to your network layout. Also, while the normal ROS setup.bash is sourced, you may want to source the devel/setup.bash of your workspace.

After installation, you should be able to run the teleoperation launch file if the VESC is programmed. If you buy the RACECAR/J kit, the VESC should already be programmed. You are ready to tele-operate the robot: RACECAR/J – ROS Teleoperation

If you buy the VESC separately, program the VESC next: RACECAR/J – Programming the Electronic Speed Controller

Conclusion

The installation of the entire ROS software stack for RACECAR/J can be a complicated affair. However, these installation scripts should make things fairly simple.

Facebook
Twitter
LinkedIn
Reddit
Email
Print

73 Responses

  1. Hi! I just go through clean install everything on my Jetson TX2 by your instruction. It was straight forward with little exception. The location of SDK for ZED has been changed. Amcl and map_server need manual install. Thank you for your effort. Hopefully, I want to see next step of driving.

  2. Hello,
    First of all I would like to thank you for all the hard work you put on this website to share your knowledge. It’s very helpful.

    I have problem that I could not solve for a while.

    I’m trying to build an RC car similar to the racecar. I have a problem with capturing an image frame of the ZED camera. It causes huge delay to the system (~6 seconds). Now keep in mind that I’m using the ZED Python API which is still in beta. Can you guide me to the right API to capture ZED camera frames smoothly?

    Thanks.

    1. Thank you for the kind words. RACECAR/J uses the ROS ZED package. More than likely you would want to use the C/C++ interfaces. Thanks for reading!

  3. Thanks for this article, and your website in general. Super helpful and informative! Can I ask: Why use BOTH a depth-sensing stereo camera, AND a Lidar unit? Are they each effective at different ranges or something? Thanks again!

    1. You’re welcome!
      The RACECAR is used in educational and research settings. One of the interesting problems is sensor fusion, which is how do you combine the information streams from different sensors to get a comprehensive overview of the surrounding environment. A subproblem is how do you determine which sensor to believe for a particular situation.
      The Hokuyo is a very high quality 2D LIDAR, but basically it’s just the line of objects in the field of view at a particular height. The 3D depth camera provides a wider field of view, but isn’t as accurate. The advantage of using a camera to identify something like a stop sign or so on is readily apparent. Typically autonomous vehicles use multiple cameras, radar, ultrasonics, and most use LIDAR. The “real” cars use 3D LIDAR, but it is considerably more expensive than the one used here. A “real” car also combines in GPS information.
      The thing to remember is that sensors like LIDAR may encounter issues depending on things like lighting conditions and reflectivity of objects to the laser being used. Cameras have issues with lighting also, so typically they have RGB cameras and infrared cameras too. You also have to plan for the case where a sensor gets obscured by dirt or something.

      1. Would love to see a shootout between the ZED camera, RealSense D435 and Structure.io cameras, specifically in the context of RACECAR/J, and which one(s) you ended up going with.

  4. When I install the ZED driver,

    ./installZED-SDK-TX2.sh

    Igot the following error messages:
    Installation path: /usr/local/zed
    Checking CUDA version…
    ERROR : Found CUDA 9.0 but the ZED SDK requires 8.0
    The ZED will NOT work with this CUDA version (9.0), Continue (Y/N) ?
    Installing…

    Could you please let me know how to fix it? I thin I flashed Jetsonpack3.1.

    THanks,

    George

    1. I believe that there is a new ZED driver available. Be aware that the entire software stack has not been tested thoroughly on L4T 28.2. You may also have to change the .rosinstall scripts for a new ZED wrapper.

  5. I flash my Tx2 with JetPack3.1. But when I tried to install the ZED, it said:

    ERROR: cannot verify http://www.stereolabs.com‘s certificate, issued by ‘CN=RapidSSL RSA CA 2018,OU=www.digicert.com,O=DigiCert Inc,C=US’:
    Unable to locally verify the issuer’s authority.
    To connect to http://www.stereolabs.com insecurely, use `–no-check-certificate’.
    chmod: cannot access ‘ZED_SDK_Linux_JTX2_v2.2.1.run’: No such file or directory
    ./installZED-SDK-TX2.sh: line 19: ./ZED_SDK_Linux_JTX2_v2.2.1.run: No such file or directory

    Could you let me know how to fix it?

    Thanks,

  6. I plugged-in the ZED camera. But when I install the SDK by running:

    ./installZED-SDK-TX2.sh

    It complained

    *** Download Factory Calibration ***
    No ZED detected : ZED calibration file will not be downloaded. Go directly to http://calib.stereolabs.com to manually download your calibration file.
    ZED SDK installation complete !

    Is this normal?

    1. I manually download the cal file from http://calib/stereolabs.com/?SN=mySN and put it under /usr/local/zed/settings

      When I ran ZED Explorer, the GUI said “Standby”. The ID and SN filed were empty and now image. Looks like TX2 did not recognize the ZED. Do I need to do addition install?

        1. The description for the USB hub: https://amzn.to/2I7DQa2
          reads: AmazonBasics 7 Port USB 3.0 Hub with 12V/3A Power Adapter
          That hub has two Super speed (5 Gbps) ports. The ports are labeled FAST. Typically one plugs a ZED camera into one of those ports.
          I do not know why you think that the hub is 2.0

          1. I could not find any “FAST” label on the USB hub.

            I connected ZED directly to the TX2 USB port. TX2 still can not recognize it…

  7. “If the ZED is present during installation, the ZED driver installer will download the cameras’ calibration file.”
    I don’t see anything in the ZED documentation about calibrating the camera, and no indication of whether or not this succeeded. Cna you tell me where to look for the calibration file?

    1. For anyone finding this question int he future, and having the same problem, my main clue that solved this for me was the comment from George above. The ZED camera installation script does not actually automagically download the calibration file. You need to run wget http://calib.stereolabs.com/?SN= , and replace with the serial number for your camera, which is printed on the back of the box it came in.

      Alternatively (or possibly in addition to?) the above, you can run /usr/local/zed/tools/ZED\ Calibration , which I’m guessing creates a custom calibration file.

      Hoping this helps someone out in the future, because it isn’t documented anywhere on the ZED developer site.

      1. The ZED shipped nowadays has a new firmware that does not work with the RacecarJ install script. RacecarJ uses a old version of ZED API, including the tools, such as Explorer. What I did (recommended by ZED) was install the new ZED API on a different machine, link ZED to that machine, then use the tool on that machine to downgrad the ZED firstware to a older version. Then, link ZED to TX2 and RacecarJ script will work perfectly.

        1. I *think* you’re talking only about the RacecarJ install script, is that correct? I only ask because, as far as I can tell, RacecarJ (and mit-racecar as well) doesn’t contain any actual logic. Just a bunch of ROS sensor nodes chattering away at eachother. I want to just make sure that my manual installation of the ZED driver, software and ROS node didn’t screw something up (seems to be working fine on the command line, with no errors, but who knows for sure?)

          1. I noticed that teleop.launch did not start ZED ROS node. What is the correctly way to launch the car ros node with ZED ros node?

  8. When I run “roslaunch zed_wrapper zed.launch”

    I got below error. Do you know why? and How do I fix it?

    [ERROR] [1531400428.350306233]: Failed to load nodelet [/zed/zed_wrapper_node] of type [zed_wrapper/ZEDWrapperNodelet] even after refreshing the cac he: MultiLibraryClassLoader: Could not create object of class type zed_wrapper::ZEDWrapperNodelet as no factory exists for it. Make sure that the li brary exists and was explicitly loaded through MultiLibraryClassLoader::loadLibrary()
    [ERROR] [1531400428.350438805]: The error before refreshing the cache was: MultiLibraryClassLoader: Could not create object of class type zed_wrappe r::ZEDWrapperNodelet as no factory exists for it. Make sure that the library exists and was explicitly loaded through MultiLibraryClassLoader::loadL ibrary()

  9. Can you tell me the details steps required to run the racecar j in autunomous mode. I can run it using teleop.launch. I also have lidar,imu interfaced with jetson.So just need to understand how to feed to map and what commands to run

  10. Hi,

    I am trying to understand how the car runs in autonomous mode. Can you tell me the commands to run it in autonomous mode? Also, I am trying to understand how you are getting the odometry information and passing it to navigation stack. can anyone help me to understand this?

  11. Hi,
    I downloaded and followed the installation step given above, now I want to run it in autonomous mode and I don’t know which command to run for it to run in autonomous mode. Also, I am not able to understand where is the message published on “ackermann_cmd_input” topic and “ackermann_cmd” topic. Can you help me understand what information needs to be given to the navigation stack and how are you doing it in the program?

    1. I’m not sure what you mean “run it in autonomous mode”. The “known_map_localization.launch” file will run using the map of the tunnels underneath MIT. But unless you have that environment you will need to create your own maps.

      Did you try using rqt to examine the node graph and examine topics? Did you look at the ROS messages? /joy_teleop is a node that translates the /joy input to ackermann commands. /joy_teleop publishes on /ackermann_cmd_mux/input/teleop

      /joy_teleop is implemented by src/racecar/racecar/scripts/joy_teleop.py

      1. By autonomous mode I mean using the ROS navigation stack.
        If I am right, there are three modes of operation one,where we can use the teleop mode (which I tried. works fine) and two modes that work on ROS navigation stack and ackermann control. I do not own a hokuyo lidar and I have a camera that produces pointcloud images (I have all the other components). From the tutorial I have understood that we can provide either a laser scan data or a point cloud data. I would like to use the ROS navigation stack by giving it the point cloud messages and I am having trouble finding the right place to provide this message as well as to launch the RACECAR in this autonomous mode where the navigation stack is used. Thank you.

        1. I’m not quite sure how you are thinking about this. My understanding is that there is one “out of the box” launch file which is for teleoperation. The other launch files are for two purposes. The first is called “mux.launch”. This is for a graph which includes a high level and a low level ackerman control. The other launch file is for a known map, which is the map for the tunnels underneath MIT.

          In the introduction to robotics class, there are 6 assignments for the 12 week class. The MIT students modify the “mux.launch” file to include their algorithms for autonomy. These include a parking controller, a line follower, a parking cone detector, a wall follower, visual servoing, path planning and so on. The output of the mux ties into the vesc/ackermann topic.

          The last assignment is to race through the tunnels underneath MIT in time trials using the supplied map.

          In order to build your own autonomy algorithm, you would create a node(s) which subscribes (or action/service) to the desired topics, do what ever processing you need, and then publishes to the desired topic. You will also need to think about a safety controller, so that you have some control over the robot after launch. Hope this helps.

          The last assignment ties together

  12. Thank you for the information and sorry for the trouble. I am not clearly understanding whether any default autonomy algorithm is included in this package or we should include our own?

    1. It’s not trouble, there just seems to be a nomenclature issue. Autonomy is too general of a term to really define what you are trying to accomplish. An autonomous vehicle just means that it does something on its own, the complexity of the goal is not specified. One could have a robot drive for 10 seconds in a straight line. Or one could have a robot navigate through an environment avoiding obstacles to a stated location. Both are “autonomous”, that is, they work without intervention. But they are hardly the same thing.

      You should build your own “autonomy algorithm” depending on your goals. There are several tools in the mit-racecar Github repository, such as particle_filter ( https://github.com/mit-racecar/particle_filter ) , which you may find useful for scaffolding your solution.

      Typically you would publish your “go command” to the high level ackermann cmd in the mux, and also provide an arbiter in the safety control node so that you avoid running into anything. You would also publish the low low level ackermann commands to the VESC.

  13. Thank you for the information and I agree that there is a bit of nomenclature issue. Is there any place you can point me to where I can find the signal flow of this RACECAR?

        1. What did you try?
          rqt_graph will show the graph of a running ROS system.
          rosnode, rostopic, rosmsg and rossrv and other command line tools can be used for examining a running system.

          For example, start teleop. Then:

          $ rostopic info joy_teleop

          joy_teleop is the node which converts joystick commands into ackermann_msgs which eventually to the VESC.

          If you have read and understand this book, you should be able to find your way around:
          Programming Robots with ROS: A Practical Introduction to the Robot Operating System (Quigley et al)
          https://amzn.to/2PgOVJ8

  14. Hi Jim,

    I have had some trouble with the first step for the ZED camera. I installed the latest version of JetPack so I have CUDA 9.0. Is there a way I can continue with CUDA 9.0 or do I have to use CUDA 8.0. If so, how?

    Thanks!

  15. If you want to use CUDA 9.0, you will need to install the appropriate driver for the ZED camera from the Stereolabs developer website, along with modifying the version of the ROS ZED wrapper package. The only currently supported version of the software stack is as described above.

  16. Hi there,

    Currently, I’m exploring how the ackermann_cmd_mux package interfaces with the VESC while the car is operating. My end goal here is to understand how exactly the joystick manages to publish to ackermann_cmd_mux, which then prescribes inputs to the VESC (apologies if I’ve understood this wrong). My end goal is to allow keyboard operation, or to use another interface to program values to the VESC, but I haven’t really been able to figure out how the joystick sends commands. Is there a configuration file I’m missing, or am I just thinking about this the wrong way? Could you please point me in the right direction?

    1. Hello,

      I recently downloaded the the racecar package and made it run through the teleop launch file. I then went inside the vesc folder and inside vesc/src there is vesc_to_odom.cpp which i think basically gives out odometry information from the vesc and in order to interpret the data I changed “published_tf:false” to true considering it will show me odom topic when I run rostopic list but I am unable to see the topic. Is there any other file I need to edit to get the odometry info? Am I in the right direction or completely wrong? My end goal is to take to odometry info from vesc and feed it to gmapping. Can anyone help me out?

      Thanks.

      1. Hi there,

        I think the odometry information is provideed by IMU and not VESC. The IMU sends the odometry information to vesc.Someone please tell me if my understanding is wrong, According to me since the IMU has accelerometer and gyroscope maybe that is giving out the odometry information. I am trying to figure out how is the map build,currently I am looking at map_server and gmapping to create a map and I will update you if I figure it out .

        1. The IMU is not connected to the VESC. The VESC node calculates odometery. The VESC tracks the motor speed. You can calculate the wheel speed from the motor speed and gearing. If you know the size of the wheels, you can then calculate positional changes. The VESC also controls the steering servo. Using information from the steering servo, you can calculate the angular velocity. The wheel base of the robot is also needed to help in the calculations.

          You can see this code here: https://github.com/mit-racecar/vesc/blob/master/vesc_ackermann/src/vesc_to_odom.cpp

          Note that brushless motors have some issue on startup because they do not “know” where the rotor is so they do not know which stator to fire first. This can cause the motor to “cog” while it hunts for the right stator sequence as the motor starts up. This can cause the odometery to be a little off. Adding a sensored brushless motor can help with this. Sensored brushless motors use something like a hall effect sensor to calculate where the rotor is, which helps calculate the stator firing sequence for a smoother startup. The VESC can handle sensored brushless motors.

          Note that this is not as precise as having encoders on the wheels. The IMU can be used to help in these calculations, along with the information from the lidar. Typically that’s one of the tasks on the robot, fusing sensor data to get a ground truth of what is taking place.

          1. Hello,

            I am trying to recreate this project. I have few questions when I went through the workspace.
            1. After reading the comments I understood the odometry info is coming from vesc, then what is the use of IMU?
            2.How were the maps generated? for this I read through the book and understood it requires gmapping or hector_slam to create the map but I don’t know how to send the “tf” and update it while creating rosbag file which are the subscribed topics for gmapping.
            3.How was the path decided once the map was created?

            I know these are alot of questions and could be most of them would not make any sense but just want to have proper understanding of the project.

            Thanks.

  17. Hi Jim,
    First I would like to thank you for this awesone blog.
    Right now I’m planning to build RC car similar to Daniel Tobias one.
    I think will choose ZED camera, Logitech c920 and Lidar VLP-16. Do I need to prepare special script for this Lidar model ? I have older version of IMU from SparkFun. It was described by you and tested with TK-1 platform. ROS software stack will be runed on TX-2 platform. I’m planning to run DNN using Tensorflow backend. For controlling motors I will go with VESC 4.12 driver. I’m curious if TX-2 will be enough to run one dnn based on c920 camera and one for VLP-16 & ZED? On host side (laptop) I need also to apply object detection based on Yolov3. Wifi built in TX-2 devkit module is enough for this activity ?

    1. Hi Michal,
      If you are using stereo vision for the ZED, that takes up quite a bit of the GPU. If not, then there’s a chance you can get everything to work together. That this depends a lot on the application(s) you are planning to run onboard. The VESC and IMU don’t take up much processing, but processing the VLP-16 data stream along with the ZED and c920 is expecting an awful lot. Note that MIT uses a Jetson Xavier when using this selection of hardware. Thanks for reading!

  18. Thanks for the installation guide.
    I am new to linux and ROS, I would like to know if there is a racecarJ package available for Jetpack 3.2.1. I am not using the same sensors other than Rplidar2. Where do I start if I want to create a customized the racecarJ package using other camera like BFLY-camera and JetsonTX2 onboard camera? Thanks.

    1. There is currently no JetPack 3.2.1 available for RACECAR/J. If you build a customized package, you will need to find the ROS packages for any of the sensors that you intend to add and substitute them for the existing ones. Good luck on your project!

  19. Hi everyone, I have successfully build my car. However, I could not find the FocBox VESC anymore. The manufacture no longer makes them. So if you have a spare or want to sell your used, please let me know…

    1. @Andrew, thanks. Is the software the same as Focbox? Can you use the BLDC tool with this? Could you post some pictures?

  20. hi, I have some strange results (odom moving alone, forward & backward are inverted) only in rViz using the Vesc6+, do you have an idea why? Does it works well for you in rViz? (I used the branch VESC6 & I did not changed the code of RacecarJ)

    I used this repo for the urdf (racecar_description): https://github.com/mit-racecar/racecar_gazebo
    my launch file for rviz: roslaunch racecar_description display.launch

      1. yes “exactly” (I have a “cog” effect but it s not the issue for rViz I think)
        In rViz: odom move alone, forward & backward are inverted :/

        1. the odom is moving alone I think because when I do a `roslaunch racecar teleop.launch` and after `rostopic echo /odom`, I can see in the msg that pose/pose/position/x is always incrementing (twist/twist/linear/x is also moving) and I am not moving the robot. It s stange no? Do you have the same result?

  21. Hi Jim, I seem to be having the opposite problem of most people – First, my setup everything installed in the past few months for software version reference. TX2, Traxxas car + brushless 3500 motor + Battery, ZED camera, VESC from mboards (VESC4.12 config), logitech 710 controller. I had to dissect all of the automated installers as I have the most current version of Jetpack running (4.3?) running ROS melodic. Now I am almost positive I got all of the install right. When running the basic teleop.launch, forward/backward works in the sense that the wheels spin the correct direction, but definitely not well. It is a very inconsistent choppy rotation. Left/Right does not work at all. Right makes a clicking sound, left does nothing. A note, my controller (same as yours) only works in X mode and when I flip to ‘D’ it crashes the jetson and reboots. weird. When trying to turn, I get ROS errors coming from the MIT stack ‘unknown payload type’, ‘out-of-sync with VESC’, unknown data etc… for the duration of time that I am moving the joystick. I figured since my goal is autonomous operation/SLAM to just forget the joystick as long as the car will drive itself. Fastforward through the software development and the hardware is behaving the same way. The car wants to turn and drive, but wheel rotation is still very choppy and confused, wheels still wont turn. Any Ideas? I’ve exhausted every other resource I can find and am really trying to avoid the full wipe/reinstall troubleshooting method. Thanks in advance!

    1. Hi Chris,
      It’s not clear to me how you programmed your VESC, changed the firmware, or tuned it to match the motor you are using. The standard RACECAR stack runs on an Ubuntu 16.04 system, JetPack 4.3 is 18.04. I would expect some issues with given drivers.
      The VESC issues sounds like a tuning issue, mismatched firmware, or it could be that the VESC is defective.
      I don’t have any experience with the mboards version.

  22. Hi Jim,

    Thanks for your response. I programmed the VESC using the newest version of the VESC “tool”, the servoout firmware that it had loaded, and the tuning config from the MIT VESC repo. Tuning may have been overlooked in this case. I’ve started fresh in a new ws on the same machine to see if I can track down the issue. So far, first error I’ve run into is jstest shows no messages when the controller is in D-mode. Switch to x-mode (queue crash/reboot) and jstest shows messages on controller input. Not sure if that is of any significance, but we’ll see what comes next

    1. If you are using the VESC tool, you may have an incompatibility problem with the configuration from the MIT VESC parameters. The VESC-tool has many more parameters than the earlier BLDC tool. While I do not know exactly how the two different sets of parameters match up, I would guess there would be issues. Also, you should tune the parameters to the particular motor you are using.

  23. Hi I’m running on ROS melodic. When I try to run installMITRACECAR.sh I get this error:

    zed-ros-wrapper/CMakeFiles/ZEDWrapper.dir/build.make:62: recipe for target ‘zed-ros-wrapper/CMakeFiles/ZEDWrapper.dir/src/zed_wrapper_nodelet.cpp.o’ failed
    make[2]: *** [zed-ros-wrapper/CMakeFiles/ZEDWrapper.dir/src/zed_wrapper_nodelet.cpp.o] Error 1
    CMakeFiles/Makefile2:4634: recipe for target ‘zed-ros-wrapper/CMakeFiles/ZEDWrapper.dir/all’ failed
    make[1]: *** [zed-ros-wrapper/CMakeFiles/ZEDWrapper.dir/all] Error 2
    Makefile:140: recipe for target ‘all’ failed
    make: *** [all] Error 2
    Invoking “make -j6 -l6” failed

    Any suggestions? Thanks.

  24. Hello there,
    I try to run racecar packages under Ubuntu 18.04, L4T 32.3.1 (kernel 4.9.140-tegra). Unfortunately, the VESC is not found under ttyACM, what in my opinion refers to cdc-acm driver issue, which is not properly installed. Has anyone faced a similar issue? Thank you.

  25. Hello there,
    I try to run racecar packages under Ubuntu 18.04, L4T 32.3.1 (kernel 4.9.140-tegra). Unfortunately, the VESC is not found under ttyACM, what in my opinion refers to cdc-acm driver issue, which is not properly installed. Has anyone faced a similar issue? Thank you.

Leave a Reply

Your email address will not be published. Required fields are marked *

Disclaimer

Some links here are affiliate links. If you purchase through these links I will receive a small commission at no additional cost to you. As an Amazon Associate, I earn from qualifying purchases.

Books, Ideas & Other Curiosities