Quantcast
Channel: JetsonHacks
Viewing all 339 articles
Browse latest View live

Stereolabs ZED Camera – NVIDIA Jetson TX2

$
0
0

The Stereolabs ZED Camera is introducing SDK 2.0 which includes support for the Jetson TX1 and Jetson TX2. Looky here:

Background

The ZED Camera is a stereo depth sensor which contains two 4 megapixel imagers. For each frame, a composite of the left and right images (side by side) are sent over USB 3.0 to a host computer. The host computer then constructs a depth image using a GPU.

The small package and relatively light weight make the ZED an interesting choice for robotics and other mobile applications.

There are a variety of other APIs in the ZED SDK which do further processing. These functions include such interesting tasks as 6-axis Positional Tracking (visual odometery, no markers needed) and large-scale 3D environment mapping.

Stereolabs has a Github repository with a lot of code for project support, which is worth checking out.

The Jetson acts as the host computer for the ZED in this article.

Installation

Stereolabs recently updated their website to facilitate better interaction. Currently, you can get the SDK here.

Installing the SDK is straightforward. Download it from the website, set the permissions, and then run the installer from a terminal. The installer gives the option of also installing sample code, useful for development purposes.

Installation Issues

While downloading and installing the SDK was simple, actually getting the ZED to work was quite challenging in the development environment here.

First, we started using the ZED camera here over a year ago. Before most of the tools could work, a firmware upgrade was needed for the camera.

The firmware must be flashed from a PC. The PC here is an older, slower machine and doesn’t appear to care much for all that is new and shiny. It runs Windows 8. The Stereolabs package was downloaded from their website and installed on the PC. The ZED was connected to the PC, and then the dreaded ‘MSVCP120.dll is missing‘ message popped up.

This is odd, as the installer made it a point to ask for confirmation when installing the related Microsoft Visual C++ Redistributable Package. Having had to support this very issue over the last decade or so, I knew how to fix it. As a developer, I understood. As a consumer, I was annoyed.

Note: In the reorganization of the website, Stereolabs now has a page for troubleshooting this very issue. Unfortunately it wasn’t around when I first encountered the issue.

The second installation issue is the ZED Camera calibration. The procedure itself is simple enough. Unfortunately the calibration is effected by the lighting conditions when the test is being run. So much so that about the only time a good calibration could be made is when the room was dark. In my house, the only time it’s dark is at night. So calibrating at night was the only option.

Worse, there are times after running what appears to be a successful calibration that the tools cannot ‘find’ the configuration files. My guess is that it actually found the calibration file, but the file contained a bad range of values and is rejected. In either case, as a developer I understand. As a consumer and user, not so much.

And to be clear, I did a fresh install of the OS, downloaded the ZED SDK, and tried to calibrate the camera a dozen times. I tried restoring calibrations that I knew worked previously, and they were not recognized either.

Performance

Once the SDK is installed and the ZED is calibrated, there are several tools and examples to explore. The ZED Depth Viewer provides a depth map and point cloud viewer, and a status on the number of frames that are being processed per second (Hz).

The frame rates for the Jetson TX2 and Jetson TX1:

Jetson TX1 vs Jetson TX2 – ZED Depth Viewer
720p @ 60 fps – bigger is better
Machine Average Hz
Jetson TX1 (Maximum Clocks) 30
Jetson TX2 (Mode Max-Q) 33
Jetson TX2 (Mode Max-P) 43

As we can see from the table, in low power mode the Jetson TX2 is about 10% faster than the Jetson TX1 in maximum performance mode. In maximum performance mode the TX2 is almost 50% faster. Remember that you must factor in that the input stream is 60 fps. Interesting.

Conclusion

So there you have it, our first look on the TX2 with the ZED API 2.0. We will be using this camera on the current iteration of the Jetson RACECAR, so we should be able to see how it works in an actual application.

The post Stereolabs ZED Camera – NVIDIA Jetson TX2 appeared first on JetsonHacks.


Jetson RACECAR Build – Lower Platform

$
0
0

Time to put the lower platform together and do a test fit. Looks here:

Lower Platform Build

The lower platform consists of four major components:

Note: This is the second of three prototype builds for the Jetson RACECAR. A full bill of materials will be published for the third prototype.

The electronic components are attached to a laser cut, 1/4″ ABS platform using 1/4″ aluminum standoffs. The USB Hub and the VESC are attached using 3M Dual Lock Reclosable Fastener.

3M Dual Lock

Most people associate Velcro, the original hook and loop fastener with these types of applications. While Velcro is a good alternative, the 3M Dual Lock solution provides some advantages for our application. First, Velcro has a tendency to allow “give”, the attached objects can still move around a little when fastened together. Second, sometimes it’s hard to tell when the fastener is fully engaged. Third, the adhesive on the back of the Velcro has a tendency to lose grip when hot. The industrial versions of Velcro are better, of course.

Dual Lock works differently than Velcro. The fastener consists of continuous polyolefin stems having a mushroom shape. The mushroom heads allow the fasteners to easily slide over each other allowing positioning of parts before they are “snapped” together creating a firm fastening attachment. Where Velcro consists of two different tapes, a hook and a loop tape, the Dual Click consists of one. Dual Lock attaches to itself.

By varying the density of “mushrooms” on the tape, different levels of strength can be achieved. This allows users to mix and match tapes to achieve different strength levels of bonding. Also, Dual Lock offers both rubber and acrylic based adhesives on the back of the tape. The acrylic adhesive offers bonding up to 200 degrees fahrenheit.

The Dual Loop should easily hold the USB hub, we’ll have to experiment a little with the VESC to make sure it holds under heavy use and high heat. However, if the VESC is running close to 200 degrees something else is probably wrong.

There appears to be about 3/16″ clearance with the top platform and the top of the Jetson TX1 heatsink/fan. It’s a pretty tight fit. Have a look:

Top platform heat sink clearance
Top platform heat sink clearance

Check out the video for the test fit, everything appears to line up.

Next Steps

There’s some wiring to do to hook everything up, we’ll have to add batteries. On the MIT RACECAR, they use an extra router for better communications with a base station. We’ll have to think about that, and figure out if we think autonomous means round tripping with other stationary computers. It might be good for testing though …

Are you building a Jetson RACECAR type of vehicle, or other interesting type of Jetson based project? Let us know!

The post Jetson RACECAR Build – Lower Platform appeared first on JetsonHacks.

Jetson RACECAR Build – Upper Platform

$
0
0

After putting together the lower platform of the Jetson RACECAR, the next step is to assemble the upper platform. Looky here:

Stiffening the Springs

With the addition of the extra weight from the platforms, computer and sensors, the RACECAR is getting a case of “droopy drawers”. Time to address that issue.

Before continuing the build, we add spacers to the front and rear spring assemblies. The TRAXXAS Rally comes with additional spacers. On the front springs, add two small spacers and a medium spacer. On the rear springs add one large spacer. This has the effect of raising the ride height and stiffening the suspension.

Depending on the total weight being added, it may be prudent to switch out to stiffer springs. Fortunately there are several different spring sets available for the car.

Lower Platform

In the video, the IMU is added to the lower platform using 1/4″ standoffs and 4-40 screws with nylon washers.

Upper Platform

The upper platform has three major components:

The MIT RACECAR uses the router to get a better signal and boost connectivity with the car. This may be optional depending on your circumstances. The Energizer Battery is interesting. The battery is 18000 mAH with three output voltages. The 16-20V outlet drives the Jetson. The 12V outlet drives both the USB hub and the router. There is a 5V USB outlet which is currently not used.

Using two batteries on the vehicle, one for the motor and the other for the sensors and electronics, keeps the sources “clean”. Motors tend to “dirty up” power supplies, and under load can cause voltage drops and spikes. Having two batteries eliminates that issue. There are circuits which can keep the power supply clean of course, but for now the two battery solution seems like a good choice.

There is a mounting hole for the ZED camera on the top platform. The ZED provides stereo vision capabilities to the vehicle.

With the judicious application of 3M Double Lock Fastener, both the battery and the router are mounted on the platform, as shown in the video. Both the battery and the router use the same mounting pattern so that both can be on the vehicle, or the battery alone.

The ZED is mounted using a 1/4″-20 3/8″ screw.

The platform is mounted on 1 3/4″ 3/16″ hex 4-40 F/F standoffs with 4-40 3/8″ screws.

Wiring

Preliminary wiring adds an extension cable for the steering servo to the VESC-X. The USB hub has a cable which may be used on the car, but a shorter 6″ USB 3.0 B Right Angle to A cable is wired up in the video. The cable connects the USB hub to the Jetson USB 3.0 type A jack.

The IMU is then wired to the USB hub using a short mini-USB to A cable.

The next step routes the USB 3.0 cable from the ZED camera through the platform to the USB hub. The ZED has a generous amount of cable, so it needs a little tidying up with zip ties.

As the last step, the power from the Energizer battery is wired. A cable supplied with the battery connects the 16-20V outlet to the Jetson power input. The 12V cable needs to be modified to connect to both the USB hub and the router. The cable then connects the 12V output from the battery to the USB hub and router.

Note: If the router is not being used, then the cable connects directly from the Energizer battery to the USB hub.

In general, cable management is an issue. We’ll have to remember to use caution and plenty of fasteners to keep wires away from moving parts!

Next Steps

At this point, the vehicle hardware is just about complete. The next step is to start installing the software.

The post Jetson RACECAR Build – Upper Platform appeared first on JetsonHacks.

Discount Code for GTC (Annual NVIDIA GPU Technology Conference)

$
0
0

Jim at GTC

If you haven’t been to the annual NVIDIA GPU Technology Conference (GTC) you’re missing out on something special. As usual, I’ll be there to see all the new bright and shiny. The show is in San Jose, California May 8-11, 2017.

As a JetsonHacks reader, you can get a 25% discount following this link, discount code NVBWILLIA

There are a limited number of these spots available, get them while the getting is good!

Visit with Jim @ GTC

The post Discount Code for GTC (Annual NVIDIA GPU Technology Conference) appeared first on JetsonHacks.

Jetson RACECAR Build – Software Install

$
0
0

The software install for the Jetson RACECAR is a multi-step process. Looky here:

Background

There are a couple of different ways to install the software for the Jetson RACECAR. The MIT RACECAR repository on Github has instructions on how to directly install their software from a conveniently provided disk image. For the purposes of the Jetson RACECAR, we will build the entire software stack which includes flashing the Jetson RACECAR with JetPack, and loading ROS. This process takes a couple of hours depending on your Internet connection speeds.

The procedures here to prepare the Jetson RACECAR for software development have all been covered on JetsonHacks before, but the difference here is that we string them all together.

For the purposes of this development project, we added a 120GB SSD with a cable to the RACECAR. Looky here:

Installation

In this article, the Host PC refers to a PC that is running Ubuntu. The Jetson RACECAR is the Jetson onboard the RACECAR itself. In the current RACECAR, this is a Jetson TX1. The PC is used to run JetPack, an installer which flashes the Jetson RACECAR with Linux 4 Tegra (L4T) and various useful libraries. Article references are provided for more detailed instructions.

Note: We’re working on the Jetson RACECAR side of the install, there is also a PC Host side which will come later in the series.

Before we start installing the software, we partition and format the SSD. We start with:

$ sudo parted /dev/sda/ mklabel gpt

and then create a partition and format using the Disks app.

Once the SSD is prepared, we set the SSD as the root directory, as described here. We’re then ready to start building our software development environment on the RACECAR.

Here’s the software installation sequence:

On the Host PC:

  • Use JetPack from a Host PC to flash the Jetson with L4T. We’re using a Jetson TX1, so we select that option in the installer.
  • We use JetPack to install all of the specialized libraries listed, but do not choose to compile the samples.

On the Jetson RACECAR

If you are familiar with Jetson software development, there are not any surprises here. If you’re new to all this, read through the linked articles and watch the associated videos to become familiar with the subject matter.

Conclusion

With the prepared development environment, now we can start building the actual Jetson RACECAR packages for ROS so that we can start to command the car to do our bidding!

The post Jetson RACECAR Build – Software Install appeared first on JetsonHacks.

Jetson RACECAR Build – Software Install 2

$
0
0

In part one of the RACECAR software install, we set the stage. In this article, we install the RACECAR packages. Looky here:

Background

One of the packages that is installed is for Ackerman Steering. This is the type of steering geometry that is used in most automobiles. Angled steering arms allow the wheel angles to change at a different rate. In the old horse buggy days, parallel steering arms were used on which when the wheels turned, both turned at the same angle. As you know, the wheel closest to a turn follows a smaller radius circle than the outside wheel. In other words, the outside wheel travels further than the inner wheel, so it is intuitive that the angle of the outer wheel should be different than the inner one. Angled steering arms provide a mechanical solution to the problem. The R/C car used in the Jetson RACECAR has this type of steering mechanism connected to the steering servo.

Installation

This is a straightforward installation which installs the ROS racecar package, along with packages for nodes controlling the VESC (Vedder Electronic Speed Controller) and an Ackerman Steering node. There is also a Stereolabs ZED camera package installed, and support for some other sensors.

All of these packages come straight from the MIT RACECAR repository.

$ git clone https://github.com/jetsonhacks/installRACECAR.git
$ cd installRACECAR
$ ./installRACECAR.sh

Conclusion

Most of the needed software is now installed on the Jetson RACECAR. The one remaining issue before we can get the vehicle rolling is the VESC, which needs firmware installation and configuration. This is done using a software tool called ‘bldc-tool’, and is the subject of the next article. Stay tuned!

Setting up the VESC

The post Jetson RACECAR Build – Software Install 2 appeared first on JetsonHacks.

Get Your Motor Running – VESC – Jetson RACECAR Build

$
0
0

After finishing the installation of the ROS packages on the Jetson RACECAR, we need to program the VESC, an electronic speed controller, for use in the Jetson RACECAR. Looky here:

Background

As discussed in Part 3 – ESC Motor Controller, the TRAXXAS steering servo and the drive motor Electronic Speed Controller (ESC) are controlled by PWM signals sent from an on board radio receiver.

For the Jetson RACECAR, we replace the stock TRAXXAS ESC with a Vedder Electronic Speed Controller (VESC). The major reason for the change is to gain full control at low speeds. The stock ESC puts the minimum vehicle speed at around 6 mph. Another reason is that the VESC is open source, which allows the curious to explore the motor controller implementation.

Architecturally, the VESC has a STM32 ARM Cortex processor. The STM32 runs ChibiOS, a real-time operating system. The default firmware flashed on the VESC-X is ‘Servo-in’, which allows a remote controller to set the motor speed. For the Jetson RACECAR application, the VESC servo port needs to be programmed as ‘Servo-out’, which allows commands to be sent to the robot steering servo.

Fortunately there is a compiled binary of the version of the VESC firmware that includes the Servo-out setting. We can flash the STM32 directly from the Jetson using a program called ‘bldc-tool’. BLDC is an acronym for BrushLess DC motor.

Note: There are also compiled versions of the bldc-tool for x86 machines, if you prefer that route.

Once the bldc-tool loads the servo-out firmware on to the VESC, we then load a configuration file which matches the VESC configuration to control a TRAXXAS Velineon 3500 motor.

Note: The actual VESC firmware is available in binary form in the bldc-tool firmwares directory. If you are interested in building the VESC firmware from source, you can compile it from the bldc firmware source tree on Github.

Installation

The VESC is wired to the Jetson using a micro USB to USB-A cable. The USB cable normally communicates motor speed and steering angles between the Jetson and the VESC. The TRAXXAS steering servo is wired to the VESC servo header. In this application, a 90 degree, 3 pin female header helps with the wiring.

VESC Wiring
USB input from Jetson on left, steering servo center. Front of robot is to the left.

The JetsonHacks account on Github contains a repository named installBLDC. To build the bldc-tool:

$ git clone https://github.com/jetsonhacks/installBLDC
$ cd installBLDC
$ ./installBLDC.sh

This will build the bldc-tool and download the RACECAR motor configuration files. The RACECAR motor configuration files are downloaded from the Github mit-racecar repository, and are stored in ~/hardware/vesc.

Before starting the bldc-tool, connect the VESC to the vehicle battery. Then:

$ cd ~/bldc-tool
$ ./BLDC_Tool

This will bring up the GUI to interact with the VESC. Before flashing the firmware, hit the ‘Connect’ button to communicate with the VESC. The VESC should be at ttyACM0 (more formally, /dev/ttyACM0). The current firmware revision will display in the lower right hand corner on connection.

Use the ‘Firmware’ tab to flash the firmware binary, the configuration can be loaded from the ‘Load XML’ button. Important Note: You must select the correct version of firmware to match the VESC that you are using, otherwise damage and other bad things can happen. In the video, we flashed the firmware ‘VESC_servout.bin’ for version 4.12 of the hardware.

The configuration file is located in ~/hardware/vesc. The configuration file shown in the video is ‘6.141_bldc_VESC_X_hw_30k_erpm.xml’ which is for the VESC-X. If you are using a regular VESC, you will probably want to use the configuration file ‘/6.141_bldc_old_hw_30k_erpm.xml’.

Note: You should use the updated bldc configurations, the configurations lower the min and max erpm values to avoid damaging the VESC when connected to the TRAXXAS motor.

Tele-operation

Once the firmware and configuration files are in place, the Jetson RACECAR is ready for teleoperation. You can test a game controller to make sure it is working with the Jetson:

$ sudo jstest /dev/input/js0

The Logitech Gamepad F710 has two modes, controlled by the button labeled ‘mode’. The correct mode for the Jetson RACECAR is the one which the left joystick controls axis 0 and 1, and the right joystick controls axis 2 and 3.

Once the game pad works, you can teleoperate the robot. Make sure that the wheels are clear of any obstructions.

$ cd racercar-ws
$ source devel/setup.bash
$ roslaunch racer teleop.launch

The robot has a deadman switch, the upper button on the left horn of the game pad. Holding the deadman button, you can control the throttle with the left joystick, and the steering with the right joystick.

Conclusion

At this point, we have a working robot platform. We’ll have a couple of more articles on the Jetson RACECAR. As we go forward, we will build the final prototype hardware platform, which we are calling RACECAR/J.

The post Get Your Motor Running – VESC – Jetson RACECAR Build appeared first on JetsonHacks.

What is the difference between RACECAR projects?

$
0
0

Reader Fikri asked:

Now, I’ve been a while trying to understand RACECAR/J and other similar project out there. Please correct me if my understanding is way off. So, there are 2 different approaches to build this project, first nvidia’s way (end to end learning) like what Tobias did and second one is MIT’s or UPENN’s way (RACECAR and F1/10). Is that correct Jim?

While it’s difficult to speak to the other projects, we can talk about RACECAR/J. As you know, RACECAR/J is a robotic platform built using a 1/10 scale RC car chassis. The computation is done by a NVIDIA Jetson Development Kit. There are sensors attached to the Jetson, as well as a controller to allow the Jetson to adjust the speed of the chassis motor and steering angle of the wheels.

RACECAR/J
RACECAR/J

Alan Kay once said, “People who are really serious about software should make their own hardware.” And so it is with robots. RACECAR/J is a very simple robot, but it provides a platform for which to explore self driving vehicles. There are many challenges to getting to the point where a vehicle can drive itself, some can be thought of as very granular. Others are combination of taking different inputs and calculating a course of action.

Let’s examine the intent of RACECAR/J. For people just starting out in ‘robocars’, there are several areas to explore. You’ve probably seen inexpensive line following robots. Once you’re more serious, there are other routes to explore. For example, DIY Robocars is an excellent resource and community for people who want to learn about building their own autonomous vehicles. Many people build their own ‘Donkey Car’ or equivalent with a Raspberry Pi for computing and a camera for a sensor. You can build one of these cars for a couple of hundred dollars. To be clear, this is just one route you can go through in that community.

Once the car is constructed, there are official race tracks where people gather monthly to race against each other. People are exploring different software stacks, some use machine learning, others use computer vision. Introduction of inexpensive LIDARs add another sensor type to those stack. Typically the cars communicate information to a base station, and the base station does additional computation in a feedback loop with the car.

The first question that you have to ask yourself is, “What is the interesting part of this problem?” Typically building one of these robots is fun, but it’s not interesting. The interesting part is creating algorithms that the vehicles use to navigate. This is a new field, there isn’t a “correct” way to do this just yet.

The second question is “How does this prepare me for actually building a self driving car?”. You have probably seen the many articles about different self driving cars and read about all of the different sensors they use. Cameras, radar, lidar, ultrasonic, stereo depth cameras, GPS, IMUs and so on. You also know that the cars have a large amount of specialized computing power on board such as NVIDIA Drive PX (typically 4 Tegra chips on one board, a Jetson has one), FPGAs, ASICs, and other computing components. The full size, autonomous vehicles do all of their computing on board. As you might guess, at Mercedes Benz or any of the other auto manufacturers, they don’t hook up a Raspberry Pi to a camera and call it done.

Which brings us around to building a more complex robot platform. To understand how all this works, you should be familiar with what the issues are. A major area is sensor fusion, taking all of the sensor information and combining it to come up with the next action to take. This includes figuring out when one of the sensors is lying, for example what happens when one of the cameras gets blinded by dirt on the lens?

Motion planning and control is another area of intense research. The vehicle is here, what’s the most efficient way to get to the next point? This reaches all the way down the hardware stack to PID control of the steering and throttle, and all the way up to integrating an autopilot with a GPS route planning system.

There are a variety of ways to think about this. One is a very deterministic approach, similar to that taken in computing in the last 40 years. Using computer vision, calculate the next desired destination point, go there, rinse, repeat until you have arrive at the final destination.

Another approach is that of machine learning. Drive the car around, record everything the sensors ‘see’ and the controls to the steering, throttle and brakes. Play this back to a machine learning training session, and save the result as an inference model. Place the model on the car and watch the car drive itself.

There are many variations thereof. The thing to note however is that you’ll eventually need some type of hardware to test the software on. Simulators are great, but they ain’t real. Full size cars require a very healthy wallet. That’s the purpose of RACECAR/J. The idea is that a widespread adoption of the hardware platform will mean that people will be able to share what they learn about these problems at a price point decidedly less than a full size vehicle. The hardware platform isn’t the interesting part of this, it’s just a few hours of wrenching to put together something to hold the software.

As it turns out, you can implement end-to-end deep learning (which is a relatively new idea), a vision based system, a LIDAR based system or hybrids. At MIT, their first class was mostly built around the IMU and LIDAR on the RACECAR. They are now teaching machine learning autonomous vehicles, the software of which the RACECAR is certainly capable of running. They are also working on a deeper focus with computer vision solutions.

Conceptually, the underlying hardware on the cars mentioned in the original question are the same, some of the implementation details are a little different. The actual software stacks on the cars are different, the Tobias version with end-to-end machine learning versus the more traditional LIDAR based solution. In the press, you’ll hear the (Google) Waymo approach (which is a LIDAR variation) versus the Tesla vision approach. The fact is, it’s still early. As sensors become more mainstream with higher resolution and less cost, you can expect one or two approaches that fit the problem well and be adopted. The real question is how do you access it if you don’t work at one of the big automotive companies?

The post What is the difference between RACECAR projects? appeared first on JetsonHacks.


Scanse Sweep LIDAR Software Install

$
0
0

The newly introduced Scanse Sweep Scanner is an inexpensive 2D LIDAR that is easy to interface with a NVIDIA Jetson Development Kit. Looky here:

Background

LIDAR is much the talk in robot navigation, especially in the autonomous driving arena. It comes in three flavors. The first is usually called 1D, a single laser and receiver measuring the distance of a single point in front of the sensor.

The second is called 2D, and conceptually consists of a single laser/receiver sensor spinning in a circle. Mechanically this may mean that the sensor is mounted on an encoded motor, or a spinning optical device such as a mirror spins above a stationary sensor.

The third is 3D, which provides a surrounding view from all angles. This is usually what people think about when talking about LIDAR on autonomous vehicles. Such devices usually are implemented with a sensor placed underneath a spinning mirror. The mirror not only spins, but also moves in the Y-plane simultaneously. ‘Inexpensive’ professional versions start around $8000 USD and quickly escalate to over $100,000.

There have been many companies working towards bringing higher resolution LIDARs of different types, including fully solid state 3D versions, to market at a consumer price point over the last few years. As of this date, none are shipping.

Scanse Sweep

The Scanse Sweep is a 2D LIDAR. A laser sensor is mounted on an encoded motor, which spins around providing a 2D view. The data is provided over a serial port. You can read the specs on the Scanse page.

The Sweep Scanner is currently around $350 USD, which makes it affordable in the world of LIDARs. At the same time, you will need to manage your expectations. This device running at maximum speed of 10 Hz and sample rate of 1000 samples per second are well below that of even the next competitor up the LIDAR wrung, the RP-LIDAR A2. For a comparison of the two, see: Comparing low-cost 2D scanning Lidars on DIY Robocars.

At the same time, you realize that just a couple of years ago there were no real entries at this price point. Many DIYers were taking apart Neato robot vacuum cleaners to cannabilize a less capable LIDAR unit, but that’s about it.

The actual Sweep Scanner is well thought out. Mounting the device is simple using 2.5M screws and there are two different ports at different orientations which may be used to interface the serial port to USB using a FTDI serial to USB converter. On RACECAR/J, the Scanse plugs into a USB hub which is in turn connected to the onboard Jetson.

Installation

The instructions for installing the Sweep SDK are straightforward and well documented on the Sweep Github account. Same with the Sweep ROS wrapper.

As usual, JetsonHacks provide a repository on Github called installSweep which provides convenience scripts for installing both the SDK and the ROS wrapper. The SDK must be installed before the ROS Wrapper.

$ git clone https://github.com/jetsonhacks/installSweep
$ cd installSweep
$ ./installSweepSDK.sh

The SDK will be installed in directory ~/sweep-sdk.

If you want the ROS wrapper installed, you need to have an initialized Catkin Workspace. Then:

$ cd ~/installSweep
$ ./installSweepROS.sh [catkin workspace name]

This will clone the repository, run rosdep to install missing dependencies, and then catkin_make the package. The name of the package is ‘sweep_ros’.

Notes:

  • The Scanse Sweep Scanner is connected to an AmazonBasics 7-Port, USB 3.0 Hub. The USB Hub is connected to a Jetson Development Kit.
  • USB Autosuspend is turned off. USB Autosuspend usually powers down USB ports to conserve power.

Conclusion

For a freshman effort, the Sweep Scanner is surprisingly well done. The device itself is well though out, and the software support with basic libraries easily available via Github allows for easy integration into applications. The ROS wrapper is an added bonus. Whether the Sweep meets the performance goal for a given project needs to be thought about, but it is nice to have an low cost entry point into the LIDAR world.

The post Scanse Sweep LIDAR Software Install appeared first on JetsonHacks.

RACECAR/J Build – Chassis

$
0
0

This article marks the start of the prototype RACECAR/J build. RACECAR/J is an 1/10 scale autonomous vehicle. First up, preparing the chassis. Looky here:

Background

This is the third hardware prototype of the RACECAR on JetsonHacks. The prototype base is the MIT RACECAR, an “open-source powerful platform for robotics research and education”.

The platform houses state-of-the-art sensors and computing hardware, placed on top of a powerful 1/10-scale mini race car.

Over time, a couple of different parts have become obsolete since we our first prototype. This includes the Sparkfun IMU, and the TRAXXAS car that we originally built upon. This is not unusual, but still requires some changes to the deck/platform for attachment.

We can break the hardware down into different sections:

  • Chassis – This version is based on a TRAXXAS Platinum Slash Truck.
  • Computing – NVIDIA Jetson Development Kit.
  • Sensors – The robot can use different types of sensors including LIDAR, stereo camera, RGBD cameras, and IMUs.
  • Electrical – Batteries, Wiring and Interfaces
  • Mechanical – The “nuts and bolts” that form the backbone of the mechanical structure of the car. This includes the decks.

TRAXXAS Platinum Slash Truck

In this prototype, we build on the TRAXXAS Platinum Slash Truck. In earlier prototypes we used the TRAXXAS Rally which has since been discontinued. However, both the Rally and Slash are similar vehicles built on the same chassis platform. In fact, the Slash meets the demands of our application even better because many of the various suspension bits have been upgraded from plastic to aluminum. This includes the C-hubs, steering blocks, rear hub carriers and axle nuts.

In addition, the Slash does not include a transmitter or receiver. Since these are not used in our project, this provides a little bit of savings.

Chassis Preparation

As shown in the accompanying video, there are several steps in preparing the TRAXXAS Slash. Most of the preparation involves removing parts of the RC Car which which we do not use. Here are the major steps:

  • Remove the 4 body clips which hold the clear plastic body on the car
  • Remove the plastic body
  • Remove the body mounting brackets. There is one in the front, and one in the rear. Each mounting bracket is held in place by two screws.
  • Remove the stock Electronic Speed Controller (ESC), which is held in place by two screws.
  • Remove the receiver case. 4 screws hold the cover down, 2 more screws accessible from inside the box hold it to the chassis.
  • Remove the stock front bumper.
  • Upgrade the front and rear springs.
  • Install a new front bumper, the Scalpel Bumper from JConcepts.
  • Remove the antenna holder

The video gives detailed instructions on the modifications.

Notes

In a previous article, What is the difference between RACECAR project?, we discussed the reasoning behind building our own robot hardware. It’s simple, we want a general purpose self driving platform to better understand the different bits and pieces of autonomous vehicles and the associated software.

There are many ways to modify this build to suite any given application. In this prototype, we replace the stock ESC with a VESC. The VESC is an open source brushless DC motor controller. This provides better control at slow speeds than the stock ESC, as well as the ability to monitor engine speed. The engine speed can be used to calculate crude odometery, since there are no encoders built into the car drivetrain.

The stock plastic body weighs about 6 ounces. The build will be adding 3-5 pounds of batteries, computers and sensors. Therefore a spring upgrade is necessary. The springs shown in the video are the first attempt, but still need to be fine tuned for this application.

Many people have asked for a full bill of material (BOM) for the build. Here’s the deal: Once we’re happy that everything works and fits the bill, we’ll publish the BOM. In addition, we’re setting up a storefront where you can buy the hard to find, custom, and long lead time items. Some of the parts can take up to 10 weeks to get, so we’ll keep some in inventory at the store. We’re still a few weeks from opening the store, but things look quite promising.

The post RACECAR/J Build – Chassis appeared first on JetsonHacks.

JetPack 3.1 Release

$
0
0

Today NVIDIA released JetPack 3.1 which introduces L4T 28.1 with production support for both the NVIDIA Jetson TX1 and Jetson TX2 Development Kits. Also new is TensorRT 2.1, cuDNN 6.0 and expanded multimedia API functionality and samples. Components for Jetson TK1 remain unchanged.

JetPack 3.1 is available on the NVIDIA Embedded Developer Website.

JetPack 3.1
JetPack 3.1

From the JetPack 3.1 Release Notes

Release Highlights

  • New L4T Production Release 28.1
    • This 64-bit BSP (Board Support Package) has been designed to work on both Jetson TX2 and Jetson TX1
  • TensorRT 2.1
    • New Customer Layer API enables integration of novel, user-defined layers
    • Doubled Deep Learning inference performance for batch size of one
  • cuDNN v6.0
    • New fused convolution provides better performance due to faster compute in the fused kernels
    • New dilated convolution reduces the number of parameters, which results in speed up of computation for certain applications like object detection and image segmentation that require convolution followed by upscaling
  • Multimedia API v28.1
    • New functionality
      • TNRv2 (Temporal Noise Reduction)
        • High quality spatio-temporal noise reduction using GPU. Recommended for applications where low light video quality is important and GPU requirement is acceptable. Typical GPU utilization is <8.5% for 1080p30fps operation on Jetson TX1.
      • Piecewise linear WDR Support
        • ISP now supports cameras with “built-in WDR” that combine multiple exposures on-sensor and transmit the result with a piecewise linear encoding. Functionality verified using Sony IMX-185 (and reference driver is included). This feature does not include support for other WDR technologies such as DOL or spatial interleaving.
    • New samples
      • How to share CUDA buffer with v412 camera and then process color conversion (YUYV to RGB) with CUDA algorithm
      • How to render video stream (YUV) or UI (RGB) with Tegra DRM (Direct Rendering Manager), i.e. rendering support for non-X11 and lightweight display system. Tegra DRM is implemented in user-space and is compatible with standard DRM 2.0

JetPack 3.1 is available on the NVIDIA Embedded Developer Website. Go grab some new JetPack goodness!

The post JetPack 3.1 Release appeared first on JetsonHacks.

Build Kernel and ttyACM Module – NVIDIA Jetson TX2

$
0
0

In this article, we cover building the kernel and modules for the NVIDIA Jetson TX2. We also build the ACM module, which allows the Jetson to communicate with devices that report through ttyACM. Looky here:

Background

Note: This article is for intermediate users. You should be familiar with the purpose of the kernel. You should be able to read shell scripts to understand the steps described.

With the advent of the production version of L4T 28.1 for the NVIDIA Jetson TX2, NVIDIA recommends using a host PC when building a system from source. See the Linux for Tegra R28.1 web page where you can get the required GCC 4.8.5 Tool Chain for 64-bit BSP.

If you are building systems which require a large amount of kernel development, that is a good option. For a person like me, it’s a little overkill. Most of the time I just want to compile an extra driver or three as modules to support some extra hardware with the TX2.

For example, one of the modules that I need is used to support USB ACM devices. Some USB devices report as USB, others report as ACM. Here’s an article explaining the good bits about what that means. Many devices, such as a Hokoyo LIDAR and some Arduinos, report as ACM devices.

Presented here are some scripts which download the kernel source on to the Jetson TX2 itself, modifies the Makefiles so that it will compile onboard the Jetson, and then copies the kernel Image into the boot directory. The video above shows how to select the ACM module and add it to the Image. The options are:

USB Modem (CDC ACM) support
CONFIG_USB_ACM

Installation

The script files to build the kernel on the Jetson TX2 are available on the JetsonHacks Github account in the buildJetsonTX2 repository.

$ git clone https://github.com/jetsonhacks/buildJetsonTX2Kernel.git
$ cd buildJetsonTX2Kernel

There are three main scripts. The first script, getKernelSources.sh gets the kernel sources from the NVIDIA developer website, then unpacks the sources into /usr/src/kernel.

$ ./getKernelSources.sh

After the sources are installed, the script opens an editor on the kernel configuration file. In the video, the local version of the kernel is set. The stock kernel uses -tegra as its local version identifier. Make sure to save the configuration file when done editing. Note that if you want to just compile a module or two for use with a stock kernel, you should set the local version identifier to match.

The second script, makeKernel.sh, fixes up the makefiles so that the source can be compiled on the Jetson, and then builds the kernel and modules specified.

$ ./makeKernel.sh

The modules are then installed in /lib/modules/

The third script, copyImage.sh, copies over the newly built Image and zImage files into the /boot directory.

$ ./copyImage.sh

Once the images have been copied over to the /boot directory, the machine must be restarted for the new kernel to take effect.

Note: The copyImage.sh script copies the Image file to the /boot directory of the current device. If you are using an external device such as a SSD as your root directory and still using the eMMC to boot from, you will need to copy the Image file to the /boot directory of the eMMC.

Spaces!

The kernel and module sources, along with the compressed versions of the source, are located in /usr/src

After building the kernel, you may want to save the sources off-board to save some space (they take up about 3GB) You can also save the boot images and modules for later use, and to flash other Jetsons from the PC host.

Conclusion

For a lot of use cases, it makes sense to be able to compile the kernel and add modules from the device itself.

Note

The video above was made directly after flashing the Jetson TX2 with L4T 28.1 using JetPack 3.1.

The post Build Kernel and ttyACM Module – NVIDIA Jetson TX2 appeared first on JetsonHacks.

NVIDIA Jetson TX1 Now on L4T 28.1 (JetPack 3.1)

$
0
0

One of the big changes in the new JetPack 3.1 release is that the NVIDIA Jetson TX1 now runs L4T 28.1, an Ubuntu 16.04 variant. The reason that this is important to know is that 28.1 runs Kernel version 4.4.38. In turn, that means that there are a lot of changes to the kernel itself and may require a little bit of rework on the existing codebase.

The JetsonHacks repositories on Github are no exception. There are many repositories which work with/around the older L4T kernels. A prime example is the librealsense installation library.

Because librealsense adds several different video modes to the uvcvideo driver, we made up a patch which adds those modes to the uvcvideo module source. This requires that the kernel and uvcvideo module be rebuilt and installed. Fortunately, those changes have migrated upstream and are now part of the 4.4 kernel which means that nothing needs to be changed in the kernel in order to recognize a RealSense camera on a Jetson TX1 or Jetson TX2.

Of course, there’s still an install script for librealsense on the Jetson, it’s just been updated for L4T 28.1. The older version is still available after being tagged and stored in the repository.

There are many scripts that may become a little flaky because of the kernel change, and you shouldn’t expect the older articles on this site to reflect the newer kernel changes. Be on the lookout!

The post NVIDIA Jetson TX1 Now on L4T 28.1 (JetPack 3.1) appeared first on JetsonHacks.

Develop on SSD – NVIDIA Jetson TX1 and Jetson TX2

$
0
0

Using a SSD as the root directory for development on a Jetson Development Kit provides many advantages, including faster disk times and much more storage. It’s easy to do! Looky here:

Background

With the advent of L4T 28.1, both the Jetson TX1 and the Jetson TX2 both run the same 64-bit 4.4.38 version kernel. Hopefully there will be few differences (if any) between the base development environments. One thing that does help development is more disk space, and faster disks. For those of us who develop onboard the Jetson, adding a SSD makes the development environment much more enjoyable.

Installation

The process is the same for both the Jetson TX1 and the Jetson TX2. Refer to the video for specifics.

The process is simple. Install a SSD on a Jetson (make sure the Jetson is powered down). Note: The physical installation is not shown in the video, the first part of this video is an example. Flash the Jetson with JetPack. Format the new SSD disk, setup a partition (it should show up as /dev/sda1). Then simply copy the contents of the eMMC over to the disk, and modify the /boot/extlinux/extlinux.conf file accordingly, so that the root directory points at /dev/sda1. After rebooting the machine, the SSD is now the root directory. There should be plenty of room for developing applications.

In case you need some extra modules or make changes to the kernel, the video above also shows how to build the kernel. Make sure to select the correct repository, i.e. buildJetsonKernelTX1 for the Jetson TX1, buildJetsonKernelTX2 for the Jetson TX2. There are scripts to download the kernel sources, help configure the kernel, build and copy the Image file. Note that the Image is copied to /boot/Image on the current device, which is the SSD. The Jetson does not boot from this location, so we need to copy it to the place where the Jetson looks to boot.

We’ve never really discussed this, so here goes. When U-Boot boots the Jetson, in stock configuration it boots from the internal eMMC using the file indicated in the /boot/extlinux/extlinux.conf file located on the eMMC. Typically this is /boot/Image. You can also set the root directory in this file, that’s how we are using the SSD as the root directory. Note that /dev/sda1 is not mounted this early in the boot cycle, so the Image file on the SSD isn’t much help here.

The issue here is that the real boot directory is located on the eMMC, so the simplest way is to copy the Image that we just created on the SSD to the boot directory of eMMC as shown in the video. We’re clever, so of course we give it a different name than ‘Image’ (like ImageSSD) so that we can use both the stock eMMC Image and the new SSD Image. By having multiple entries in the extlinux.conf file, we can then select between booting different configurations using the serial console. This makes it easier to debug changes later on.

This is the most flexible way of running systems side by side. It is possible to boot directly from SSD and so on, but the flexibility of having the stock kernel available on the eMMC for development is invaluable.

Conclusion

We’ve relied on the video more than most articles here. Mostly this is because we use a GUI for configuration. It should be a straightforward process to gain more speed and space in your development environment.

Notes: Installation in the video was show directly after flashing using JetPack 3.1

The post Develop on SSD – NVIDIA Jetson TX1 and Jetson TX2 appeared first on JetsonHacks.

Build Kernel and ttyACM Module – NVIDIA Jetson TX1

$
0
0

In this article, we cover building the kernel and modules for the NVIDIA Jetson TX1 Development Kit onboard the Jetson itself. We also build the ACM module, which allows the Jetson to communicate with devices that report through ttyACM. Looky here:

Background

Note: This article is for intermediate users. You should be familiar with the purpose of the kernel. You should be able to read shell scripts to understand the steps described.

With the advent of the production version of L4T 28.1 for the NVIDIA Jetson TX1, NVIDIA recommends using a host PC when building a system from source. See the Linux for Tegra R28.1 web page where you can get the required GCC 4.8.5 Tool Chain for 64-bit BSP.

If you are building systems which require a large amount of kernel development, that is a good option. For a person like me, it’s a little overkill. Most of the time I just want to compile an extra driver or three as modules to support some extra hardware with the TX1.

For example, one of the modules that I need is used to support USB ACM devices. Some USB devices report as USB, others report as ACM. Here’s an article explaining the good bits about what that means. Many devices, such as a Hokoyo LIDAR and some Arduinos, report as ACM devices.

Presented here are some scripts which download the kernel source on to the Jetson TX1 itself, modifies the Makefiles so that it will compile onboard the Jetson, and then copies the kernel Image into the boot directory. The video above shows how to select the ACM module and add it to the Image. The options are:

USB Modem (CDC ACM) support
CONFIG_USB_ACM

Installation

Note: A major issue with building the kernel and modules onboard the Jetson TX1 is the amount of space on the eMMC. If you are building the kernel on the eMMC, listen closely. If you flashed the with JetPack 3.1 and installed all of the options, then you probably have ~ 3.9GB free. This is just barely enough to build the system. Moreover, if you attempt to reboot the system after installation, and there is not enough free space left on the eMMC, the system will not boot.

There are several options, such as clearing off space before you start building the kernel. Another option is to build the kernel and then offload the kernel sources and the compressed versions to another device, and then remove them from the eMMC. You have been warned.

The script files to build the kernel on the Jetson TX1 are available on the JetsonHacks Github account in the buildJetsonTX1 repository.

$ git clone https://github.com/jetsonhacks/buildJetsonTX1Kernel.git
$ cd buildJetsonTX1Kernel

There are three main scripts. The first script, getKernelSources.sh gets the kernel sources from the NVIDIA developer website, then unpacks the sources into /usr/src/kernel.

$ ./getKernelSources.sh

After the sources are installed, the script opens an editor on the kernel configuration file. In the video, the local version of the kernel is set. The stock kernel uses -tegra as its local version identifier. Make sure to save the configuration file when done editing. Note that if you want to just compile a module or two for use with a stock kernel, you should set the local version identifier to match.

The second script, makeKernel.sh, fixes up the makefiles so that the source can be compiled on the Jetson, and then builds the kernel and modules specified.

$ ./makeKernel.sh

The modules are then installed in /lib/modules/

The third script, copyImage.sh, copies over the newly built Image and zImage files into the /boot directory.

$ ./copyImage.sh

Once the images have been copied over to the /boot directory, the machine must be restarted for the new kernel to take effect.

Note: The copyImage.sh script copies the Image file to the /boot directory of the current device. If you are using an external device such as a SSD as your root directory and still using the eMMC to boot from, you will need to copy the Image file to the /boot directory of the eMMC, and modify /boot/extlinux/extlinux.conf accordingly.

Spaces!

The kernel and module sources, along with the compressed versions of the source, are located in /usr/src

After building the kernel, you may want to save the sources off-board to save some space (they take up about 3GB) You can also save the boot images and modules for later use, and to flash other Jetsons from the PC host. The kernel sources are in the directory named kernel. The file source_release.tbz2 is the compressed version of the kernel and some other system packages. Then you can remove the directory and file. Remember to use ‘sudo’ as this is part of the system area.

Conclusion

For a lot of use cases, it makes sense to be able to compile the kernel and add modules from the device itself.

Note

The video above was made directly after flashing the Jetson TX1 with L4T 28.1 using JetPack 3.1.

The post Build Kernel and ttyACM Module – NVIDIA Jetson TX1 appeared first on JetsonHacks.


Intel RealSense Camera librealsense – NVIDIA Jetson TX Dev Kits

$
0
0

Intel RealSense cameras can use an open source library called librealsense as a driver for the Jetson TX1 and TX2 development kits. Looky here:

Background

With the release of L4T 28.1, both the Jetson TX1 and Jetson TX2 run on a Linux 4.4 version kernel. The 4.4 kernel has built-in support of the RealSense camera formats in the UVC video module. Earlier versions of L4T running an earlier kernel version required that the kernel be rebuilt before using a RealSense camera. Here’s an earlier article for reference.

That’s great news, as it simplifies installation of the RealSense driver library, librealsense. Without having to rebuild the kernel, life be good.

Install librealsense

A convenience script has been created to help with this task in the installLibrealsense repository on the JetsonHacks Github account.

For the Jetson TX2:

$ cd $HOME
$ git clone https://github.com/jetsonhacks/installLibrealsenseTX2.git
$ cd installLibrealsenseTX2
$ ./installLibrealsense.sh

For the Jetson TX1:

$ cd $HOME
$ git clone https://github.com/jetsonhacks/installLibrealsenseTX1.git
$ cd installLibrealsenseTX1
$ ./installLibrealsense.sh

This will build the librealsense library and install it on the system. This will also setup udev rules for the RealSense device so that the permissions will be set correctly and the camera can be accessed from user space. There is a patch applied during the installation which fixes an issue with the UVC video module not being recognized. This issue has been addressed upstream in the librealsense repository, but is not in a release just yet.

Note: At this point, both of the Jetson scripts are identical because the same kernel is being used. Prior to the release of L4T 28.1, the Jetson TX2 and Jetson TX1 ran different kernel versions.

Notes

Here are some notes:

  • In the video above, the installation was done on a Jetson TX2 running L4T 28.1 immediately after being flashed by JetPack 3.1
  • Librealsense now uses CMake as its build system.
  • QtCreator and Qt 5 are installed as dependencies in the librealsense part of the install. There are QtCreator project files located in the librealsense.qt directory. The project files build the library and example files. If you do not use QtCreator, consider modifying the installer script to remove QtCreator.
  • These scripts install librealsense version v1.12.1
  • The RealSense R200 is the only camera tested at this time.
  • Examples using librealsense are located in ~/librealsense/build/examples
  • Intel RealSense Stereoscopic Depth Cameras is a comprehensive overview of the stereoscopic Intel RealSense RGBD imaging systems

The post Intel RealSense Camera librealsense – NVIDIA Jetson TX Dev Kits appeared first on JetsonHacks.

Stanford Lecture Collection | Convolutional Neural Networks for Visual Recognition

$
0
0

Ok, maybe you were busy flossing the cat, or assembling a dog from a kit and didn’t have time to check out the lectures for CS231n from Stanford, Spring 2017. Looky here:

Why You Should Watch This Series

One of the really interesting things that has happened over the last 10 years is that top universities are sharing their classes online. In the technical/computer world, both MIT and Stanford are leading the charge to share knowledge with the only admission being a connection to the Internet and a viewing device. There is a wealth of knowledge available both at the graduate and under graduate level.

But you already know all that. While other people are watching inane YouTube videos, you are using that time to actually learn.

So you want to know what the next wave of Stanford entrepreneurs is going to be building on? You should watch this 16 lecture class. Here’s the blurb from the YouTube channel:

Computer Vision has become ubiquitous in our society, with applications in search, image understanding, apps, mapping, medicine, drones, and self-driving cars. Core to many of these applications are visual recognition tasks such as image classification, localization and detection. Recent developments in neural network (aka “deep learning”) approaches have greatly advanced the performance of these state-of-the-art visual recognition systems. This lecture collection is a deep dive into details of the deep learning architectures with a focus on learning end-to-end models for these tasks, particularly image classification. From this lecture collection, students will learn to implement, train and debug their own neural networks and gain a detailed understanding of cutting-edge research in computer vision.

If you’re new to computer vision combined with machine learning, here’s all the background and current theory in one place. You should remember that this is university level. So while none of this requires a PhD in math, it does require more work than watching the latest cute kitten video.

And yes, this is what everyone is so excited about in the NVIDIA Jetson world.

The post Stanford Lecture Collection | Convolutional Neural Networks for Visual Recognition appeared first on JetsonHacks.

Quick Tip: Which Version of L4T is Running? – NVIDIA Jetson Development Kit

$
0
0

Ever wonder which L4T version is running on your NVIDIA Jetson Development Kit? There’s a script for that! Looky here:

Background

If you are a newcomer to the Jetson, some of the references to the operating system that runs on the device can be a little confusing. The operating system is Linux 4 Tegra, abbreviated as L4T, a Linux Ubuntu variant. Ubuntu runs on top of a Linux kernel, which is the lower level part of the operating system which interacts with the computer and devices. That seems simple enough, but it tends to get confusing because of the different development environments that people use.

The NVIDIA recommendation is to install the OS and supporting packages using the JetPack installer from a host Ubuntu computer. JetPack installs L4T on the device, along with other support libraries selected by the user. The user may optionally select to install many of these tools on the host. This is useful for developers who wish to use the host PC as a cross development environment for the Jetson.

Each different version of JetPack has a specific version of L4T that it installs. Typically JetPack will note which version that will be installed during the installation process. What happens is that many people will refer to the OS version as the JetPack version that installed L4T on the Jetson. Of course, as different JetPack releases and Jetson Dev Kit versions become more numerous, it becomes harder to correlate which version of L4T is actually installed on the device. It’s an obvious disconnect. Some of the more experienced Jetson developers can keep the map in their head, but for more normal folks it’s easier to look it up. Worse, if you use multiple Jetsons or you did not flash them yourself using JetPack, then sometimes it’s difficult to know what version is actually running.

Once you know the distinction between the OS version and JetPack, it’s easy enough to figure out. Linux developers will simply look in /etc/nv_tegra_release which describes the release in the first line (sample, GCID hidden):

# R28 (release), REVISION: 1.0, GCID: [an ID], BOARD: t186ref, EABI: aarch64, DATE: Thu Jul 20 07:59:31 UTC 2017

There’s the information. The release is R28 and revision is 1.0 which gives L4T 28.1.0. The BOARD parameter indicates the t186ref which is a Jetson TX2 Development Kit.

If you’re a bit like me, I think the polite term now is ‘simple’, you realize that it would be nice to have a little script that looks this up and prints it out.

Installation and Execution

There is a repository on the JetsonHacks account on Github which contains a Python script to help us along. To install:

$ git clone https://github.com/jetsonhacks/jetsonUtilities
$ cd jetsonUtilities

To run the script:

$ python jetsonInfo.py

Because the script is marked as executable, an alternative is to:

$ ./jetsonInfo.py

The result will be similar to:

Hardware Model Name not available
L4T 28.1.0
Board: t186ref
Ubuntu 16.04 LTS
Kernel Version: 4.4.38-tegra

In addition to the L4T version, the script prints out the Ubuntu version and kernel version which may be useful to know.

On some versions of the release, the Hardware Model of the board may be present, e.g. jetson_tx1. On the later versions of L4T this does not seem to be the case. Note that it is not an error.

Out in the real world, most people talk about the Ubuntu version. Therefore it’s useful at times to know which Ubuntu version that L4T is built from. For example, if you are using ROS there is a corresponding version for each Ubuntu release. It’s a slight distinction, but worth noting. For the most part, the L4T revision denotes the changes to get Ubuntu running on the Jetson hardware at the firmware, kernel and device tree level.

Conclusion

Hopefully this article gives a little bit of insight as to what other developers mean when they talk about which version of L4T is running on the Jetson. While an experienced developer can infer which version of L4T is installed on the device, the L4T designator is really the only way to know the current machine configuration.

Notes

As shown in the video, this script works on the Jetson TX2, Jetson TX1, and Jetson TK1.

The post Quick Tip: Which Version of L4T is Running? – NVIDIA Jetson Development Kit appeared first on JetsonHacks.

Mobile Robot Navigation – Localization – Mike Boulet

$
0
0

At a recent MIT/Lincoln Labs 2017 Beaver Works Summer Institute Seminar, Mike Boulet gave a lecture on localization with regards to mobile robot navigation. Looky here:

Background

There are many parts to mobile robot navigation. As this information is used directly in programming the MIT RACECAR, a NVIDIA Jetson based robot, we will cover that information here.

Mobile robot navigation can be broken into these main categories:

  • Perception
  • Localization
  • Task and Route Planning
  • Motion Planning and Execution

This lecture covers the localization aspect of the task.

There are many more lectures available in this summer series, with a wide range of subject matter. We will be providing pointers to the lectures that directly address the RACECAR, but it’s worth going through the playlist to find other topics which may interest you.

Note that these lectures are given to high school senior students.

Note: In case there are browser issues, the YouTube address is: www.youtube.com/watch?v=QxqUwOGN7Uw

Note: Some people find it helpful to set the playback speed for these types of videos to 1.25X on YouTube, the setting is available in the settings menu. This saves a little time while watching, but the fidelity is still good enough to understand the lecture. You can always put it back to normal speed for the tricky bits.

The post Mobile Robot Navigation – Localization – Mike Boulet appeared first on JetsonHacks.

A Good Article: CSI Cameras on the TX2 (The Easy Way)

$
0
0

Recently Peter Moran wrote an article titled CSI Cameras on the TX2 (The Easy Way)

Money Quote:

We’re going to look at utilizing the Jetson’s image processing powers and capturing video from the TX2’s own special CSI camera port. Specifically, I’ll show you:

  • Why you’d even want a CSI camera.
  • Where to get a good CSI camera.
  • How to get high resolution, high framerate video off your CSI cameras using gstreamer and the Nvidia multimedia pipeline.
  • How to use that video in OpenCV and ROS.

If you have an interest in CSI cameras on the Jetson Development Kits, the article is well worth the read. It gathers up a lot of the information that is scattered about into one place, and gives an overview and code on how to integrate CSI cameras into your project.

Recommended.

The post A Good Article: CSI Cameras on the TX2 (The Easy Way) appeared first on JetsonHacks.

Viewing all 339 articles
Browse latest View live