Quantcast
Channel: JetsonHacks
Viewing all 339 articles
Browse latest View live

NVIDIA Jetson Xavier Announcement

$
0
0

There is a lot of news in the Jetson world today. I know that a lot of it may be obscured by the big news that the JetsonHacks YouTube Channel just passed over one million views. Thanks for watching! We will cover the other events here just so they are not drowned out.

Jetson Xavier

At Computex 2018, NVIDIA announced the Jetson Xavier, the latest addition to the Jetson platform family. Money quote:

Jetson Xavier is designed for robots, drones and other autonomous machines that need maximum compute at the edge to run modern AI workloads and solve problems in manufacturing, logistics, retail, service, agriculture and more.

Here’s a beauty shot:

Jetson Xavier
Jetson Xavier (courtesy NVIDIA)

Specs

Here’s some of the technical specs:

System

  • GPU — 512-core Volta GPU with Tensor Cores
  • DL/ML Accelerator — (2x) NVDLA Engines (nvdla.org)
  • CPU — 8-core ARMv8.2 64-bit CPU, 8MB L2 + 4MB L3
  • Memory — 16GB 256-bit LPDDR4x | 137 GB/s
  • Storage — 32GB eMMC 5.1
  • Vision Accelerator — 7-way VLIW processor
  • Video Encode — (2x) 4Kp60 | HEVC
  • Video Decode — (2x) 4Kp60 | 12-bit support
  • Mechanical — 100mm x 87mm with 16mm Z-height
    (699-pin board-to-board connector)
  • Multiple operating modes at 10W, 15W, and 30W

I/O

  • Display — 3x eDP/DP/HDMI at 4Kp60 | HDMI 2.0, DP HBR3
  • Camera — 16x CSI-2 Lanes (40 Gbps in D-PHY V1.2 or 109 GBps in CPHY v1.1
    • 8x SLVS-EC lanes (up to 18.4 Gbps)
    • Up to 16 simultaneous cameras
  • PCIe — 5x 16GT/s gen4 controllers | 1×8, 1×4, 1×2, 2×1
    • (3x) Root Port + Endpoint
    • (2x) Root Port
  • USB
    • (3x) USB 3.1 (10GT/s)
    • (4x) USB 2.0 Ports
  • Ethernet — Gigabit Ethernet-AVB over RGMII
  • Other I/Os — UFS, I2S, I2C, SPI, CAN, GPIO, UART, SD

Jetson Xavier Developer Kit:

Jetson Xavier Module
Jetson Xavier Development Kit (courtesy NVIDIA)

Discussion

Looking over the specs, we can see a couple of interesting items. First, there are 512 Volta based CUDA cores and 8 ARM CPU cores. NVIDIA is quoting around 10 TFLOPS of FP32 performance! That equates to up to 20x the performance of the original Jetson TX2! Now, the Xavier can use up to 30W (which is 2X the Jetson TX2) in max mode which accounts for some of the speed increase. The Jetson Xavier has 15W and 10W modes too.

A 256 bit wide memory bus also helps (the Jetson TX2 is 128). The memory and eMMC are both the same size as the TX2. I/O grows quite a bit, now there are 3 USB 3.1 ports on the module, along with 4 2.0 ports. You can use up to 16 simultaneous cameras.

PCIe is Gen 4! This is faster than any currently shipping desktop systems.

The Jetson Xavier Module is 100mm x 87mm with 16mm height. The Jetson Xavier DevKit Reference Carrier is only slightly larger and adds a large variety of I/O connectors such as USB and HDMI. The DevKit has a CSI camera slot, along with a PCIe connector and access to the GPIO connector. This is quite a change from previous generations with their much larger carrier board.

The Jetson Xavier does not have onboard Wi-Fi or Bluetooth, a change from the earlier Jetsons TX modules. However the Xavier Devkit has a M.2 Key E slot which will allow developers to easily add WiFi/BlueTooth/Cell connectivity.

There is a M.2 Key M slot for SSD storage located on the top of the carrier board (right beneath the module). The SD card is micro-SD format. There is also an eSATAp port.

Pricing at the announcement the MSRP of the Jetson Xavier Developer Kit is $1299 (USD). The Jetson Xavier Developer Kit will be available for early access in August and starting in October for general access.

Isaac

Also announced at Computex 2018 is NVIDIA® Isaac™ availability. Money quote:

The NVIDIA Isaac Software Development Kit (SDK) gives you a comprehensive set of frameworks, tools, APIs, and libraries to accelerate development of robotics algorithms and software.

Isaac Robotics Software

NVIDIA provides a toolbox for the simulation, training, verification and deployment of Jetson Xavier. This robotics software consists of:

  • Isaac SDK – a collection of APIs and tools to develop robotics algorithm software and runtime framework with fully accelerated libraries.
  • Isaac IMX – Isaac Intelligent Machine Acceleration applications, a collection of NVIDIA-developed robotics algorithm software.
  • Isaac Sim – a highly realistic virtual simulation environment for developers to train autonomous machines and perform hardware-in-the-loop testing with Jetson Xavier.

Conclusion

New stuff in the pipe! Mo’ better, mo faster. It promises to be a fun fall when we get out hands on these little beasties.

Updates

Updates since the original article:

  • Confirmation of Gen 4 PCIe
  • Clarification of Jetson Xavier Development Kit (It’s pictured above)
  • Development Kit has a M.2 Key E slot for wireless cards
  • Carrier board has a M.2 key M for SSD storage on the top of the carrier
  • SD Card is micro format
  • eSATAp port
  • Development Kit has a CSI slot, similar to the Jetson TX1/TX2. There is no camera included in the Dev Kit.

The post NVIDIA Jetson Xavier Announcement appeared first on JetsonHacks.


Build OpenCV 3.4 with CUDA on NVIDIA Jetson TX1

$
0
0

For OpenCV to use CUDA acceleration on the NVIDIA Jetson TX1 running L4T 28.2 (JetPack 3.2), you will need to build OpenCV from source. Looky here:

Background

With the latest release of L4T, 28.2, OpenCV version 3.3 may be installed through the JetPack installer. At the time of the L4T release, OpenCV did not provide support for CUDA 9.0 with which L4T 28.2 ships. Over the next couple of months, version OpenCV 3.4 added CUDA 9.0 support.

So what does that mean? Well, if you want OpenCV CUDA support under L4T 28.2 you need to compile it from source. Fortunately we have some convenience scripts to help with that task in the JetsonHacks repository buildOpenCVTX1 on Github.

Installation

You should note that OpenCV is a rich environment, and can be custom tailored to your needs. As such, some of the more common options are in the build command, but are not comprehensive. Modify the options to suit your needs.

If you saw the previous article about building OpenCV on the Jetson TX2, the process is similar. However, there is one difference which is very important.

The Jetson TX1 has a 16GB internal eMMC flash drive. Along with the operating system and usual programs/libraries, this does not leave enough room on the eMMC to build the OpenCV library. As a result, you will need to build OpenCV on external media, such as a SD card, USB flash drive/disk, or SATA disk.

Note: In the video we used this USB Flash Drive. We also have been using the Samsung T5 Portable SSD here lately, and really liking it.

The external drive must be formatted as Ext4, otherwise the operating system will get all confused when it tries to do things such as make symbolic links.

Library location

With this script release, the script now installs OpenCV in /usr/local. Earlier versions of this script installed in /usr. You may have to set your include and libraries and/or PYTHONPATH to point to the new version. See the Examples folder. Alternatively, you may want to change the script to install into the /usr directory.

All of this may lead to a conflict. You may consider removing OpenCV installed by JetPack before performing this script installation:

$ sudo apt-get purge libopencv*

Options

The buildOpenCV script has two optional command line parameters:

  • -s | –sourcedir Directory in which to place the opencv sources (default $HOME)
  • -i | –installdir Directory in which to install opencv libraries (default /usr/local)

For example, to run the the build file:

$ ./buildOpenCV.sh -s <file directory>

This will build OpenCV is the given file directory and install OpenCV is the /usr/local directory.

Make sure to read through the install script. In the script, here are some of the options that were included:

  • CUDA
  • Fast Math (cuBLAS)
  • OpenGL
  • GStreamer 1.0
  • Video 4 Linux (V4L)
  • Python 2.7 and Python 3.5 support

Build and Install

To download the source, build and install OpenCV:

$ git clone https://github.com/jetsonhacks/buildOpenCVTX1.git
$ cd buildOpenCVTX1
$ git checkout v2.0OpenCV3.4
$ ./buildOpenCV.sh -s <file directory>

You can remove the sources and build files after you are done:

$ ./removeOpenCVSources.sh -s <file directory>

where the <file directory> is the same as in the buildOpenCV command. This will remove the OpenCV source, as well as the opencv_extras directories.

Examples

There are a couple of demos in the Examples folder.

There are two example programs here. Both programs require OpenCV to be installed with GStreamer support enabled. Both of these examples were last tested with L4T 28.2, OpenCV 3.4.1

The first is a simple C++ program to view the onboard camera feed from the Jetson Dev Kit.

To compile gstreamer_view.cpp:

$ gcc -std=c++11 ‘pkg-config –cflags opencv’ ‘pkg-config –libs opencv’ gstreamer_view.cpp -o gstreamer_view -lstdc++ -lopencv_core -lopencv_highgui -lopencv_videoio

to run the program:

$ ./gstreamer_view

The second is a Python program that reads the onboard camera feed from the Jetson Dev Kit and does Canny Edge Detection.

To run the Canny detection demo (Python 2.7):

$ python cannyDetection.py

With Python 3.3:

$ python3 cannyDetection.py

With the Canny detection demo, use the less than (<) and greater than (>) to adjust the edge detection parameters. You can pass the command line flags —video_device=<videoDeviceNumber> to use a USB camera instead of the built in camera.

Notes

New L4T Releases – NVIDIA Jetson TK1 and Jetson TX2

$
0
0

NVIDIA announces the availability of version 21.7 of L4T for the Jetson TK1 Development Kit, and version L4T 28.2.1 for the Jetson TX2/TX2i Development Kits.

Jetson TK1

The L4T 21.7 software maintenance release for the Jetson TK1 is available at: https://developer.nvidia.com/linux-tegra-r217 There are a couple of useful bug fixes.

Here are the release notes: https://developer.download.nvidia.com/embedded/L4T/r21_Release_v7.0/Tegra_Linux_Driver_Package_Release_Notes_R21.7.0.pdf

Jetson TX2/TX2i

For the Jetson TX2/TX2i Development Kits, there is a new maintenance release of L4T, 28.2.1. There are several bug fixes, plus the source code for Cboot is made publicly available. Here are the L4T 28.2.1 release notes: https://developer.download.nvidia.com/embedded/L4T/r28_Release_v2.1/Tegra_Linux_Driver_Package_Release_Notes_R28.2.1.pdf

The Jetson TX1 remains at L4T version 28.2.

In order to install the new release of L4T 28.2.1, use the new version of JetPack, version 3.2.1.

JetPack Download and Install

Here are the instructions to download and Install JetPack: https://docs.nvidia.com/jetpack-l4t/index.html#jetpack/3.2.1/install.htm

JetPack 3.2.1 release notes: https://developer.nvidia.com/embedded/jetpack-notes

Good luck on your new installs!

The post New L4T Releases – NVIDIA Jetson TK1 and Jetson TX2 appeared first on JetsonHacks.

How To Package OpenCV 3.4 with CUDA – NVIDIA Jetson TX Dev Kits

$
0
0

Wouldn’t it be nice to be able to package OpenCV into an installable after a build such as in our previous article, Build OpenCV 3.4 with CUDA on Jetson TX2? Well you can! In fact, the OpenCV build system makes this pretty simple.

Note: There’s also an article for the Jetson TX1: Build OpenCV 3.4 with CUDA – Jetson TX1. Both of the repositories are basically the same, but set the architecture for the particular processor in use.

The advantage of this approach is that you will only need to build OpenCV to the configuration you desire once, and can reinstall it easily when you need to recreate or reflash your Jetson environment.

Building the Package

Here’s a walk through video for building the OpenCV package. This is for a Jetson TX2, but the instructions for a Jetson TX1 are similar. Looky here:

Note: In the video we used this USB Flash Drive to build the package. We also have been using the Samsung T5 Portable SSD here lately, and really liking it. The Samsung T5 is currently at a great price/performance point, and don’t tend to get lost as easily as the thumb drives. The USB drive is connected via an Amazon 7 Port USB 3.0 Hub, and the webcam is a Logitech c920.

In the buildOpenCVTX2 repository on the JetsonHacks account on Github (for the Jetson TX1 buildOpenCVTX1 repository) there is a script called ‘buildAndPackageOpenCV.sh’. The script includes the commands in the buildOpenCV.sh script, along with the commands to package the build.

The flag CPACK_BINARY_DEB tells CMAKE to package the build. If you read the script, you will see that after OpenCV is built and installed, there is a command:

$ sudo make package

This tells the system to build the binary package files. These files are in three flavors; .deb files, a .tar file, and a .sh shell script which is a self contained installer.

The buildAndPackageOpenCV script has two optional command line parameters:

-s | –sourcedir Directory in which to place the opencv sources (default $HOME)
-i | –installdir Directory in which to install opencv libraries (default /usr/local)

For example, to run the the build file:

$ ./buildAndPackageOpenCV.sh -s <file directory>

This example will build OpenCV in the given file directory and install OpenCV in the /usr/local directory.

The corresponding .deb files will be in the <file directory>/opencv/build directory in .deb file and compressed forms.

As shown in the video, the source directory can be set to external media such as a USB drive or SATA drive.

Note: On L4T 28.2, the default installation location for the OpenCV libraries is /usr. The default location for the OpenCV 3.4 build presented here is /usr/local.

Installing the Package

While OpenCV 3.4 is installed in the preceding procedure, there will be times that you want to generate a new system and install OpenCV 3.4 without having to compile it from source. With the OpenCV package files from the above build process, you can simply use the package files as installers. For the simplest way, looky here (first few minutes):

You will probably want to copy the package files to another directory. In the <file directory>/opencv/build directory, the OpenCV packages will start as ‘OpenCV*’

There will be some .deb files, a .tar file, and a .sh file. These files may differ from the video depending on the build options you select for OpenCV.

If you are familiar with Linux, you probably already know what to do with the .deb file or .tar files. If not, you can use the .sh file.

First, you will want to remove the current OpenCV installation:

$ sudo apt-get remove lib opencv

Note: You may want to ‘purge’ instead of ‘remove’. The purge command will remove the corresponding .deb files of the older versions from the system if installed.

Next, switch over to the place where you have your OpenCV package shell script and execute:

$ ./OpenCV-3.4.1-<version>.sh –prefix=/usr

The ‘prefix’ flag tells the installer where to place OpenCV. In a standard L4T installation, OpenCV is installed in /usr.

This is just one way to install OpenCV, you may want to install it using different methods or locations.

Testing with YOLO

In the installation video, we install and test with YOLO. Looky here:

The instructions are taken from JK Jung’s excellent blog: https://jkjung-avt.github.io/yolov3/. If you haven’t run across that blog before, take the time to stop and look around. There is a lot of great Jetson related content!

YOLO itself means “You Only Look Once”. YOLO is a state-of-the-art, real-time object detection system. Checkout the YOLO website: https://pjreddie.com/darknet/yolo/

There’s quite a lot of information there, take your time and absorb it.

Conclusion

I think this is a pretty useful addition to the build toolbox. I use it quite a bit. Have fun!

The post How To Package OpenCV 3.4 with CUDA – NVIDIA Jetson TX Dev Kits appeared first on JetsonHacks.

Jetson TX2 Build Kernel for L4T 28.2.1 updated

$
0
0

A quick note, the repository for building the Linux kernel onboard the NVIDIA Jetson TX2 development kit has been updated. The repository is on the JetsonHacks account on Github, buildJetsonTX2Kernel.

I want to thank Shreeyak (https://github.com/Shreeyak) for pointing out a version identification issue, as well as building an alternative to pull the kernel sources from the NVIDIA Git repositories directly. While I ultimately chose not to use this approach in the JetsonHacks scripts, there is a pull request available which implements this. If you are developing on the Jetson professionally, you and your team should seriously consider this approach.

The post Jetson TX2 Build Kernel for L4T 28.2.1 updated appeared first on JetsonHacks.

Librealsense Update – NVIDIA Jetson TX Dev Kits

$
0
0

Recently the Intel Librealsense Development Team added CUDA support to the Librealsense SDK. Here’s how to install it on the Jetson TX Dev kits. Looky here:

Background

As you may recall, we added CUDA support to librealsense in a previous article, Now with CUDA! Intel RealSense D400 cameras – NVIDIA Jetson TX. Recently I received a nice note from the Librealsense Development Team stating that they had used the code as a basis for adding CUDA support to librealsense (starting in version 2.13.0).

Intel took a look at the performance (from https://github.com/IntelRealSense/librealsense/pull/1866):

Initial CUDA support for conversions and pointcloud generation.

Performance improvement:

As tested on the Jetson TX2:

Description Resolution fps No Cuda, avg. (ms) With Cuda, avg. (ms)
YUY2 into RGB8 1920×1080 30 30 – 32 5-7
YUY2 into RGBA8 1920×1080 30 34 7.5
YUY2 into BGR8 1920×1080 30 35 5 – 7
YUY2 into BGRA8 1920×1080 30 37 7 – 10
YUY2 into Y16 1920×1080 30 10 5 – 8
YSplitting Y8I to Y8, Y8 1280×800 30 9 – 10 3 – 5
Splitting Y12 to Y16, Y16 1280×800 25 22.9 5 – 7
Pointcloud (deproject depth) 640×480 30 8 – 9 3 – 5

In order to activate GPU full power run: sudo ./jetson_clocks.sh

You can see that there is a 4-7x speedup on the CPU only versus GPU code on the Jetson. The term fps means frames per second. Remember that when you are running 30 fps, each frame arrives about every 33ms.

Note: Because space is tight on the Jetson TX1, you should first flash the Jetson TX1 with L4T 28.2 using JetPack 3.2.1, download the buildLibrealsense2TX repository, and then build the patched kernel. Once the kernel is in place, remove the kernel sources to give you enough space to operate.

Installation

On the JetsonHacks Github account, there is a repository buildLibrealsense2TX. To download the repository:

$ cd $HOME
$ git clone https://github.com/jetsonhacks/buildLibrealsense2TX
$ cd buildLibrealsense2TX
$ git checkout v0.9

The instructions are much the same as in the previous article. More details are available on the README.md file in the buildLibrealsense2TX directory.

Explain please

First, a word about what we’re doing. There are several RealSense camera formats that the standard UVC (video) module in the kernel does not support. librealsense provides patches to add those video modes to the appropriately. The patches also add support for properly adding timestamps to the incoming video stream. In addition, there are several patches that modify some of the Industrial I/O (IIO) tree. These patches are providing support to supplemental hardware on a RealSense camera, such as a 3D Gyro or 3D Accelerometer.

Note: Some of the patches apply to modules that are built into the kernel itself. Because these modules are required to be in the Image, and not built as external, you will need to compile the kernel itself along with the modules.

If you do not modify the Linux kernel, librealsense will mostly appear to work. You will probably experience issues a little further along when you are trying to get better precision and accuracy from the video and depth streams. As the RealSense D435 camera does not have supplemental hardware, the IIO patches don’t make any difference.

If you’re just looking to play around with the camera, you may be able to get away with not compiling the kernel and skip over it. If you’re more serious, you’ll have to start patchin’ !

Building the Kernel

Note: If you built your kernel as in the previous article, you must rebuild it again!

Building the kernel can be taxing, there are many little things that can go wrong. Plus, you can make your Jetson become brickly if something happens at an untoward momement. If you’re a go for it kind of person:

$ ./buildPatchedKernelTX.sh

If something does go wrong during the build, you may want to try to debug it. As part of its cleanup process, the buildPatchedKernel script erases all of the source files and build files that it has downloaded and built. You can pass nocleanup as a command line flag so it keeps those files around. Hopefully you can fix everything.

$ ./buildPatchedKernelTX.sh –nocleanup

Actually, this script is more useful as a template for rebuilding your own kernel with the librealsense changes. There are two scripts in the ‘scripts’ directory for helping in the kernel build:

patchKernel.sh applies the librealsense kernel patches.

configureKernel.sh Configures the kernel to add the appropriate modules needed by librealsense.

Since you’re a developer, you should be able to figure out what the scripts are doing and modify them to match your needs.

librealsense Installation

make sure that the RealSense camera is not attached to the system. Then install the library:

$ ./installLibrealsense.sh

The script does several things:

  • Installs dependencies
  • Builds CMake
  • Builds librealsense and associated apps

CMake 3.8+

The librealsense build system requires CMake version 3.8 or above. The standard Jetson TX version of CMake is 3.5.1 for L4T 28.2.x. Therefore the script builds CMake (in the home folder). However, the script does not install CMake into the system area, it just uses the new version directly from the build fold. Of course, you can install the new version of CMake on your system if so desired.

After the script compiles the library, the new files are placed in the following directories:

  • The library is installed in /usr/local/lib
  • The header files are in /usr/local/include
  • The examples and tools are located in /usr/local/bin

The script also sets up a udev rule so that the RealSense camera is available in user space.

Once the library is installed, plug the camera into the Jetson, or into the Jetson through a powered USB 3.0 hub. You can then go and execute the tools and examples. For example:

$ cd /usr/local/bin
$ ./realsense-viewer

Conclusion

It is nice to have the CUDA upgrade for librealsense in the official repository. Because the RealSense SDK is under heavy development, we will have to keep our eye out for improvements in the weeks ahead!

Notes

  • In the video, a Jetson TX2 is shown
  • L4T 28.2.1 installed by JetPack 3.2.1
  • buildLibrealsense2TX is version v0.9

Intel Documentation:

The post Librealsense Update – NVIDIA Jetson TX Dev Kits appeared first on JetsonHacks.

I2C – NVIDIA Jetson TX2 Development Kit

$
0
0

It is straightforward to connect an I2C device to a Jetson TX2. Looky here:

Background

I2C is a straightforward serial protocol. There are usually two wires, one is for transferring data (SDA), the other is a clock which is used to mark the begin and end of data packed (SCL). Most devices will also require power (VCC) and ground (GND). There are several I2C busses on the NVIDIA Jetson TX2 Development Kit. You can access I2C bus 0 and I2C bus 1 on the J21 GPIO header.

Hardware

Note:A Jetson TX2 with 64 bit L4T 28.2.1 (JetPack 3.2.1) is shown in the demo

First, before powering up the Jetson, let’s wire up the LED Segment Display. Here’s the pinout of the J21 GPIO Header. In our example, we power the display from the Jetson GPIO header at 5V.

For this example project, a Adafruit 0.56″ 4-digit 7-segment Display W/i2c Backpack – Green is wired to a Jetson. The Display is assembled per the Adafruit instructions.

On a Jetson TX2, here’s a wiring combination for I2C Bus 1:

GND J21-6 -> LED Backpack (GND)
VCC J21-2 -> LED Backpack (VCC – 5V)
SDA J21-3 -> LED Backpack (SDA)
SCL J21-5 -> LED Backpack (SCL)

Note that the TX2 also has a I2C Bus 0 interface. See the J21 Pinout Diagram.

If you wish to interface with I2C Bus 0:

GND J21-6 -> LED Backpack (GND)
VCC J21-2 -> LED Backpack (VCC – 5V)
SDA J21-27 -> LED Backpack (SDA)
SCL J21-28 -> LED Backpack (SCL)

Note: To use Bus 0 with the example, you will need to modify the example source code.

Software Installation

Once the board is wired up, turn the Jetson on.
Install the libi2c-dev library. In order to be able inspect the LED Display, you may find it useful to also install the i2c tools:

$ sudo apt-get install libi2c-dev i2c-tools

After installation, in a Terminal execute (1 is the I2C bus in this case):

$ sudo i2cdetect -y -r 1

ubuntu@tegra-ubuntu:~$ sudo i2cdetect -y -r 1
0 1 2 3 4 5 6 7 8 9 a b c d e f
00: — — — — — — — — — — — — —
10: — — — — — — — — — — — — — — — —
20: — — — — — — — — — — — — — — — —
30: — — — — — — — — — — — — — — — —
40: — — — — — — — — — — — — — — — —
50: — — — — — — — — — — — — — — — —
60: — — — — — — — — — — — — — — — —
70: 70 — — — — — — —

You should see an entry of 0x70, which is the default address of the LED Segment Display. If you have soldered the address pins on the Display to change the address, you should see the appropriate address.

Next, install the library and example code which is available in the JHLEDBackpack repository on the JetsonHacks Github account. To install:

$ git clone https://github.com/jetsonhacks/JHLEDBackpack.git
$ cd JHLEDBackpack
$ cd example

You are then ready to compile the example and run it.

$ make
$ sudo ./displayExample

The display will go through a couple of examples, a blinking set of dashes, a hexadecimal display, a floating point number display, a count down timer and a clock example. Hit the ‘Esc’ key during the clock example to end the example.

The library defaults to I2C Bus 1. If you want to use Bus 0, modify the example file displayExample.cpp:

HT16K33 *displayMatrix = new HT16K33() ;
// Add the following line
displayMatrix->kI2CBus = 0; // Use I2C bus 0
int err = displayMatrix->openHT16K33();

Make sure you save the file, and run make on it.

Notes

Equipment and Supplies

The segmented LED display is a kit. You will need some elementary soldering skills for assembly. We tend to use:

New to electronics? This is a pretty easy project, looky here: Electronics Tutorials for some introductory material on how to start becoming a master.

Conclusion

Accessing the I2C bus on the Jetson TX2 J21 GPIO header is straightforward, and makes for easy prototyping!

The post I2C – NVIDIA Jetson TX2 Development Kit appeared first on JetsonHacks.

RACECAR/J – ROS Teleoperation

$
0
0

After completion of the RACECAR/J robot assembly, ROS software installation and VESC programming, it is time to test teleoperation using a game controller. Looky here:

Background

In the software installation article, we installed a custom ROS software stack for the MIT RACECAR. There is a special node, joy_teleop, which is lightly modified to support the RACECAR for use with a game controller. The standard controller in the package is a Logitech Gamepad F710. It is possible to use other game controllers by modifying the scripts.

Teleoperation

Once the robot is assembled, software loaded and VESC programmed, the Jetson RACECAR is ready for teleoperation. The game controller used in the video is a Logitech Gamepad F710.

Note: If you purchashed a full RACECAR/J base kit (RACECAR/J Robot Base Kitt or RACECAR/J “FlatNose” Robot Base Kit) the FOCBOX VESC will arrive programmed. If you purchase a VESC separately, you will need to program it to operate the robot. See: Programming the Electronic Speed Controller.

Power on the robot. Plug the battery in the chassis into the VESC. Plug the Jetson and peripherals into a power source, either a battery for the electronics on the car or using the Jetson power brick if the robot is on a bench.

Note: Use caution if the RACECAR is on a bench. You should place the robot on a pit stand or perhaps the Jetson cardboard box to make sure that the wheels do not come in contact with the bench. Also, make sure that none of the cables can come in contact with any moving parts. Let’s just say bad things can happen if the wires wrap around the spinning parts, or the robot starts moving unexpectedly.

In order for the Jetson to recognize the F710, the small switch on the back of the game controller must be in the ‘D’ position. If the game controller is in the ‘X’ position, it may not be detected.

Logitech F710 (Back)
Logitech F710 (Back)

The Logitech F710 connects to the Jetson over a wireless USB dongle. The dongle should be plugged into the USB Hub which is connected to the Jetson. In order to connect to the Jetson, power the F710 on (using the center button labeled ‘Logitech’). You can press the ‘VIBRATION’ button to tell if the F710 is powered on. The ‘VIBRATION’ button should cause the game controller to rumble in your hand.

Logitech F710
Logitech F710

Once connected, the game controller should show up as /dev/input/js0

The F710 has two modes, selected by the button labeled ‘MODE’. The correct mode for the Jetson RACECAR is the one which the left joystick controls axis 0 and 1, and the right joystick controls axis 2 and 3. There is an LED next to the MODE button. If the LED is green, then it is in the wrong mode, the light should be off.

You can test a game controller to make sure it is working with the Jetson using the jstest program.

$ sudo jstest /dev/input/js0

Note: jstest is located in the joystick debian package. If it is missing you can install it:

$ sudo apt-get install joystick

Once the game pad is sending information, you can teleoperate the robot. Make sure that the wheels are clear of any obstructions.

$ cd racercar-ws
$ source devel/setup.bash
$ roslaunch racecar teleop.launch

The robot has a deadman switch, the upper button (labeled LB) on the left horn of the game pad. Holding the deadman button, you can control the throttle with the left joystick, and the steering with the right joystick.

Switch the joystick axis

In the repository on the MIT RACECAR account on Github the right joystick controls steering of the car by changing the vertical axis. Most computer games use the horizontal axis to steer. In order to switch the axis, edit the YAML file located at:

~/racecar-ws/src/racecar/racecar/config/racecar-v2/joy_teleop.yaml

In the section:

# Enable Human control by holding Left Bumper
human_control:
type: topic
message_type: ackermann_msgs/AckermannDriveStamped
topic_name: low_level/ackermann_cmd_mux/input/teleop
deadman_buttons: [4]
axis_mappings:

axis: 1
target: drive.speed
scale: 2.0 # joystick will command plus or minus 2 meters / second
offset: 0.0

axis: 3
target: drive.steering_angle
scale: 0.34 # joystick will command plus or minus ~20 degrees steering angle
offset: 0.0

Change the line

axis: 3

to:

axis: 2

This should change from the vertical to the horizontal axis on the game controller.

Conclusion

Setting up the robot for teleoperation is straightforward. Note that this is different from how a radio controlled car normally works. In an unmodified vehicle, a radio receiver receives information for a hand held controller. The receiver converts this into PWM pulses which the ESC interprets as throttle and response information.

In the case of RACECAR/J, the game controller is sending information to the Jetson computer. The Jetson then interprets the game controller commands and sends that information as commands to the ESC. This subtle, but important, difference means that additional autonomous algorithms can be integrated using the computer as a mediator.

The post RACECAR/J – ROS Teleoperation appeared first on JetsonHacks.


Magic Leap One – NVIDIA Jetson TX2

$
0
0

We have know for some time (February, 2017) that Magic Leap has been using the Jetson TX2 for research and development:

Image: Business Insider

This shot shows a rig in development. While the purpose of that particular hardware is a little muddy, the recent announcement of the Magic Leap One Developer Edition is crystal clear.

Teardown!

We have little interest in hype, but we certainly have a lot of interest in the inner workings of a state of the art augmented reality device!

iFixit has a wonderful teardown of the Magic Leap One hardware. Looky here:

This is quite an amazing video, extremely well written and presented. There’s a bunch of good stuffs in there! There’s enough technical bits in there so that you can understand how it works, but it’s not overwhelming. It’s geeky enough for geeks!

You should read the full article: Magic Leap One Teardown

Walk Through

Finding a review with balance of the hype versus reality versus what is currently technically possible is difficult. Here’s one of the best of the lot from Adam Savage’s Tested: Magic Leap One Augmented Reality Review!

Tested has a show/web series “Perceptions” which covers the VR/AR space. This means that they have experience with the current VR systems as well as experience with the current AR systems, such as the Microsoft Hololens. Their review is much more in balance than some of the others out there because they know the current market space.

Their description of the focal plane implementation is spot on, and is also one of the key aspects as to why this is such a difficult problem area. Waveguide displays (Magic Leap calls theirs a “photonics lightfield chip” ) have been around for awhile, but this is certainly a push forward. Multiple waveguide displays have the potential of making AR look more lifelike.

Jetson in the Middle!

As Jetson developers, this is very interesting. There are a wide range of connected peripherals such as cameras, speakers, and multiple displays that require near real-time processing.

Some people from the peanut gallery have commented on the price ( ~ $2295 USD ), the practicality of the device, and all sorts of things. It’s almost as if they don’t even read the first line of the first paragraph. This is a developer kit. We all know developers are special. If the comments are from people who are not developers who actually are using the device, that’s a good time to hit the mute or ignore button. Developers are very much capable of blustering on their own, thank you very much.

This is the developer hardware. It still has to go through all the production engineering to get ready for the consumer market. That process tends to make the product become much better. It will be interesting to see what is on the horizon.

Conclusion

We get a lot of general questions comparing the Jetson Development Kits to other single board computers (SBC). “What is this for?” “What is it used for?” “Why should I buy this?” This is mostly from hobbyists comparing the Jetsons to a Raspberry Pi or some such. It’s not a fair comparison for a variety of reasons. The products are for different markets. The Jetson Development Kits are ostensibly for developers who are building state of the art devices like, well, Magic Leap One. Obviously the form factor of the Jetson TX2 product in this case is different from the development kits full size mini-ITX carrier board. However, having the dev kit means that you can start developing immediately without having to bring up a new carrier board.

At the very least, this is a device which shows the true capabilities of the Jetson given enough development time and resources. Impressive.

The post Magic Leap One – NVIDIA Jetson TX2 appeared first on JetsonHacks.

TensorFlow for NVIDIA Jetson TX2

$
0
0

NVIDIA now has an official release for TensorFlow on the NVIDIA Jetson TX2 Development Kit!

This makes installing TensorFlow on the Jetson much less challenging. Here’s the shortcut version:

For Python 2.7

$ pip install –extra-index-url=https://developer.download.nvidia.com/compute/redist/jp33 tensorflow-gpu

For Python 3.5

pip3 install –extra-index-url=https://developer.download.nvidia.com/compute/redist/jp33 tensorflow-gpu

Here is the original announcement and the full installation document.

Here are some other useful links

NVIDIA DL frameworks guides

Jetson Downloads

Enjoy!

The post TensorFlow for NVIDIA Jetson TX2 appeared first on JetsonHacks.

Links to Jetson Xavier Resources

$
0
0

Exploring ROS – RACECAR/J

$
0
0

Exploring ROS is easy using the both the built-in tools and additional GUI based tools. Looky here:

This is a little different than most the content on JetsonHacks. Basically we bring up a running ROS system using the MIT RACECAR stack and take a look around. This gives some background on how to use the ROS introspection tools. There are several ways to gather the same information, using both graphics based tools and command line tools. In the video, we mix these up a bit so that you can get a feel on how to explore on your own.

To be clear, PLAY! A lot of people ask “How do I do this?” or “I want to do this, what should I do?”. These types of projects are not about some rote method of “Do this, do that”. Get in there, take things apart, put things together! Look around, play and see what happens. You will need to understand how all this works and ties together, and the only way to do that is to get your hands dirty.

Many people ask us questions such as “How do I do this?” You will always receive the reply, “What did you try?” If you haven’t tried anything, you don’t have a foundation from which to ask a question. Therefore we can’t help you. On the other hand, if you say “I tried this, that and the other thing and it didn’t work”, there’s a starting point on which to build. Plus, you’re smart enough to figure most of this out on your own, you don’t need help.

Same thing with general questions. If a simple one sentence question requires a PhD dissertation as an answer, you should consider breaking the question into something just a little more specific.

Background

Because of the format of the video which is an interactive exploration of the RACECAR stack, we’ll just have some notes here. You should definitely have the book “Programming Robots with ROS, a practical introduction to the Robot Operating System” by Morgan Quigley, Brian Gerkey and William D. Smart. The book is written by the people who wrote ROS originally, and has a very good overview of ROS and practical application.

Notes

rqt is a Qt-based framework for GUI development for ROS. It consists of three parts/metapackages:

In the video, we load rqt:

$ sudo apt-get install ros-kinetic-rqt -y
$ sudo apt-get install ros-kinetic-rqt-common-plugins -y
$ sudo apt-get install ros-kinetic-rqt-robot-plugins -y

You should know that most of the command line tools shown in the video have equivalents in the rqt system. Again, Play! There is so much more than can be covered in a few minutes of video. We chose to use command line tools in addition to rqt so that you would be exposed to both ways of doing the same tasks. It is easier to see the full graph of the system in rqt, it’s easier to see the messages using the command line.

We use several command line tools:

These are specific tools, and are useful. There is a very rich environment of documentation and examples, both on the ROS Wiki and elsewhere. Please use those resources to become familiar with the concepts in use here.

One thing we didn’t cover in the video is the mechanism which ROS uses to communicate between nodes. We know that ROS topics provide access to ROS messages.

ROS is pretty simple conceptually, but as in most distributed systems it can turn in to a tangle when you try to examine it. There is roscore, which is the minimum number of ROS nodes and the server which constitute the ROS “kernel” if you will. The server is “ROS MASTER” (you may recall this from setting up ROS). ROS Master conceptually keeps track of all the nodes. When a node registers a publish/subscribe request, it looks up corresponding nodes which are interested in that information based on the rostopic the node wants to use. ROS Master also communicates with a parameter server to inform nodes of how messages are constructed.

ROS Master has other responsibilities of course, but it’s useful to know at least that it is responsible for the lower level communication.

Conclusion

This is the first part of an occasional series of post of hands on with the RACECAR ROS stack. We are planning more in the future, stay tuned!

The post Exploring ROS – RACECAR/J appeared first on JetsonHacks.

JetPack 4.0 Developers Preview

$
0
0

NVIDIA has announced that a developer preview early access release for JetPack 4.0 is now available. This release is to support the new Jetson Xavier Developer Kit. We reproduce the original announcement, known as the good bits, here for your convenience. Here is the original link.

JetPack 4.0 Developer Preview EA components

  • L4T R31.0.1
  • Ubuntu 18.04 LTS aarch64
  • CUDA 10.0
  • cuDNN 7.3
  • TensorRT 5.0 RC
  • VisionWorks 1.6
  • OpenCV 3.3.1

Release Highlights

  • Initial Linux BSP support for Jetson Xavier with L4T R31.0.1
  • Support for Ubuntu 16.04 and Ubuntu 18.04 on the host
  • Support for Ubuntu 18.04 on the target (Unity desktop is kept as default for this release)
  • TensorRT 5.0 support for GPU INT8, and CUDA HMMA/IMMA Tensor Core operations
  • TensorRT 5.0 initial EA support for DLA FP16

Note: JetPack 4.0 Developer Preview EA is intended for developers to immediately get started prototyping their applications with Jetson Xavier, and is not a production release. NVIDIA will provide future updates to JetPack for Jetson Xavier with additional feature enhancements and performance improvements. Please refer to the JetPack Release Notes for more information.

Downloads

Download JetPack…https://developer.nvidia.com/embedded/jetpack
Release Notes……….https://developer.nvidia.com/embedded/jetpack-notes
L4T Release Notes…https://developer.nvidia.com/embedded/dlc/l4t-release-notes-31-0-1
L4T R31.0.1 Page…..https://developer.nvidia.com/embedded/linux-tegra

Enjoy!

The post JetPack 4.0 Developers Preview appeared first on JetsonHacks.

AlphaPilot – Lockheed Martin Innovation Challenge

$
0
0

Lockheed Martin is launching AlphaPilot, an open invitation challenge. Looky here:

The focus is on artificial intelligence for autonomous systems.

The challenge is to design an artificial intelligence and machine learning (AI/ML) framework capable of flying a drone without human intervention or navigational pre-programming.

What about prizes?

Alpha Pilot will award more than $2,000,000 USD to the top performers. $250K will go to the first team that can better human pilots in a head to head race. The grand prize winner takes away $1 million USD.

What do I gotta do for a megabuck?

Here’s where you have an advantage. The drones will be powered by the NVIDIA Jetson platform. All you have to do is navigate a fully autonomous drone through a complex, multi-dimensional racing course without any pre-programming or human intervention. Oh, and you will be racing against professional pilots from the Drone Racing League.

This is a completion of AI quality – all other racing variables, including the drone hardware (which will be provided) are controlled.

How to get started

First, go to AlphaPilot application site and sign up to find out when registration opens. It should be sometime in November. Go through the materials and rules.

You’re ready to start studying and coding. It should be easy, both because you know the Jetson and you have seen this video:

Conclusion

Quit reading here, go study and write code! That megabuck isn’t going to find its way into your pocket all by itself!

The post AlphaPilot – Lockheed Martin Innovation Challenge appeared first on JetsonHacks.

Record and Playback Actions under ROS – RACECAR/J

$
0
0

One of the first things people like to do with their RACECAR is gather data. Using ROS we can record and playback actions. It’s easy! Looky here:

Background

The usual note here: You should definitely have the book “Programming Robots with ROS, a practical introduction to the Robot Operating System” by Morgan Quigley, Brian Gerkey and William D. Smart. The book is written by the people who wrote ROS originally, and has a very good overview of ROS and practical application.

rosbag is a command line tool that records messages and allows you to play them back later. You can think of a rosbag as similar to a digital video file that records not only video but also other types of sensor messages. Think of it as a multi-track recording, where you can have not only a video track but also a sound track, an IMU track, and so on. There is a master time code which then allows all of the messages to synchronize when they play back.

Why is this useful? For debugging purposes, this mechanism allows you to present the same data in a controllable manner which allows you to isolate and fix bugs. It also allows you to develop algorithms independent of having a physical robot. You can record sensor data from the robot, and then use the recorded data for development.

Another use is to take the data within the rosbag to train Deep Learning networks. This has become increasingly more important over the last few years. For example, on RACECAR/J you may want to drive the robot while recording the camera image along with the throttle and steering values. You take that information and train a neural network on how to “drive” the robot. Transfer the trained network to the Jetson which can then infer from the network how to drive the robot.

Usage

rosbag is simple to use. It’s basically:

$ rosbag record <topics>

and

$ rosbag play <name of file>

There are all sorts of command line parameters you can pass which allows control of obvious things like the name of the file to record, to other more interesting ideas like controlling the speed of playback and where you start playback in the file. Check out the ros wiki for more.

Example

In the video, we use rqt to visualize which rostopics to record and playback. From the racecar-ws directory in two separate terminals, we launched the RACECAR teleop launch file and the Stereolabs ZED camera wrapper:

$ roslaunch racecar teleop.launch

and

$ roslaunch zed_wrapper zed.launch

Record

Then we recorded the rosbag:

$ rosbag record /ackermann_cmd /zed/rgb/image_rect_color -O racecarj.bag

This recorded two topics:

  • /ackermann_cmd which contains the information that ultimately controls the steering and the throttle
  • /zed/rgb/image_rect_color which contains the rectified color image from the ZED camera

Playback

In the video, after some fumbling around, we played the rosbag. We also point out remapping topics so that the messages can be injected at different places in the node graph topography.

To look at the recorded image, we loaded the image_view package:

$ sudo apt-get install ros-kinetic-image-view -y

Next we ran an image view on the topic /video_stuff:

$ rosrun image_view image_view image:=/video_stuff

Then we are ready to playback the rosbag:

$ rosbag play racecarj.bag /ackermann_cmd:=/ackermann_cmd_mux/input/navigation /zed/rgb/image_rect:=/video_stuff

This command plays back the topics that were previously recorded. However, the topics were remmapped with /ackermann_cmd going to the Ackermann Command Mux (ackermann_cmd_mux/input/navigation) and the ZED image being remapped to /video_stuff. The video in the rosbag displays in the image_view.

Basically, remapping the topics help smooth out the playback.

Conclusion

Conceptually ROS is pretty simple, but can be tricky to think about because it is a concurrent, distributed system. rosbags are a very useful tool for help with debugging, simulation and training. rosbags be good!

The post Record and Playback Actions under ROS – RACECAR/J appeared first on JetsonHacks.


NVIDIA Jetson AGX Xavier Developer Kit

$
0
0

NVIDIA recently began shipping a new product, the Jetson AGX Xavier Developer Kit. Looky here:

Jetson AGX Xavier Overview

The Jetson Xavier is the next generation of the Jetson line of development kits. Xavier ups the game from the previous generation Jetson TX2.

The Jetson Xavier introduces a new module format in order to support the increase in bandwidth needs for the next generation of data I/O. This includes introduction of USB 3.1 and PCIe Gen 4.0.

Hardware

The Jetson AGX Xavier features a NVIDIA Volta 512 core GPU with 64 Tensor cores. Eight Carmel cores (NVIDIA’s own custom 64-bit ARM cores) make up the CPU complex, in 4 dual-core clusters. The cores implement ARMv8.2 with RAS support. All cores are cache coherent, which extends to the GPU and other onboard accelerators.

The Xavier incorporates dual Programmable Vision Accelerators (PVAs) for helping with common computer vision processing tasks.

Another accelerators on the Tegra chip is the Deep Learning Accelerator (DLA), a hardware implementation of the NVIDIA NVDLA architecture.

Some great information about the actual Tegra Xavier chip is available on WikiChip: Tegra Xavier.

Jetson Xavier - Right Front Jetson AGX Xavier - Front Jetson AGX Xavier - Left Back Jetson Xavier - Back Jetson Xavier - Bottom Jetson Xavier

The memory subsystem incorporates a 256-bit memory controller which provides high bandwidth LPDDR4 support. 16 GB LPDDR4 Main Memory and 32 GB eMMC Flash memory are integrated on the module.

The Module also supports hardware video encoders and decoders which support 4K ultra-high-definition video at 60 fps in several different formats. Also included is an Audio Processing Engine with full hardware support for multi-channel audio.

Gigabit Ethernet BASE-T is included.

The display controller subsystem allows for three multi-mode (eDP/DP/HDMI) Serial Output Resources. This includes HDMI 2.0a/b (up to 6Gbps), DP 1.2a, and eDP 1.4 (up to 8.1Gbps). There is a HDMI 2.0 connector on the carrier board, the other display ports are accessible via USB 3.1.

Speaking of USB 3.1, there are two USB 3.1 connectors, 40 pins for GPIO, a micro USB connector, a PCIe Gen 4.0 slot. Along with the previously mentioned HDMI 2.0a connector and RJ45 Ethernet port, there is a barrel jack which supplies power to the developer kit. The developer kit can be powered by between 9 and 20V (the kit includes a 65 watt 19V power supply). The developer kit can also be powered through either one of the USB 3.1 connectors.

Sippy or Speedy

Like the Jetson TX2, there are several different selectable modes for configuring the power consumption and speed. These modes work by managing the number of CPU cores online and setting the frequency of the CPU and GPU cores. The Jetson Xavier has modes similar to the profile of the Jetson TX2 in both low and high performance mode (10W and 15W), along with another Xavier specific super mode (30W). 30W refers to 30 watts, and different variations of modes can tailor the performance in the specific power envelope that you specify.

Software

There are several changes to the Jetson AGX Xavier software stack. The Jetson Xavier runs a Developer Preview of an Ubuntu 18.04 variant named L4T 31.0. The Linux Kernel is 4.9, a newer version than the earlier Jetson TX2 version 4.4. There have been changes to the boot flow. The Jetson Xavier comes with a long list of software libraries, and a good selection of samples with source code.

The new JetPack 4.0 Developer Preview Early Access installer is available to flash and copy system software to the Jetson TX2.

Availability

If you are a developer, you can get the Jetson AGX Xavier at a discounted price on the NVIDIA site.

You can buy the Jetson AGX Xavier on Amazon at the regular price. You can also buy the Xavier on the NVIDIA site and other authorized resellers.

Peak Performance

Here is what NVIDIA is saying about the peak performance of the Xavier:

Jetson AGX Xavier is capable of more than 30 TOPS (trillion operations per second) for deep learning and computer vision tasks. The 512-core Volta GPU with support for Tensor Cores and mixed-precision compute is capable of up to 11 TFLOPS FP16 or 22 TOPS INT8 compute. Jetson AGX Xavier’s dual NVDLA engines are capable of 5 TOPS INT8 or 2.5 TFLOPS FP16 performance each. It also has high-performance eight-core ARM64 CPU, a dedicated image processor, a video processor and a vision processor for accelerating computer vision tasks.

Conclusion

Stay tuned as we begin working with the Xavier to better understand how to take advantage of the next level performance. The amount of documentation available now is impressive, you can find links on the Jetson AGX Xavier forum.

Pictures, Natch!

Jetson Xavier - Right Front
Jetson Xavier – Right Front
Jetson AGX Xavier - Front
Jetson AGX Xavier – Front
Jetson AGX Xavier - Left Back
Jetson AGX Xavier – Left Back
Jetson Xavier - Back
Jetson Xavier – Back
Jetson Xavier - Back
Jetson Xavier – Back

The post NVIDIA Jetson AGX Xavier Developer Kit appeared first on JetsonHacks.

JetPack 4.1 Developer Preview – NVIDIA Jetson AGX Xavier Developer Kit

$
0
0

JetPack 4.1 installs the operating system, libraries and SDKs on to the Jetson AGX Xavier Developer Kit. Looky here:

Background

With the early access shipments of the Jetson AGX Xavier, NVIDIA is providing the Developer Preview of the JetPack series. The current version is JetPack 4.1 Developer Preview, Early Access available from the JetPack web page. You can download earlier version of JetPack from that page also, along with versions for the other Jetson family members, the Jetson TX2, Jetson TX1, and in the archives for the now discontinued Jetson TK1.

The idea behind the Developer Preview Early Access is to give developers as much as a head start on building their products and applications as possible. As such, and especially with a new product with as rich of a computing environment as the Xavier, this release supports the most basic needs of developers. Some of the new hardware features (which there are many) are still works in progress, and do not have much support just yet.

With that said, you can count on several iterations of JetPack as new features come online. Plan your development accordingly, there will be times that you will need to regenerate your system from scratch.

Installation

For the most part, installation pretty easy. From an Ubuntu 16.04 or Ubuntu 18.04 PC 64 bit host computer, you simply download the JetPack software from the NVIDIA JetPack web page (you’ll have to sign in with your developer account to download JetPack) and follow the instructions in the setup guide. Watching the video above should cover most questions, should the need arise.

Note: NVIDIA supports running JetPack from a native Ubuntu installation. Many people have issues with using JetPack on VMs because of the way that USB enumerates during the flashing process.

There are a wide variety of tools which you can select to install on the PC side (called the Host) and the Jetson Xavier (called the Target). Use the Component Manager to select which libraries and SDKs you wish to install, on both the Host and the Target. In the video above, we select everything, because we want to play!

Installation from the demo host computer to the Jetson took about an hour fifteen all together, including all the downloads on a 30 MBs Internet link, flashing the Jetson, cross compiling the samples and then loading them onto the Jetson.

The one tricky bit in all of this is setting the Jetson into recovery mode. Follow the on-screen instructions to set the Jetson into recovery mode, open a Terminal, and then type:

$ lsusb

In the output you should see the Jetson Xavier listed as 0955:7019 Nvidia. If you don’t see the Jetson using lsusb, then the device will not be flashed. The video shows how to set the Xavier into Force Recovery Mode with no expensed spared state of the art video and computer graphics technology. That may be a slight exaggeration, but there is a bit about it in there, including an absolutely riveting dramatic recreation in closeup. You’ll laugh, you’ll cry, but most of all you’ll care.

Tools Available

JetPack 4.1 flashes the L4T 31.0.2 (an Ubuntu 18.04 variant) to the Jetson Xavier. Here are some of the JetPack release highlights:

  • L4T R31.0.2
  • Ubuntu 18.04 LTS aarch64
  • CUDA 10.0
  • cuDNN 7.3
  • TensorRT 5.0 RC
  • VisionWorks 1.6
  • OpenCV 3.3.1
  • Multimedia API

Developer Tools

  • CUDA Tools
  • NVIDIA Nsight Systems
  • NVIDIA Nsight Graphics

Conclusion

The first time through, setting up the system and flashing the Jetson can take around a little more than an hour or so depending on your download speeds and the speed of your PC. In the video, a simple cable modem 30MBs link was used for downloading. Downloading all of the Host and Target components only happens the first time you do an installation. Subsequent installations check for updates and if none are available, then simply flash the Jetson. This saves a lot of time.

It’s time to start developing!

The post JetPack 4.1 Developer Preview – NVIDIA Jetson AGX Xavier Developer Kit appeared first on JetsonHacks.

NVPModel – NVIDIA Jetson AGX Xavier Developer Kit

$
0
0

You can use the command line tool nvpmodel to set the performance and energy usage characteristics of the NVIDIA Jetson AGX Developer Kit. Looky here:

Background

With the introduction of the Jetson TX2, the command line tool nvpmodel brought the ability to define a set of parameters to effectively define the performance for a given power envelope.

Jetson Tegra systems cover a wide range of performance and power requirements. Balancing the performance and power requirements is an important part of most product development cycles. Fortunately, NVIDIA has done the heavy lifting and done the calculations to figure out which processing components provide the best performance for a given energy budget in multiple configurations. At the very least, these configurations provide a great start to developing your own tuned configuration.

On the TX2, nvpmodel defines the number of CPUs on line and their clock frequencies, the GPU frequency, and the External Memory Controller (EMC) frequency. Remember that the EMC controls the speed of access to the external LPDDR4 memory.

The Jetson AGX Xavier is a much richer computing environment than the Jetson TX2. In addition to adding 4 more CPU cores, the Xavier adds Deep Learning Accelerators (DLA) and Visual Accelerators (VA). These new additions can also be configured with nvpmodel! nvpmodel defines 4 different power envelopes in 7 different modes. The power envelopes are 10 Watt, 15 Watt, 30 Watt, and “Forget power usage, speed is what I need!”.

Configuration

nvpmodel introduces seven different “modes” on the Jetson AGX Xavier. The following table breaks down the modes which describe which CPU, GPU, DLA and VA cores to use along with their clock frequency, and the memory controller frequency.

NVPMODEL CLOCK CONFIGURATION

Mode Name EDP 10W 15W 30W 30W 30W 30W
MAXN MODE_10W MODE_15W MODE_30W_ALL MODE_30W_6CORE MODE_30W_4CORE MODE_30W_2CORE
Power Budget n/a 10W 15W 30W 30W 30W 30W
Mode ID 0 1 2 3 4 5 6
Number of Online CPUs 8 2 4 8 6 4 2
CPU Maximal Frequency
(MHz)
2265.6 1200 1200 1200 1450 1780 2100
GPU TPC 4 2 4 4 4 4 4
CPU Maximal Frequency
(MHz)
1377 520 670 900 900 900 900
DLA Cores 2 2 2 2 2 2 2
DLA Maximal Frequency
(MHz)
1395.2 550 750 1050 1050 1050 1050
Vision Accelerator (VA) cores 2 0 1 1 1 1 1
VA Maximal Frequency
(MHz)
1088 0 550 760 760 760 760
Memory Maximal Frequency
(MHz)
2133 1066 1333 1600 1600 1600 1600
The default mode is 15W (MODE_15W, ID:2)

Table Abbreviation Notes:

  • GPU TPC – GPU Texture/Processor Cluster
  • DLA – Deep Learning Accelerator
  • VA – Vision Accelerator

Usage

To call nvpmodel:

$ sudo nvpmodel -m [mode]

where mode is the number of the mode that you want to use. For example:

$ sudo nvpmodel -m 0

places the Jetson into MAXN mode.

You can query which mode is currently being used:

$ sudo nvpmodel -q --verbose

The file /etc/nvpmodel.conf holds the different models. Developers can add their own models to add different modes suitable to their application.

Note: nvpmodel settings are persistent across sessions. That is, if the Xavier reboots, the nvpmodel settings remain in effect.

jetson_clocks.sh

If you have been developing on previous Jetson models, you probably are familiar with the script jetson_clocks.sh. On the Jetson Xavier, jetson_clocks.sh provides the best performance for the current nvpmodel mode. The nvpmodel configuration defines maximum and minimum clock values for any given mode, jetson_clocks.sh adjusts the clock value to the maximum value. Oh, and sometimes adjusts the fan value when you decide to run flat out.

jetson_clocks.sh can also show the current settings for the CPU, GPU and EMC. This includes which cores are online, the minimum and maximum frequencies, and the current frequency.

$ sudo ${HOME}/jetson_clocks.sh –show

Note: If you are in the home directory, of course you can just use:

$ sudo ./jetson_clocks.sh –show

You can store the current clock settings for later use into a file using the store option. The restore option uses the file to set the clocks to the saved value.
To maximize the current mode performance for the Xavier:

$ sudo ${HOME}/jetson_clocks.sh –show

Note: The effects of jetson_clocks.sh is not persistent across sessions. In other words, if the machine reboots the previous jetson_clocks.sh settings are not in place.

Maximum Performance

To configure the Jetson Xavier and set the clocks for maximum performance:

$ sudo nvpmodel -m 0
$ sudo ${HOME}/jetson_clocks.sh

Conclusion

Using nvpmodel provides developers with a nice tool set to easily setup different energy usage and performance scenarios. Recommended.

Notes

Note: The mode names in the nvpmodel configuration file:

  • MAXN
  • MODE_10W
  • MODE_15W
  • MODE_30W_ALL
  • MODE_30W_6CORE
  • MODE_30W_4CORE
  • MODE_30W_2CORE

The table derives from a NVIDIA webinar given by Dustin Franklin. The notes for the webinar are available on Github.

The post NVPModel – NVIDIA Jetson AGX Xavier Developer Kit appeared first on JetsonHacks.

Install NVMe SSD on NVIDIA Jetson AGX Developer Kit

$
0
0

It is a simple task to add a sea of gigabyte goodness to the Jetson Xavier using a NVMe SSD. Looky here:

Background

One of the nice additions to the Jetson AGX Xavier is a M.2 Key M slot. If you use a Jetson TX1 or Jetson TX2, you know that there is a M.2 Key E slot. The Key E slot is useful for adding functions such as wireless cards. However, by adding a M.2 Key M slot we can now add Solid State Disk (SSD) memory. The M.2 Key M slot uses the Non-Volatile Memory Express (NVMe) protocol that runs over PCIe. Note that when you install a SSD card in the slot, the card needs to be NVMe PCIe.

Materials and Tools

In the video, we install a Western Digital 500GB NVMe SSD. There are several different sizes and brands of these types of devices, people have reported good results with the Samsung variety. We also use our trusty iFixit Pro Tech Toolkit which contains a variety of useful tools for just this purpose.

The Southern California weather brought in Santa Ana winds during filming. The humidity is unusually low, around 15%. There seems to be a lot of static electricity hanging about, so it is time to break out the iFixit Anti-Static Mat. This helps keep the ESD dogs at bay. If in doubt when working on electronics (especially at the component level), the anti-static mat is your friend.

Hardware Installation

The M.2 Key M connector is on the top of the carrier board. You will need to detach the carrier board from the Jetson Xavier Module. This is a straightforward task. With a #2 phillips head screwdriver, remove the four screws that hold the standoffs to the Jetson module (the carrier board is sandwiched in-between). From the factory, the screws may have some blue thread locker on them, which may require a little elbow grease to start the removal process. Set the standoffs and screws aside.

Standoff Screws
Remove Standoff Screws

Next, carefully disconnect the carrier board from the module. Note that there is a wire connecting the module to the carrier board (this is for controlling the fan). The wire is relatively short and the connection is delicate. There is a 699 pin connector which joins the the Jetson module to the carrier board. As shown in the video, lift gently on the carrier board. The PCIe connector on the end of the carrier board can provide a little purchase, but you should not have to apply an excessive amount of pressure to break the connection.

When you feel/hear the pop of the disconnection, you should stop. Lift gently and find the wire connecting the carrier board to the module. The wire is not long enough to allow the carrier board to lay flat on a table surface when the Jetson module is in certain positions. As shown in the video, you can use something like a book to create a resting place for the carrier board. An alternative is to simply lay the Jetson module on its side, with the cable exiting on the bottom side.

Disassembled Xavier
Disassembled Xavier

You may want to disconnect the cable from the carrier board. You can use a pair of tweezers to help in the task. The connector is delicate, personally I would try to avoid doing that if possible.

A retaining screw is provided at the end of the M.2 slot. Remove the retaining screw. You then install the SSD card into the connector, and then use the retaining screw to hold it in place.

Xavier SSD Installed
Xavier SSD Installed

In this case, assembly is the reverse of disassembly. If you are using the Jetson Xavier in something like a robotics environment, you may want to put a dab of blue thread locker on the end of the standoff screws to help keep them secure.

Disk Configuration

After the hardware install, it is time to configure the disk under Ubuntu. Hook the Xavier back up to a keyboard, mouse and monitor. In the video above, we use mostly GUI tools, please refer to the video for a walk through.

The basic steps are to format the disk, and then create partition(s). In the video, we just allocate the entire disk space to one big ol’ partition. However you may want to be a little fancier. Being Linux, there are tools for doing this through the command line, as all the pros will tell you.

Mount Options

In the video, we didn’t cover much about the disk mount options. In the Disks application, you can access the mount options through the gear icon, “Edit Mount Options …”. This brings up a dialog. In the video, you’ll notice the “Mount Point” to put it politely, gibberish looking. It basically consists of a directory pointer (/media) followed by the UUID of the device. Of course the UUID is specific to the device.

If you’re not a big fan of accessing the device that way, you can rename it. For example, let’s say we want to refer to the disk from the root directory as “/XavierSSD500”. We edit the “Mount Options” to look like this:

Xavier Disk Mounting
Xavier Disk Mounting

and then click “Ok”. After rebooting, the disk will then be available through /XavierSSD500. For example:

$ cd /XavierSSD500

will switch you over to the SSD:

nvidia@jetson-0422818069391:/XavierSSD500$

Another Article on the Same Subject!

Here’s an article on how another Jetson Xavier owner goes through the process. Very nicely done. That article also covers some disk mount options, which you may find useful. Regardless of which method you use, the new SSD should show up on your desktop, and you can enjoy the sea of GB goodness!

Conclusion

For a little bit of money and about 15 minutes worth of work, you too can be swimming in the sea of gigabytes. It’s nice to have the disk always available, and not have to have external drives dangling precariously about.

The post Install NVMe SSD on NVIDIA Jetson AGX Developer Kit appeared first on JetsonHacks.

I2C – NVIDIA Jetson AGX Xavier Developer Kit

$
0
0

It is straightforward to connect an I2C device to a Jetson AGX Xavier. Looky here:

Background

I2C is a straightforward serial protocol. There are usually two wires, one is for transferring data (SDA), the other is a clock which is used to mark the begin and end of data packed (SCL). Most devices will also require power (VCC) and ground (GND). There are several I2C busses on the NVIDIA Jetson AGX Xavier Kit. You can access I2C bus 8 and I2C bus 1 on the GPIO Expansion Header.

The Xavier GPIO Expansion Header is basically the same layout as the previous generation Jetson TX series. However, there is a slight difference in the software to interface with I2C devices. The Jetson TX series use a derivative of Ubuntu 16.04 (L4T 28.x), the Xavier is an Ubuntu 18.04 variant (L4T 31.x). Ubuntu 18 shifts the libi2c-dev smbus API to a separate library, and a different header file in comparison to the earlier version.

Hardware

Note:A Jetson Xavier using L4T 31.0.2 (JetPack 4.1) is shown in the demo.

First, before powering up the Jetson, let’s wire up the LED Segment Display. Avoid wiring the Xavier when there is power connected. Here’s the pinout of the GPIO Expansion Header. In our example, we power the display from the Jetson GPIO header at 5V.

For this example project, a Adafruit 0.56″ 4-digit 7-segment Display W/i2c Backpack – Green is wired to a Jetson. The Display is assembled per the Adafruit instructions.

On the Jetson AGX Xavier, Pin 1 of the GPIO Expansion Header is the pin closest to the power indicator light:

Jetson AGX Xavier Pin 1
Jetson AGX Xavier Pin 1

The odd number pins (right to left, pins 1,3,5 and so on) are on the top row. The bottom row is the even number pins.

On a Jetson Xavier, here’s the wiring combination for I2C Bus 8:

GND Pin 6 -> LED Backpack (GND)
VCC Pin 2 -> LED Backpack (VCC – 5V)
SDA Pin 3 -> LED Backpack (SDA)
SCL Pin 5 -> LED Backpack (SCL)

Note that the Xavier also has a I2C Bus 1 interface. See the Xavier GPIO Pinout Diagram.

If you wish to interface with I2C Bus 1:

GND Pin 6 -> LED Backpack (GND)
VCC Pin 2 -> LED Backpack (VCC – 5V)
SDA Pin 27 -> LED Backpack (SDA)
SCL Pin 28 -> LED Backpack (SCL)

Note: To use Bus 1 with the example, you will need to modify the example source code.

Software Installation

Once the board is wired up, turn the Jetson on.
Install the libi2c-dev library. In order to be able inspect the LED Display, you may find it useful to also install the i2c tools:

$ sudo apt-get install libi2c-dev i2c-tools

After installation, in a Terminal execute (8 is the I2C bus in this case):

$ sudo i2cdetect -y -r 8

ubuntu@tegra-ubuntu:~$ sudo i2cdetect -y -r 8
0 1 2 3 4 5 6 7 8 9 a b c d e f
00: — — — — — — — — — — — — —
10: — — — — — — — — — — — — — — — —
20: — — — — — — — — — — — — — — — —
30: — — — — — — — — — — — — — — — —
40: — — — — — — — — — — — — — — — —
50: — — — — — — — — — — — — — — — —
60: — — — — — — — — — — — — — — — —
70: 70 — — — 74 — — —

You should see an entry of 0x70, which is the default address of the LED Segment Display. Note that if you have soldered the address pins on the Display to change the address, you should see the appropriate address.

Next, install the library and example code which is available in the JHLEDBackpack repository on the JetsonHacks Github account. As of this writing, the Xavier version is in the L4T31 branch of the repository. There is a tagged release version for the Xavier in the v2.0 tag of the repository. To install:

$ git clone https://github.com/jetsonhacks/JHLEDBackpack.git
$ cd JHLEDBackpack
$ git checkout v2.0
$ cd example

You are then ready to compile the example and run it.

$ make
$ sudo ./displayExample

The display will go through a couple of examples, a blinking set of dashes, a hexadecimal display, a floating point number display, a count down timer and a clock example. Hit the ‘Esc’ key during the clock example to end the demo.

The source library defaults to I2C Bus 1. When we use bus 8 as in our example, displayExample.cpp reads:

HT16K33 *displayMatrix = new HT16K33() ;
// Default I2C Bus 1
// Bus 8 for Xavier Expander Header pins 3,5
displayMatrix->kI2CBus = 8;
int err = displayMatrix->openHT16K33();

Note: If you wish to use Pin 27 and 28, change the kI2CBus to 1. Make sure you save the file, and run make on it.

More Notes

Equipment and Supplies

The segmented LED display is a kit. You will need some elementary soldering skills for assembly. We tend to use:

New to electronics? This is a pretty easy project, looky here: Electronics Tutorials for some introductory material on how to start becoming a master.

Conclusion

Accessing the I2C bus on the Jetson AGX Xavier GPIO Expansion Header is straightforward, and makes for easy prototyping!

The post I2C – NVIDIA Jetson AGX Xavier Developer Kit appeared first on JetsonHacks.

Viewing all 339 articles
Browse latest View live