Some of the more popular pages on JetsonHacks are the pinouts for the GPIO headers on each of the Jetson developer kits. Here’s a new one for the Jetson Nano 2GB!
The Jetson Nano 2GB GPIO J6 Header Pinout gathers together some of the information that’s spread over several documents such as the sysfs number for the GPIO pins, I2C bus numbers and the device of the UART and places it in one place. Plus there’s a bunch of fun colors!
Microsoft now supports ARM 64 machines like the NVIDIA Jetsons and Raspberry Pi within Visual Studio Code. Looky here:
Background
Last year we did an article on installing Visual Studio Code using a community build. While this is great, Microsoft now supports ARM 64 builds directly. In addition in the newest release, Microsoft also adds the C/C++ programming extensions for ARM/ARM64!
Visual Studio Code is probably the most popular programming tool on the planet, and for good reason. It’s free, and supports a very large ecosystem of programming languages and extensions. While aimed at web and cloud development, most people find that other development tasks are easier too.
Installation
One easy way to install Visual Studio Code is to go to the Visual Studio Code site, download the .deb for ARM 64, and install it. Just remember to find the ‘Other platforms’ link to take you to the place to download the ARM 64 version. Many of the buttons labelled ‘Download’ download the x86 version of Visual Studio Code (this is for a PC), make sure you find the ARM 64 version (see the video if you need a better description).
There is great documentation for both installing Visual Studio Code and getting started. This includes video tutorials and a large number of articles for use. Certainly a page worth checking out.
The ARM 64 version runs on the NVIDIA Jetson Nano 2GB, Jetson Nano, Jetson AGX Xavier, Jetson Xavier NX, Jetson TX1 and Jetson TX2. The Jetson TK1 requires the ‘ARM’ version (32 bit), though we have not tested the Jetson TK1 with Visual Studio Code.
If you are looking to script the installation, on the JetsonHacksNano account on Github, there is a repository installVSCode. There are two scripts. The first script, installVSCode.sh, will simply download Visual Studio Code and install it. The second script, installVSCodeWithPython.sh, will download Visual Studio Code, install it, and then download and install the Microsoft Python extension, along with some Python support libraries.
To clone the repository and install VSCode:
$ git clone https://github.com/JetsonHacksNano/installVSCode.git $ cd installVSCode $ ./installVSCode.sh
To clone the repository and install VSCode with Python support:
$ git clone https://github.com/JetsonHacksNano/installVSCode.git $ cd installVSCode $ ./installVSCodeWithPython.sh
Modify the scripts to suit your development needs. In particular, the Python install script gives an example on how to install an extension from the command line:
$ code –install-extension ms-python.python –force
The name of the extension can be found by looking up the extension in the extension manager of Code.
Conclusion
If you program a lot on your Jetson, Visual Studio Code is worth checking out. This is especially true if you use different languages, as you can hop back and forth between them and maintain the same programming development environment.
“May you live in interesting times.” Some people are going to look back on 2020, but it will be difficult to get any type of useful, reasonable perspective for several years.
But I know you don’t care about any of that. You want to know how it effects JetsonHacks. First, I want to thank you for being part of the JetsonHacks community, I hope you find everything here to your liking.
Normally in this end of year article I lay out the number of views and such that all of the JetsonHacks properties. Because of the plague, let’s say up front that the numbers where atypical, and not that interesting.
For YouTube, there were 750K views (36.9K hours of watch time!) and we picked up 7.3K subscribers. We only published 10 videos this year, so those numbers are actually quite astounding. Last year there were 922K views, so the views are down. But we we published 34 videos in 2019, so 2020 performed quite well considering.
We passed 25K YouTube subscribers this year, which I find quite incredible and well beyond any of my expectations.
Thank you everyone for subscribing, liking and sharing the videos. It really helps with the YouTube algorithms.
As you probably know, this year I did some videos on other YouTube channels which don’t show up in these totals.
On the JetsonHacks website, we had 834K views down from 960K the year before. There were 20 posts in 2020, 44 in 2019 so this years performance is quite good taking that into consideration.
Surprisingly, 2020 had 335K visitors while 2019 had 338K visitors! So the traffic is about the same. Without as much new content, it makes sense that the overall page views are down even with the same amount of traffic.
Going Forward
Going into 2021 we’re planning some new projects. The first one is a media project. We are working on a monthly live stream that I will be doing with NVIDIA which will be available on the NVIDIA Developers YouTube Channel starting in mid-January. The idea is that for each episode we will have some incredibly amusing banter, chat with an interesting member of the Jetson community and/or NVIDIA engineers, and have Q&A to discuss Jetson issues.
Of course, we will want to crank up more content on JetsonHacks than we had in 2020. I’ve been working on some interesting Jetson related projects this year, but have not been able to share them. However, in the last few months I have become unreasonably fascinated with motion control. So I think it might be fun to play with some stepper motors and build a motion control rig.
One of the issues that has been looming is technical debt. JetsonHacks has many articles, videos and Github repositories. Unfortunately many of these do not age particularly well. In part this is the nature of the beast. As the technology advances, there are a lot of changes to the underlying frameworks.
Inevitably something breaks in a new release, and any code written for the video or articles can become stale. Sometimes it’s simply a matter of changing release versions in the Github scripts, sometimes the script or technique simply becomes obsolete.
If it helps, the way that I view JetsonHacks is that it is a set of notes of projects that I work on. If it is a project that I work on a lot, you will see the code updated. If not, then the code more than likely needs help. To be clear, it is open source which means that you have access to it and change it to meet your needs. However that does not mean that it will be maintained.
The JetsonHacks Github repositories need a good scrubbing. You will begin to see many of the repositories being archived. I haven’t figured out a good way to do this yet, it feels like there should be a parallel archived account.
As the global supply chains come back up to speed, we should be able to start some interesting projects.
Conclusion
Yep. Conclude 2020, and get on with it. I shall see you on the other side. Happy New Year!
The big news? Boot from USB or NVMe! You should check it out. Here’s the official blurb from NVIDIA, note the Webinars on Feb. 9 and Feb. 11, 2021:
We are pleased to announce JetPack 4.5 1, a production release supporting Jetson AGX Xavier series, Jetson Xavier NX, Jetson TX2 series, Jetson TX1, and Jetson Nano.
JetPack 4.5 includes VPI 1.0, Security features including enhanced secure boot and support for full disk encryption, enhanced bootloader functionality, and a new way of flashing Jetson devices using NFS.
We will be hosting a webinar on Feb 9th for an in-depth overview of JetPack 4.5 features where we will also demo select features and answer any questions you may have. Please register here.
Learn how to implement computer vision and image processing pipelines using VPI by registering to the webinar we will be hosting on Feb 11th
Highlights of JetPack 4.5 are:
First Production release of Vision Programming Interface (VPI).
Support for loading kernel, device tree and initrd from the root file system on USB drive or NVMe. Refer to release notes 2 for more details.
UBoot version updated to v2020.04.
Boot firmware for all Jetson Nano developer kits updated to relocate boot firmware to integrated QSPI-NOR. Developer kits will use microSD Cardsolely or OS/app storage after this change
All Jetson Nano Developer Kits will now show a warning screen if no microSD Card is inserted. It will attempt to boot from other supported media after showing this warning screen.
With the introduction of JetPack 4.5, it is now possible to boot your Jetson Nano from a USB drive! Looky here:
Background
The NVIDIA Jetson Nano Developer Kits (A02, B01, and 2GB) boot and run from a micro-SD card. It is now possible to set the Nano up to boot and run from a USB drive.
There are a couple of reasons to boot from USB. First, it is more reliable over the long term than using the micro SD card. SD cards aren’t really designed to handle the files of an operating system, and tend to be less reliable than USB drives in general.
The second reason is that USB is much faster than the SD card speed, usually between 4 to 10 times depending on the application. This makes things much snappier!
USB drives are available in various flavors. You’ll see them as SSD, which stands for Solid State Drive, HDD, which stands for Hard Disk Drive, and flash drive which are sometimes referred to as thumb drives or USB sticks.
For this application, HDD is the most reliable long term, SSD is an excellent alternative and is faster. HDD tends to be slower than SSD and tends to use more power, but offers more storage for the same price. Some HDD drives can be powered externally. Flash drives aren’t as suitable for this application for much the same reason as SD cards, but some people still use them.
People also like the extra storage that many USB drives afford. This makes it easy to store lots of programs and large amounts of data, such as video files and large data models.
Here are a few drives we’ve had success with (affiliate links):
These are all available in a variety of sizes, some careful hunting will find a drive which meets your needs.
Disk Formatting and Partitions
Here are some terms that you will see when we talk about drives on Linux. Partitions, GPT and Ext4. Sometimes the terms are jumbled together, making them hard to understand.
Physical devices consist of a collection of sectors. These are the physical bits were data are stored. Software creates something known as a Partition, which maps and groups sectors together. In that way, each region of the device can be managed independently.
Partition descriptions are stored in a Partition Table. A physical drive can have one or more partitions.
Because a Partition is treated as a separate logical volume, drives with multiple partitions appear as though they are multiple physical drives.
Partitions are handy for storing things like file systems, swap disks and so on.
You have heard the term ‘format the drive’ before. When we talk about formatting a drive for Linux, this is a two step process. First we format the drive to define the partition layout, and then we format partitions for a specific use.
Linux uses GPT for the layout of a partition table on a physical storage device using globally unique identifiers, or GUIDs. It is the Linux preferred partitioning system for drives larger than 2TB. This results in a table at a known location which tells the system where partitions are physically located on the drive. In other words, a map!
Once the drive is partitioned, we are ready to format a partition for the Linux file system. Different partitions can have different format types.
This is how you might setup a dual boot Windows and Linux machine on a PC, for example. One partition for the Windows filesystem, and another for the Linux filesystem.
For our Jetson, we want to format a partition as Ext4. Ext4 is a format for a journaling file system for Linux. We will put the rootfs of the Jetson on this partition. Rootfs is simply the root file system, we usually think about this as all of the files on the system.
QSPI-NOR?
In previous versions of JetPack, the system boots from information stored on specific partitions on the SD card. For JetPack 4.5+, this changes.
On the Jetson module, there is flash memory. Typically you will see this referred to as QSPI-NOR. NOR is a type of flash memory. QSPI stands for Quad Serial Protocol Interface, a highway for transferring data to and from the Jetson Tegra chip to the NOR flash memory.
When JetPack goes through its setup process, it transfers information to boot the system from the SD card to the QSPI-NOR. Consequently, the Jetson can use this information to boot the system without having to have a SD card present.
There are a couple of more things to know. If there is no SD card, then the Jetson will look to the USB drive for a rootfs. There are other devices from which the Jetson will be able to boot if they are setup correctly, such as a NVMe drive. In our case, we will just concern ourselves with the USB drive.
There is a special file in the /boot directory which tells Linux where the root file system is located. We will need to modify this file, named extlinux.conf to point to the new location of the root.
You can follow the instructions from NVIDIA in case you want to create your bootable USB drive on your host PC. Here we are going to create the drive from the Jetson itself.
Install
You will need to do an initial setup of the Jetson with JetPack 4.5+ in order to load updated firmware into the Jetson Module QSPI-NOR flash memory. Follow the ‘Getting Started’ instructions on the JetPack site: https://developer.nvidia.com/embedded/jetpack
During the initial setup of L4T 32.5+, the firmware for the Jetson Nano developer kits relocates the boot firmware from the micro SD card to the Jetson module integrated QSPI-NOR flash memory. This also changes the layout of the SD card. This layout is now analagous to the BIOS in a PC.
There is a repository on the JetsonHacksNano account on Github, bootFromUSB which contains convenience scripts to help with this task.
Clone the repository, and switch to that repositories’ directory.
$ git clone https://github.com/jetsonhacksnano/bootFromUSB $ cd bootFromUSB
Step 1
Prepare a USB drive (preferably USB 3.0+, SSD, HDD) by formatting the disk with the GPT partitioning scheme. This will erase the data that is on the drive, be warned. Then create a partition, and set the format type to Ext4. In the video, we use the ‘Disks’ application. See the video for a walk through. It is easier if you only plug in one USB drive during this procedure. When finished, the disk should show as /dev/sda1 or similar. Note: Make sure that the partition is Ext4, as NTSF will appear to copy correctly but cause issues later on. Typically it is easiest to set the volume label for later use during this process.
Step 2
Copy the application area of the micro SD card to the USB drive. The script copyRootToUSB.sh copies the contents of the entire system micro SD card to the USB drive. Naturally, the USB drive storage should be larger than the micro SD card. Note: Make sure that the USB drive is mounted before running the script. In order to copyRootToUSB:
usage: ./copyRootToUSB.sh [OPTIONS]
-d | --directory Directory path to parent of kernel
-v | --volume_label Label of Volume to lookup
-p | --path Device Path to USB drive (e.g. /dev/sda1)
-h | --help This message
In the video, we:
$ ./copyRootToUSB.sh -p /dev/sda1
Step 3
Modify the /boot/extlinux/extlinux.conf file on the USB drive. An entry should be added to point to the new rootfs (typically this is /dev/sda1). There is a sample configuration file: sample-extlinux.conf in the repository.
Modify the /boot/extlinux/extlinux.conf file located on the USB drive. This is in a system protected area, so you will need privileges to change the file, ie ‘sudo gedit’. Make a copy of the ‘PRIMARY’ entry and rename it sdcard.
In the PRIMARY entry change the location of the root to point to the USB drive, ie change ‘root=/dev/mmcblk0p1’ which is the address of the SD card. Provided in this repository is a sample configuration file: sample-extlinux.conf as an example.
While using root=/dev/sda1 in the extlinux.conf works, it can be a good idea to use the PARTUUID of the disk to identify the disk location. Because USB devices are not guaranteed to enumerate in the same order every time, it is possible that that /dev/sda1 points to a different device. This may happen if an extra flash drive is plugged into the Jetson along with the USB boot drive, for example.
The UUID of the disk in the GPT partition table is called the PARTUUID. This is a low level descriptor. Note that there is another identifier, referred to as UUID, which is given by the Linux file system. Use the PARTUUID for this application, as UUID has been reported to cause issues at the present time in this use case.
There is a convenience file: partUUID.sh which will determine the PARTUUID of a given device. This is useful in determining the PARTUUID of the USB drive. Note: If the PARTUUID returned is not similar in length to the sample-extlinux.conf example (32a76e0a-9aa7-4744-9954-dfe6f353c6a7), then it is likely that the device is not formatted correctly.
$ ./partUUID.sh
While this defaults to sda1 (/dev/sda1), you can also determine other drive PARTUUIDs. The /dev/ is assumed, use the -d flag. For example:
$ ./partUUID.sh -d sdb1
After saving the extlinux.conf file, you are ready to test things out. Shutdown the Jetson, and remove the SD card. Boot the Jetson, and the Jetson will then boot from the USB drive. There will be a slight pause as it looks for which drive to boot from. Make sure that your USB drive is plugged in when you attempt to boot the machine, as this is not hot swappable.
If you encounter issues, you can always boot from the SD card again.
Then you are all set! Enjoy.
Notes
Once you flash your Jetson with JetPack 4.5+, the QSPI-NOR changes and may not properly boot a disk that was created using JetPack 4.4 and lower. In order to restore the QSPI-NOR you will have to run the SDK Manager from a pre-JetPack 4.5 release on a host PC. The SDK Manager will refresh the QSPI-NOR for the earlier version, and should be able to work with the earlier card.
There is an issue with JetPack 4.5 for which you will need a workaround. This was fixed in JetPack 4.5.1. The workaround is in the releases of the bootFromUSB repository, but you should use JetPack 4.5.1 if possible.
Here’s a new, useful website for Jetson related information and demonstrations: ai-triad.com The website is run by one of the newly minted Jetson Champions, Joev Valdivia. Money quote:
I believe the combination of NVIDIA Transfer Learning Toolkit, Deepstream and the Jetson Devices are going to open up new frontiers for the application of A.I.
The goal of this website is to aide others in that journey.
There is an associated YouTube channel with a good number of Jetson related videos about deep learning applications such as YOLO and DeepStream.
There are several different ways to power your NVIDIA Jetson Developer Kit using a battery. Let’s go over some of the more popular. Looky here:
Background
After working on several projects recently that require Jetsons to use battery power, I want to share with you some of the lessons I learned.
Let’s divide the Jetson family into two camps. The first will be the Jetson Nano and Jetson Nano 2GB. Both of these Jetsons require a 5 Volt (5V) power source. The second group is the rest of the Jetsons, including the Jetson Xaviers, Jetson Tx1 and Jetson TX2. These Jetsons run in a wider range of input voltages, typically 9V to 19V.
As you know, you can think of a battery as electrochemical device which can provide electrical power. Batteries can consist of different chemical makeup, you are probably familiar with some of the common ones, such as alkaline (zinc manganese oxide, carbon), lead acid, nickel metal hydride, and lithium-ion polymer.
Here, we will focus on lithium-ion/polymer types of batteries because of their high power density to light weight. Also, they are rechargeable which make them convenient for long term use.
Note: The video is pretty extensive and covers the things listed here in more detail. Some things are more easily shown than written!
Packaging
We look at two different package styles for batteries here. A power bank is a popular style of battery people use for recharging their phones. Some power banks provide multiple voltages, but almost all provide 5V through a USB connection.
The NP-F style battery is in common use by photographers and videographers. The NP-F battery was originally invented by Sony to power their cameras. The battery uses 18650 lithium-ion cell batteries in pairs to provide a 7.4V nominal battery. The type of 18650s and the number of pairs of cells determine the capacity of the battery. Quality NP-F style batteries have built in battery management systems (BMS) to prevent over-discharge.
Some Project Parts you might need
Here’s a list of affiliate links to some items from the video, available on the JetsonHacks Amazon Storefront:
You can power a Jetson Nano through the designated USB port. For the Jetson Nano 2GB, this is the USB-C port. For the Jetson Nano, this is the micro-B USB connector. Simply connect a USB cable between the power bank and the Jetson, and turn on the power bank. Some power banks have multiple USB outlets, which allow you to run more than one device at a time. This can be useful if you are using multiple devices.
Powering a Jetson using a NP-F Style Battery
As we note earlier, a NP-F style battery provides 7.4V nominal power. Nominal defines the class of battery. Batteries run in ranges during their use cycle , a fully charged NP-F battery is ~8.4V, a discharged battery is 6.6V. Note: You should not run an individual cell under 3V, as this may damage the cell so that it cannot recharge. A NP-F has two in series, so that means do not run it under 6V. The Jetsons other than the Nano run in a voltage range of 9-19V.
For our application, we use a SmallRig battery adapter plate which helps us in two areas. First, the plate has a built in boost converter which converts the 7.4V to 12V and outputs it to a jack. Note that the raw battery voltage is also available on a secondary jack.
Second, the plate provides an affordance for attaching the battery to our project. There are two 1/4″-20 threads (a standard in the camera world) which we can utilize. Also, the plate captures the battery and holds it securely. Here’s an example, the battery adapter plate and the battery are towards the rear of the vehicle, attached to the platform plate:
The 12V output on the adapter plate is available on a 5.5mm outer diameter (OD), 2.5mm inner diameter (ID), center positive jack. Remember that the Jetson Nano barrel jack is 5.5mm OD, 2.1mm ID, center positive. The other Jetsons are 2.5mm ID. The video covers a couple of ways of making cables and how to make a simple power distribution block.
To power non-Nano Jetsons, create a cable to connect the adapter plate from the 12V output to the Jetson barrel jack. Plug in the battery, and you should be ready to go!
Buck/Boost Voltage Converter
Another useful device is a voltage converter. The converter takes an incoming voltage and converts it to a different output voltage. A boost convert (step-up converter) takes a lower voltage and converts it to a higher voltage. A buck convert (step-down converter) takes a higher voltage and converts it to a lower voltage. A Buck/Boost converter does both. Note that for efficiency there is expected to be at least a 1.5V difference between the input and the output voltages. Here we are using a buck converter to convert 7.4V to 5V.
This device takes the input from a battery, and converts it to a regulated voltage. This is useful when you have a voltage source outside of the range of acceptable voltages for the Jetson, or you need a more specific voltage like 5V for a Jetson Nano.
Powering a Jetson Nano from a Battery and Voltage Converter
As an example, we can take the 7.4V output from the NP-F battery, and convert it to 5V using the converter to power a Jetson Nano. Because the battery adapter plate that we are using has two outputs which we can use simultaneously, we can power both a Jetson Xavier NX and Jetson Nano:
Battery Run Time and Such
Battery manufacturers list the capacity of batteries in two ways: Amp Hours and Watt hours. You can use this information with the current draw of the Jetson and devices to determine the length of time the battery will last. This can be an extensive calculation, as the draw of the device(s) tends to be quite variable depending on the task.
In addition, there usually is some inefficiency in voltage conversion and such. You can do a quick back of the napkin calculation to get you in the ballpark of run time. However, to get more specific you will need to do testing. You might be surprised to find out that manufacturer claims may not match the observed results!
Conclusion
The methods above are just a few ways to power the Jetson with a battery. There are all sorts of clever solutions out there. Hopefully you find these tips useful.
Many of the NVIDIA Jetson Development Kits need external antennas for their WiFi/Bluetooth cards. Many after market enclosure manufacturers offer mounting points for external antennas. However there are times you might need to have an alternate solution.
Here’s how to make your own antenna mount. We make a model from a mechanical drawing, and then use that to 3D print the part. Bits to Atoms, as it were. Looky here:
Da Files
The antenna mount bracket captures two RP-SMA antenna jacks on one side of an L-shaped bracket. The other side of the L-bracket offers mounting points for the bracket. In the video, we are using these antennas: https://amzn.to/3lCGRWq
You can either clone one of the repositories or grab the files that you want via a web browser interface.
Each file is a little bit different. The wider mounts take into account the antenna spacing for 2.4GHz, which is useful for Bluetooth reception. The MIT RACECAR version has a different mount spacing to better fit with the top platform.
Note: It is easier to first attach the WiFi card to the IPEX MHF4 connectors before attaching the RP-SMA jack to the bracket. I believe that the technical term for the connection to the WiFi card is ‘fiddly’.
Background
The video is is walk through of the modeling and 3D printing process. It is a better explanation than I can write here. Instead, let’s talk about what we’ll call ‘personal manufacturing’.
We can think about making parts from several perspectives. From the perspective of personal manufacturing, when talking about Computer Aided Manufacturing (CAM) there are two categories, additive manufacturing and subtractive manufacturing.
Additive, you add material. Subtractive, you subtract material. We usually think about additive manufacturing as 3D printing. Subtractive manufacturing is typically CNC machining and laser cutting. There are of course a much wider range of processes, such as molding and stamping, but we’ll talk about desktop manufacturing here.
An industry standard of for controlling these machines is a language called gcode. It’s not intuitive that you control additive manufacturing machines in the same way that you control subtractive machines. However, the tasks are much the same; tell a manufacturing machine where to position a spindle or extruder or laser cutting head in 3-space and control the head.
gcode the Language of Machine Romance
While there are many ways to create gcode, including hacking it straight into the machine by hand, most people use some type of Computer Aided Design (CAD) application to first design their parts. Back in the 1980’s Autodesk was the first company to offer a CAD program on personal computers that was widely adopted. Still going strong, Autodesk offers a wide variety of CAD programs for different levels of users, from beginners to beyond pro.
There are a large number of other CAD program publishers of course, and everyone has their favorite. If you’re just getting started, you should survey what meets your needs. The investment in learning these tools is usually steep. Choose wisely.
Some of the CAD programs can also directly generate gcode and allow visualization and adjustment of tool paths. There are also separate CAM applications which handle this chore, usually tied to a specific type of machine.
3D Printers
Between 10-15 years ago, there was an avalanche of publicity around 3D printers for the home. The ‘promise’ was that everyone would have a 3D printer in their home. Anyone would be able to design and make their own 3D parts. Very much Neal Stephenson’s ‘The Diamond Age’.
As with the introduction of most technology, ‘Today’ really means a decade or so before something is useable by people who are not technicians. Surprisingly, it turns out that you need the skills of a designer, a mechanical engineer, a materials engineer and manufacturing engineer to design and print a part of much complexity.
With that said, 3D printing (along with relatively inexpensive subtractive methods) have led a revolution as to what is actually possible for ordinary people with less than corporate budgets. For people willing to make the investment both in time and money, a whole world of ‘atom instantiation’ is now possible. A staggering number of different materials lead to almost unlimited possibilities for realization of imaginative designs.
Also, a cottage industry has grown up around both designing ‘personal’ parts and 3D printing them. Many designs are available on sharing sites such as Thingiverse, where people can find either parts that meet their needs or can be modified to suite them.
What About Us?
For many people, the investment to understand some of the simpler concepts and to be able to make basic parts in this manner is well worth it. Think about it as adding to your talent stack. Simple projects like the one we did above get you started. To be able to create parts for your own project is immensely satisfying, and give you an understanding (and appreciation) for what it takes to build even some of the simplest things you see around you in everyday life.
JetPack 4.6 is the latest release for the NVIDIA Jetson Developer Kits and modules. There are lots of goodies. Some of the highlights include Over-The-Air (OTA) updates, Triton Inference Server, and a new 20 watt mode for the Jetson Xavier NX. Go check it out!
The official blurb
Here’s the official blurb from the official NVIDIA Jetson Developer Forum:
We are pleased to announce JetPack 4.6 27, a production release supporting Jetson AGX Xavier series, Jetson Xavier NX, Jetson TX2 series, Jetson TX1, and Jetson Nano.
JetPack 4.6 includes support for Triton Inference Server 7, new versions of CUDA, cuDNN and TensorRT, VPI 1.1 14 with support for new computer vision algorithms and python bindings, L4T 32.6.1 16 with Over-The-Air update features, security features, and a new flashing tool to flash internal or external media connected to Jetson.
In addition to l4t-base container image, new CUDA runtime and TensorRT runtime container images are released on NVIDIA NGC, which include CUDA and TensorRT runtime components inside the container itself, as opposed to mounting those components from the host. These containers are built to containerize AI applications for deployment. Note that the L4T-base container continues to support existing containerized applications that expect it to mount CUDA and TensorRT components from the host.
Highlights of JetPack 4.6 are:
Support for Jetson AGX Xavier Industrial module.
Support for new 20W mode on Jetson Xavier NX 16 enabling better video encode and video decode performance and higher memory bandwidth. The included 10W and 15W nvpmodel configurations will perform exactly as did the 10W and 20W modes with previous JetPack releases. Any custom nvpmodel created with a previous release will require regeneration for use with JetPack 4.6. Please read release notes for details.
Image based Over-The-Air update 17 tools for developing end-to-end OTA solution for Jetson products in the field. Supported on Jetson TX2 series, Jetson Xavier NX and Jetson AGX Xavier series. Download the OTA tools from the L4T page 16 under the Tools section.
A/B Root File System redundancy 7 to flash, maintain and update redundant root file systems. Enhances fault tolerance during OTA by falling back to the working root file system slot in case of a failure. Supported on Jetson TX2 series, Jetson Xavier NX and Jetson AGX Xavier series.
A new flashing tool 11 to flash internal or external media connected to Jetson1. Supports Jetson TX2 series, Jetson Xavier NX and Jetson AGX Xavier. The new tool uses initial RAM disk for flashing and is up to1.5x faster when flashing compared to the previous method2.
Secure boot is enhanced3 for Jetson TX2 series to extend encryption support to kernel, kernel-dtb and initrd.
Disk encryption 2 of external media supported to protect data at rest for Jetson AGX Xavier series, Jetson Xavier NX and Jetson TX2.
NVMe driver added to CBoot for Jetson Xavier NX and Jetson AGX Xavier series. Enables loading kernel, kernel-dtb and initrd from the root file system on NVMe.
Enhanced Jetson-IO tools to configure the camera header interface and dynamically add support 5 for a camera using device tree overlays
Support for configuring for Raspberry-PI IMX219 or Raspberry-PI High Def IMX477 at run time using Jetson-IO too 6l on Jetson Nano 2GB, Jetson Nano and Jetson Xavier NX developer kits.
Support for Scalable Video Coding (SVC) H.264 encoding
Support for YUV444 8, 10 bit encoding and decoding
With the advent of JetPack 4.6 (L4T 32.6.1) it is now possible to boot the Jetson AGX Xavier and Jetson Xavier NX from external storage. Looky here:
Background
In the JetPack 4.6 release Cboot now has NVMe driver support for the Jetson Xavier NX and Jetson AGX Xavier series. This enablesnables loading kernel, kernel-dtb and initrd from the root file system on NVMe.
There is also a new flashing tool which can flash internal or external media connected to the Jetson Xaviers. The new tool uses initial RAM disk for flashing and is up to 1.5x faster when flashing compared to the previous method!
If you have read the highlights from the press release, you already know all of that. Let’s talk about the new flashing tool in a little more detail.
For reference, the JetsonHacks scripts are in the JetsonHacks repository on Github: bootFromExternalStorage
You’ll need a NVMe drive in your Jetson to follow this tutorial. These drives are available in different sizes of course, but 500GB seems like a good price/capacity. Here’s some suggestions (Amazon affiliate links):
This process does not require that the NVIDIA SDK Manager GUI application be present. These are all command line scripts that can be run in a Terminal. For many people, this better fits their workflow.
Flashing with initrd
The new flashing method uses initrd (initial RAM Disk) to flash both internal media and external media connected to a Jetson device. This method uses initrd and USB device mode. Initrd flash supports several workflows such as:
Flashing internal storage devices
Flashing external storage devices such as NVMe SSD and USB drives
Enabling A/B rootfs on external storage devices
Enabling disk encryptions on external storage devices
Flashing individual partitions
Flashing fused Jetson devices
Flashing a Massflash blob to normal and fused Jetson devices
Generating separate images for external and internal storage devices, then flashing the combined images
Note: Jetson AGX Xavier series devices use boot firmware that is stored only on internal eMMC memory. Consequently this type of device cannot boot from USB or NVMe SSD until its internal eMMC has been flashed.
As usual, tools this flexible have a little bit of a learning curve. There are a group of scripts in the Linux_for_Tegra/tools/kernel_flash directory which handle the above mentioned tasks. It is worth reading the README_initrd_flash.txt file to get a feel for what it takes accomplish the task you are trying to accomplish.
The following initrd flashing process will reformat the external drive on the Jetson. Make sure that you backup any data you want to keep.
nvsdkmanager_flash.sh
There is a higher level script which can be useful if you are less concerned with fine control of all of the different parameters. For the simple case of flashing to a Jetson Xavier with an attached NVMe SSD, you prepare the Jetson disk image and:
$ ./sdkmanager_flash.sh –storage nvme0n1p1
For a Jetson Xavier NX, this will flash the internal QSPI-NOR memory and put the rootfs and other partitions on the NVMe SSD. The Jetson AGX Xavier does not have QSPI memory, so the equivalent on eMMC is flashed instead.
The Jetson Xavier will then boot from the SSD. For the Jetson Xavier NX, no SD card need be present.
Preparing the Host
You will need to install dependencies on the x86 host to support flashing the Jetson. NVIDIA officially supports Ubuntu 16.04 and 18.04 for the flashing scripts. Then you will minimally need three archives:
There are helper scripts in the JetsonHacks bootFromExternalStorage repository. The time for the entire process of downloading and flashing the Jetson will depend on the speed of your Internet connection and host computer. Here it was ~ 1 hour. To download the helper scripts and run them:
There are a couple of tricks here and there in the scripts to put everything in the right place.
Flashing the Jetson
Once the archives are expanded and put in the correct place, put the Jetson into Force Recovery mode. This is different on each Xavier. The video shows a Xaiver NX. You can use the command lsusb to determine if the host can ‘see’ the Xavier. There should be an entry with the NVidia Corp tag when the Xavier is connected to the host in Force Recovery mode. Default is to flash to the NVMe SSD. You will find this works best if you actually have a NVMe SSD installed (64GB minimum; more is better) before attempting to flash.
You can use the convenience script in the repository to flash the Jetson:
$ flash_jetson_external_storage.sh
It takes ~ 20 minutes to install everything in this manner. After flashing, the Jetson will be in ‘oem-config’ mode, ready to be setup.
Setting up the Jetson
You can choose to setup the Jetson in headless mode or desktop mode. In the video, we chose desktop mode. In either case, the next step is to install the NVIDIA JetPack packages on the Jetson. Switch from the host to the Jetson. Finish the Jetson configuration, then:
The default packages are from the meta-package nvidia-jetpack and includes all of our Jetson friends like:
CUDA
cuDNN
TensorRT
VisionWorks
VPI
OpenCV
plus a couple of other of packages to replicate the same experience as the default Xavier NX SD card.
Conclusion
With the convenience scripts, setting up your Jetson Xavier to boot from external storage should be straightforward. This is the NVIDIA ‘approved’ way, so Over The Air (OTA) updates should work going forward too.
Note that NVIDIA is working on the SDK Manager GUI application to replicate this type of functionality. That’s certainly worth checking out!
Notes
The JetPack tools from NVIDIA support Ubuntu 16.04 and 18.04 on the host machine. There are work arounds in these scripts for Ubuntu 20.04. However these are not well tested yet.
JetPack 4.6, L4T 32.6.1
Tested on Jetson Xavier NX, NVMe SSD
Host: Ubuntu 16.04 (should work with Ubuntu 18.04 also)
When recently setting up a Jetson Xavier NX Developer Kit to boot from a SSD, we needed to figure out what packages to install on top of the default Ubuntu distribution. The answer? Package Lists! Looky here:
Background
You will often hear talk of packages in the Ubuntu world. The Jetson L4T operating system is a version of Ubuntu named L4T. What is the difference between L4T and Ubuntu? Packages.
What are Packages?
Under Ubuntu, packages contain all of the necessary files, meta-data and instructions to install functions or software applications. A package file is a compound file which contains three items. First is a format header, which tells the package manager information about the version of the package format. Second there is a Control tar file (Tape Archive format) which contains information about the actual package itself. Third, there is a Data tar file, which contains the actual data (binaries, sources, libraries and so on) of the package.
There are different types of packages. We mostly think about a package representing libraries and applications. However, there are also packages types which contain source code, ‘meta-packages’, and other types. Meta-packages do not contain much information in and of themselves. However, they do contain a list of dependencies. When the package manager sees the dependencies, it will install them. Thus, Meta-packages tell the system to install a group of packages, a package of packages if you will.
The package manager for Ubuntu is Debian. A package file has the extension .deb. Side note, Debian was named after Debra and Ian Murdock. Ian Murdock was the founder of Debian. The package manager is a lower level part of the operating system which installs, removes and provides information about packages.
dpkg and apt
At the lowest level, dpkg provides the package manager capabilities. For any given package, the Control section provides information to dpkg about how to go about installing (and removing) the package Data on to the operating system. The Control section may contain scripts to help set things up, tear them down, and other system setup tasks. Also, the Control section may contain a list of dependencies, reverse dependencies and other useful hints from which the package manager can take advantage.
While dpkg is capable of managing the system, it wasn’t long before people built tools using dpkg as the base to provide more flexibility. The Advanced Package Tool, or apt, is a higher level tool which provides a friendly interface with dpkg.
One of the features which makes apt a useful user experience is the ability to pull dependencies from remote repositories. dpkg requires all package files to be local, while apt keeps a list of remote repositories where package files can be retrieved if need be. Consequently, apt will grab dependencies from places other than the local machine when needed, and take care of any circular dependencies if encountered.
Package Lists
A Package List is a list of all of the packages installed on a given system. This is valuable for several reasons. Probably the most important is that the package list denotes all of the applications and libraries, along with the Control ‘installation instructions’, for a given system. For example:
$ sudo apt list --installed
Lists the installed packages, along with some other information such as the release number. Dpkg provides the same information in a more concise form:
$ sudo dpkg --get-selections
We can use the ‘>’ redirect command line character to save the result to a file.
Once we have the package list, we can use that information to replicate this machines configuration on another. In the video, we compare the configuration of a base Ubuntu distribution with that of a machine which has a JetPack install.
While we could simply do an apt install on all of the packages in the JetPack machines package list on the fresh Ubuntu, in the video we chose a different tact. Instead, we compared the package lists of the default Ubuntu install versus the JetPack install using the diff tool. Diff compares two files and shows the difference. Taking this information, we were able to determine that there is a meta-package with the name nvidia-jetpack which installs the associated Jetson specific libraries such as CUDA, cuDNN, TensorRT, and so forth. To examine the Control information about a package:
$ sudo apt show nvidia-jetpack
This will show the name of the package, the type of package, release number, dependencies, reverse dependencies, conflicts and a wide range of other information.
Other Apt tools
There are a wide variety of tools to support apt. We mentioned earlier that the apt command line tool is a simpler version of other apt related tools. Specifically these are apt-get and apt-cache. There are other apt based tools, one useful one is apt-clone.
You can use apt-clone allows you to generate a package list and other associated data and place it into a file using the clone command. This file can then be brought to another similar machine (same release). Once on the new machine, the apt-clone restore command will configure that machine to be that of the original. You can also use this to reconfigure the original machine to the saved state at the time of the clone.
To be clear, apt-clone does not make a clone of the entire machine; only the installed packages. Typically a developer will keep the code that they are developing in a versioning system or external backup, along with any other data files. You know my saying. “If you cannot generate your system from scratch, you do not have a system”. That’s just a pain train that is coming to get you at the worst possible time.
Some developers use the Aptitude NCurses based interface to apt in the Terminal. This semi-gui provides many of the functions we mention above. A more popular alternative among developers is the GUI called Synaptic. Synaptic provides access to a wide variety of apt tools via the ease of a graphical interface.
Conclusion
This is low level stuff, but it is important to understand if you are a system developer. This article lightly touches on a complex subject area. Like most operating system functions, at the lower levels a package manager is simple. Keep track of configuration and file locations. This is simple in the small, these files go here, a little bit of configuration there and we’re done.
And like most operating system functions, this is complex. This has a lot to do with the sheer number of packages to track, and the dependencies of packages on each other. Circular dependencies and conflicts make things more difficult to wrap ones head around.
However, there are some mature tools to help. Check them out!
We’re starting fall cleaning here at JetsonHacks! That means updating some of our code repositories and articles. With the release of JetPack 4.6, there’s a lot of work and heavy lifting we need to do!
One of the longer term goals is to have a more common code/script base across all of the Jetsons. At this point, all of the 64-bit Jetsons are running Ubuntu 18.04. Previously some of the Jetsons were split across 16.04 and 18.04. Consequently, we will be archiving some repositories and pointing to the new, unified code.
Updating Repositories
Over the last few months, several repositories have had issues unrelated to the Jetson. We’re fixing those first. As usual, each of the repositories have release notes in them.
Install ROS
The installROS repositories on JetsonHacks, JetsonHacksNano, and RACECAR/J have all been updated. The major issue is that the GPG key had expired over at ROS, and needed fixing in the scripts. We took this chance to calculate the ROS_IP environment variable more intelligently when setting up the ROS core server.
Install Arduino IDE
The installArduinoIDE repository on JetsonHacksNano now installs Arduino IDE 1.8.15. This seems like a natural fit for other members of the Jetson lineup, so we brought it over to JetsonHacks too.
Install the RealSense SDK
One of the changes over the last year or so is the ability to load the Intel RealSense SDK from a .deb in a repository. RealSense SDK is another name for librealsense.
Loading from the repository is way cool, it cuts installation from many hours to just a few minutes. In both the JetsonHacks installRealSenseSDK repository and the JetsonHacksNano installLibrealsense, we change to the new key server URL for the RealSense repository. While these repositories have different names, they both contain the same scripts.
We took this chance to rewrite the buildLibrealsense.sh script. buildLibrealsense.sh builds librealsense from source. First, there are more command line switches. The clever bit is that the script now determines the latest librealsense release, and builds that version. Previously this was a fixed variable in the script.
The new -v | --version flag provides an override to build a particular version. This is useful in case you need a specific version, as is the case when matching up against the RealSense ROS wrapper.
Another switch is the new -j | --jobs that allows control of the number of concurrent build processes. If you have less than 4GB of memory, this defaults to 1, otherwise it is the number of cores-1. You can control this more easily in case you have extra swap memory and want to have more (or less) concurrent build threads.
Conclusion
That’s the “low hanging fruit” as it where. Hopefully some of your favorites have better flavor!
Fall cleaning continues here at JetsonHacks. The last week here we’ve been doing some deep cleaning, working on the operating system level repositories.
Updating Repositories
One area that has changed quite a bit over the last few iterations of the Jetson L4T releases is the way that a developer can interact with the Linux kernel and modules. Previously, it was possible to work directly with most of the lower level system directly on the device. NVIDIA recommended working with these system level changes in a cross development environment hosted on a PC running Ubuntu.
As Linus Torvald diplomatically points out: “Cross-development is pointless and stupid when the alternative is to just develop and deploy on the same platform. Yes, you can do it, but you generally would like to avoid it if at all possible.” There are opposing views, of course.
I personally subscribe to that philosophy for simple development. However, note that when you are building new hardware (and you are serious about doing so) you need to have as much compute power as possible if you intend to advance the state of the art. You ain’t building next generation supercomputers with your desktop PC.
Also, it makes sense to train complex machine learning models on high-end GPUs, or in the cloud. Once you have a trained model, you can deploy it on a Jetson. You are only going to live so long, you don’t have time to wait days/weeks/months to train a model on a Jetson. However, things like device drivers, changing the device tree, and hacking the kernel? To me, that should happen on device.
One of the main reasons on device OS hacking on device has become difficult is the emphasis on security in the last several iterations of L4T on the Jetson. There have been several features introduced at the OS level to facilitate better security.
First, the kernel images may be signed. For example, the Jetson Xaviers have their kernel PKC Signed and has SBK encryption. Depending on the architecture of the specific Jetson, the kernel may be placed in QSPI-NOR flash memory onboard the Jetson module (Jetson Nano, Nano 2GB, and Xavier NX). Or it may be placed in the APP partition of the rootfs. Second, there may be multiple rootfs partitions to help facilitate Over The Air (OTA) updates. The device tree binaries may be also signed.
jetson-linux-build
With all of that said, the Jetson Hacks versions “Build Kernel and Modules” were all in various states of disrepair. With each release of L4T came a little more cruft. There are a surprisingly large number of these repositories:
For the most part, the only difference between the code on each one of these repositories is where we retrieve the source code for the kernel and modules from the NVIDIA L4T archive. These URLs are based on the L4T release version. In fact, there are two branches of code for each release. One is for what is know as code name T210 (Jetson TX1, Jetson Nano, Jetson Nano 2GB). The other branch is code name T186 (Jetson TX2, Jetson AGX Xavier, Jetson Xavier NX).
If only there was a way to consolidate a way to get the kernel sources …
In the jetson-linux-build repository in the JetsonHacks Gitub account there is a file named getKernelSources.sh. If you look at the script that it calls, getKernelSources.sh, you will see a nearly magical dictionary that corresponds to the source URL given a L4T release number. Some people call it an “associative array”, us computer sciency people call it a hash table. There’s a lot of fancy code in there which grabs the correct source for the machine on which you are running.
There’s also an example in the repository on how to build a kernel module. In this case, a module to connect a Logitech F710 game controller. It’s still possible to build modules onboard the device.
As for the kernel image, it’s much more nuanced. To be honest, I’m a little confused as to what can and can’t be done on the device right now as far as the kernel image goes.
The above mentioned repositories are still in place, but note that if they haven’t been already, they will be switching over to the jetson-linux-build codewise.
Logitech F710 module
The logitech-f710-module has been updated on JetsonHacks, JetsonHacksNano, and RACECARJ Github repositories. These modules cover L4T 32.4.3 through L4T 32.6.1. See the releases for the appropriate release for your needs.
Conclusion
It takes some time to work on these cleanup efforts. Cleanup this time took a couple of weeks. Hopefully you will find it useful, and you can use this work as is or modify it to suit your needs.
Setting up a Wi-Fi Hotspot using the Network Manager GUI on the NVIDIA Jetsons is simple, but there is a trick. Looky here:
Background
Being able to access your Jetson through a wireless hotspot is a useful feature. The Jetson can act as a wireless router to share a wired Internet connection with other devices such as phones and computers. You can also use the hotspot to access other machines and servers on the same network.
Of course, you’ll need a wireless Network Interface Controller (NIC). You will find that these types of NICs are also referred to as Wi-Fi cards. Some of the Jetson Developer Kits have built in wireless. Others, such as the Jetson Nanos, add them separately. Here’s how to install a wireless card in a Jetson Nano.
Configuration is simple through the Network Manager GUI. The video walks through a couple of ways to do this. The first is through the System Settings->Network dialog. It’s almost as simple as clicking the ‘Use as Hotspot…’ button.
Unfortunately, this does not work with many recent wireless cards. In the video we are using an Intel 8265NGW Wi-Fi card.
The ‘Use as Hotspot…’ button creates a wireless connection with a default mode of ‘ad-hoc‘. Typically ad-hoc networks are mesh networks where machines connect together without a central control point. Windows 10 quit supporting these types of networks, so the wireless NIC card manufacturers followed suit. The result is that even though you can set the mode to ad-hoc, the Jetson will not connect to it if you have that type of NIC card when you try to create the network.
Instead, setting the connections mode to ‘Hotspot’ creates a network similar to that of a wireless router. Also, note that this method assigns a default password to the network using WEP encryption.
Create Hotspot from ‘Scratch’
The second method is to create a new connection using the ‘Edit Connections…’ menu item from the Network menu in the status bar.
There may be terms with which you are not familiar. One, SSID (Service Set Identifier) can be thought of as the name of the network other devices will see. Follow along in the video to get a feel how to setup the Hotspot.
This method allows a little more control over how you setup your Hotspot, including a wider variety of encryption.
Sharing an Internet Connection
One of the examples in the video is sharing an Internet connection with other devices. The Internet connection to the Jetson is through a wired Ethernet connection to a cable modem. The important point is that the Internet connection is through the wired Ethernet; it is not coming from a Wi-Fi connection. The Wi-Fi NIC is creating the wireless network, it cannot connect to the another device at the same time.
Using the Network
The other example is simply using the wireless network to connect with other computers. In the example, we connect to the Jetson via SSH. Note however that we can do the regular computery stuff. For example, we can have a private web server running on the Jetson and serve pages on the network. In the example, we start up Robot Operating System (ROS) server on the Jetson and launch control nodes for a robot.
Command Line Tools
There are several command line tools that you can use to configure a Hotspot. For example, there is a command line version of the Network Manager named nmcli
The tool iw is the replacement for the earlier iwconfig tool. One very useful command of iw is the list command:
$ iw list
Which lists the capabilities of attached wireless devices. These radios are complicated, being able to see all of their capabilites is useful.
There is also hostapd which is a popular configuration tool.
If you are configuring the wireless access point at this level, there are many things to take into consideration depending on usage. But if you need it, you need it.
GStreamer is one of the basic ways of handling media components on the NVIDIA Jetsons. One of the main uses is to handle video sources such as cameras and files.
GStreamer uses a Pipeline architecture, where different Elements link together to guide input from a media source to an output. The output is called a ‘sink’. These pipelines provide a “rich” interface, a quick translation of which means “these things can be complex”.
For many people, the number of options available for linking together Elements and configuring the Elements themselves is confusing. Elements may act as a media source, a sink, a filter, a router, or a combination of the above.
Elements have properties for configuring parameters. The number of properties are variable. Some Elements are simple and need little configuration. Other Elements are complex and provide a wide range of fine tuning.
GStreamer uses a Plugin architecture. One way to think about a Plugin in GStreamer is as a dynamic library. Plugins contain executable code for Elements and a couple of other supporting atomic GStreamer types.
Tools
There are GStreamer command line tools for working with the Pipeline and Elements. The first is gst-launch, which allows construction of a GStreamer pipeline for running tests. The second tool is gst-inspect which lists the Plugins, Elements, Types and some other bits of information. Here we are looking at gst-inspect. Since we are using GStreamer 1.0, the formal names of the tools are gst-launch-1.0 and gst-inspect-1.0.
The base GStreamer system on the Jetson has 260 Plugins and well over 1200 different GStreamer elements! Typically running gst-inspect consists of running the command and watching text scroll by for quite a bit of time. Eventually you will either capture everything to a file so you can look at it in a text editor, or use grep on the command line to find items of interest.
You may ask for just one Plugin or Element, of course. Like most command line interfaces, you can pass flags to get different views on the GStreamer system. Or you can ask for a more verbose listing which will generate a megabyte or two of text. This might not be as useful as it sounds.
So, here’s a little challenge. Given a day or three, write a GUI in prototype form for inspecting the GStreamer Plugins and Elements.
Features
The list of features is pretty small.
A selectable List containing GStreamer Plugins, Elements and Types
Text Search to help narrow down which items to view
A Filter to separate the Plugins from the Elements from the Types
An Inspector Pane which shows detailed information about a selected entry
The above application in the JetsonHacks Github repository is written in Python3 and uses Qt. The dependencies are already installed in the default Jetson images.
The code uses a model, view, controller architecture. How would you build this?
Prototypes vs Production Code
It is worth noting the difference between prototype and production code. This is a prototype. People approach production code in a much different manner. Prototypes are typically the work of one person or a very small group. Production code begins with the assumption that there will be a large group of people working on the code, and that a large number of people will be using the code.
Prototypes are meant to be thrown away! These are sketches of interesting ideas. That does not mean that they are not useful. Indeed, that’s the whole point! In the physical world, this is why people build models. Getting hands on experience with what started out as an idea can help form the idea into something more powerful.
Production code is very much different. The scaffold for building is a project in and of itself. You must know what you are building beforehand. For example, if you were to start building production code you have a set of requirements that you incorporate right at the beginning, such as:
Project Management
Localization/Internationalization for language
Security
Error Handling/Exceptions
Testing Harnesses
Unit Tests
Integration Tests
Acceptance Tests
Documentation
Specifications
Architecture Design
Engineering Specifications
Build System
Source Control
The list above is just for starters. Production systems are not something just one person builds. To be clear, when working on production code, the actually writing of the code itself is not the majority of the work!
Notes
Tested on NVIDIA Jetson Nano and Jetson Xavier NX Developer Kits
At the latest GTC, NVIDIA announced that they will begin shipping the Jetson AGX Orin in early 2022.
Introduction
The Jetson AGX Orin is the next generation AGX form factor Jetson. The AGX Orin module has the same footprint as the Jetson AGX Xavier, using the same 699 pin connector. Here’s some quick specs for the AGX Orin Developer Kit:
We can see from the specs some of the reasons that NVIDIA claims a 6-7X performance improvement over the earlier AGX Xavier. The Orin uses the new NVIDIA Ampere architecture with 2048 CUDA cores and 64 Tensor Cores. The AGX Xavier has 512 Volta Architecture CUDA cores with the same number of Tensor Cores.
By comparison the Orin CPU complex consists of 12 ARM Cortex A78AE v8.2 64 bit CPUs, grouped in fours. The Xavier has 8 ARM Cortex A72s. For the Orin, the memory is faster with 32GB of 256-bit LPDDR5 main memory versus the older LPDDR4 of the Xavier.
The Orin has 15W, 30W and 50W power modes.
How Do We Keep Getting Even More Computing Power?
The CEO of NVIDIA, Jensen Huang, famously stated that “Moore’s law is dead”. Kinda, sorta. If you look at the past history of the Jetson, we know that the Tegra X1 chips (currently in the Jetson Nano) have ~2 billion transistors. The Xavier Tegra chips have ~7 billion. The new Orin Tegra chips have ~17 billion! With just about every generation of Jetson you get 2x the number of transistors of its predecessor.
How does that happen? The key is the manufacturing process. the Tegra X1 uses a 20 nanometer (nm) process, the Xavier a 12nm process and the Orin a 8nm process. The process is the main factor on how many transistors you can fit in a given space on the chip die. The smaller the process, the smaller the transistor. Thus more transistors in a given amount of space.
That explains how you get more transistors, but you can imagine that there’s a physical limit. 5nm is now becoming available. However, as you keep getting smaller you will eventually reach the point where the electrons won’t be able to be contained. After all, they can only put up with so much. They want to be free!
Another interesting aspect of having more transistors is how to take advantage of them in the compute stream. Modern computer hardware architecture is a really interesting area. If you haven’t studied it or haven’t visited the area in a while, it’s well worth taking some time and listen to some talks and read up on the subject.
Sofware
The big news on the software front for the Jetson AGX Orin is that a new version of JetPack 5.0 will install Ubuntu 20.04 on Xavier and Orin platforms. This, along with support of the rest of the major NVIDIA Jetson software packages (CUDA, DeepStream, ISAAC support and so on) promises to make the new Jetson quite formidable in the AI on the Edge and robotics communities.
JetPack 5.0 will include kernel 5.10, a reference file system based on Ubuntu 20.04, UEFI as boot loader, OP-TEE as the trusted execution environment, and also the latest compute stack. JetPack 5.0 Developer Preview release is targeted for Q1 2022 and the production release is targeted for 2H-2022.
Pricing?
NVIDIA is keeping this a secret for now. As we get closer to the release data, I’m sure they’ll let us know.
How This Rollout is Different
In the past, the Jetson announcements have been pretty much around the product launch date. For the Orin, that changes somewhat. This time, the Orin is announced a few months ahead of ship dates. In addition, NVIDIA provides several different documents to help with migration from the AGX Xavier to the AGX Orin, along with a good amount of technical specifications, pinmux sheets, and design considerations.
Jetson AGX Orin collateral available at the Jetson Download Center 22, including data sheet, design guide, pinmux, and Jetson AGX Xavier to Jetson AGX Orin migration app note.
Read through the technical brief first, it gives a good overview of the architecture of the system. There are many different compute complexes on the Tegra System on a Chip (SoC), it takes some study to understand what all of them do.
Once we get our hands on one of these, we’ll be able to see what the new goodness is all about.
A while back, we wrote a Python software sketch that implements a GUI for inspecting GStreamer plugins and features. Let’s revisit the code, and clean it up! Looky here:
Introduction
There are many ways to develop software. One interesting thing that I notice is that once many people learn to program, they don’t “practice” programming any more. Not surprising, as many people code in their jobs and don’t feel the need to take on additional work.
I have found over the years that taking part in “recreational” programming helps me better understand different, sometimes new, concepts that are being introduced in to computer science. Once you are a senior software engineer on up, you realize that you are not being paid to code, but to have meetings, manage people, answer emails and write design documents. Adding recreational programming to the mix helps get a better handle on new technology as it is being introduced.
Recreational Programming
What is recreational programming? It is small project or snippet(s) of code that you write. There should be a very specific purpose, or have investigation value. You can think about it as a study, or a sketch. The intent is to learn, throw it away, and take what you have learned and apply it to your next sketch or project. This frees you from being ‘married’ to the code, and encourages you to experiment.
This is the same way many artists work when they draw or play music. Athletes too. You tend to only see finished work, you rarely see the process that gets to that work. If you don’t believe me, watch the 8.5 hours of the Beatles ‘Let it Be’ film. Watch how many times they run through each song and change them.
When you’re on take 47 of a 3 minute piece, that’s work! The people who made the film? They had more than 55 hours of never before seen footage and 150 hours of audio to piece together. The original ‘Let It Be’ documentary cut down the footage to 80 minutes.
Typically people don’t sit down and create finished pieces. Instead they gather up bits and pieces, crafting and editing them together. You have to be a little ruthless at times, and throw away some very good parts to better serve the ‘story’. Editing be difficult!
Typically I plan on 1-4 hour projects, though a day or three may be a useful range too. When you first start out, you may not know have enough time estimating skills to predict how long any given task may take. You can use this technique to help hone your time estimating skills.
Time Estimation
Block out an appropriate mount of time in your schedule, and turn off your phone, no email, web browsing and so forth. You are concentrating on one task, coding! You can web browse for documentation and so forth, but be cognizant to do only that.
Write down a task list on a pad of paper. The first task is defining the task list. Estimate how long you think that should take.
Next, write down the tasks that you will undertake. Whether that’s to refactor some code, write new routines, introduce new ideas or programming constructs. Write down how long you think each task will take you.
Once you are down writing the task list, compare the amount of time with your estimate you began the list with. Close or not? Cross task one off the list.
Now you are ready to start programming, or researching if that’s one of your tasks. Time each task, and as you complete them cross them off the list and compare the estimated versus actual time. As you get more experience, you should be able to estimate how long any given task may take and some of the variables to take into account. This is a valuable skill, as it is always one of the first questions asked about any project.
In general, you should always know how much work you can do in an hour. It’s important to be only doing the work at hand in a concentrated manner. Get in the zone!
Revisiting Sketches
In the video, we revisit a sketch that I wrote a month or so previously. Of course, you will want to revisit such things much sooner so you don’t have to ‘relearn’ the code that you wrote. I have found that this is good practice when working with programming languages or frameworks which I am not all that familiar. Do a little bit of study, go back and clean up.
For most of the code studies I do, I tend to write ‘pseudo code’ which tends not to take into account specific language features. By revisiting the code, I can add some of the niceties that a given language can provide. Tools play a part in this, for example you notice in the video I lean quite heavily on the Visual Studio Code refactoring facilities.
It Ain’t About Me
But this isn’t about me coding, but rather sharing a technique that you may find useful. Over the years, I have found it very helpful.
A Little Bit More
To be honest, this video is also testing a new technique in recording these tutorial videos. With the proceeds of the JetsonHacks videos, I acquired a BlackMagic Design ATEM Mini Pro recording to a Samsung SSD. Thank you for watching the videos! As time goes on, I can start adding in some more of the advanced features that this kind of workflow provides.
The video was recorded directly from the HDMI out of a NVIDIA Jetson Xavier NX Developer Kit, with an Aspen Mics Lavalier Microphone. The Jetson is running JetPack 4.6, L4T 32.6.1.
The end of another year. In a lot of ways, it felt like 2021 was the sad little brother of 2020. As we head into 2022, it feels like things are beginning to change, and we are getting ready to make up for lost time.
But I know you don’t care about any of that. You want to know how it effects JetsonHacks. First, I want to thank you for being part of the JetsonHacks community, I hope you find everything here to your liking.
This is the article where we talk about what transpired across the vast JetsonHacks media empire. Let’s delve into the numbers and see what we shall see.
JetsonHacks.com
As most of you know, there wasn’t as much content published during the course of 2021 compared to previous years. We posted 15 full articles in total.
As you can see, we had 300K visitors viewing 657K pages. This is down somewhat from the previous year, in part because there was less new content. In 2020, 20 articles were posted. This in turn was down from the 960K views in 2019. Does this mean that JetsonHacks is slowly withering away?
What’s the matter with you, why would you even think something like that?
In 2022, we will be releasing more content. People will be excited, and come to their senses and visit jetsonhacks.com. ‘Nuff said.
JetsonHacks Channel on YouTube
The decrease in traffic is similar on the JetsonHacks YouTube channel.
We published 10 videos this year on the channel. That’s about the same number published in 2020, but the views are down from 750K to 566K. Is the JetsonHacks YouTube channel dying on the vine?
Look, if you keep thinking like that you should seriously consider where your life is heading. Maybe the ‘rona has broken you. The YouTube channel is doing well. We narrow cast to the Jetson community, we don’t care about the rest of the unwashed masses.
As you know, I have also been doing a regular show on the NVIDIA Developers YouTube channel with some NVIDIA folks, Dusty Franklin and Dana Sheahen:
That’s good for another 8 videos in 2021. I’ve also been doing secret live streams about the Jetson, but they are a secret.
Viewers and Readers around the Globe
Here’s a quick breakdown of the origin of some of the traffic to the media empire:
I always find it amazing how far the reach is for this information.
Going Forward
As we go into 2022, one of the things I want to start practicing is live streaming. Thanks to the support of the channel, we’ve acquired an Atem Mini Pro which is a 4 channel HDMI digitizer. This should make it easier to stream video straight to YouTube. I’m not sure this is the best or easiest way, time will tell. Let me know if you want a review.
NVIDIA has announced the new Jetson Xavier Orin (to start shipping in the March, 2022 time frame) , which promises to be more better than the Jetson Xavier AGX. When we get it in our hot little hands on one, we’ll find out for sure.
One of the areas that we will start looking at more closely is machine/deep learning on the Jetson. I think there are many areas to explore, but one that we should take a look at is DeepStream. There are all sorts of technologies tangled together that provides what NVIDIA calls Intelligent Video Analytics (IVA). I think walking through each part is beneficial. We’ll start with cameras and the Jetson in January. That leads us into GStreamer, and then DeepStream. We’ll probably wander into the more traditional computer vision world at the same time.
Let’s also look at some more peripherals, shall we? Robots too! There has been a lot of new products announced over the last couple of years. Let’s plan to take a look at then some time soon.
Conclusion
2021 be gone! Here’s hoping that 2022 is not quite as “interesting”. Happy New Year!
This is the start of a new set of articles which provide a deeper explanation of the concepts around the NVIDIA Jetson family of products. In many of the previous articles on JetsonHacks, we have been concentrating on “how to” do a particular task.
“In Depth” will focus more on how different subsystems work and can be combined together for specific tasks such as vision processing and machine learning. Articles in this series will cover the Jetson hardware itself, external devices, plus the software that binds everything together.
Here we focus on digital video cameras. As cameras provide the images for vision and machine learning analysis, understanding how a camera gathers and distributes those images is important.
The Intent
The intent here is to familiarize (or if you’re like me, refresh your memory) on different terms and concepts associated with digital streaming video cameras. You can use this overview as a jumping off point to dive deeper into the subjects presented.
Introduction
Digital video cameras are ubiquitous. Billions of people have smartphones or tablets with built-in cameras, and hundreds of millions have webcams attached to their computer.
Digital video has a brief history. The first semiconductor image sensor (CCD) was invented in 1969 at Bell Laboratories. A second type, called a CMOS sensor, was invented at the Jet Propulsion Laboratory down the street here in Pasadena, California in 1993. It was in the early 1990s that there was a convergence of technology which allowed digital video to be streamed into consumer level computers. The first popular consumer webcam, the Connectix QuickCam, was introduced in 1994 for $100. 320×240 resolution, 16 bit grayscale. Twas amazing at the time.
CMOS technology is now in use on the vast majority of the sensors in consumer digital video products. Over time the resolution of the sensors has improved, while adding a myriad of capabilities.
Even with a short history, there is a forest of abbreviations and acronyms to navigate to understand what people are talking about in a given context. Hard to talk about something if you don’t know the right name.
Here we will concentrate on cameras that we attach to a Jetson, though these same cameras can be attached to lesser machines. Just one example, here’s a 4K camera:
You can think of a camera as several different parts. First is the image sensor, which gathers lights and digitizes it. The second part is the optics, which helps focus light on the sensor and provides a shutter. Then there is the supporting electronic circuitry which interfaces with the sensor, gathers the images and transmits them.
Image Sensor
There are two types of image sensors predominately in use today. The first is CMOS, the other is CCD. CMOS is dominant in most lower cost applications. The raw sensors provide monochrome (greyscale) images.
Here’s an image of an image sensor, the Sony IMX477:
Color Images
There are different ways to get color images from these sensors. By far the most common way is to use a Bayer Filter mosaic, which is a color filter array. The mosaic arranges color filters on the pixel array of the image sensor. The filter pattern is half green, one quarter red, and one quarter blue. The human eye is most sensitive to green, that’s why there’s an extra in the filter pattern.
Each filter tunes a particular wavelength of photons to the sensor pixel. For example, a blue filter makes the sensor pixel sensitive to blue light. The pixel emits a signal depending on how many photons it sees, in this case how much blue light.
There are other variations using color filter arrays of this type of approach. The Bayer method is patented, so some people try to work around that. Alternatives are CYGM (Cyan, Yellow, Green, Magenta) and RGBE (Red, Green, Blue, Emerald).
In the Bayer filter, the colors may be arranged in different patterns. To get all the combinations, you may see BGGR (Blue, Green, Green, Red), RGBG, GRBG and RGGB. This pattern is used to interpolate a color image using demosaicing algorithms.
The raw output of Bayer-filter cameras is called a Bayer pattern image. Remember that each pixel is filtered to record only one of three colors. The demosaicing algorithm examines each pixel and its surrounding neighbors to estimate a full Red Green Blue (RGB) color for that pixel. That’s why it’s important to know the arrangement of the colors in the filter.
These algorithms can be simple or complex, depending on computational elements onboard the camera. As you can imagine, this is quite the problem. The algorithms make tradeoffs and assumptions about the scene that they are capturing and take into account the time allowed to calculate the color values. There can be artifacts in the final color image depending on the scene and algorithms chosen.
Time is an important factor when you are trying to estimate the color of each pixel in real time. Let’s say you are streaming data at 30 frames per second. That means you have have about 33 milliseconds between frames. Your image better be done and gone before the next one arrives! If you have a couple of million pixels to demosaic per frame, that means you have your work cut out for you! Accurate color estimation can be the enemy of speed, depending on which algorithm is in use.
Some cameras can transmit the raw Bayer pattern image, but many send the RGB image. We’ll cover transmitting the image later in the article.
Infrared Light
The Bayer filter is transparent to infrared light. Many image sensors can detect near infrared wavelengths. Most color cameras add an infrared filter on the lens to help with better color estimation.
However, sometimes it is useful to look at a scene that is illuminated by infrared light! Security “night vision” systems typically have an IR emitter combined with a camera image sensor without an infrared filter. This allows the camera to “see in the dark”.
Optics
The optics for a digital video camera consist of the lens and the shutter. Most inexpensive cameras use a plastic lens, and provide limited manual focus control. There are also fixed-focus lenses which have no provision for adjustment. Other cameras have glass lenses, and some have interchangeable lenses.
You will hear lenses classified by different statements. Typically a lens is specified by its focal length. The focal length of a lens can be a fixed distance. If the focal length is variable, this is called a zoom lens.
Another classification is the aperture, which is denoted by a f, e.g. f2.8. A lens can have a fixed aperture, or a variable one. The size of the aperture determines how much light can hit the sensor. The larger the aperture, the more light is allowed through the lens. The larger the aperture, the smaller the f number.
The lens Field of View (FoV) is also important. Typically this is expressed in degrees, both in the horizontal and the vertical dimension, or diagonally, with the center of the lens being the midpoint of both of the angles.
The fourth classification is the mount type for cameras that have interchangeable lenses. Interchangeable lenses allow for much more flexibility when capturing images. In the Jetson world, you may hear of a M12 mount. It uses a metric M12 thread with 0.5mm pitch. This is also known as a S-mount. Another common term is a C or CS lens mount. There may attach directly to the PCB of the sensor. The Raspberry Pi Hi Def camera uses this type of mount.
The shutter for the camera may be mechanical or electronic. The shutter exposes the sensor for a predetermined amount of time. There are two main types of exposure methods that shutters use. The first is a rolling shutter. The rolling shutter scans across the sensor progressively, either horizontally or vertically. The second is a global shutter, which exposes the whole sensor at the same instant. The rolling shutter is most common as it tends to be less expensive to implement on a CMOS device, though there may have image artifacts, like smearing, for fast moving objects in a scene.
For scenes that do not have any fast moving objects, rolling shutter can be a good choice. However, for other applications this may be unacceptable. For example, a mobile robot which is inherently a shaky platform to begin with may not be able to produce good enough images for visualization if the images are smeared. Therefore a global shutter is more appropriate.
Electronic Circuitry
The electronic circuitry of the digital video camera controls image acquisition, interpolation and routing of the images to the awaiting world. Some cameras have this circuitry on the sensor die (many phone cameras do this to save space), others have external circuitry to handle the task.
Data compression is an important task. Video data streams can be very large. Most inexpensive webcams have a built-in ASIC to do image interpolation and video compression.
Newer to the market ‘smart’ cameras may have additional circuitry to process the video data stream. This includes more complicated tasks such as computer vision or depth image processing. These specialty cameras may combine more than one sensor in the camera.
For example, a RGBD camera (Red, Green, Blue, Depth) may have two sensors for calculating depth, and another sensor for grabbing color images. Some of these cameras use infrared illuminators to help the depth sensors in low light situations.
The electronic circuitry transmits the video data from the camera to a host device. This can be through one of several physical paths. On the Jetson, this is the MIPICamera Serial Interface (MIPICSI) or through the familiar USB. Third parties offer GMSL (Gigabit Multimedia Serial Link) connectors on Jetson carrier boards. GMSL allows longer transmission distances than the typical CSI ribbon cables by serializing/deserializing the video data stream with buffers. For example, you may see these types of connections in use in robots or automobiles.
GMSL camera connectors, image courtesy of Connect Tech
Data Compression and Transmission
Here’s where it starts to get interesting for us. Data is coming across the wire, how do we interpret it?
We talked about creating full color images. Typically we think about these as three channels of Red, Green and Blue (RGB). The number of bits in each of these channels determine how many “true” colors can be displayed. 8 bits per channel is common, you may see 10 bits. In professional video, you will see higher numbers. The more bits, the more colors you can represent.
Let’s say it’s 8 bits per color channel, so that’s 24 bits per pixel. If an image is 1920×1080 pixels, that’s 2,073,600 pixels X 3 bytes = 12,441,600 bytes. If there are 30 frames per second, you get 373,248,000 bytes per second. Of course, if you are using 4K video then you get 4x that amount. Now, we love our pixel friends, but we don’t want to drown in them.
As I’m sure you have pointed out by now, we took a Bayer pattern image and expanded it. Certainly we can transmit the image itself along with an identifier indicating which pattern of colors are on the sensor! Of course we can! However, this forces the receiver to do the color conversion, which may not be an optimal solution.
Types of Data Compression
There are many ways to reduce the amount of image data being transmitted from a video stream. Generally this is done by:
Color space conversion
Lossless Compression
Lossy Compression
Temporal Compression
We won’t go too deeply into this subject here. Subsequent articles will cover the highlights as we get down the road. Entire industries are devoted to these subjects. However, if you have used cameras in the past you are probably already familiar with some of the names of the subjects here.
In color space conversion, YUV coding converts the RGB signal to an intensity component (Y) that ranges from black to white plus two other components (U and V) which code the color. This can be either a lossless or lossy approach. Lossless means that we can convert the image back to the original without any loss, lossy means that we will lose some of the data.
Then there is image compression. You are probably familiar with a PNG file, which uses lossless bitmap compression. A JPEG file is a lossy compression method based on a discreet cosine transform. In general, you can get up to a ~4x size reduction using lossless compression, whereas through lossy compression you can go much higher. The quality of the lossy compressed image may suffer, of course.
Temporal compression measures and encodes differences in the video stream images over time. Generally a frame is set as the key (keyframe), and differences are measured between subsequent frames from there. That way, you only need to send the one keyframe and then the differences. New keyframes are usually generated after a given interval, or generated on a scene change. For mostly static scenes, the size savings can be quite dramatic.
There are a wide variety of algorithms for this task, which is called encoding. The names of these encoders include H.264, H.265, VP8, VP9 and MJPEG. A matching decoder on the receiving end reconstructs the video.
fourcc
A four character identifier (fourcc) identifies how the video data stream is encoded. This is a throwback to the old Macintosh days where QuickTime built upon the Apple File Manager idea of defining containers with four characters. The four characters conveniently fit in a 32 bit word. Audio uses this method too.
Some of the fourcc codes are easy to guess, such as ‘H264’ and ‘H265’. ‘MJPG’ means that each image is JPEG encoded. Others are not so easy, ‘YUYV’ is fairly common which is a packed format with ½ horizontal chroma resolution, also known as YUV 4:2:2. In part some of this confusion is because manufacturers can register these format names. Also, over the years the same code may have an alias on different platforms. For example, on the Windows platform ‘YUYV’ is known as ‘YUY2’.
Conclusion
This is an overview of cameras. There are multiple books and research article on each of the subjects here. Hopefully this gives you a starting point on where to start exploring when digging deeper into the subject.
In the next article, we will go over how to actually get the video stream into the Jetson!
Jetson Camera Coding is the associated code for the upcoming “In Practice” series on working with cameras. There are three repositories:
camera-caps
The JetsonHacks Github repository camera-caps provides a graphical user interface over the v4l2-ctl command line tool. You may find it handy for examining the capabilities of a V4L2 camera that is attached to your Jetson. This works for both CSI cameras and USB cameras. Here’s a beauty shot:
In the snapshot, there are three cameras running simultaneously from the camera-caps tool. A Logitech C920, a Stereolabs ZED camera and an Intel Realsense D435i. The Realsense is showing both an RGB stream and an infrared camera stream.
USB-Camera
USB-Camera is a Github repository with sample Python scripts for working with V4L2 cameras and the Jetson Development kits. The examples use OpenCV (included with JetPack) to capture the camera and display it on the screen. One example shows how to use the V4L2 camera front end to interface with the camera. Another example uses a GStreamer front end to interface with the camera. GStreamer is important in the Jetson ecosystem, as it provides the foundation for DeepStream for Intelligent Video Analysis (IVA).
The third example uses Haar cascades for face and eye detection. This is an example of how to grab video frames from a camera and process them.
The GStreamer pipeline has been streamlined for better frame rates. We also add exception handling to the Python code, along with some other cleanup, to make the code more robust.