Quantcast
Channel: JetsonHacks
Viewing all articles
Browse latest Browse all 339

GTC 2016 – Flying Robot Demo

$
0
0

GTC 2016 in San Jose last week was a great technical conference. I will be writing a few articles about some of the interesting things that I saw there. I want to thank everyone who stopped by and said hello while I was in the NVIDIA booth, it was great getting a chance to chat with some of the readers and viewers.

Most of the technical talks from the conference are online (or soon will be), which provides a wonderful opportunity to see what leaders in several fields are using GPUs for.

In this article, I will cover a demonstration that I showed. Looky here:

Background

If you’ve been following JetsonHacks for the past few weeks, you noticed that there were several articles about the DJI Matrice, which is a drone for developers. By attaching a product called Manifold, a Tegra K1 computer similar to the Jetson TK1, the developer can gather information from the drone sensors, such as the gimbaled camera, and use that information to control the drone through an onboard flight controller. A simple UART connects the Manifold to the flight controller, while the video feed from the camera comes through a special USB port which also allows pass through to the flight controller. This arrangement allows the flight controller to broadcast the video back to a ground station. In this manner, the drone can act autonomously.

Demonstration

The actual demonstration is a simplistic interpretation of codes that the drone captures through the camera. The codes are known as AprilTags, more formally as fiducial markers in computer science lingo. Each one of the tags represents a different code. For example, in the demonstration when AprilTag code 12 was shown to the drone, the drone would Take Off, Code 20 would move the drone to the right a little. In initial testing in the outdoor Drone Zone cage, apparently code 21 tells the drone to fly into the protective netting. Looky here:

In some sense this is to be expected. The drone was in a field for the trial runs, it’s not surprising that it moved a little too far in a more enclosed space. Here’s the initial test for reference:

Development

Being a developer drone, the Matrice can be configured in many different ways. There are many mounting options for different components, and in some sense putting the Matrice together is like completing a jigsaw puzzle. The project started about three weeks before GTC.

About 14 days went by assembling and waiting for the different bits and pieces to arrive (as well as way too many trips to Frys Electronics to pick up various miscellaneous bits for the project).

DJI has several different software SDKs for the Matrice, including an Onboard SDK for the Manifold (there is also a version which supports ROS, Robot Operating System). There are also versions for talking through the R/C controller of the drone with different mobile devices, both iOS and Android. For the demonstration, Onboard SDK was used.

After the Matrice was assembled and once the Manifold computer and camera were in order, work on the demonstration software started. The demo software was put together in about 5 days, with another couple of days spent debugging and testing.

The actual program itself is rather simple. Video is taken from the camera, which comes in as NV12, and is converted into a form which OpenCV can process. There is an open source AprilTags recognizer written in C++ on Github which is a port from the original Java program written by Professor Edwin Olson of the University of Michigan. Connecting the library code together and getting it to work was straightforward, probably the easiest part of the project.

The rest of the software development, not so much. As you might have guessed, getting the video from the camera and then changing the format was the first hurdle. Fortunately some digging found some example code to help with the process.

The next hurdle was much more complex. To actually use the APIs for the flight controller, the developer needs to know something about how the quadcopter actually flies, things such as controlling the pitch, the roll and the yaw of the aircraft, controlling the throttle, little things like that. I had never flown a quadcopter before the start of the project, so that part was a little challenging. However, I understood from 3D graphics what the concepts were all about. Still, there are some nuances that were lost on me, such as when the Manifold needed to take flight control when executing commands. Some of the commands needed to have control, others do not. I’m sure a developer more experienced in the domain would have found it straightforward.

There is also a concept of Modes which I found confusing. The R/C controller has a three way switch on the left front labeled ‘P’, ‘A’, and ‘F’. For the Manifold to take control, the switch needs to be in the ‘F’ position. However, the actual Onboard SDK has the same modes that it controls, which when thinking about it made my head hurt. Also, with DJI being a Chinese company, the documentation has some strange terms for which the meanings are not immediately obvious. There are some examples included with the SDK, but the examples seem oriented towards people who actually know what they are doing. I know what I wanted the drone to do, I just didn’t have the mental map on how to tell the flight controller how to do it.

Once the program was debugged for the most part and tested in the DJI Simulator (a very nice way to see what is going to happen in the field) there were a few more obstacles to overcome.

A quick note on the DJI Simulator. For the Matrice, the simulator only runs on a Windows box as of the time of this writing. I happen to have a Windows 8.1 box. When I tried to install the DJI Driver to talk to the Matrice over USB, installation always failed, with the helpful error message “Installation Failed”. No reason, no error code, just failed.

Having failed a lot in life, I found this easy to accept. But I also knew that if I persevered long enough, that square peg was going through that round hole come hell or high water. It turns out that the DJI driver was written as an ‘untrusted’ 32 bit driver. Starting in Windows 8, the kingdom banned any such behavior. Only trusted 64 bit drivers can be installed on the Windows boxen. As in most things computers, special incantations can circumvent the rules of the kingdom, so the rogue DJI Driver was installed and the DJI Simulator was able to come to life. You might guess that finding this information out and implementing the solution took about as long as reading this paragraph. Your guess is terribly wrong.

The Manifold talks to the Matrice through both USB and UARTs which require device permissions. This means that the program must have permissions set when the Manifold boots. Normally the command to execute the app would be placed in /etc/rc.local, but there was a rub.

The application is built in Qt using OpenGL! Normally that’s not really a robot thing, one would run a much simpler user interface through a console. However, in this case a graphical user interface was used to expedite development and testing. It was easier to have a GUI to manipulate and try to replicate the actions in the the application.

So what’s the problem with having it run as a GUI? First, in /etc/rc.local one is not assured that X11 (the base for the graphics system) has started. If X11 hasn’t started, the GUI based program can’t run. A quick solution for this was to place the app in the ‘Startup Applications’ area.

The second issue is not as obvious. Once you unplug the HDMI monitor from a Manifold (or other X11 systems such as the Jetson) when the computer boots X11 sees that there is no monitor, and thus no need for graphics, X11 does not load. That’s all fine and well most of the time, but in this case I need graphics to run my app!

The work around in my case was to buy a HDMI display Emulator:

Headless Display Emulator @ Amazon

Not surprisingly, I had forgotten that the actual connector on the Manifold is mini-HDMI, so back in the car to Frys … They didn’t have quite what I was looking for, so I ordered this collection of goodness from Amazon:

The reason I’m telling you this is that you my find yourself in such a situation one day …

Note: There is a way to set the X11 config file to do this in software on the Tegra boards, I’ll write up a separate article on that tidbit. Still, this can be a convenient solution.

Anyway, after those niggles were behind it was time to go to the park for the first test. Things mostly worked. However, I will reveal that there was one nasty bug that took me a while to hunt down.

In my haste to build some tables, I neglected to put in range checking in the lookup portion of the code. I know better, you know better, and yet still … I must say I admire that C++ does exactly what you tell it to do, even if it is a jump to a nonexistent memory location that causes the application to crash hard. Even better, when the app crashes and the app is connected to a robot, good times! Combine this with the fact that the app needs sudo permissions to actually run making it very difficult to run in the traditional debugger/Qt IDE …

I will just say that I may have said a dirty word. Or maybe a couple. For like four hours straight, trying to find the issue. But at the end of the development period it mostly worked, as the video above showed. At the show, we spent most time in the NVIDIA booth hooked up to the simulator:

Flying Robot in NVIDIA booth
Flying Robot in NVIDIA booth

The lights at the show made everything look nice, and it was great talking to the readers that stopped by and said hello!

The post GTC 2016 – Flying Robot Demo appeared first on JetsonHacks.


Viewing all articles
Browse latest Browse all 339

Trending Articles