Main Content

Install and Setup Prerequisites forNVIDIABoards

Target Requirements

Hardware

MATLAB®Coder™ Support Package for NVIDIA®Jetson®and NVIDIA DRIVE®Platformssupports the following development boards:

  • NVIDIA Jetson Xavier™ NX.

  • NVIDIA Jetson AGX Xavier.

  • NVIDIA Jetson Nano.

  • NVIDIA Jetson TX2.

  • NVIDIA Jetson TX1.

  • NVIDIA DRIVE PX2.

The support package uses an SSH connection over TCP/IP to execute commands while building and running the generated code on the DRIVE or Jetson platforms. Connect the target platform to the same network as the host computer. Alternatively, use an Ethernet crossover cable to connect the board directly to the host computer.

Note

On the Windows®platform, open port18735in the Windows Firewall settings. This port is needed to establish a connection to the MATLAB server running on the embedded platforms.

Software

  • Use theJetPack (NVIDIA)or theDriveInstall (NVIDIA)软件安装操作系统映像,开发工具,and the libraries required for developing applications on the Jetson or DRIVE platforms. You can use theComponent Managerin theJetPackor theDriveInstallsoftware to select the components to be installed on the target hardware. For installation instructions, refer to the NVIDIA board documentation. At a minimum, you must install:

    • CUDA®toolkit.

    • cuDNN library.

    • TensorRT library.

    • OpenCV library.

    • GStreamer library (v1.0 or higher) for deployment of thevideoReaderfunction.

    TheMATLAB Coder Support Package for NVIDIA Jetson and NVIDIA DRIVE Platformshas been tested with the following JetPack and DRIVE SDK versions:

    Hardware Platform Software Version
    Jetson Xavier NX, AGX Xavier, TX2/TX1, Nano JetPack 4.5.1
    DRIVE DRIVE SDK 5.0.10.3-12606092

  • Install the Simple DirectMedia Layer (SDLv1.2) library, V4L2 library, and V4L2 utilities for running the webcam examples. You must also install the development packages for these libraries.

  • For deploying theAudio File Read金宝app®block, install the Sound eXchange (SoX) utility and its development and format libraries.

    For example, on Ubuntu®, use theapt-getcommand to install these libraries.

    sudo apt-get安装libsdl1.2-dev v4l-utilssox libsox-fmt-all libsox-dev

Environment Variable on the Target

The support package uses environment variables to locate the necessary tools, compilers, and libraries required for code generation. Ensure that the following environment variables are set.

Variable Name Default Value Description
路径 /usr/local/cuda/bin

Path to the CUDA toolkit executable on the Jetson or DRIVE platform.

LD_LIBRARY_PATH /usr/local/cuda/lib64

Path to the CUDA library folder on the Jetson or DRIVE platform.

Ensure that the required environment variables are accessible from non-interactive SSH logins. For example, you can use theexportcommand at the beginning of the$HOME/.bashrcshell configuration file to add the environment variables.

Example.bashrcFile

Alternatively, you can set system-wide environment variables in the/etc/environmentfile. You must havesudoprivileges to edit this file.

Example /etc/environment File

Input Devices

  • Camera connected to the USB or CSI port of the target hardware.

  • USB audio device for recording and playback of audio signals.

Development Host Requirements

MathWorks下载188bet金宝搏

  • MATLAB (required).

  • MATLAB Coder(required).

  • GPU Coder™ (required for GPU targeting).

  • Parallel Computing Toolbox™ (required for GPU targeting).

  • Simulink (required for generating code from Simulink models).

  • Computer Vision Toolbox™ (recommended).

  • Deep Learning Toolbox™ (required for deep learning).

  • Embedded Coder®(recommended).

  • Image Processing Toolbox™ (recommended).

  • Simulink Coder(required for generating code from Simulink models).

  • GPU Coder Interface for Deep Learning Librariessupport package (required for deep learning).

Third-Party Products

  • NVIDIA GPU enabled for CUDA.

  • CUDA toolkit and driver.

  • C/C++ Compiler.

  • CUDA Deep Neural Network library (cuDNN).

  • NVIDIA TensorRT – high performance deep learning inference optimizer and run-time library.

For information on the version numbers for the compiler tools and libraries, seeInstalling Prerequisite Products(GPU Coder). For information on setting up the environment variables on the host development computer, seeSetting Up the Prerequisite Products(GPU Coder).

Related Topics