NVIDIA Open-Sources TensorRT Library Components

Written by Michael Larabel in NVIDIA on 2 July 2019 at 12:37 PM EDT. 30 Comments
NVIDIA
NVIDIA announced via their newsletter today that they've open-sourced their TensorRT library and associated plug-ins.

TensorRT is NVIDIA's flagship platform for deep learning inference and focused for doing so on NVIDIA GPU hardware. TensorRT is built atop CUDA and provides a wealth of optimizations and other features.

Included via NVIDIA/TensorRT on GitHub are indeed sources to this C++ library though limited to the plug-ins and Caffe/ONNX parsers and sample code. Building the open-source TensorRT code still depends upon the proprietary CUDA as well as other common build dependencies. But nice at least seeing the TensorRT code more open now than previously.
Related News
About The Author
Michael Larabel

Michael Larabel is the principal author of Phoronix.com and founded the site in 2004 with a focus on enriching the Linux hardware experience. Michael has written more than 20,000 articles covering the state of Linux hardware support, Linux performance, graphics drivers, and other topics. Michael is also the lead developer of the Phoronix Test Suite, Phoromatic, and OpenBenchmarking.org automated benchmarking software. He can be followed via Twitter, LinkedIn, or contacted via MichaelLarabel.com.

Popular News This Week