1 min

Tags in this article

, ,

Nvidia sets the code for his parser and plug-ins for TensorRT to Github. The company hopes for enthusiasm from the developer community to contribute to the open source project.

The code for both the parser and the plugins for TensorRT is now open source. Nvidia put all the assets online in a github repository. TensorRT is Nvidias AI Reference System. Since the introduction of the RTX series, the CPU manufacturer has been equipping its graphics cards with Tensor cores, optimised for AI workflows. The publicized TensorRT code is designed to do that, so it’s not about training.

Hardware first

The parser’s job is to import pre-trained models for inference into the accelerator, while the plugins make it possible to link new operations or applications to the hardware. Of course, Nvidia wants developers to embrace the TensorRT hardware and make it as simple as possible for them. If they are enthusiastic and build their apps for TensorRT, that means that the final applications will run on Nvidia. Hardware remains the core business of the company, not software.

It is therefore not the first time that Nvidia makes code open source. Nvidia has more than 120 repositories online and actively contributes to open source deep learning projects. The AI specialist invites the community to contribute to the code and plant additions to merge with each new release.

This news article was automatically translated from Dutch to give Techzine.eu a head start. All news articles after September 1, 2019 are written in native English and NOT translated. All our background stories are written in native English as well. For more information read our launch article.