Inference Detectnet neural network on DeepStream 4.0

NVIDIA’s DeepStream SDK delivers a complete streaming analytics toolkit for AI-based video and image understanding.

To download and install Nvidia Deepstream 4.0 follow guidelines from :

My inference machine is based on Unbuntu 18.04 with Nvidia Geforce RTX 2080 nvidia dirvers 418.87 cuda 10.1

Follow Nvidia deepstream 4.0 installation guides to install required packages as Gstreamer 1.14.1 and TensorRT 5.1.

For Deepstream 4.0 installation package suggest using deb intallation which take care to install Deepstream 4.0 library and gstreamer plugins in correct paths.

Once Installation is complete Nvidia deestream package sholud be located in : /opt/nvidia/deepstream/deepstream-4.0

Navigate to :


Create a directory parser_detectnet :

mkdir parser_detectnet

Navigate to :


Downolad DetectNet parser plugin source and makefile from my github here and here

Copy both files on /opt/nvidia/deepstream/deepstream-4.0/sources/libs/parser_detectnet and type make to compile plugin.

After successful compilation a library file should be present in directory.

To make a quick test to inference a DetectNet you can use gstreamer pipeline with a standard detectnet configuration file for Deepstream here

gst-launch-1.0 filesrc location=<videoFile> ! qtdemux ! h264parse ! nvv4l2decoder ! m.sink_0 nvstreammux name=m batch-size=1 width=<VideoWidthSize> height=<VideoHeightSize> ! nvinfer config-file-path=detnet_config.txt batch-size=1 unique-id=1 ! nvvideoconvert ! nvdsosd ! nveglglessink

That’s it !!!

You may also like