Trouble decoding RTSP stream using Nividia's Gstreamer Hardware Decoders


Drone: Anafi USA
GStreamer: 1.14.5
Board: Jetson Xavier NX
Jetpack 4.6.3

I have been fighting with this problem for quite awhile now. I am running on a Jetson Xavier NX and my goal is to ingest the RTSP video stream, decode the h264 and re-encode it to h265 to rebroadcast on a separate network using Gstreamer. Main reason is h265 provides higher quality while allowing for a lower bitrate which is necessary for our application.

I have had no issue rebroadcasting the rtsp stream using this pipeline:

gst-launch-1.0 rtspsrc location=rtsp:// ! rtpjitterbuffer latency=100 ! queue ! udpsink buffer-size=2000000000 host= multicast-iface=eth1 ttl=10 auto-multicast=true port=5600 sync=false

And decoding using avdec_h264 is not an option due to it being to slow.

The pipeline that I am wanting to use is such

gst-launch-1.0 rtspsrc location=$RTSP_STREAM ! rtph264depay ! queue leaky=1 ! h264parse ! nvv4l2decoder ! queue2 ! nvv4l2h265enc ! h265parse ! rtph265pay ! queue ! udpsink buffer-size=2000000000 host= multicast-iface=eth1 ttl=10 auto-multicast=true port=5600 sync=false

I was able to test that this works using a third party test RTSP stream but it does not work at all with the parrot.

I saw earlier posts mentioning missing IDR frames and PPS, SPS issues that could be causing this.

Please let me know if there is any work around here, I have tried so many different versions and decoders but all run into the same issue.


I see that you are running jetpack 4.6.3. There has been alot of improvments on the 5.0.2 that also support NX.

The Jetson Multimedia API package provides low level APIs for flexible application development.

Camera application API: libargus offers a low-level frame-synchronous API for camera applications, with per frame camera parameter control, multiple (including synchronized) camera support, and EGL stream outputs. RAW output CSI cameras needing ISP can be used with either libargus or GStreamer plugin. In either case, the V4L2 media-controller sensor driver API is used.

Sensor driver API: V4L2 API enables video decode, encode, format conversion and scaling functionality. V4L2 for encode opens up many features like bit rate control, quality presets, low latency encode, temporal tradeoff, motion vector maps, and more.

JetPack 5.0.2 Camera highlights include:

  • Argus support for YUV444 and Linear RGB888 output format for Jetson AGX Xavier and Jetson Xavier NX.
  • HDR Support for Jetson AGX Orin. Digital Overlap (DOL)1 Mode and Piecewise Linear (PWL) are supported.
  • Support for Error Resiliency on Jetson AGX Orin
  • New Samples:
    • argus_demosaicOutput to demonstrate the CVOutput capability.
    • argus_rawBayerOutput to demonstrate raw capture using argus with options available to enable/disable 3A/ISP to converge sensor exposure settings.
    • argus_userAlternatingAutoExposure to demonstrate the captures using alternating exposure.
    • argus_yuvOneshot now supports YUV444 format along with YUV420 format.

I forgot to add to my post that I had originally tried everything at 5.0.2 originally before downgrading. The rest of our systems run on 4.6.3 so I assumed down grading would solve my issues. Unfortunately the jetpack version was not the problem.


My guess is that the issue is probably with the absence of IDR frames in the stream: to improve the stability of the video bitrate and lower the latency, we use intra-refresh which means I-frames are replaced by P-frames only with a periodic refresh pattern of I-slices.

SPS and PPS are repeated in the stream at each start of refresh, which are signalled using a recovery point SEI.

A way to solve the issue would be if you could force the decoder to start on the recovery point. When there is no way to start the decoder on a frame that is not an IDR, what we do in our player implementations is to generate a fake gray IDR frame for the decoder.

This topic was automatically closed after 30 days. New replies are no longer allowed.