Anafi video via WebRTC (GStreamer webrtcbin)


I’m pretty new here and also pretty new to Anafi. Didn’t even check the SDK yet, shame on me :slight_smile:

I’m going to integrate the Anafi into an existing WebRTC infrastructure. I only found Olympe 7.1 so far, which would be an alternative way to go. The difficulty with this approach is to have all the WebRTC stuff in Python.

For python the better approach is to use GStreamer. So I replaced my existing WebRTC pipeline’s input (USB camera, Raspberry PI CSI camera) by an RTSP input. This worked perfectly with an old Chinese surveillance camera (SRICAM), but not with Anafi.

My pipe basically just relays the H.264 from the drone to the WebRTC platform. I got 99% of it all running - just the video didn’t appear. My suspicion is, that the profile_level_id of 4d4028 describes a video stream, which is incompatible with what nowadays browser understand by use of OpenH264.

My question is: Is there any means via GStreamer rtspsrc to alter some internal parameter of the H.264 encoder of the Anafi?

Needless to say, that a transcoding approach works, but this adds additional latency (the Anafi RTSP stream is already now far from being able to compete to what DJI does - not to talk about sub-second latency, I’m happy with 2-3 seconds) and cannot be run on a PI or some other Edge computer, since the load is too much for those.

One step further: I don’t think it is related to the profile_level_id. It must have something to do with the encapsulation. Usually I’m expecting to see RTP packets containing H.264 NAL units. This time there seems to be another encapsulation on top… At least a trace in wireshark trying to decode the payload as RTP/H264 shows complete nonsense compared to a recording from the SRICAM

To be continued

OK, maybe somebody is able to explain me, what I’m seeing here. In fact, I’m expecting to see RTP H.264 NALs, but those are not in the way I’m used to. I’m sure it is OK, since a “full” GStreamer pipeline can decode and display it, but for a “simple” relay of the H.264 there must be maybe a bit more now, and I don’t know what yet.

Running this command from my PC vs. the Anafi and recording the traffic:

`gst-launch-1.0 rtspsrc location=rtsp:// ! fakesink``

This triggers the drone and after some back and forth is is really playing. Stopping after 10 secs, looking into WS and don’t get it: I see the RTSP protocol packets, but the H.264 packets seem to be … multiplexed? Wrapped into a higher NAL unit? How to decode this?

Doing the same with the SRICAM on another network gives this is like I know it: SPS and PPS separated.

gst-launch-1.0 rtspsrc location=rtsp://

Unfortunately I’m not allowed to post more than one image per post yet, but it can be seen.

To not be misunderstood: I’m sure the transmissions of the drone are OK, but a pipeline, which just “rtph264depays” and “rtph264pays” again fails on the far end. The browser cannot cope with this.

Could anybody please enlighten me?

Even though I’m just talking to myself here, I have created a little video, which shows the difference in payload between SRICAM and ANAFI.

What concerns me most is the complete absence of any IDR record in the payload…

Solved in

This topic was automatically closed after 30 days. New replies are no longer allowed.