RTSP stream and photo/video capture in parallel seems to be not possible

As of now I only observed this in the simulator, but with all what I’ve seen so far I have no doubt, I will see it with a real drone too:

Status:

  • I’m consuming the drone’s RTSP stream for transmission via WebRTC
  • In parallel I’m using the SDK to control the drone’s flight

Observation:

  • Whenever I issue some command related to photo/video in the remote control app for the drone, the RTSP connection breaks. I don’t even have to try to make a photo - even the mode change (photo->video, video->photo) does the trick.

I have some understanding for this behaviour, even though I would like it much better, if the SDK controlled attempts just return an error or so instead of shooting RTSP.

Please confirm/deny

Confirmed with real drone.

Trying to make a photo while obtaining video via RTSP ends in an error for the photo call, but doesn’t cut off RTSP. Switching from photo mode to video mode seems to not change anything, the reverse switch cuts off RTSP.

What does this mean?

Basically you can’t have live video from the drone via the RTSP way and having unproblematic access to the photo and video recording caps of the drone. This limitation is not fully understandable to me, tbh.

I have some small hope, that making photo and video on the drone is possible, if the video is obtained via Olympe using e.g. the YUV callback, but have no evidence since not tested yet. I just know, that it seems to work from the observation of the Anafi iOS app.

I was trying to pause the RTSP stream while making photos. But the Anafi RTSP server seems to not properly handle the transition PAUSED->PLAYING.

0:00:28.183523582 20306 0xa7676030 WARN rtspsrc gstrtspsrc.c:6691:gst_rtspsrc_send:<rtspsrc0> got NOT IMPLEMENTED, disable method PLAY

I can visibly pause the RTSP video by self.pipeline.set_state(Gst.State.PAUSED) An attempt to recover from this by self.pipeline.set_state(Gst.State.PLAYING) fails and the video which was paused remains frozen.

Hi,

In order to clarify this, here are a few explanations about the drone photo/video pipeline, and its implications on the streaming:

  • The photo/video switch triggers a full reconfiguration of the sensor acquisition pipeline, inducing a change in aspect-ratio (16/9 for the video mode, 4/3 for the photo mode). This is also the case for some (but not all) other capture parameters as you might have seen.
  • These reconfiguration are seen by the streaming server as a complete “media” change: any active RTSP session now references an invalid media, and is thus terminated. GroundSDK (and OpenFlight/FreeFlight) handle this by restarting a new RTSP session immediately on the “new” media. The “new” media will lead to a different h.264 stream, with different parameter sets.
  • The list of parameters that triggers a pipeline reconfiguration is different on a simulator and on an actual drone, due to the highly different code in those area, that’s why the behavior on the two platforms is not 100% identical.
  • If you only plan to take one type of media, once you have configured the drone to the required photo or video mode, you should not have any RTSP breakage.

Quick note on the photo mode : depending on the PhotoStreamingMode set in the camera config message, the streaming might be paused when a photo is acquired. This pause should be completely transparent to an RTSP client, and is unrelated to what I described previously.

Regards,
Nicolas.

Thanks for the clarification. That’s a pretty heave re-configuration, indeed.

Could you be a bit more specific on the quote above?

Does it meant that, if I would configure the drone into photo or video mode before opening RTSP, then I shouldn’t notice any break if I later actually shoot or record videos? I didn’t try yet.

Quick note on the photo mode : depending on the PhotoStreamingMode set in the camera config message, the streaming might be paused when a photo is acquired. This pause should be completely transparent to an RTSP client, and is unrelated to what I described previously.

I noticed this pause already in the FreeFlight app. That wouldn’t be a problem at all. It MUST not break RTSP, that’s the thing :slight_smile:

Other question: Do these reconfiguration issues also affect the YUVSink interface? That would be an alternative way to do WebRTC for me, but I wouldn’t start that effort if I would know beforehand, that there is also kind of outage/break to be expected.

Does it meant that, if I would configure the drone into photo or video mode before opening RTSP, then I shouldn’t notice any break if I later actually shoot or record videos?

Exactly: the drone should never trigger spurious configuration changes, so once you’re in the right mode, all should be stable.

Do these reconfiguration issues also affect the YUVSink interface?

The YUVSink interface is still a RTSP client under the hood, just not plugged to a renderer, so it will have the same behavior as any other client.

It MUST not break RTSP, that’s the thing

The fact that the RTSP stream is restarted when the camera is reconfigured is something that can not easily be changed inside the drone, and this decision was not by choice, but mainly due to technical restrictions. The technical reasons behind this are part of the firmware and how the different components communicate, so I cannot elaborate much more on this point.
As I said in my previous message, this is something we worked-around by having the “controller” auto-restart its RTSP client when the session is torn down by the drone. The solution is not ideal, but it’s the best we have for the current situation. It also have the upside of properly handling RTSP timeouts and wifi loss of connection that might occur in difficult situations.

OK, unfortunate, but what would that mean in practice? Say I’m having setup and been using a YUVSink. In another part of my app I’m playing with the camera settings. What happens to the YUVSink now? Does it just not deliver any frames anymore and recovers after the switch or would I have to re-initialize the entire thing?

If I check your sample code there is an init, start, stop. olympe/src/olympe/doc/examples/streaming.py at 324a4b2744f6f024a2f0a3dfab2ddddb58b4566c · Parrot-Developers/olympe · GitHub

What would be the recommended procedure to not kill the YUV stream in case I want to switch the mode?

The fact that the RTSP stream is restarted when the camera is reconfigured is something that can not easily be changed inside the drone, and this decision was not by choice, but mainly due to technical restrictions. The technical reasons behind this are part of the firmware and how the different components communicate, so I cannot elaborate much more on this point.

Understood

As I said in my previous message, this is something we worked-around by having the “controller” auto-restart its RTSP client when the session is torn down by the drone.

Yes, but maybe you understand: It is your workaround. People like me coming from DJI never have seen such problem with theire drones, so my use case is completely different: I’m establishing a WebRTC connection and part of this is to connect to the drone using GStreamer on a companion computer. The main assumption here is: This connection will never break unless any communication component breaks. So a break has then (and only then) to be followed by an immediate re-establishment of the entire WebRTC connection (which btw does not even just carry video but the entire data/telemetry too).

Now it turns out, that the drone itself causes this break unexpectedly by playing with the wrong keys…sigh…

I just didn’t expect this and need to see, how we can revert the use case now and tell the user: But you must know what kind of video you want to shoot before you are going to connect to the drone. And it needs separate connections for each kind of requirement… (which needs still to be proven)

So far I’m unable to confirm that setting the mode beforehand does prevent the RTSP connection from being shot down in the moment of making a photo.

Here some steps to reproduce:

  • Console prepared to run:
    gst-launch-1.0 rtspsrc location=rtsp://192.168.42.1/live is-live=true connection-speed=3000 ! rtph264depay ! h264parse ! avdec_h264 ! videoconvert ! fakesink

  • A function, which sets up photo mode a short moment after having successfully connected to the drone

  1. Setup photo mode
    def set_photo_mode(self):
        ''' Set photo mode '''
        self.logger.info("set photo mode")
        assert(self.drone(camera.set_camera_mode(cam_id=0, value='photo')).wait().success())
        assert(self.drone(camera.set_photo_mode(
            cam_id=0,
            mode='single',
            format='full_frame',
            file_format='jpeg',
            burst='burst_4_over_1s',
            bracketing='preset_1ev',
            capture_interval=0
        )).wait().success())

This function runs OK.

  1. Start the RTSP client

  2. Make a photo (function taken from one of your samples)


    def take_photo(self):
        ''' Take photo '''
        self.logger.info("taking photo")
        photo_saved = self.drone(camera.photo_progress(result="photo_saved", _policy="wait"))
        self.drone(camera.take_photo(cam_id=0)).wait()
        if not photo_saved.wait(_timeout=30).success():
            assert False, "take_photo timedout"
        photo_progress_info = photo_saved.received_events().last().args
        print(photo_saved, photo_progress_info)
        media_id = photo_progress_info["media_id"]
        photo_count = photo_progress_info["photo_count"]
        print("Media ID {}, Photo count {}".format(media_id, photo_count))

The output of this function looks OK so far, but says also, that no photo has been taken.

{'Camera.Photo_progress': OrderedDict([('cam_id', None),
                                       ('result',
                                        <photo_result.photo_saved: 2>),
                                       ('photo_count', None),
                                       ('media_id', None),
                                       ('list_flags', None)])} OrderedDict([('cam_id', 0), ('result', <photo_result.photo_saved: 2>), ('photo_count', 0), ('media_id', '10000029'), ('list_flags', <list_flags_Bitfield: []>)])
Media ID 10000029, Photo count 0

This is reproducible. The RTSP video instead, running in the separate console until the shot was taken, is shot down:

Got EOS from element "pipeline0".
Execution ended after 0:00:10.906530244
Setting pipeline to NULL ...
ERROR: from element /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0: Unhandled error
Additional debug info:
../subprojects/gst-plugins-good/gst/rtsp/gstrtspsrc.c(6697): gst_rtspsrc_send (): /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0:
Bad Request (400)
ERROR: from element /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0: Could not write to resource.
Additional debug info:
../subprojects/gst-plugins-good/gst/rtsp/gstrtspsrc.c(8242): gst_rtspsrc_close (): /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0:
Could not send message. (Generic error)
Freeing pipeline ...

The GST call has ended and returned to console prompt.

Either I have misunderstood you or it is not possible at all.

The only thing, which is not consistent here: The media ID increases and even if no RTSP stream is running no photo is made :frowning: That in fact makes me think that either the setup or the shot itself is wrong.

A few suggestions from my own struggles with the RTSP feed:

  • avdec_h264 seems to be the only gstreamer element that can directly handle the ANAFI’s video out of the box with out modification. This is unfortunate for embedded users like Raspberry Pi who want to be able to use a hardware decoder like omxh264dec or nvh264dec which are not able to handle this encoding out of the box. (I think it’s specifically the use of “intra-refresh” technique that most of these decoders are not amenable to). This leaves the main options for a developer to be either: (1) transcode the video using avdec_h264 or PDrAW (easiest method), (2) integrate Parrot’s PDrAW libraries directly into your application, or stick to an approach explicitly supported by the GroundSDK such as YUV callback (medium), or (3) write your own decoder (hard).

  • I agree that information is pretty hard to come by. I remember with the old ARSDK3 drones like Parrot Bebop 2 there was a way to get the SPS/PPS packets and then in your own source code you could just inject those frames at whatever interval you wanted to at the NAL boundaries. Back then at least Parrot actually provided usable C++ code examples for understanding this process. This way is very similar to DJI — in the DJI Mobile SDK a long time ago there was a video data callback in the SDK; but this stream does not include appropriate SPS/PPS/IDR and you had to inject that from another source. That was not well documented by DJI, but at least they provided some sample code you could muck through to figure it out. In my opinion it would be awesome if there were a simple web page on the GroundSDK or PDrAW documentation that very simply explained the specifications of the video stream and requirements for decoding it, so developers didn’t have to be video experts or reverse engineer to figure it out. Or at least say “the video stream format is proprietary, you MUST use PDrAW and we won’t support any other method” so I don’t waste hours trying to turn water in to wine.

  • I have also observed the RTSP feed EOF in GStreamer when capturing a still photo. At least, in my case, I am able to workaround it by restarting the stream in my application, as I am using libgstreamer in C++. I recognize that it’s not ideal, but maybe putting the gst-launch-1.0 line into a looping shell script or systemd service might be enough to keep it going?

Thanks. I meanwhile have completely dropped the RTSP approach and using the YUV callback instead. This lets the entire RTSP mess in responsibility of the SDK.

This topic was automatically closed after 30 days. New replies are no longer allowed.

Hello,

Here is what happens with the video stream on ANAFI 4K, Thermal and USA products when camera commands are issued:

The video stream is encoded in H.264 and payloaded in RTP/AVP over UDP. RTSP over TCP is user for session management.

See the capture below where a stream is established, then a video to photo mode switch occurs, then 2 photos are taken. You can observe this with Wireshark for example.

The RTSP session (“5251864a3d82f97e” in the example below) is never stopped once established (SETUP request with a 200 OK response) until a TEARDOWN request is received, unless there is a network disconnection or RTSP timeout.

The RTP stream stays on the same ports during the whole session (5004->55004 for RTP and 5005<->55005 for RTCP in the example below). The SSRC (synchronization source identifier) however changes every time the photo/video pipeline is reset on the drone. This occurs on many changes of camera configuration (video recording framerate or resolution, switch between photo and video modes, photo trigger…).

On the H.264 side a new sequence (new SPS/PPS) is started when a change of camera configuration requires a change in the stream parameters: the streaming framerate for example is linked to the video recording framerate (24, 25, 30 fps).

As Nicolas said, this is not all by choice but due to technical constraints on hardware and software on the drone.

RTSP requests and responses

OPTIONS * RTSP/1.0
CSeq: 1
User-Agent: 0.0.0

RTSP/1.0 200 OK
CSeq: 1
Date: Thu, 28 Jul 2022 13:40:05 GMT
Public: DESCRIBE,SETUP,PLAY,PAUSE,TEARDOWN,GET_PARAMETER
Server: Parrot Streaming Server
Content-Length: 0



DESCRIBE rtsp://192.168.43.1/live RTSP/1.0
CSeq: 2
User-Agent: 0.0.0
Accept: application/sdp

RTSP/1.0 200 OK
CSeq: 2
Date: Thu, 28 Jul 2022 13:40:05 GMT
Content-Type: application/sdp
Server: Parrot Streaming Server
Content-Length: 484
Content-Base: rtsp://192.168.43.1/live

v=0
o=- 17025693627984635348 1 IN IP4 192.168.43.1
s=live
i=AnafiThermal-C000101
c=IN IP4 0.0.0.0
t=0 0
a=tool:1.8.3
a=recvonly
a=type:broadcast
a=control:*
a=X-com-parrot-maker:Parrot
a=X-com-parrot-model:AnafiThermal
a=X-com-parrot-model-id:0919
a=X-com-parrot-serial:PI040445P29C000101
a=X-com-parrot-build-id:anafi-thermal-1.8.3
a=X-com-parrot-boot-id:693FE4DCABD4D4F3F958265498508C82
m=video 0 RTP/AVP 96
i=DefaultVideo
a=control:front
a=rtpmap:96 H264/90000



SETUP rtsp://192.168.43.1/live/front RTSP/1.0
CSeq: 3
Transport: RTP/AVP/UDP;unicast;client_port=55004-55005
User-Agent: 0.0.0

RTSP/1.0 200 OK
CSeq: 3
Date: Thu, 28 Jul 2022 13:40:05 GMT
Session: 5251864a3d82f97e;timeout=10
Transport: RTP/AVP/UDP;unicast;client_port=55004-55005;server_port=5004-5005;ssrc=F2752BF8;mode=PLAY
Server: Parrot Streaming Server
Content-Length: 0



PLAY rtsp://192.168.43.1/live RTSP/1.0
CSeq: 4
Session: 5251864a3d82f97e
Scale: 1.00
User-Agent: 0.0.0
Range: npt=now-

RTSP/1.0 200 OK
CSeq: 4
Date: Thu, 28 Jul 2022 13:40:05 GMT
Session: 5251864a3d82f97e;timeout=10
Scale: 1.00
RTP-Info: url=live/front;seq=0
Server: Parrot Streaming Server
Content-Length: 0



GET_PARAMETER rtsp://192.168.43.1/live RTSP/1.0
CSeq: 5
Session: 5251864a3d82f97e
User-Agent: 0.0.0

RTSP/1.0 200 OK
CSeq: 5
Date: Thu, 28 Jul 2022 13:40:09 GMT
Session: 5251864a3d82f97e;timeout=10
Server: Parrot Streaming Server
Content-Length: 0



GET_PARAMETER rtsp://192.168.43.1/live RTSP/1.0
CSeq: 6
Session: 5251864a3d82f97e
User-Agent: 0.0.0

RTSP/1.0 200 OK
CSeq: 6
Date: Thu, 28 Jul 2022 13:40:13 GMT
Session: 5251864a3d82f97e;timeout=10
Server: Parrot Streaming Server
Content-Length: 0



GET_PARAMETER rtsp://192.168.43.1/live RTSP/1.0
CSeq: 7
Session: 5251864a3d82f97e
User-Agent: 0.0.0

RTSP/1.0 200 OK
CSeq: 7
Date: Thu, 28 Jul 2022 13:40:17 GMT
Session: 5251864a3d82f97e;timeout=10
Server: Parrot Streaming Server
Content-Length: 0



GET_PARAMETER rtsp://192.168.43.1/live RTSP/1.0
CSeq: 8
Session: 5251864a3d82f97e
User-Agent: 0.0.0

RTSP/1.0 200 OK
CSeq: 8
Date: Thu, 28 Jul 2022 13:40:21 GMT
Session: 5251864a3d82f97e;timeout=10
Server: Parrot Streaming Server
Content-Length: 0



GET_PARAMETER rtsp://192.168.43.1/live RTSP/1.0
CSeq: 9
Session: 5251864a3d82f97e
User-Agent: 0.0.0

RTSP/1.0 200 OK
CSeq: 9
Date: Thu, 28 Jul 2022 13:40:25 GMT
Session: 5251864a3d82f97e;timeout=10
Server: Parrot Streaming Server
Content-Length: 0



GET_PARAMETER rtsp://192.168.43.1/live RTSP/1.0
CSeq: 10
Session: 5251864a3d82f97e
User-Agent: 0.0.0

RTSP/1.0 200 OK
CSeq: 10
Date: Thu, 28 Jul 2022 13:40:29 GMT
Session: 5251864a3d82f97e;timeout=10
Server: Parrot Streaming Server
Content-Length: 0



GET_PARAMETER rtsp://192.168.43.1/live RTSP/1.0
CSeq: 11
Session: 5251864a3d82f97e
User-Agent: 0.0.0

RTSP/1.0 200 OK
CSeq: 11
Date: Thu, 28 Jul 2022 13:40:33 GMT
Session: 5251864a3d82f97e;timeout=10
Server: Parrot Streaming Server
Content-Length: 0



GET_PARAMETER rtsp://192.168.43.1/live RTSP/1.0
CSeq: 12
Session: 5251864a3d82f97e
User-Agent: 0.0.0

RTSP/1.0 200 OK
CSeq: 12
Date: Thu, 28 Jul 2022 13:40:37 GMT
Session: 5251864a3d82f97e;timeout=10
Server: Parrot Streaming Server
Content-Length: 0



GET_PARAMETER rtsp://192.168.43.1/live RTSP/1.0
CSeq: 13
Session: 5251864a3d82f97e
User-Agent: 0.0.0

RTSP/1.0 200 OK
CSeq: 13
Date: Thu, 28 Jul 2022 13:40:41 GMT
Session: 5251864a3d82f97e;timeout=10
Server: Parrot Streaming Server
Content-Length: 0



GET_PARAMETER rtsp://192.168.43.1/live RTSP/1.0
CSeq: 14
Session: 5251864a3d82f97e
User-Agent: 0.0.0

RTSP/1.0 200 OK
CSeq: 14
Date: Thu, 28 Jul 2022 13:40:45 GMT
Session: 5251864a3d82f97e;timeout=10
Server: Parrot Streaming Server
Content-Length: 0



GET_PARAMETER rtsp://192.168.43.1/live RTSP/1.0
CSeq: 15
Session: 5251864a3d82f97e
User-Agent: 0.0.0

RTSP/1.0 200 OK
CSeq: 15
Date: Thu, 28 Jul 2022 13:40:49 GMT
Session: 5251864a3d82f97e;timeout=10
Server: Parrot Streaming Server
Content-Length: 0



GET_PARAMETER rtsp://192.168.43.1/live RTSP/1.0
CSeq: 16
Session: 5251864a3d82f97e
User-Agent: 0.0.0

RTSP/1.0 200 OK
CSeq: 16
Date: Thu, 28 Jul 2022 13:40:53 GMT
Session: 5251864a3d82f97e;timeout=10
Server: Parrot Streaming Server
Content-Length: 0



GET_PARAMETER rtsp://192.168.43.1/live RTSP/1.0
CSeq: 17
Session: 5251864a3d82f97e
User-Agent: 0.0.0

RTSP/1.0 200 OK
CSeq: 17
Date: Thu, 28 Jul 2022 13:40:57 GMT
Session: 5251864a3d82f97e;timeout=10
Server: Parrot Streaming Server
Content-Length: 0



TEARDOWN rtsp://192.168.43.1/live RTSP/1.0
CSeq: 18
Session: 5251864a3d82f97e
User-Agent: 0.0.0

RTSP/1.0 200 OK
CSeq: 18
Date: Thu, 28 Jul 2022 13:41:00 GMT
Session: 5251864a3d82f97e;timeout=10
Server: Parrot Streaming Server
Content-Length: 0

RTP SSRC

RTSP SETUP
 -> SSRC=0xF2752BF8
[...]
switch from video to photo mode
 -> SSRC=0xC85F112C
[...]
take picture
 -> SSRC=0x3C6F1C5C
[...]
take picture
 -> SSRC=0x3DF73EEA
1 Like

I don’t want to argue with you, the more that I meanwhile abandoned the approach to use the video via RTSP from GStreamer, but do you mean, the GStreamer stream should NOT break, if a photo is make? Then I need to disagree: It breaks.

I’m not saying the GStreamer stream should not break. I’m saying it most likely breaks because of the change in SSRC that probably triggers stopping the gstreamer RTP element.

To to make the GStreamer approach work, it would require to catch the EOS event somehow in the pipeline and restart the RTP element.

OK, then we are on the same page.

EDIT: My approach when I was using RTSP was to pro-actively tear down the RTSP connection and reconnect after the shot. But this was lame and unstable.

I’m now using the YUV callback, which is faster and I don’t have to mess with RTSP on the outer line.

However, leaving streaming on while doing a shot mostly stops the video just for a few seconds with a visible re-sync, but in rare cases Parrot dumps tons of errors like this and then it is a lottery, if the video will re-appear or not.

2022-08-02 11:44:00,023 [ERROR]         ulog - pdraw_source_coded_video - copyOutputFrame:527: getOutputMemory err=11(Resource temporarily unavailable)
2022-08-02 11:44:00,023 [ERROR]         ulog - pdraw_dmxstrm - processFrame:2821: StreamDemuxerNet#1#front: copyOutputFrame err=11(Resource temporarily unavailable)

In all these cases I have seen “yuv flush” callback, so I can’t imagine to hold ref’ed buffers, which might cause the memory shortage…

But it is a Raspberry PI with 2GB total memory only (even though htop is not alarming me)…

I found no pattern so far. Right now I just pray each time I’m pressing the photo button…

Thank you for the detailed information, this matches what I see in my gstreamer pipeline when the photo/video pipeline is reset on the drone. My workaround is to simply catch the EOS event and restart the element because I happen to be using GStreamer in C++

1 Like

Using the YUV sink API from olympe/pdraw is a good idea because then all the RTP/RTSP and video decoding is handled by libpdraw. As you observe the YUV output is not affected by the drone pipeline restarts due to photo triggers, but there will still be a breakage when switching video recording framerate or photo/video mode switch for example (anything that changes the stream resolution or framerate). The libpdraw implementation is optimized to reduce latency, this is why it is faster in your case.

The errors you see is likely a bug we have previously seen but were not able to catch yet. On a raspi you are using our ffmpeg-based video decoder implementation which is used much less on our side than the Android/iOS HW decoder implementations.

As you observe the YUV output is not affected by the drone pipeline restarts due to photo triggers,

There is an outage in delivery of about 2-4 seconds

|2022-08-08 17:02:16,011 [ERROR] |ulog - pdraw_source_coded_video - copyOutputFrame:527: getOutputMemory err=11(Resource temporarily unavailable)|
|2022-08-08 17:02:16,017 [ERROR] |ulog - pdraw_dmxstrm - processFrame:2821: StreamDemuxerNet#1#front: copyOutputFrame err=11(Resource temporarily unavailable)|
|2022-08-08 17:02:16,093 [ERROR] |ulog - pdraw_source_coded_video - copyOutputFrame:527: getOutputMemory err=11(Resource temporarily unavailable)|
|2022-08-08 17:02:16,094 [ERROR] |ulog - pdraw_dmxstrm - processFrame:2821: StreamDemuxerNet#1#front: copyOutputFrame err=11(Resource temporarily unavailable)|
|2022-08-08 17:02:16,096 [ERROR] |ulog - pdraw_source_coded_video - copyOutputFrame:527: getOutputMemory err=11(Resource temporarily unavailable)|
|2022-08-08 17:02:16,097 [ERROR] |ulog - pdraw_dmxstrm - processFrame:2821: StreamDemuxerNet#1#front: copyOutputFrame err=11(Resource temporarily unavailable)|

Can I help to hunt that down?