Product: Anafi
Hey,
I have an Android App that is rendering the video from the livestream. So far so good. The issue is that when there is a photo camera trigger, the onContentZoneChange callback sets the GL surface rect to 0 size and video is corrupted for some time. All the while the playState says the camera is playing, the onFrameReady keeps being called and the streaming server also reports enabled.
Am i missing something some step that i need to take care of, when i have a live stream view and trigger a picture? How do i know when the renderer is not in a healthy state?
Update: I see the GLRenderSink.Renderer.renderFrame returns false when this bad state exists, but then it is already too late as my framebuffer is bound and noise is put into it. Theoretically i could have another framebuffer and discard it if the render failed but this is sub-optimal and i need to try it.