Hello,
I am developing an application using GroundSDK Android that aims to do remote image processing of the drone’s video stream. I want to receive the stream on my app and send each frame to a remote machine without displaying it on the device’s screen. From my research, it seems that GSDKStreamView, the recommended way to access the stream, will not work without being displayed (since as I understand, the capture() method does a screen layout pixel copy so if it is not displayed it cannot retrieve the underlying image). Thus, as far as I can tell, there are only two options.
Option 1: use the internal YUVFrameSink class as the stream sink. This appears to do exactly what I want except the onFrame() callback returns a native pointer to an internal cpp struct (sdkcore_frame). I am unfamiliar with JNI so I have no idea whether there is an easy way to read this pointer in as a byte array into Java. If anyone knows how to do so, please let me know!
Option 2: use the RTSP stream generated by the drone (rtsp://192.168.42.1/live). I have played around with this method but I find that the RTSP stream has extremely poor performance compared to the internal GroundSDK stream. Namely, it has a tremendous amount of latency, up to 1 second at times, which causes the stream to lag behind real-time significantly. This is not acceptable for my project. If someone knows how to reduce this latency to <100 ms, please let me know!
If anyone at Parrot could offer some guidance on how to proceed, I would be very grateful. Thank you for your time and your help!