I’m building out an iOS application with a feature that relies on real time access to the decoded YUV frames coming from the drone. I have found a couple posts referencing that this may not be possible on iOS (Ground SDK - access YUV / H264 frame sink) and others referencing some example code (potentially Android) interacting with the frames themselves.
I am interested in example code going from the base GroundSdk object (e.g. via accessing Peripherals, Sinks, etc…) all the way to receiving frames and arriving at a pointer to a decoded YUV frame.
Is this available on iOS?
If not, I would also be interested in similar example code for Android.
Unfortunately, we don’t have any examples of this type. But we do have the HelloDrone example, which creates a livestream and may be a good start for what you want to do.
This exemple is available for Android and IOS and please, take a look at the documentation for Android and IOS.
I think you should import RawVideoSink (Android / IOS).
You should use copiedFrame.planes instead of copiedFrame.nativePtr to access the frame content in swift.
You can find infos about the video format (resolution, plane count, byte ordering in planes in semi-planar mode …) in the VideoFormat object passed to the didStart() function of your listener.