SDK platform: [iOS]
I am having the hardest time figuring out something in the iOS SDK.
For research in school I am using the Mambo FPV drone. We are using external information to try and make the drone
We are making an iOS app, based on the Sample SDK. The goal is to convert the video stream (which is h264) that is received as an ARCONTROLLER_Frame_t to a UIImage every time. This way we can use this image to detect rectangles/faces using Vision by Apple.
We are struggling with this specific part for the longest time, and we were hoping that you could shed some light on this matter. Do you have any sample code, purely for passing the ARCONTROLLER_Frame_t to a UIImage so that we can continue the detection part of the video stream. So in short, we are expanding the SDKSample for iOS and integrating Vision framework to detect rectangles/faces in the videostream, unable to convert the frames to usable UIImages.