I’m developping computer vision services, and when the drone is flying, sometimes the do_step function isn’t called during few seconds (that is critical for our autonomous fly).
I’m trying to debug but It’s hard as it’s only happening when flying with real drone. With simulated drone, everything works perfectly. As we can’t have access to logs of drone and don’t know how to write own log, I’m wondering which video feed should I use for computer vision service using frontal camera.
I started from airsdk-sample and changed VIPC STREAM to use fcam_streaming. Is it the best way to create computer vision services for autonomous fly ? I can see lot of mutex mecanism in the service, and I’m wondering if this feed is the best one to do what I want as fcam_streaming is also used by the streaming process.
Otherwise, is there any other way to do front cam based computer vision ?