We are trying to use computer vision service to register the aruco marker with vertical camera as a tracked object, acquire the position coordinates of that object, and output specific values to the log for visual inspection. We are trying to use Hello mission and road_runner example from airsdk to navigate and land the drone.
Is Computer Vision in arisdk of anafiAI a library?
In the documentation we see the computer vision in both telemetry and in messages.
Messages list - 7.7
Telemetry list - 7.7
What is the difference between messages list and telemetry list?
Could Parrot provide examples how we can trigger and use these computer-vision services? Is it possible to use built in services in an AirSDK mission?