I want to use computer vision to register the vehicle in front of me as a tracked object, acquire the position coordinates of that object, and output specific values to the log for visual inspection.
1, For that reason, I want to understand the following two points. Could you give me an answer?
・Is Computer Vision in arisdk of anafiAI a library?
・Telemetry list - 7.7 and Messages list - 7.7 What is the difference?
2, Could you please tell me the specific implementation method for the above?
Computer vision is a drone service. It is used to track targets on the stream from the drone’s front camera.
Telemetry is used to retrieve data from the drone. It is possible to publish your own telemetry, as in the example mission Road Runner, which exchanges telemetry between an Airsdk service and a guidance mode.
The messages in the message list are used to communicate with the drone’s services, such as the computer vision service. All the commands you can send and the events you can receive are available here.
For your mission, you can use the computer vision service with the suggestion of boxes, for example, which will enable you to recognise a target car and target it. You will need to create your own position estimation algorithm to recover the position of your target (in your case a car). To do this, you can use telemetry to retrieve data on the drone’s position and the orientation of the front camera.
I hope this has been useful to you.