I’m trying to realize the visual-inertial monocular slam with Parrot Anafi and hoping to have more information from sensors.
Since most of the sensor data (like linear accelerations and angular velocities) can not be accessed, I can only realize the slam with the monocular camera itself by far.
So, I hope to have the right to access more data from sensors by using the Olympe API in the future.
Two more questions:
- What is the difference between TakeOff and AutoTakeOffMode?
- How is the hand launch mode realized? Is it realized in the App or in the Anafi firmware? If it’s in the firmware, can we have an API of that?