Need access to more informations, like IMU data (after filtering) & position (from the optical flow sensor)

I’m trying to realize the visual-inertial monocular slam with Parrot Anafi and hoping to have more information from sensors.

Since most of the sensor data (like linear accelerations and angular velocities) can not be accessed, I can only realize the slam with the monocular camera itself by far.

So, I hope to have the right to access more data from sensors by using the Olympe API in the future.

Two more questions:

  1. What is the difference between TakeOff and AutoTakeOffMode?
  2. How is the hand launch mode realized? Is it realized in the App or in the Anafi firmware? If it’s in the firmware, can we have an API of that?

I believe auto takeoff is also referred to as smart take off. Basically, you call it and if hand launch is enabled and the criteria is met it’ll ramp the motors. Otherwise it does a full spin up of the motors.

There is a shake callback you can use as well for detecting if the drone has been tossed / etc. It’s available on both the Anafi and bebop2.

Regarding additional sensors the firmware doesn’t expose much beyond inertial movement. You won’t find anything available image flow wise if that is what you’re looking for.


For the record, AutoTakeOff has been deprecated with Bebop2 drones and is not supported by Anafi.
For an hand launch (and contrary to what synman said) you should use UserTakeOff command instead.

The hand launch is performed automatically by the firmware after it has received the UserTakeOff command.

Have you checkout the video streaming metadata exposed by Olympe ?


Thanks for the answers!
Is it possible to add the access to the angular velocities and linear accelerations to the SDK?
And another question about the speed in NorthEastDown:
When the Anafi is in landed mode, the speed unit will not work and always be zero. If the speed can be read in any mode, that’ll be great.