FollowMe Sphinx

Is possible to Simulate FollowMe feature using OlympeSDK?
https://developer.parrot.com/docs/olympe/arsdkng_followme.html#

In Sphinx, you have to mark an actor as begin tracked. Example:

parrot-ue4-empty -ams-path=DefaultPath,Jasper:*

See Populate the scene with actors - 2.9.1

OK, that makes Jasper appear and - by help of a sphinx cli command - also run. But you maybe might agree: The entire “How to do FollowMe” via Olympe SDK is pretty much undocumented, or do I oversee things? Is there any sample to successfully make the drone follow that guy in the simulation somehow?

I found this, but this does only takeoff, hover and land immediately.

import olympe
from olympe.messages.ardrone3.Piloting import TakeOff, moveBy, Landing
from olympe.messages.ardrone3.PilotingState import FlyingStateChanged
from olympe.messages import follow_me
from olympe.messages.follow_me import start,stop,state,set_target_is_controller,target_is_controller

drone = olympe.Drone("10.202.0.1")
drone.connect()
drone(
	TakeOff() >> FlyingStateChanged(state="hovering", _timeout=60)
).wait()
drone(
	follow_me.set_target_is_controller(1, _timeout=60)
).wait()
drone(
	follow_me.start(mode="leash", _timeout=60) >> state(mode="leash", behaviour="follow")
).wait()
drone(Landing()).wait()

I mean, I considered this to not being working OOB, but I thought I would get at least a hint, how to do that via the SDK. But there is not much error information, just a “deprecated” PilotedPOI message telling me lat=lon=0=status UNAVAILABLE…

Not very helpful, though.

To be more specific:

The documentation states that the tracked subject/object has a red arrow on top

Even though the sphinx simulator does log to have Jasper_0 as tracked person on startup, I can’t see that red arrow:

Hello,

For the arrow problem, what version of Parrot-Sphinx are you using ?
Can you run this command?

parrot-ue4-empty -version

Hmm. Ok, maybe a good point. 2.8.2

apt says, parrot-sphinx is already the newest version (2.8.2-focal)

Are you sure that your apt config is up to date? See Installation procedure - 2.9.1
The latest version is 2.9.1.

Note that, as explained here, when running “Follow me” mode, only the GPS signal is used to track the actor, not the front camera.

Uhm… Now. Updated to 2.9.1. Red arrow there

But the script is still not working, or I’m not getting its wisdom. I can see, that the nose of the drone is following the moving person, but I would expect the drone to follow the person itself. What am I missing?

only the GPS signal is used to track the actor, not the front camera

In other words: There is no way to simulate visual tracking, right?

0-04 14:51:29,019 [INFO] 	olympe.drone.ANAFI-0000000 - _recv_cmd_cb - follow_me.mode_info(mode='look_at', missing_requirements='drone_calibrated|drone_gps_good_accuracy|target_barometer_ok|drone_far_enough|drone_high_enough|image_detection|target_good_speed|drone_close_enough', improvements='drone_calibrated|drone_gps_good_accuracy|target_gps_good_accuracy|target_barometer_ok|drone_far_enough|drone_high_enough|image_detection|target_good_speed|drone_close_enough', list_flags='')
2022-10-04 14:51:29,019 [INFO] 	olympe.drone.ANAFI-0000000 - _recv_cmd_cb - follow_me.mode_info(mode='geographic', missing_requirements='drone_calibrated|drone_gps_good_accuracy|target_barometer_ok|drone_far_enough|drone_high_enough|image_detection|target_good_speed|drone_close_enough', improvements='drone_calibrated|drone_gps_good_accuracy|target_gps_good_accuracy|target_barometer_ok|drone_far_enough|drone_high_enough|image_detection|target_good_speed|drone_close_enough', list_flags='')
2022-10-04 14:51:29,019 [INFO] 	olympe.drone.ANAFI-0000000 - _recv_cmd_cb - follow_me.mode_info(mode='relative', missing_requirements='drone_calibrated|drone_gps_good_accuracy|target_barometer_ok|drone_far_enough|drone_high_enough|image_detection|target_good_speed|drone_close_enough', improvements='drone_calibrated|drone_gps_good_accuracy|target_gps_good_accuracy|target_barometer_ok|drone_far_enough|drone_high_enough|image_detection|target_good_speed|drone_close_enough', list_flags='')
2022-10-04 14:51:29,020 [INFO] 	olympe.drone.ANAFI-0000000 - _recv_cmd_cb - follow_me.mode_info(mode='leash', missing_requirements='drone_calibrated|drone_gps_good_accuracy|target_barometer_ok|drone_far_enough|drone_high_enough|image_detection|target_good_speed|drone_close_enough', improvements='drone_calibrated|drone_gps_good_accuracy|target_gps_good_accuracy|target_barometer_ok|drone_far_enough|drone_high_enough|image_detection|target_good_speed|drone_close_enough', list_flags='')

What does all that just mean?

From the current documentation:

command messageolympe.messages.follow_me.configure_geographic(use_default, distance, elevation, azimuth, _timeout=10, _no_expect=False, _float_tol=(1e-07, 1e-09))

Then:

* **distance** (*float*) – The distance leader-follower in meter Not used when arg start is at 0
* **elevation** (*float*) – The elevation leader-follower in rad (not used when arg start is at 0)
* **azimuth** (*float*) – The azimuth north-leader-follower in rad (not used when arg start is at 0)

Hmm. What the means is “arg start is at 0”?

Just to read in the end:

TODO

Remove unsupported message follow_me.configure_geographic

Honestly, this is really not helpful.

On Anafi / Anafi Thermal / Anafi USA. The follow me feature depends on visual tracking algorithms that are bundled within the FreeFlight 6 application. Using Olympe, the follow me feature use the GPS and barometer info from the controller without the visual tracking.

The logic is a bit convoluted but the documentation of this messages says:

  • missing_requirements (BitfieldOf(olympe.enums.follow_me.input], u16)) – List of missing requirements to enter this mode on start. Bit is 0 if the input is missing, 1 if the input is ok. For example, if the first bit is equal to 0, it means that the mode is not available because the drone is not calibrated. If at least one input is missing, drone won’t able to follow the target. It won’t use any fallback either
  • improvements (BitfieldOf(olympe.enums.follow_me.input, u16)) – List of inputs that can improve the mode. Bit is 0 if the input is missing, i.e.: the mode can be improved if this input would be fulfilled, 1 if the input is ok. For example, if the first bit is equal to 0, it means that the mode would be improved if the drone would be calibrated. If at least one input is missing, a downgraded mode will be used.

Here, for every follow_me mode you are missing the target_gps_good_accuracy bit from the requirements. The improvements bit fields has all its bits set to 1, so there is nothing you can do to “improve” the follow me tracking (even if you’re still missing the target_gps_accuracy requirements…).

The “TODO: Remove unsupported message ***” note is present for messages that were available for legacy drone products (Parrot Bebop 1/2 here). Those messages are not supported by Anafi.

@ndessart We are looking to implement our own visual tracking for the FollowMe piloting interface similar to what is done by FreeFlight6. Is this code part of the open source codebase for FreeFlight? If so, I cannot seem find it. If not, can you elaborate on how you arrive at an azimuth and elevation for sendTargetDetectionInfo? Are you computing an estimate of the GPS location of the tracked object and then converting that into azimuth/elevation based on the drone’s own GPS location?

Unfortunately, I have to admit that this doesn’t necessarily make me any smarter. Nevertheless, thank you for the answer.

Let’s try it again in a different way:

  • Script as shown above
  • Works against a Sphinx instance with Jasper and Co.
  • Besides the named oddities, I see that the drone’s nose follows Jasper’s run
  • I would have expected the drone to take off and follow the guy at some distance, but it just takes off and hovers on the spot

Question: Can this work with the expected result on the simulator?

All explanations regarding the “required” and “improvements” are still nebulous. This should really be better documented

From the explanations about the “visual FollowMe” I understand that this does not work in the context “Olympe/Python/Simulator”. Please confirm.

OpenFlight 7 is the open source core of FreeFlight 7 for Anafi Ai. My answer to Neil, was for Anafi / Anafi Thermal / Anafi USA. For Anafi Ai, the drone itself runs all visual tracking algorithms. Those algorithms are not publicly available. I’m not familiar with GroundSDK iOS and cannot answer your next questions. Please, create a new thread in the Anafi AI / GroundSDK forum category.

Hi @ndessart We are also working on the Anafi and Anafi USA. We are using the Android implementation of GroundSDK for control and are offloading the RTSP stream to an edge node for processing. The intent is to use object detection on the edge to find targets and then compute azimuth/elevation to send as the tracking information in the FollowMe API. This is similar to what I presume the FreeFlight app does when you draw a bounding box around a target and it follows it as the object moves. So I am looking for the algorithms that FreeFlight uses to drive that API when you initiate a FollowMe.

Could you please share the olympe logs and firmware logs so that we can have a better understanding of what is happening exactly ? Thanks

“It does not work”, I confirm. It’s just not available out of the box. As I said, the visual tracking algorithms are not publicly available. You would have to reimplement them on your end and then send the bounding rect coordinates (from the live video streaming) of your target to the drone firmware (with the correct frame timestamp).

Yes, it could be better documented, point taken…

Thanks, that makes it much more clear. I will send you the logs tomorrow. Thanks for the offer to have a look at it.

As I said, the visual tracking algorithms are not publicly available. You would have to reimplement them on your end and then send the bounding rect coordinates (from the live video streaming) of your target to the drone firmware (with the correct frame timestamp).

I recall (from another thread) that you are recommending OpenPose for this, but I guess any other object detection algorithm would do the job either (?)

Anyway, this answer sheds some light on the scene, at least I know where to go now.

Thanks.

Hello there,

I think the main problem here is the distance between the drone and the actor. Anafi4k requires a minimum distance of 15 meters (more or less). I have written a little Olympe script that should work in parrot-ue4-empty world with the following command line:

parrot-ue4-empty -ams-path=DefaultPath,Jasper:*

Olympe script:

import olympe
import logness
from olympe.messages.ardrone3.Piloting import TakeOff, Landing, moveBy
from olympe.messages.ardrone3.PilotingState import FlyingStateChanged
from olympe.messages.follow_me import (
    start, stop, state, set_target_is_controller
)
from olympe.messages.rth import return_to_home, state as rth_state

logness.update_config({
    "handlers": {
        "olympe_log_file": {
            "class": "logness.FileHandler",
            "formatter": "default_formatter",
            "filename": "olympe.log"
        },
    },
    "loggers": {
        "olympe": {
            "handlers": ["olympe_log_file"]
        },
    }
})

drone = olympe.Drone("10.202.0.1")

assert drone.connect()

takeoff = drone(
    TakeOff()
    >> FlyingStateChanged(state="hovering", _timeout=60)
).wait()

assert takeoff, "Check takeoff"

moveby = drone(
    moveBy(dX=-25.0, dY=-25.0, dZ=-1.0, dPsi=0.0, _timeout=30)
    >> FlyingStateChanged(state="hovering")
).wait()

assert moveby, "Check moveBy"

follow = drone(
    set_target_is_controller(1)
    >> start(mode="leash", _no_expect=True)
    >> state(mode="leash", behavior="follow")
    >> stop()
    >> start(mode="geographic", _no_expect=True)
    >> state(mode="geographic", behavior="follow")
    >> stop()
    >> start(mode="relative", _no_expect=True)
    >> state(mode="relative", behavior="follow")
    >> stop()
    >> FlyingStateChanged(state="hovering")
).wait(_timeout=60)

assert follow, "Check follow me"

return_home = drone(
    return_to_home()
    >> rth_state(state="available", reason="finished")
).wait(_timeout=120)

assert return_home, "Check return to home"

landing = drone(
    Landing()
    >> FlyingStateChanged(state="landed")
).wait(_timeout=60)

assert landing, "Check landing"

Could you try it?