FollowMe Sphinx

Quick stupid question: I know how to obtain the logs from a real drone, but how to do that from simulator? There is no web server running, as it seems (confirmed by Olympe):

2022-10-19 09:22:19,675 [ERROR]         olympe.media - _get_all_media - The webserver is unavailable

The simulated drone IP address is 10.202.0.1 . So the webserver should be available at http://10.202.0.1

Is there any other related log entries before this error ?

Ah, stupid me, sorry. Sure, there it is, the web page. But where are the logs?

Is there any other related log entries before this error ?

This is all I get at WARNING level from Olympe, once I start my Olympe application:

2022-10-19 09:43:24,971 [WARNING]       olympe.media - _get_all_media - HTTP 541: http://192.168.188.118:80/api/v1/media/medias b'Media Not Yet Indexed'
2022-10-19 09:43:25,265 [WARNING]       olympe.drone.ANAFI-0000000 - _recv_cmd_cb - Unknown message id 0xa2000005
2022-10-19 09:43:25,265 [WARNING]       olympe.drone.ANAFI-0000000 - _recv_cmd_cb - Unknown message id 0xa2000004
2022-10-19 09:43:25,708 [ERROR]         ulog - pdraw_sink_coded_video - VideoDecoder#2: coded video media format H264/AVCC not supported
2022-10-19 09:43:25,974 [WARNING]       olympe.media - _get_all_media - HTTP 541: http://192.168.188.118:80/api/v1/media/medias b'Media Not Yet Indexed'
2022-10-19 09:43:26,975 [WARNING]       olympe.media - _get_all_media - HTTP 541: http://192.168.188.118:80/api/v1/media/medias b'Media Not Yet Indexed'
2022-10-19 09:43:27,977 [ERROR]         olympe.media - _get_all_media - The webserver is unavailable
2022-10-19 09:43:27,977 [WARNING]       olympe.media - aconnect - Media are not yet indexed

Do you want me to provide Olympe logs at INFO level?

The webserver is there but returns an HTTP 541 error. Media on this physical drone are not indexed, the media web API is unavailable (not the web server itself).
On Anafi / Anafi Thermal / Anafi USA, this error can be explained if there is no SD card in the drone.

Thanks, but I’m missing logness

OK, not quite clear what it is doing. Trying to screen record it

  • It takes of
  • It flies backwards south-westerly away
  • It points the nose to the not moving Jasper
  • It climbs
  • It flies forward north-easterly
  • It lands

EDIT: Here the video. Sphinx followme, first attempt - YouTube

Yes, I’m used to this. MediaServer not working on Sphinx.

@ndessart However, I can’t find any log on the simulator. Where it is supposed to be?

I was trying to find an appropriate API in the Olymp spec 7.4 to no avail. I thought it would be clear to me yesterday, but it is not. By what API function could I communicate the “bounding rect coordinates of the target”?

Please elaborate

@Neil Our understanding thus far, is that there is such an API, namely the OnboardTracker, however this is only available on the ANAFI AI. For Anafi/USA/Thermal, it looks like we are limited to FollowMe which is slightly different. It does not take a bounding box, but rather an azimuth/elevation/change of scale. We are also looking at this from the Android GroundSDK and not Olympe, but the APIs seem to be present in both. Hence my earlier question about how to compute the required arguments for FollowMe from bounding boxes that we receive from our object detection DNN. We are attempting to first use the gimbal FOV/pose and the drone’s GPS to approximate the GPS location of the target object. We are then going to convert that into azimuth/elevation based upon the drone’s (current) GPS location relative to where we believe we saw the target. We still have to figure out an algorithm to estimate the change of scale so that the drone can have an idea if it needs to move backwards or forwards to stay with the target. The phone effectively does all this when you run FreeFlight; it seems to take the video stream and compute these parameters based upon the changes in the bounding box displayed as an overlay. Unfortunately though, as @ndessart said, this code is not open source so we are trying to do a similar thing on the edge server where we offload the stream.

@teiszler I see. Thanks for the elaborated answer. I knew there was a trap :slight_smile: From inspecting the Olympe SDK I also found that API function and I knew, there will be a DNN and CV nightmare ahead. That’s why I was thrilled about the statement to “just” have to forward the bounding box and the timestamp… That sounded logical to me, they (Parrot SDK) are knowing all about the intrinsics and extrinsics of the camera, there should be a way to calculate a 3D pose from that. Turns out it is not that easy.

Oh boy, I miss the RealSense SDK and it’s clear in and outs…

Quick look around gave me this 3D Object Detection and Pose Estimation with Deep Learning in OpenCV Python - YouTube

That sounds familiar. He calculates rotation and translation from object detection, that looks promising. Will checkout this.

Summary:

  • I asked a while ago how to make follow me run with the simulator. I was asked to provide logs from the simulator. Couldn’t find them somewhere. Need guidance, where to find

  • I got a sample and commented my results, providing a video. Missing input here. What is the expected behaviour of the new sample? I still see just the nose moving and some uncoordinated flight movements, which most likely are caused by the commands, not be something I would consider to be “follow me”.

@priols, @ndessart: Please elaborate, how to proceed?

“Is there anybody out there?” (Pink Floyd, “The Wall”, 1979)

Rubbish, the guy is just the sales man

@Neil I’m sorry that the free support we provide does not seem to meet your expectations. But please keep in mind that the support provided by Parrot on this forum is done as best effort by the developers themselves. This means that you can have access to people who actually know the system in details, but also that these people cannot spend all their time on support. Since you subscribed you have posted numerous topics and dozens of messages. There is no way we can keep up with this. And frankly the tone you use in your messages does not help. Please keep it short and be courteous.

That said, the FollowMe feature on Anafi 4K/Thermal/USA was always meant to be closed source. As we explained, the visual tracking algorithm is running on the received video stream in the FreeFlight6 app. Although re-implementing a third-party equivalent of the visual tracking algorithm is technically possible (all the necessary commands to send to the drone exist), we cannot provide more detailed documentation and support on this topic.

If you want to run the FollowMe feature in the simulator, a first step to ckeck that it is working is to use the FreeFlight6 app connected to the simulated drone.

And frankly the tone you use in your messages does not help.

Maybe you give me some examples? Otherwise I treat this “mimimi” as what it is. Mimimi. And honestly, I have been put too often now into the revolving door. Are you observing the state of my posts? Are you seeing that there are several issues, for which you have asked for “logs” and if I provided them, nothing happened anymore. Especially here in this thread I was asked to provide simulator logs - just there are no.

And why are you pointing me to use the FreeFlight app? Isn’t the Olympe SDK supposed to be a replacement? What is so difficult to give me an answer - in time? This topic “How to successfully follow me from Olympe to simulator” is open now since 21 days.

Honestly, this is not even third class support.

But ok. I will not be asking anymore.
Thanks