Hi,
I’m working on a real-time detection and tracking application. Currently we use a different drone and stream video frames to a windows machine for ML processing. KML information is included in the stream and is needed for our application to localize tracked objects observed by the drone.
We need both video frames and the video metadata (primarily the GPS coordinates of the drone as well as information about the frame/zoom) streamed in real-time. I’ve had a look through some of the relevant documentation on this for ANAFI. This page has some relevant information and I’ve been learning about the provided SDK and tools for accessing some of this data.
Ultimately what we need is to be able to control the drone via FlightController 3 while simultaneously picking up the video frames and metadata for processing on a Windows machine. A forum thread here details a way to both use the controller and collect an RTSP stream. This works for us and I can get video frames at the expected URL 192.168.53.1/live
in Windows. However I inspected the incoming video stream and I cannot find an easy way to get access to the frame-sync’ed metadata. This led me to look into the provided SDK to see about getting the metadata along with our video stream.
I read through the documentation and forums and it’s clear that Windows is not currently supported with the SDK and surrounding tools. In the interest of learning more I went ahead and installed an Ubuntu VM as well as the PDrAW and Olympe applications.
As I understand it, Olympe is not suitable for our use case, because the ANAFI drone will not allow a simultaneous connection to the controller and also to the python application. Nonetheless, I disconnected the controller and gave it a try. With no controller connected I can connect to the drone via 192.168.42.1
as I’d expect, and running the streaming example did capture some data from the ANAFI. The example recorded a very short video (4 seconds with a low bitrate/interruptions) and captured the metadata using the provided libraries. I see the following errors while running the streaming.py
example, and I notice that while some metadata is present in the resulting file, the GPS coordinates are conspicuously absent. Any idea what might be going on here?
I then reverted to the old controller connection setup and attempted to run PDrAW itself to verify the connectivity. However I get an error almost as soon as I start up the application following the user guide. I can ping the drone at 192.168.53.1
as well as use VLC to view the stream within the Ubuntu VM at 192.168.53.1/live
. So I know networking is not the problem.
I’m thinking perhaps I’m missing some dependency or there is a compatibility issue with the VM. I haven’t yet dug through the source code to trace this error but I’m thinking that’s my next step. Is this a known problem?
What is the best way for me to directly tap into libvideo-metadata and perform the extraction on an RTSP stream? Ultimately I’m guessing we may end up with some custom C code running in WSL to handle this. Has much testing been done with that library and WSL?
For now my setup in the VM is just for learning and understanding. If a large effort is needed to make something Windows compatible, we may run with that for some time though. Any ideas for how to get at the metadata within Ubuntu while retaining the ability to use the FlightController?