Integrating Sphinx with an external camera simulation

I am interested in running Sphinx with an external camera simulation (a custom OpenGL application, not Gazebo), together with the official controllers (e.g. Free Flight 6). The ideal situation would be to have Sphinx simulating the drone, telemetry data, etc. and stream video from the external camera simulation. At this point it can be limited to Anafi 4k.

I understand how to disable the Gazebo-based camera simulation (with_front_cam) and how to get the drone and gimbal pose from Sphinx / Gazebo. How do I continue from here?

  1. Assuming I have raw RGB frames or H.264 frames ready to be streamed, is there any way to get the IP address and port of the controller (e.g. Free Flight 6) for video streaming?
  2. Do I need to insert metadata?



Forwarding your own frames directly to the controller is quite difficult: You’d have to start your own RTSP server on rtsp://anafi.local:554/live and bypass the one started by the firmware, knowing that this server is always active even with with_front_cam=0 (it then produces test video data).

Hi, thanks for the reply.

Do you think it will be easier to somehow push raw frames into the firmware (as a general approach, I know it is not a supported use case)? I assume the simulation sends raw frames to the firmware for encoding and streaming.

The protocol used to send frames to the firmware is based on undisclosed proprietary software so I can’t help you with that.