I am interested in running Sphinx with an external camera simulation (a custom OpenGL application, not Gazebo), together with the official controllers (e.g. Free Flight 6). The ideal situation would be to have Sphinx simulating the drone, telemetry data, etc. and stream video from the external camera simulation. At this point it can be limited to Anafi 4k.
I understand how to disable the Gazebo-based camera simulation (with_front_cam) and how to get the drone and gimbal pose from Sphinx / Gazebo. How do I continue from here?
- Assuming I have raw RGB frames or H.264 frames ready to be streamed, is there any way to get the IP address and port of the controller (e.g. Free Flight 6) for video streaming?
- Do I need to insert metadata?