Low real time factor on good computer


I’m trying Sphinx on a very good computer that has these ressources :

  • CPU i9-9940X 3.30GHz 14 Core
  • GPU RTX 2080 11Go
  • 32Go DDR

I’m under Ubuntu 18.04 with ubuntu-drivers-440 installed.

Unfortunately, on parrot-ue-empty, without any action (no take off, etc), I have a very low realtime factor , around 58% :

I can see using nvidia-smi that UE and gazebo are both using my GPU :

Thu May  5 17:32:28 2022       
| NVIDIA-SMI 440.33.01    Driver Version: 440.33.01    CUDA Version: 10.2     |
| GPU  Name        Persistence-M| Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp  Perf  Pwr:Usage/Cap|         Memory-Usage | GPU-Util  Compute M. |
|   0  GeForce RTX 208...  On   | 00000000:B3:00.0  On |                  N/A |
| 31%   46C    P0    60W / 250W |   2473MiB / 11018MiB |     37%      Default |
| Processes:                                                       GPU Memory |
|  GPU       PID   Type   Process name                             Usage      |
|    0      1267      G   /usr/lib/xorg/Xorg                            18MiB |
|    0      1314      G   /usr/bin/gnome-shell                          73MiB |
|    0      5269      G   /usr/lib/xorg/Xorg                            18MiB |
|    0      5402      G   /usr/bin/gnome-shell                         254MiB |
|    0     13560    C+G   ...e4-empty/Empty/Binaries/Linux/UnrealApp  1837MiB |
|    0     13991      G   gzserver                                       9MiB |
|    0     18984      G   /usr/lib/xorg/Xorg                           142MiB |
|    0     19077      G   /usr/bin/gnome-shell                         106MiB |

Any idea to check why it is so slow ?



The RT factor should not be this low with a RTX 2080. Mine is above 90% with a GTX 980.
Have you tried the latest driver: nvidia-driver-510?

Note that the main bottleneck issues are always due to cameras. The more cameras you visualize, the lower your RT factor will be.

1 Like

Hi @ocrave,

I was visualizing 3 cameras in Sphinx (Vertical, Front and Stereo). As soon as I removed it from the viewer, the reatime factor went close to 1 :slight_smile:

My algorithms uses these images onboard, so they are computed anyway, but not display. This is only the display of 3 images (already computed) that makes the simulator so low ?


The frames are synchronized with the scene with a precision of 1 ms. They are sent to the firmware via a shared memory. Each camera has its own set of clients. If there is no client the camera is disabled (no render). Since it is a shared memory, only one rendering is performed for each frame, and the same buffer is read by all its clients. When you visualize a camera in Unreal, with sphinx-cli, or when you start receiving frames in the firmware, you are just creating a new client that will read the same frames in the camera buffer pool as the other clients.

If the 3 cameras are already used by your algorithms, you should not see a significant drop of the RT factor, since they already have at least one client in the firmware. Can you tell us which camera contributes the most to this drop?

Note that the follow and user cameras (Unreal cameras) also have an effect on the RT factor. The larger your window, the more they consume since their resolution is proportional to the window size. You can decrease the Resolution Scale in Edit > App Settings to reduce their consumption.

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.