Telemetry sample tweaking

Hello there,

While extracting data from the telemetry tool, I get this unusual timestamp record:

(305, 990000)
(305, 992000)
(305, 994000)
(305, 998000)
(305, 999000)
(306, 0)
(305, 988000)
(305, 961000)
(305, 967000)
(305, 972000)
(305, 973000)
(305, 978000)
(305, 979000)
(305, 992000)
(305, 997000)
(305, 988000)
(305, 991000)
(306, 2000)
(306, 4000)

where the 1st one is for sure the second timing, and the 2nd one seem to be related with sampling. However, this sampling is received unordered. How is it related with the SAMPLE_RATE and MSG_RATE parameters used for the tool? As of now, I’ve set both to the same value (1000). How does it affect the program?

Thanks in advance!

Could I use it to e.g. save up to X messages from the same second then discard the other ones (in a way they are evenly distributed within that second)?

Ok, so my real question is this:

Given that I am processing all the telemetry topic samples received and filtering/storing some of them (e.g. only the drone’s and pedestrian’s poses), what I end up with is with a ~70-150MB file with thousands of rows with subtle changes between them (i.e. a pose coord is different). Therefore, for a single second, I can have more than 50 rows.

So what I am currently doing is let it be, and just when closing the logging, I remove “duplicated” rows so to have one datarow per second. However, I’d like that this filtering could be implicitly setup in the telemetry interface. Is there any way to reduce/increment sample/message rate to limit the data overflow?