The arg of PCMD: timestampAndSeqNum


Hi, I’m trying to write an SDK for win10. The function I’m writing is about PCMD.

I don’t understand what the parameter “timestampAndSeqNum” means. In the XML file, it is said that “Command timestamp in milliseconds (low 24 bits) + command sequence number (high 8 bits)”.

I’m not sure what timestamp should I set. For example, if I send a PCMD command to the drone , what timestamp should I set? the current time? or 0s?

Second, does the sequence number refer to the buffer ID? For example, PCMD doesn’t need Ack, so it’s buffer ID is 10, than its’ sequence number refers to the buffer ID. Am I right?

I will really appreciate if someone can save me QAQ

Best Regards,



This is used by the drone for debug purposes. You can let it to 0 as it is done in libARController.



Hi @Djavan :slight_smile:

Thanks for your reply, it works for me :smile:

Now I can keep the connection and send PCMD to the drone. But I still have the same problem,

I found that when it takeoff, the drone will not hover at the point, it will keep moving forward or right side.(i.e. I send PCMD with all arguments zero in 50ms each time, and send a takeoff command to the drone, at this moment the drone will takeoff but it keeps moving forward or right side at the same time.)

But when I use FreeFlight Pro, it will not happen.

Do I miss some settings about stabalization? :confounded:

Best Regards,


Do you send the AllStates/AllSettings commands?



I will check it tomorrow! Thanks for your quick reply :smile:

Have a nice day:relaxed:

Best Regards,



It works!! It is because that I miss the command “AllSettings”, I only send the command “AllStates”. I guess that these two commands are something like initialization?

My command order is:
Discover() -> AllStates() -> AllSettings -> PCMD(0, 0, 0, 0) with a loop -> takeoff()

*All these commands send with non ACK type.

Everything goes well and my drone flies stabalized :slight_smile:

Now I’m going to write a function about video stream, which is the hardest part :frowning:

Right now I don’t know where should I start, could you give me some advice?

I think that I will send the command “VideoEnable” at first, but how do I capture the video stream and decode these encoded data?

Really appreciate for your help!!

Best Regards,

Program with LabView

The video stream relies on the rtp protocol.
You can find more information about it in this blog post and in this one.

Please also note that you should only send the PCMD on the non ack buffer.




Thanks for your support again!

I will check these references in these days. Hope everything goes well.:joy:

p.s. I noticed that only PCMD should be send to non ACK buffer, but I found Discover/AllStates/AllSettings/Takeoff works fine even be send to non ACK buffer, too.

But definitely I will lose some error info. report from the drone.:sweat_smile:

Best Regards,



Indeed, the drone will accept any command from any buffer (i.e. you can also send the PCMDs in the acknowledged buffer), but you might come into issues regarding the buffer sizes, commands reliability & commands latency.

The main idea is that you should send only the PCMDs (and the CameraOrientation commands) on the non-ack link, as these commands are loss tolerant by nature (i.e. periodic with a short enough period to ignore a single loss). Other commands (non periodic ones) should be sent in the acknowledged buffer.




Thanks for the detailed information :slight_smile:, now I know why these commands should be sent to ACK buffers. Though these commands still work fine, but sometimes it might goes wrong due to the latency or wrong buffer size … etc. I will correct it :slight_smile:

By the way, is the way which transports the video stream by RTP protocol (i.e. stream v2, VLC) same as the way mentioned in ARSDK_protocol?

If so, I’m not sure which transport type of my bebop2 is. When I send the discover command, the return JSON message shows :

{ “status”: 0, “c2d_port”: 54321, “arstream_fragment_size”: 65000, “arstream_fragment_maximum_number”: 4, “arstream_max_ack_interval”: -1, “c2d_update_port”: 51, “c2d_user_port”: 21, “arstream2_server_stream_port”: 5004, “arstream2_server_control_port”: 5005, “arstream2_max_packet_size”: 1500, “arstream2_max_latency”: 0, “arstream2_max_network_latency”: 200, “arstream2_max_bitrate”: -1, “arstream2_parameter_sets”: “TBD” }

Should I implement the transport type mentioned in ARSDK_Protocols or in this blog?

Best Regards,



You should use the RTP (known as ARStream v2 inside the SDK) protocol when using any drone other than the Jumping Sumo (&variants).

The ARStream v1 protocol is deprecated and, while still present on current firmwares, will probably be removed from future firmwares.

To select the v2 protocol, you have to include the following keys in the JSON you send during the connection:

  • "arstream2_client_stream_port", which is the UDP port on which you want to receive the RTP data (i.e. the video)
  • "arstream2_client_control_port", which is the UDP port on which you want to receive the RTP control data



@Nicolas @Djavan

Hi, thanks for the reply again.:blush:
The JSON string I send to the drone in discover function was

{“controller_type”:“computer”, “controller_name”:“Halley”, “d2c_port”:“43210”}.

Do you mean that I need to define the “arstream2_client_stream_port” and “arstream2_client_control_port”, as the JSON string like

{“controller_type”:“computer”, “controller_name”:“Halley”, “d2c_port”:“43210”, “arstream2_client_stream_port”:“55004”,“arstream2_client_control_port”:“55005” } :confused:

After connecting, I send the video enable command, which’s buffer looks like:

| 1 (1B) project_ARDrone | 21 (1B) class_MEDIAStream | 0 (2B) command_VideoEnable | 1 (1B) arg_enable |
little -> high

then I use UDP to receive the data in port 55004, but there’s nothing received.:cry:

EDIT: Sorry, I just found some mistake about receiving, now I can receive the data !:sweat_smile:


Additionally, I’m not sure that the command order for the video stream is:

  1. Discover() (i.e. connect the drone and init the stream port)
  2. enableVideo() (i.e. send the video enable command)
  3. getImageData() (i.e. get the stream data from the arstream2_client_stream_port)
  4. decodeImageData() (i.e. decode the received data)

EDIT: I just figured it out !! All I need to do is to receive the RTP data from UDP port and decode it as H.264 video encoding streamed !!!

P.S. I have to say that I really appreciate for all your patient to answer me !! Thanks!!!:blush:

Best Regards,


@Nicolas @Djavan

Hi, I think that I’m starting to get the point, but I still have one last question…

Each time I receive the packet from the drone, the packet looks like 12 Bytes for header and the image data in the following.

I found that each data in RTP packet, it represents a part of a frame, but what is the whole size of a frame ?( i.e. how many packets should I combined in a buffer and decode it as H.264?)

Thanks for the reply :grin:

P.S. Really sorry to bother you guys but I’m really interest in it, and really appreciate for all your support!! :confounded:

Best Regards,



I’m no expert in RTP / H.264, but here are the reference documents about RTP and h.264 payloading.

Not the most readable documents, but since our stream protocol is compliant with these standards, you can get all your answers from them :wink:

Another way would be to search through the libARStream2 source code, to see how it is implemented by the SDK.



@Nicolas @Djavan Hi, Sorry for the late reply!

Now I’m applying myself to the ARStream v2 with RTP/H.264.

I will report if I have any improvement!!

Thanks for all your support !! :slight_smile:

Best Regards,