Sumo-Drone Protocol: Sending a movement command stops video transfer


#1

Hi everybody,

I am connecting to the Sumo and send a video enabling command. The drone then sends back video perfectly with a frame rate of about 10 frames per second. As soon as I send one any other command (jump, move etc), the frame rate drops down to almost zero (1 or 2 per second or less) - sometimes it even stops completely. I can only revert that situation by rebooting the drone

Does anyone know or has an idea why the drone behaves that way?

cheers
Stefan


#2

Hi,

See @Djavan answer here : Having fun with jumping sumo; some observations about jpg feed on developing program by following protocols doc

Regards,
Nicolas.


#3

Ok, Thanks for pointing me there but I am not sure if I understand what I have to do.

Let me sumarize what I understood:

  • as long as no cmd has been sent no metric is being generated which is why video works fine
  • as soon as the first cmd is sent, then the metric is generated which slows down the video.

So my question is: What should I do?

  • does that mean that as soon as I send the first pcmd I need to keep on going sending commands (like every other 50ms) to let the drone know that “I am still alive”?
  • If that is the case, which cmd should I send when I don’t want the drone to do anything (hence standing still) to keep alive the video and now disturb the metrics.

(I did read the whole thread but I may have overread the actual solution).

cheers and thanks for responding so fast, Nicolas.

Stefan


#4

hi stefan, yes, unless I am mistaken, you will need to send some form of movement command to the drone … simply speaking, any command won’t do, like say a tap or somesuch, but rather it has to be a movement. But, that movement could be 0, 0, so you aren’t necessarily forced to move …

one good thing, though, that I learned was that, if you followed the thread, it seems that the frequency is somewhat decided by you, although there is a lower bound …


#5

but, stefan, there is a bunch of other tricky stuff in the protocol - which basically boils down to you needing to respond to acks and pings/pongs, I think, otherwise they are just resent, I think …

good luck & let me know if I can help, otherwise …


#6

Good you pointed out that I can send a movement like 0,0. That may do the trick (although it means from the time starting one movement I need to have a thread continously only taking care about that then).

Strange thing is though why they did this as there is actually ping/pong which should be enough to know for the drone it is still alive. Anyway, I will try it out to send the move command more regularly. From what I learned so far is that I shouldn’t send it too often (lower bound) but also not too seldom (because of the metrics).


#7

Fortunately I am not starting from scratch but from a code that alex bischof has written and he has foreseen all that stuff already. Also I tried to understand the arsdk source (which is really c-ish … and hard).

One thing I stumbled over is that there is a video ack command (13). Any idea what this is for?


#8

I agree, regarding the ping/pong & I don’t understand it either - it seems that this should be sufficient, or rather, even better, I think that there ought to be a command that lets you set the video as constant, and set the rate, given the capabilities of the underlying device … right now, I think that the frame rate is adjusted given some conditions …


#9

give me a sec about the video ack, but, a note about it appears in the arsdk protocols doc …

ok, its at the end, and it reads:


ARStream acks are deprecated and are no longer enabled on most products. The description here is only for compatibility with older Bebop Drone firmwares.

I spent a bunch of time reading and re-reading the protocols doc and I think that it is pretty good, save not having a good note about what you (& I earlier) complained about, which is not so bad over-all, other than that when you developing something, I think it is natural for one to feel that one has written something incorrectly …


#10

I am amazed how fast you answer and I really appreciate that a lot :slight_smile: !

Can you point me to protocol doc that YOU are refering to as the best? is that the one: http://developer.parrot.com/docs/bebop/ARSDK_Protocols.pdf?

Cheers
Stefan

PS: btw, if you are curious (and others who are reading along) what it is for:

  1. Here is the library that Tobias and Alex have developed and I am trying to tweak for the video: https://github.com/Devoxx4KidsDE/drone-controller
  2. Here is a play app that bridges it to allow call it via http: https://github.com/Devoxx4KidsDE/sumo4scratch
  3. And here is the workshop for kids to control the Sumo with Scratch: https://github.com/Devoxx4KidsDE/workshop-jumping-sumo-4-scratch

#11

hi stefan - yes, exactly, this is the only protocol document that I know about …

hey, that’s a nice program and I give you extra respect for working on a project for teaching kids. if I can help you with something quite specific, please let me know …

while I was working on my little “research” project, in a few cases, I had to do some packet capture from the iphone app to track down exactly which packets I wanted to send, so, perhaps being able to do something that helps … in my case, I have a small mac mini and it was able to capture everything i needed …

but, it seems to me that you are on the right track now, regarding the video - which is, essentially, to have some thread that sends some move commands at some interval. in my case, I use some queue that I can push commands to. I snatch from the queue at some interval, and, if there is no movement on the queue, then I send move(0,0) …

anyway, I have some quite rough code that it is my intention to write up quite soon, perhaps even today, with all my findings. In my case, as I was saying elsewhere, I wanted to read from the video feed and make navigation decisions based on things on the floor, like colored tape and so on … but, I wanted to build the app from the bottom up, as much as possible …

when I finally post everything I’m hoping that it will be quite easy to understand, and perhaps there’s something in there for you to use …


#12

Where you could actually help me is how I could capture the traffic between the iphone (or android) and the drone. Only now by your mention I am aware that the device from which I am controlling the drone doesn’t need to be the one that wireshark is running at. However I am bit new to wireshark. Can you give me guidance on how to setup wireshark? So my setup is as follows

  • Drone providing wifi network
  • Android (or Ipad) running Freeflight Jumping
  • Macbook running wireshark

How do I setup wireshark correctly to capture the packets. When I go into options and the standard setting is on I almost see no packets. When I click on the monitor check box and the link layer is on “802.11 plus radiotap header” then a lot comes in. Is that the right setting? Do you have display filters to ease working with wireshark?

The idea with the video capturing and interpretation of the bottom is very cool and I could imagine would fantastically fit to what we do with the kids. I am always thinking of cool ideas with which the kids would have fun with.

cheers
Stefan


#13

hi, so, take a look at google for “wireshark mac wifi monitor mode” and you should be able to find the right way … it won’t be hard for you … its just that the mac wifi has to be able to go into monitor mode … none of my other computers networking cards would do it, except the mac …

after I captured the packets, I had the re-assemble them, since each packet is no larger than the MTU of the device, so, that was a bit of a struggle …

then there was the question of how to read the packets and all that - which is in that protocols doc … and then, of course, to be able to filter out to find just what you are looking for … in my case, I was trying to figure out how to take a jpg snapshot - a control exists on the app, but I didn’t see the command elsewhere … so, I had a bit of a clue of what I was looking for in particular, and, when I was doing it, I had spent quite some time w/ the packets and their formatting, but, anyway, I hope that the above helps …

good luck …


#14

I finally was able to capture it and yes, it needs to be in monitor mode and I set it to 802.11. Still, as you say, it is really a hassle to then filter out the stuff and put them together. It seems on a normal TCP-IP layer wireshark puts the packets automatically together as it notices the segments but on 802.11 at least I did not get that to work.

Anyway, I will now focus on automatically send PCMDs(0,0) to the drone as soon as the first pcmd has been send to keep the drone busy. I will let you know if that finally did the trick.

Thanks for all your help so far. I do appreciate it.


#15

hi stefan - for the sake of correctness, btw, those packets are UDP rather than TCP …


#16

Here is something I learned so far:

  • It is true that sending a command regularly (I use 100ms) keeps the drone sending video images
  • However this makes sending commands in general much more complicated because
    ** A move of 1 second needs to be split into 10 single commands each 100 ms
    ** If a ping is received from the drone it must be still handled in the meantime
    ** If you don’t send any command for some time you need to fill the void with a command the keeps the drone thinking you are still alive, which I use, as discussed above a move(0 speed, 0 turn)
    ** With all this you still have to make sure that the sequence numbers are still right!
    ** You have to be carefull with sending move(0,0) commands that they do not interfere with a command that make the drone moving currently because it suspends all movement instantly.

So Imagine I want the drone to move for 2 seconds, I create/split it into 20 move-commands with each command waiting for 50 milliseconds. I push this to a queue to be processed. The main thread that handles the commands to be sent to the drone processes the queue. When the queue is empty and only then it starts to artificially sending the move command. I can confirm that this works.

However I must not do that with the Jump-Command. Let’s say I want the drone to jump and then wait in my program for sometime because the drone needs to jump and land I actually have send only one command and let the processing thread fill the void move(0,0). Outside my main thread (the part of the program where the drone receives commands from a user or from a sequence of actions to be done) I need to wait “until the drone has landed” and only then push the next commands. I ran into this issue because I automatically split the Jump-Command that was intended to wait for 3 seconds into 30 commands which ended up in making the drone 30 times (arggghhh)! So different commands need to be treated differently. I have not yet implemented that part but for everyone running into the same issues I will keep posting my results.


#17

Hi,

In libARController, we have an internal thread which sends PCMD from a cache at a fixed rate, and an api (setPilotingPCMD()) to set values to this cache. In this kind of setup, to do a 2 seconds forward move, your main thread would only have to run the following pseudocode:

void front_2_sec()
{
    setPilotingPCMD(front, 0);
    sleep(2);
    setPilotingPCMD(0, 0);
}

By doing this, you also ensure that all commands are sent from a single thread. This avoids both sequence number issues and “end of commands queue detection”.

Please also note that the PCMD command is the only command affecting the video stream, and thus the only command that should be sent periodically. There is no need to repeat other commands (such as the jump command, as you figured ;))

Regards,
Nicolas.


#18

Thanks Nicolas for jumping in and bringing more light into this:

That clarifies it a lot. Currently we are actually using two threads (I am not aware why … yet). I need to find out why (the guys who did it aren’t currently within reach…

  • What do you mean with “sends PCMD from a cache at a fixed rate”?

  • Does that main you give the thread a command that it should repeat for 2 seconds and then it repeats for that time?

  • What delay are you using?

  • So when setting 0,0 the cache in the background continuously send Pcmd(0,0) or do you stop periodically send PCmds after sending it once to 0? (that would be a cool trick).

  • Or are you are sending PCMDs all of the time either PCMD(x,y) or PCM(0,0). However what do you do then if it is currently sending PCMD(0,0)'s and then want to jump? Send one jump command and then again PCMD(0,0) again?

In the samples I actually see many different “setPilotingPCMD*” like setPilotingPCMDFlag probably all sending into the cache.After answering the above and if you think it would help me could you actually point me to the implementation-file of the libARController that does the sending out of the cache in the ARSDK.

Would be great if you could clarify it even more :slight_smile:
Stefan


#19

omg, Finally I just understood the whole trick. :grinning:

I only understood after I read your code. The whole thing is much easier than I thought. I had thought I need to keep sending the Pcmd(0,0) until the next command actually is sent and until the next comes in. The only mistake I made was not to send the Pcmd(0,0) ONCE" in the end.

That does the trick. So to sum up:

  • Pcmd’s have to be repeated as long as they should have to have an effect
  • In the end a series of Pcmd’s need to be “closed” with a Pcmd(0,0) to tell the device that it did not miss one for a move. Finally it makes sense to me why it was implemented that way,

Can you still tell me the delay time you are using ?


#20

hi stefan, I think nicholas will answer better, but, I think what he’s doing, or what make sense to me is that, from a “piloting thread” he writes to some location - let’s call it “the move store” - what the pcmd should be, and then there is a loop in a “movement thread” that reads from that “move store” at some fixed interval and, at least as long as for the duration you want the video feed, will send move commands its read from the move store.

in the case above, he writes to move to the front, some value, and then goes to sleep for 2 seconds - in his piloting thread. Now, behind the scenes, so to speak, his movement thread is reading from the store and writing packets, perhaps once every 100ms or what have you, and, when his piloting thread wakes up and writes now 0,0 to the store - for being still - the movement thread just keeps going in its fashion, writing packets whatever it reads from the store. In this case, its reading and then writing 0,0, and the drone is still. Again, this loop never stops and, at least in so far as the computer controlling the actions is consistent, those packets are consistent.

now, in terms of your message queue, I’d imagine that you might have several threads writing to it, responding to things it reads, such as acks or pings and so on … you might make it a dequeue or some form or priority queue, if you absolutely had to elevate a response, as perhaps you would want to do w/ a ping/pong.

as I wrote elsewhere in this chain, I think it would help if that video feed could be made fixed and perhaps on a separate socket, although perhaps as we begin to look at higher class items from parrot, perhaps such as the bebop, we might find that there is more robustness and the ability to do this … As another example, I wish that I could change the angle of the camera, in order to point it more downward - for something I am doing … But, overall, I think that the jumping sumo is exactly at the price point and sophistication level that we wanted it at …