Can downloading mambo's pictures remotely be quick?


note: I’m sure “file structure” is a technical term, I’m using it to describe how a jpeg might be organized differently than a png file.

I’m working on my capstone and decided to use what is provided by amymcgovern’s pymambo github (at and I realized that she hasn’t managed to send over pictures just yet. I was thinking of trying to jab at it for myself, but my time is precious since I have about 10 weeks until the semester is over and I wanted to get opinions on whether or not sending over the pictures over bluetooth can be realistically quick.

part of my capstone is going to have my mambo divide up the space in front of it into a grid, move based on offsets to accurate traverse this grid, and at each stop we take a picture below it to look for a tag. Each time a picture is taken, the picture is transferred over to my computer to scan for a tag and then sends commands to the mambo based on this result. However due to the low battery life, I figure that I should take into account how long it would take to send over the pictures that the mambo can take.

all I have seen is you can only send over strings over the bluetooth connection between your mambo and whatever is connected to the mambo, so this makes me think I have to send over the pictures in a string format.

I’m not sure if I can even assume that the picture file in the mambo is of the same file type as the file you end up with then using the smartphone parrot controller app. (so if you end up with a jpeg, I don’t know if the data in the mambo is technically more raw, also a jpeg, or in some structure that works for what is needed and isn’t conventional.

I was thinking, maybe I can convert it into grayscale in the mambo, and then send it over in a more simple format via binary or hexadecimal and convert it when it comes to my computer. I don’t know, I’m just trying to get creative.

Please give me any input you can, even opinions. If you give me a lead, I can chase it! The more complicated the solution is the more my capstone may be haha. I also understand I will probably have to add more commands or files to the package that amy mcgovern provided on her github.


1. do you know what kind of file structure is used to store images in the mambo?
2. do you happen to know how to access this file?
3. is it possible to process the image via losing detail to make it easier to send(like converting it into grayscale)?
4. even if I make it lose more detail, do you think the transfer time from the mambo to my computer will be too long to do a lot of checks before it dies? I would imagine my drone is going to have to scan about 9 spaces max.



1.: The file stored on the Mambo is exactly the one you get on your phone. So, yes, jpeg.
2.: File transfer over bluetooth is described here: Picture download on mambo. As said in this thread, the file transfer protocol we made over BLE is quite messy. You might find some libraries that have already implemented in python.
3.: I don’t know that, you can try to have a binary on the Mambo that converts pictures to grayscale images.
4.: Transfer time on a smartphone is around 20 seconds, it should be the same on a computer.


Can we use the method of USB-to-LAN connection and telnet with mambo for pictures transfer??


You really need too be using pyparrot. I have posted that the pymambo is no longer maintained.


Also, I still haven’t implemented the picture transfer because it is quite slow over BLE. Since the protocol isn’t well documented either, I was unable to get it to work at all (despite that thread). Since it wasn’t a priority once I had the wifi camera working, I stopped looking at it. There is no good way to use photos in flight if they take 20 seconds to transfer.

@Djavan can we transfer the photos over wifi instead? If so, I’ll go back to figuring out the protocol…


To tell you the truth, I’m really okay with it taking 20 seconds to transfer. I can just claim in my capstone that in a real use of my project, you would have drones with better hardware and better software.

Pymambo has been working out so far, is there a huge difference with pyparrot? I’ll be sure to use it but I have been already been using your pymambo so I’m just making sure if I should expect to run into things that worked with my pymambo code but not with the pyparrot code.

Also if in understanding implications correctly, are you Amy McGovern? Thank you for sharing your pymambo/pyparrot code! I was struggling for so long trying to use the parrot developer tutorial because it was inconsistent! You’ve made my capstone easier and I have been getting to the part of my capstone that is more important (getting 2 drones to work together). And so far things kinda look promising!


Yes, I am Amy McGovern. I am a computer science professor and I am glad to hear that it is helping your capstone!

The difference is that I’ve been bug fixing in pyparrot and not in pymambo. pyparrot also supports wifi, and not just BLE. If you have a mambo with the camera, you get a lot more information from the sensors using the wifi! That was parrot’s choice in terms of how much data they send but it is based on how slow the BLE is.


Okay I’ll be sure to use it!

By the way I have had this idea and maybe you would be interested in it!

So from what I know mambo and many other drones use ultrasound to detect if it’s too close to a surface just below it or even to know when it’s left the ground.

If you can access the ultrasound readings then you can make the mambo do a flip and use the ultrasound to detect an obstacle in front of the drone when the bottom is facing the direction it will be looking at when it has finished it’s flip! I have seen how fast the drone flips so maybe it’s not possible but there should be a spike in it’s readings if it is next to a wall.


Although we get the altitude readings from the mambo, they are only at 2Hz so the frequency isn’t high enough. I’ve been hoping @Djavan will get the sensor rate up to at least 5 Hz as I know it is calculated internally higher than 2 Hz.