Streaming video errors on Raspberry Pi

Having an issue streaming video on a Raspberry Pi 3B+. Getting errors when calling yuv_frame.info() inside the yuv callback:

Traceback (most recent call last):
File “_ctypes/callbacks.c”, line 234, in ‘calling callback function’
File “/home/pi/code/parrot-groundsdk/packages/olympe/src/olympe/arsdkng/pdraw.py”, line 763, in
lambda *args: self._video_sink_queue_event(*args),
File “/home/pi/code/parrot-groundsdk/packages/olympe/src/olympe/arsdkng/pdraw.py”, line 837, in _video_sink_queue_event
while self.process_stream(id):
File “/home/pi/code/parrot-groundsdk/packages/olympe/src/olympe/arsdkng/pdraw.py”, line 873, in _process_stream
if not self.process_stream_buffer(id, video_frame):
File “/home/pi/code/parrot-groundsdk/packages/olympe/src/olympe/arsdkng/pdraw.py”, line 909, in _process_stream_buffer
cb(video_frame)
File “ground-control.py”, line 87, in yuv_frame_cb
info = yuv_frame.info()
File “/home/pi/code/parrot-groundsdk/packages/olympe/src/olympe/arsdkng/pdraw.py”, line 261, in info
frame = self._get_pdraw_video_frame()
File “/home/pi/code/parrot-groundsdk/packages/olympe/src/olympe/arsdkng/pdraw.py”, line 156, in _get_pdraw_video_frame
ctypes.byref(self._pdraw_video_frame))
ctypes.ArgumentError: argument 4: <class ‘TypeError’>: expected LP_c_ulong instance instead of LP_c_ulonglong

Any ideas?

This error occurs with ANAFI software version 1.6.0.

Hi,

Disclaimer:
RPis are not officially supported and I don’t have an RPi3 on hand so I won’t be able to reproduce this issue right now. That being said, it shouldn’t be too difficult and we will eventually support it.

It seems that your Python binding of libpdraw has some type inconsistencies. The C dependencies Python bindings used by Olympe are generated during the build step in out/olympe-linux/final/usr/lib/python/site-packages/olympe_deps.py. Can you share this file with us? Thanks

Have you followed the official installation procedure on the RPi3? (I wouldn’t bet on a cross-compiled version at this point).

Edit: I guess that you’ve followed the procedure described by @oupscamille in the Olympe 1.01 on Raspberry pi zero and pi3 A+ topic.

Nicolas

1 Like

Hey. So I am able to use the following code (Streaming.py with some changes for physical drone connection) in a linux laptop

import time
import csv
import cv2
import math
import os
import shlex
import subprocess
import tempfile
import olympe
import threading
from concurrent.futures import ThreadPoolExecutor
import olympe_deps as od
from olympe.messages.ardrone3.Piloting import TakeOff, Landing, PCMD
from olympe.messages.ardrone3.Piloting import moveBy
from olympe.messages.ardrone3.PilotingState import FlyingStateChanged
from olympe.messages.ardrone3.PilotingSettings import MaxTilt
from olympe.messages.ardrone3.GPSSettingsState import GPSFixStateChanged
from olympe.messages.ardrone3.GPSState import NumberOfSatelliteChanged
from olympe.messages.ardrone3.GPSSettingsState import HomeChanged
from olympe.messages.ardrone3.PilotingState import PositionChanged
from olympe.messages.ardrone3.GPSSettingsState import GPSFixStateChanged
from olympe.messages.ardrone3.PilotingState import FlyingStateChanged
from olympe.messages.ardrone3.Piloting import Emergency
from olympe.messages.ardrone3.Piloting import NavigateHome
from olympe.messages.rth import return_to_home


class ParrotDrone:

    def __init__(self):
        # Create the olympe.Drone object from its IP address
        self.drone = olympe.Drone("192.168.53.1", mpp=True, drone_type=od.ARSDK_DEVICE_TYPE_ANAFI4K, loglevel=2)
        self.tempd = tempfile.mkdtemp(prefix="olympe_streaming_test_")
        print("Olympe streaming example output dir: {}".format(self.tempd))
        self.h264_frame_stats = []
        self.h264_stats_file = open(
            os.path.join(self.tempd, 'h264_stats.csv'), 'w+')
        self.h264_stats_writer = csv.DictWriter(
            self.h264_stats_file, ['fps', 'bitrate'])
        self.h264_stats_writer.writeheader()

    def start(self):
        # Connect the the drone
        self.drone.connection()
        self.drone.set_streaming_output_files(
            h264_data_file=os.path.join(self.tempd, 'h264_data.264'),
            h264_meta_file=os.path.join(self.tempd, 'h264_metadata.json'),
            # Here, we don't record the (huge) raw YUV video stream
            # raw_data_file=os.path.join(self.tempd,'raw_data.bin'),
            # raw_meta_file=os.path.join(self.tempd,'raw_metadata.json'),
        )

        # Setup your callback functions to do some live video processing
        self.drone.set_streaming_callbacks(
            raw_cb=self.yuv_frame_cb,
            h264_cb=self.h264_frame_cb
        )
        # Start video streaming
        self.drone.start_video_streaming()

    def yuv_frame_cb(self, yuv_frame):
        """
        This function will be called by Olympe for each decoded YUV frame.
            :type yuv_frame: olympe.VideoFrame
        """
        # the VideoFrame.info() dictionary contains some useful informations
        # such as the video resolution
        info = yuv_frame.info()
        height, width = info["yuv"]["height"], info["yuv"]["width"]

        # convert pdraw YUV flag to OpenCV YUV flag
        cv2_cvt_color_flag = {
            olympe.PDRAW_YUV_FORMAT_I420: cv2.COLOR_YUV2BGR_I420,
            olympe.PDRAW_YUV_FORMAT_NV12: cv2.COLOR_YUV2BGR_NV12,
        }[info["yuv"]["format"]]

        # yuv_frame.as_ndarray() is a 2D numpy array with the proper "shape"
        # i.e (3 * height / 2, width) because it's a YUV I420 or NV12 frame

        # Use OpenCV to convert the yuv frame to RGB
        cv2frame = cv2.cvtColor(yuv_frame.as_ndarray(), cv2_cvt_color_flag)

        # Use OpenCV to show this frame
        cv2.imshow("Olympe Streaming Example", cv2frame)
        cv2.waitKey(1)  # please OpenCV for 1 ms...

    def h264_frame_cb(self, h264_frame):
        """
        This function will be called by Olympe for each new h264 frame.
            :type yuv_frame: olympe.VideoFrame
        """

        # Get a ctypes pointer and size for this h264 frame
        frame_pointer, frame_size = h264_frame.as_ctypes_pointer()

        # For this example we will just compute some basic video stream stats
        # (bitrate and FPS) but we could choose to resend it over an another
        # interface or to decode it with our preferred hardware decoder..

        # Compute some stats and dump them in a csv file
        info = h264_frame.info()
        frame_ts = info["ntp_raw_timestamp"]
        if not bool(info["h264"]["is_sync"]):
            if len(self.h264_frame_stats) > 0:
                while True:
                    start_ts, _ = self.h264_frame_stats[0]
                    if (start_ts + 1e6) < frame_ts:
                        self.h264_frame_stats.pop(0)
                    else:
                        break
            self.h264_frame_stats.append((frame_ts, frame_size))
            h264_fps = len(self.h264_frame_stats)
            h264_bitrate = (
                8 * sum(map(lambda t: t[1], self.h264_frame_stats)))
            self.h264_stats_writer.writerow(
                {'fps': h264_fps, 'bitrate': h264_bitrate})

    def stop(self):
        # Properly stop the video stream and disconnect
        self.drone.disconnection()

    def getgpsfix(self):
        print("\n\n GET GPS FUNC CALLED \n\n")
        i = 0

        while True:
            print("\n\n THE LOOP IS STARTING \n\n")
            print("Is GPS Fixed:")
            print("Latitude:", self.drone.get_state(PositionChanged)["latitude"])
            print("Longitude:", self.drone.get_state(PositionChanged)["longitude"])
            print("Altitude:", self.drone.get_state(PositionChanged)["altitude"])
            print("GPS Fix:", self.drone.get_state(GPSFixStateChanged)["fixed"])
            print("Current Drone Mode:", self.drone.get_state(FlyingStateChanged)["state"])
            time.sleep(3)
            print("\n\n Loop Number {} \n\n".format(i + 1))
            print("\n\n THE LOOP IS ENDING \n\n")

    def gpslock(self):
        while True:
            gpslock = self.drone.get_state(GPSFixStateChanged)["fixed"]
            if gpslock == 0:
                print("\n\n DRONE DROP INITIALISING \n\n")
                # self.drone(Emergency(_timeout=5))
                print("\n\n DRONE DROPPED \n\n")

            else:
                print("\n\n GeoFence Pass \n\n")


if __name__ == "__main__":
    streaming_example = ParrotDrone()
    # creating thread
    startdrone = threading.Thread(target=streaming_example.start)
    print("TASK 2 STARTED \n\n\n")
    gps = threading.Thread(target=streaming_example.getgpsfix)
    print("TASK 2 OVER \n\n\n")

# ------------------- Main Function Execution ------------------------------------------

    print("MAIN :: Drone Connection Starting \n\n\n")
    startdrone.start()
    print("MAIN :: Drone Connection Successfully \n\n\n")
    time.sleep(5)
    print("MAIN :: GPS Starting  \n\n\n")
    gps.start()
    # streaming_example.gpslock()

    while True:
        time.sleep(0.1)

But I am not able to run the same code on the Raspberry Pi 3B+ But i get the following -

W pdraw_dmxstrm: failed to get an input buffer (-11)
W pdraw_dmxstrm: failed to get an input buffer (-11)
I pdraw_decavc: frame input: flush pending, discard frame
I pdraw_decavc: frame input: flush pending, discard frame
[h264 @ 0x6a247540] Reinit context to 1280x720, pix_fmt: yuv420p
W pdraw_dmxstrm: failed to get an input buffer (-11)
I pdraw_decavc: frame input: flush pending, discard frame
[h264 @ 0x6a247540] Reinit context to 1280x720, pix_fmt: yuv420p
W pdraw_dmxstrm: failed to get an input buffer (-11)
I pdraw_decavc: frame input: flush pending, discard frame
[h264 @ 0x6a247540] Reinit context to 1280x720, pix_fmt: yuv420p

I have connected the RP 3b+ with the SkyController 3 via USB C port. My olympe is running on RP.

I have build olympe on RP 3B+ using this thread.

Also, @ndessart if you have any suggestion or code snippet to get parrot anafi (physical drone) video feed on raspberry pi , please share it with me.

@eubmabe @ocrave can you guys also suggest me something please?
@eubmabe you here stated that you able to achieve photos, are able to achieve Parrot Anafi Live camera feed on olympe running on RP 3b+.

Hi cPabz,

The Raspberry Pi is not supported by GroundSDK/Olympe. There is actually already some known issue on with the RPi*.

I don’t have much experience with the RPi but at this point I’d recommend compiling Olympe natively (no cross-compiling) on the RPi running Rasbian buster. I wouldn’t not recommend any other distrib.

That being said, I don’t see anything fundamentally wrong about your script.

*NOTE: Olympe currently does not send correctly commands containing float parameters.

I’ve filled a bug report on Python ctypes which doesn’t use correctly the libffi API. ctypes should be calling ffi_prep_cif_var for variadic functions instead of ffi_prep_cif . But since ctypes doesn’t have any way to know it’s dealing with a variadic function it just always call ffi_pref_cif for every function.

This has absolutely no consequence on x64 . Conversely on ARM this has a dramatic consequence since the ABI of variadic functions (with float parameters) is different from normal functions. So when the arsdk_cmd_enc variadic function is called through ctypes the arguments of this function end up in the wrong registers. That explains why the gimbal.set_target (that takes float parameters) doesn’t work on Raspberry Pi.

The next release of Olympe will workaround this issue. We will just stop using variadic functions through ctypes.

Thanks for the quick reply! Okay i have built olympe on RPi natively using this thread .

This is the output for the following code on RPi - pi@raspberrypi:~ $ cat /etc/os-release

PRETTY_NAME="Raspbian GNU/Linux 10 (buster)"
NAME="Raspbian GNU/Linux"
VERSION_ID="10"
VERSION="10 (buster)"
VERSION_CODENAME=buster
ID=raspbian
ID_LIKE=debian
HOME_URL="http://www.raspbian.org/"
SUPPORT_URL="http://www.raspbian.org/RaspbianForums"
BUG_REPORT_URL="http://www.raspbian.org/RaspbianBugs"

Could you also suggest me any single board computer (similar to RPi) that can be used with skycontroller 3 . So from your above post can i assume, if i have a micro controller that is not ARM based running ubuntu 18 or debian 9+. can i build olympe natively without any complication and issues?

Also by when are you guys planning to release the next release, i am just very curious!!
And sorry to bother you guys so much! Keep up the good work!!

Next week if all goes accordingly to the plan.

Thats really cool! Can’t wait to test it out! Once again thank you and good job, you guys are really very fast and helpful!! Keep us posted about the update… Thank you!!

When I said that the Raspberry Pi is not (officially) supported by GroundSDK/Olympe, I mean that we haven’t been testing it extensively. I currently have an RPi4 on my desk (at home yes!) that I use now and then but what we need is a proper CI pipeline running on RPi.

We already know about one show stopper issue encountered by the community with the RPi: commands that contain floating point arguments (see my message above). This will hopefully be resolved in the next GSDK release. This bug impact every ARM targets (this includes the RPi). Since most (all?) SBC computer are based on ARM CPU, you’ll have the same kind of problem with other SBCs.

Beyond that, to properly support the RPi I think that we would at least need to leverage the h264 hardware decoder of the RPi through libmmal. libpdraw used by Olympe for the video streaming should already support the RPi hardware decoder using libmmal if you build the GroundSDK natively from Raspbian (like you did).

To sum up, everything to support the RPi with Olympe should be in place for the next 1.2.0 release but since we don’t actively test the SDK on RPi I’m still expecting some minor bugs on features that I haven’t tested yet (like the video streaming on RPi).

Once we’ll have a properly working SDK on RPi backed up by a CI pipepline, I think we will “officially support” the RPi (I have no timeframe for the moment). In the meantime, I’d really appreciate that you continue sharing your experimentations here. Whenever I can/I have some time (not today nor tomorrow sorry :wink: ), I’ll try to reproduce your issue(s).

Thanks you

1 Like

Hey Guys any news about the update?

still on track :slight_smile: we are generating the docs as we speak, all from remote places!

As far as Olympe is concerned the 1.2.0 is live !

1 Like

This is a very interesting track. I love working with my Pi with my 3d printer and engraver (have some cool workflows for them).

Frankly, having groundsdk (I know we’re talking olympe here) available to it opens up a ton of possibilities from wifi extensions to custom controllers with native, or other, UIs, etc, etc.

Now, I just need to find a free 3 months to take something like that on!

1 Like

Hey! I had a different issue this time, the streaming.py file runs on RPi and a new window for the video feed also pops up but it has this error .

2020-04-21 02:21:20,215 [ERROR] 	olympe.pdraw.ANAFI-G045901 - _process_stream - _process_stream_buffer exception:
Traceback (most recent call last):
  File "/home/pi/code/parrot-groundsdk/packages/olympe/src/olympe/arsdkng/pdraw.py", line 1008, in _process_stream
    self._process_stream_buffer(id_, video_frame)
  File "/home/pi/code/parrot-groundsdk/packages/olympe/src/olympe/arsdkng/pdraw.py", line 1028, in _process_stream_buffer
    vmeta_type, vmeta = video_frame.vmeta()
  File "/home/pi/code/parrot-groundsdk/packages/olympe/src/olympe/arsdkng/pdraw.py", line 312, in vmeta
    frame = self._get_pdraw_video_frame()
  File "/home/pi/code/parrot-groundsdk/packages/olympe/src/olympe/arsdkng/pdraw.py", line 182, in _get_pdraw_video_frame
    ctypes.byref(self._pdraw_video_frame))
ctypes.ArgumentError: argument 4: <class 'TypeError'>: expected LP_c_ulong instance instead of LP_c_ulonglong

Any idea why this issue is here, or is there any workaround? I am sorry I keep troubling you all.
@ndessart @Jerome
Thank you in advance!!

I compiled Olympe natively on RPi 4b running Raspbian buster. I’m able to control the drone’s gimbal, take off, move, take photos but when I try to stream video, I get the same error! Anyone managed to get it working?
<class ‘TypeError’>: expected LP_c_ulong instance instead of LP_c_ulonglong

I have reached the same blocker, I am able to complete all the same tasks you mention. When I try to stream video I encounter the same pdraw error that @cPabz experiences.
I am running the software on Raspberry Pi 4 (8GB) Raspbian Buster.
Using Olympe’s stream video example, I have the same error.
Anyone else find a solution?

Hey I’m also encountering an error when trying to stream from a RPi 4 running Ubuntu 18.04. The error is the following:

2021-01-04 01:15:04,246 [INFO] olympe.pdraw.ANAFI-G032797 - _media_added - _media_added id : 2
Traceback (most recent call last):
File "ctypes/callbacks.c", line 234, in ‘calling callback function’
File “/home/ubuntu/code/parrot-groundsdk/out/olympe-linux/final/usr/lib/python/site-packages/olympe_deps.py”, line 81, in
type
((lambda callback: lambda *args: callback(*args))(
File “/home/ubuntu/code/parrot-groundsdk/packages/olympe/src/olympe/arsdkng/pdraw.py”, line 840, in _media_added
if (media_info.contents._2.video.format !=
AttributeError: ‘struct_pdraw_media_info’ object has no attribute ‘_2’
2021-01-04 01:15:04,249 [INFO] olympe.pdraw.ANAFI-G032797 - _media_added - _media_added id : 1
Traceback (most recent call last):
File "ctypes/callbacks.c", line 234, in ‘calling callback function’
File “/home/ubuntu/code/parrot-groundsdk/out/olympe-linux/final/usr/lib/python/site-packages/olympe_deps.py”, line 81, in
type
((lambda callback: lambda *args: callback(*args))(
File “/home/ubuntu/code/parrot-groundsdk/packages/olympe/src/olympe/arsdkng/pdraw.py”, line 840, in _media_added
if (media_info.contents._2.video.format !=
AttributeError: ‘struct_pdraw_media_info’ object has no attribute ‘_2’

Any ideas on how to fix this?

1 Like