ARFrame returns null array on getByteData() call


#1

I’m trying to convert an ARFrame to a Bitmap by creating it from the Frame’s ByteData. Now I have a few questions/issues.

  1. Is it even possible to do so? Is there an easy way to perform such a conversion?
  2. Why does the function return null? If the frame itself was null I’d simply get a NullPointer but the function actually return a null array.

I’d be glad if someone was able to help me out here!


#2

Hi,

You can’t. ARFrame content for Bebop 2 is not Bitmap encoded but h264 encoded.


#3

Ah alright thx!

I‘ve never worked with h264 encoded data. I found some information online but they all explain it as a VIDEO encoding standard.

So on receiving an ARFrame - which obviously should represent a single image not a video - how can I work with it? Is it somehow possible to convert such a single frame into wheter an NV21 Encoded Image or a Bitmap?

In order to provide a little more insight, that‘s the situation:
I‘m trying to implement Object Tracking. In order to do so, I use BoofCV‘s Object Tracking Framework. The Problem is, it can only work with the so called ImageUInt8 datatype. It provides two conversion functions:
NV21 -> ImageUInt8
Bitmap -> ImageUInt8


#4

You mean like this? :slight_smile:


#5

Yes, pretty much. Everythings working fine when using the tablet‘s camera but I can‘t seem to get any compatible image format from the drone.


#6

It’s the same process I documented here: https://stackoverflow.com/questions/37507569/bebopvideoview-to-mat/40905561

And instead of converting to a Mat, you want to convert to a GrayU8 (assuming TLD), like so:

    if (SIZE_MULTIPLIER == 0) {
        SIZE_MULTIPLIER = bitmap.getWidth() / 320;
    }

    width = (b.x - a.x) / SIZE_MULTIPLIER;
    height = (c.y - a.y) / SIZE_MULTIPLIER;

    final Bitmap compressed = Bitmap.createScaledBitmap(bitmap, Math.round(bitmap.getWidth() / SIZE_MULTIPLIER), Math.round(bitmap.getHeight() / SIZE_MULTIPLIER), true);
    storage = ConvertBitmap.declareStorage(compressed, storage);

    frame = ConvertBitmap.bitmapToGray(compressed, null, GrayU8.class, storage);

#7

Thank you, I will look into that! :+1:


#8

Unfortunately I ran into another problem. Though the conversion now apparently works, I can’t display the Frame anymore. The app only display a black image. I don’t get any errors, just the black image.

Below the BebopVideoView as a TextureView. I hope someone finds a solution!

public class BebopVideoView extends TextureView implements TextureView.SurfaceTextureListener {

    private static final String CLASS_NAME = BebopVideoView.class.getSimpleName();

    private static final String TAG = "BebopVideoView";
    private static final String VIDEO_MIME_TYPE = "video/avc";
    private static final int VIDEO_DEQUEUE_TIMEOUT = 33000;
    private static final int VIDEO_WIDTH = 640;
    private static final int VIDEO_HEIGHT = 368;
    private Surface surface;
    private MediaCodec mediaCodec;
    private boolean surfaceCreated = false;
    private boolean codecConfigured = false;
    private ByteBuffer[] buffers;

    private Lock readyLock;

    private ByteBuffer spsBuffer;
    private ByteBuffer ppsBuffer;

    public BebopVideoView(Context context) {
        this(context, null);
        readyLock = new ReentrantLock();
        setSurfaceTextureListener(this);
    }

    public BebopVideoView(Context context, AttributeSet attrs) {
        this(context, attrs, 0);
        readyLock = new ReentrantLock();
        setSurfaceTextureListener(this);
    }

    public BebopVideoView(Context context, AttributeSet attrs, int defStyleAttr) {
        super(context, attrs, defStyleAttr);
        readyLock = new ReentrantLock();
        setSurfaceTextureListener(this);
    }

    public void displayFrame(ARFrame frame) {
        readyLock.lock();

        if ((mediaCodec != null)) {
            if (codecConfigured) {
                // Here we have either a good PFrame, or an IFrame
                int index = -1;

                try {
                    index = mediaCodec.dequeueInputBuffer(VIDEO_DEQUEUE_TIMEOUT);
                } catch (IllegalStateException e) {
                    Log.e(TAG, "Error while dequeue input buffer");
                }
                if (index >= 0) {
                    ByteBuffer b;
                    if (android.os.Build.VERSION.SDK_INT >= android.os.Build.VERSION_CODES.LOLLIPOP) {
                        b = mediaCodec.getInputBuffer(index);
                    } else {
                        b = buffers[index];
                        b.clear();
                    }

                    if (b != null) {
                        b.put(frame.getByteData(), 0, frame.getDataSize());
                    }

                    try {
                        mediaCodec.queueInputBuffer(index, 0, frame.getDataSize(), 0, 0);
                    } catch (IllegalStateException e) {
                        Log.e(TAG, "Error while queue input buffer");
                    }
                }
            }

            // Try to display previous frame
            MediaCodec.BufferInfo info = new MediaCodec.BufferInfo();
            int outIndex;
            try {
                outIndex = mediaCodec.dequeueOutputBuffer(info, 0);

                while (outIndex >= 0) {
                    mediaCodec.releaseOutputBuffer(outIndex, true);
                    outIndex = mediaCodec.dequeueOutputBuffer(info, 0);
                }
            } catch (IllegalStateException e) {
                Log.e(TAG, "Error while dequeue input buffer (outIndex)");
            }
        }

        readyLock.unlock();
    }


    public void configureDecoder(ARControllerCodec codec) {
        readyLock.lock();

        if (codec.getType() == ARCONTROLLER_STREAM_CODEC_TYPE_ENUM.ARCONTROLLER_STREAM_CODEC_TYPE_H264) {
            ARControllerCodec.H264 codecH264 = codec.getAsH264();

            spsBuffer = ByteBuffer.wrap(codecH264.getSps().getByteData());
            ppsBuffer = ByteBuffer.wrap(codecH264.getPps().getByteData());
        }

        if ((mediaCodec != null) && (spsBuffer != null)) {
            configureMediaCodec();
        }

        readyLock.unlock();
    }

    private void configureMediaCodec() {
        try {
            final MediaFormat format = MediaFormat.createVideoFormat(VIDEO_MIME_TYPE, VIDEO_WIDTH, VIDEO_HEIGHT);
            format.setByteBuffer("csd-0", spsBuffer);
            format.setByteBuffer("csd-1", ppsBuffer);

            mediaCodec = MediaCodec.createDecoderByType(VIDEO_MIME_TYPE);
            mediaCodec.configure(format, surface, null, 0);
            mediaCodec.start();

            if (android.os.Build.VERSION.SDK_INT < android.os.Build.VERSION_CODES.LOLLIPOP) {
                buffers = mediaCodec.getInputBuffers();
            }

            codecConfigured = true;
        } catch (Exception e) {
            Log.e(CLASS_NAME, "configureMediaCodec", e);
        }
    }

    @Override
    public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int height) {
        this.surface = new Surface(surface);
        surfaceCreated = true;
    }

    @Override
    public void onSurfaceTextureSizeChanged(SurfaceTexture surface, int width, int height) {
    }

    @Override
    public boolean onSurfaceTextureDestroyed(SurfaceTexture surface) {
        if (mediaCodec != null) {
            if (codecConfigured) {
                mediaCodec.stop();
                mediaCodec.release();
            }
            codecConfigured = false;
            mediaCodec = null;
        }

        if (surface != null) surface.release();
        if (this.surface != null) this.surface.release();

        return true;
    }

    @Override
    public void onSurfaceTextureUpdated(SurfaceTexture surface) {
    }
}

Thanks in advance!


#9

EDIT: The Bitmap I get from the Texture View actually appears to be an empty Image (or transparent if you will)


#10

You see a valid video stream being rendered in the view? Where are you calling getBitmap() from?


#11

No the video stream isn’t rendered.
I call getBitmap() in the onFrameReceived Function in the Activity.


#12

Let me take a closer look at your view when I have more time. There’s likely something subtle missing.

It is completely appropriate for the bitmap to be black if there is no video stream being rendered.


#13

That is a very expensive call to make at 24/26/30 fps. I would seriously consider a separate thread for it with an interval closer to 5 or so FPS.

You’ll need to do some tweaking regardless. You want the video pipeline to run as fast as possible and sift off the bitmap as appropriate.


#14

Oh right, looking at the code again, I actually call the method in a separate Thread. But thanks nonetheless!


#15

Still couldn’t find a solution :confused: