How to get a Bitmap from an ARframe?


#1

Product: [Bebop/Bebop2]
Hi all,
I’m working with the last (precompiled) version of the Parrot SDK in Android Studio (Windows). My ultimate goal is using the Bebop 2 drone to read barcodes from the camera feed, using the Google Barcode API. As far as I know, the only way to detect these barcodes is reading them from bitmaps, and this last week I’ve been (literally) HOURS trying everything to convert an ARframe to a bitmap, but have had no success.
I’m aware that there are some related questions (and answers) on this issue in this forum, but none of them have been useful to me, probably because they are quite old and the samples have changed since then.

The method I’ve been experimenting the most with is frame.getByteData() inside onFrameReceived, in BebopActivity.java. I’d like to think it returns some sort of H264 encoded byte array, which can then be decoded to a YUV image… but all its values are always -100 from byte 45 on, so I must be wrong.

If someone has any clue on how the getByteData() method works or the general issue of the title I’d be extremely grateful :slight_smile:


#2

Back the ARFrame with a TextureView instead of a SurfaceView and use the TextureView’s getBitmap() method from a separate thread to retrieve a bitmap.

There may be some Java code floating around that can manipulate the bytebuffer directly, but I didn’t do much research on that front. I already had a TextureView available to me and it was easy.

Back in the day with the orignal ARDrone SDK I used to manipulate frames directly in Java to convert to bitmap. Back then it was just a matter of flipping a couple color related bits to get them in the right order. The modern SDK / Bebop drone does a bit more encoding wise.


#3

Hey @synman, thanks for your reply. Trying to extend H264VideoView from TextureView instead of SurfaceView sounds promising and yep, I’m trying to go that way. Unfortunately I’m not very familiar with Android Studio and I’m having some problems doing it :persevere:.
If I’m not wrong the class should be defined as
public class H264VideoView extends TextureView implements TextureView.SurfaceTextureListener
The only errors I have after changing this are those related to the getHolder() method, as it is exclusive of SurfaceView. I’ve tried to overcome these by writing, first

private void customInit() {
mReadyLock = new ReentrantLock();
this.setSurfaceTextureListener(this);
}

and writing this in the configureMediaCodec()

SurfaceTexture surfaceTexture = this.getSurfaceTexture();
Surface surface = new Surface(surfaceTexture);
mMediaCodec.configure(format, surface, null, 0);

and, although I don’t get compilation errors, it looks like mMediaCodec is always null when I call it in displayFrame, so I don’t receive the camera of the drone anymore :confused:

Do you have any clue on how I could adapt H264VideoView more successfully…? :smiley:
Thanks


#4

See here


#5

Okay @synman, you were right and these posts indeed contained the solutions I needed, thank you a lot for your help! :slight_smile:
As I don’t want anyone to suffer as much as I did doing this, I post here how I changed the H264VideoView class to obtain the bitmaps.

package com.parrot.sdksample.view;

import android.content.Context;
import android.graphics.SurfaceTexture;
import android.media.MediaCodec;
import android.media.MediaFormat;
import android.util.AttributeSet;
import android.util.Log;
import android.view.Surface;
import android.view.TextureView;

import com.parrot.arsdk.arcontroller.ARCONTROLLER_STREAM_CODEC_TYPE_ENUM;
import com.parrot.arsdk.arcontroller.ARControllerCodec;
import com.parrot.arsdk.arcontroller.ARFrame;

import java.io.IOException;
import java.nio.ByteBuffer;
import java.util.concurrent.locks.Lock;

public class H264VideoView extends TextureView implements TextureView.SurfaceTextureListener {

    private static final String TAG = "H264VideoView";
    private static final String VIDEO_MIME_TYPE = "video/avc";
    private static final int VIDEO_DEQUEUE_TIMEOUT = 33000;

    private MediaCodec mMediaCodec;
    private Lock mReadyLock;
    Surface surface;

    private boolean mIsCodecConfigured = false;

    private ByteBuffer mSpsBuffer;
    private ByteBuffer mPpsBuffer;

    private ByteBuffer[] mBuffers;

    private static final int VIDEO_WIDTH = 854;
    private static final int VIDEO_HEIGHT = 480;

    @Override
    public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int height)
    {

        this.surface= new Surface(surface);

       // mReadyLock.lock();
        initMediaCodec(VIDEO_MIME_TYPE);

    //    mReadyLock.unlock();
        //surfaceCreated=true;

    }

    @Override
    public void onSurfaceTextureSizeChanged(SurfaceTexture surface, int width, int height) {

    }

    @Override
    public boolean onSurfaceTextureDestroyed(SurfaceTexture surface) {
     //   mReadyLock.lock();
        releaseMediaCodec();
    //    mReadyLock.unlock();
        return false;
    }

    @Override
    public void onSurfaceTextureUpdated(SurfaceTexture surface) {

    }


    public H264VideoView(Context context) {
        super(context);
        //customInit();
    }

    public H264VideoView(Context context, AttributeSet attrs) {
        super(context, attrs);
        //customInit();
    }

    public H264VideoView(Context context, AttributeSet attrs, int defStyleAttr) {
        super(context, attrs, defStyleAttr);
        //customInit();
    }

    public void displayFrame(ARFrame frame) {
     //   mReadyLock.lock();

        if ((mMediaCodec != null)) {
            if (mIsCodecConfigured) {
                // Here we have either a good PFrame, or an IFrame
                int index = -1;

                try {
                    index = mMediaCodec.dequeueInputBuffer(VIDEO_DEQUEUE_TIMEOUT);
                } catch (IllegalStateException e) {
                    Log.e(TAG, "Error while dequeue input buffer");
                }
                if (index >= 0) {
                    ByteBuffer b;
                    if (android.os.Build.VERSION.SDK_INT >= android.os.Build.VERSION_CODES.LOLLIPOP) {
                        b = mMediaCodec.getInputBuffer(index);
                    } else {
                        b = mBuffers[index];
                        b.clear();
                    }

                    if (b != null) {
                        b.put(frame.getByteData(), 0, frame.getDataSize());
                    }

                    try {
                        mMediaCodec.queueInputBuffer(index, 0, frame.getDataSize(), 0, 0);
                    } catch (IllegalStateException e) {
                        Log.e(TAG, "Error while queue input buffer");
                    }
                }
            }

            // Try to display previous frame
            MediaCodec.BufferInfo info = new MediaCodec.BufferInfo();
            int outIndex;
            try {
                outIndex = mMediaCodec.dequeueOutputBuffer(info, 0);

                while (outIndex >= 0) {
                    mMediaCodec.releaseOutputBuffer(outIndex, true);
                    outIndex = mMediaCodec.dequeueOutputBuffer(info, 0);
                }
            } catch (IllegalStateException e) {
                Log.e(TAG, "Error while dequeue input buffer (outIndex)");
            }
        }


     //   mReadyLock.unlock();
    }

    public void configureDecoder(ARControllerCodec codec) {
     //   mReadyLock.lock();

        if (codec.getType() == ARCONTROLLER_STREAM_CODEC_TYPE_ENUM.ARCONTROLLER_STREAM_CODEC_TYPE_H264) {
            ARControllerCodec.H264 codecH264 = codec.getAsH264();

            mSpsBuffer = ByteBuffer.wrap(codecH264.getSps().getByteData());
            mPpsBuffer = ByteBuffer.wrap(codecH264.getPps().getByteData());
        }

        if ((mMediaCodec != null) && (mSpsBuffer != null)) {
            configureMediaCodec();
        }

    //    mReadyLock.unlock();
    }

    private void configureMediaCodec() {
        mMediaCodec.stop();
        MediaFormat format = MediaFormat.createVideoFormat(VIDEO_MIME_TYPE, VIDEO_WIDTH, VIDEO_HEIGHT);
        format.setByteBuffer("csd-0", mSpsBuffer);
        format.setByteBuffer("csd-1", mPpsBuffer);

        mMediaCodec.configure(format, surface, null, 0);
        mMediaCodec.start();

        if (android.os.Build.VERSION.SDK_INT < android.os.Build.VERSION_CODES.LOLLIPOP) {
            mBuffers = mMediaCodec.getInputBuffers();
        }

        mIsCodecConfigured = true;
    }

    private void initMediaCodec(String type) {
        try {
            mMediaCodec = MediaCodec.createDecoderByType(type);
        } catch (IOException e) {
            Log.e(TAG, "Exception", e);
        }

        if ((mMediaCodec != null) && (mSpsBuffer != null)) {
            configureMediaCodec();
        }
    }

    private void releaseMediaCodec() {
        if (mMediaCodec != null) {
            if (mIsCodecConfigured) {
                mMediaCodec.stop();
                mMediaCodec.release();
            }
            mIsCodecConfigured = false;
            mMediaCodec = null;
        }
    }

}

and then, in BebopActivity.java,

        @Override
    public void onFrameReceived(ARFrame frame) {
        mVideoView.displayFrame(frame);
        if(i%30==0) {
            Bitmap myBitmap= mVideoView.getBitmap();

All this being said, I must admit that I expected a better image quality in these bitmaps. If anyone with a deeper understanding on how the Bebop camera can be manipulated knows how to improve this quality, don’t doubt to make a comment :stuck_out_tongue: