Using the raw camera byte[] array for augmented reality
I'm developing an Augmented Reality app, so I need to capture the camera preview, add visual effects to it, and display it on screen. I would like to do this using the onPreviewFrame
method of PreviewCallback
. This gives me a byte[]
variable containing raw image data (YUV420 encoded) to work with.
Even though I searched for a solution for many hours, I cannot find a way to convert this byte[]
variable to any image format I can work with or even draw on the screen.
Preferably, I would convert th开发者_StackOverflowe byte[]
data to some RGB format that can be used both for computations and drawing.
Is there a proper way to do this?
I stumbled upon the same issue a few months back when I had to do some edge detection on the camera frames. This works perfectly for me. Try it out.
public void surfaceChanged(SurfaceHolder holder,int format, int width,int height)
{
camera.setPreviewCallback(new PreviewCallback() {
public void onPreviewFrame(byte[] data, Camera camera) {
Camera.Parameters parameters = camera.getParameters();
int width = parameters.getPreviewSize().width;
int height = parameters.getPreviewSize().height;
ByteArrayOutputStream outstr = new ByteArrayOutputStream();
Rect rect = new Rect(0, 0, width, height);
YuvImage yuvimage=new YuvImage(data,ImageFormat.NV21,width,height,null);
yuvimage.compressToJpeg(rect, 100, outstr);
Bitmap bmp = BitmapFactory.decodeByteArray(outstr.toByteArray(), 0, outstr.size());
}
}
}
You can use the bitmap for all your processing purposes now. Get the interested pixel and you can comfortably do your RGB or HSV stuff on it.
Imran Nazar has writen a two part tutorial on augmented reality which you may find useful. Although he eventually uses the NDK, the first part and most of the second part detail what you need using just Java.
I believe Bitmap.createBitmap is the method you need.
精彩评论