OnPreviewFrame data image to imageView - android

i'm trying to get the byte[] from the preview of the camera, convert it to bitmap and display it on a imageview with imageView.setImageBitmap()
i've managed to start the preview and display it on a surfaceView, but i don't know how to convert the byte[] data (that comes in Yuv format i think) in a RGB bitmap to display it on a imageView.
the code i'm trying is the following:
camera = camera.open();
parameters = camera.getParameters();
camera.setParameters(parameters);
surfaceHolder = surfaceView.getHolder();
surfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
camera.setPreviewDisplay(surfaceHolder);
camera.setPreviewCallback(this);
camera.startPreview();
and the preview callback is this
#Override
public void onPreviewFrame(byte[] data, Camera camera) {
Camera.Parameters parameters = camera.getParameters();
int width = parameters.getPreviewSize().width;
int height = parameters.getPreviewSize().height;
ByteArrayOutputStream outstr = new ByteArrayOutputStream();
Rect rect = new Rect(0, 0, width, height);
YuvImage yuvimage=new YuvImage(data,ImageFormat.NV21,width,height,null);
yuvimage.compressToJpeg(rect, 100, outstr);
Bitmap bmp = BitmapFactory.decodeByteArray(outstr.toByteArray(), 0, outstr.size());
imgView1.setImageBitmap(bmp);
}
The preview works but the imageView remains empty
Any idea?

It is possible you did not open the Camera in the UI thread. However, you need to ensure setImageBitmap is called in the UI thread:
#Override
public void onPreviewFrame(final byte[] data, Camera camera) {
Camera.Parameters parameters = camera.getParameters();
int width = parameters.getPreviewSize().width;
int height = parameters.getPreviewSize().height;
YuvImage yuv = new YuvImage(data, parameters.getPreviewFormat(), width, height, null);
ByteArrayOutputStream out = new ByteArrayOutputStream();
yuv.compressToJpeg(new Rect(0, 0, width, height), 50, out);
byte[] bytes = out.toByteArray();
final Bitmap bitmap = BitmapFactory.decodeByteArray(bytes, 0, bytes.length);
MyActivity.this.runOnUiThread(new Runnable() {
#Override
public void run() {
((ImageView) findViewById(R.id.loopback)).setImageBitmap(bitmap);
}
});
}

Related

Android Camera process the frame data in previewCallback cause increasingly native memory

I use camera.addCallbackBuffer(data); to reuse the buffer to avoid out of memory. My code in previewCallback like
checkFace(data, camera);
camera.addCallbackBuffer(data);
In method checkFace just convert data to bitmap then using FaceDetector to check faces. And I have try to use camera.addCallbackBuffer(data);after converting data, but the native memory show in Android Studio profiler like
after my app running about 10 minutes, the "Native" increase from 10MB to 250MB.
When my app running about 4 hours, it will crash and Logcat print :
E/IMemory (17967): cannot map BpMemoryHeap (binder=0x11515160), size=462848, fd=70 (Out of memory)
I think maybe because of the "Native" memory increasingly
CODE:
camera1.setPreviewCallbackWithBuffer(previewCallback1);
camera1.addCallbackBuffer(buffer1);
camera1.startPreview();
...
private Camera.PreviewCallback previewCallback1 = (data, camera) -> {
checkFace(data, camera);
camera.addCallbackBuffer(data);
};
//convert data to bitmap then check face from the bitmap
private void checkFace(byte[] data, Camera camera){
...
...run on new Thread...
Bitmap bitmap = BitmapUtil.ByteToBitmap(data, camera.getParameters().getPreviewSize());
...
FaceDetector detector = new FaceDetector(bitmap.getWidth(), bitmap.getHeight(), numberOfFace);
...then get the result of face detection
}
//convert frame data to bitmap
public static Bitmap ByteToBitmap(byte[] data, Camera.Size previewSize) {
ByteArrayOutputStream baos = null;
Bitmap bitmapOut = null;
try {
int w = previewSize.width;
int h = previewSize.height;
YuvImage yuvimage = new YuvImage(data, ImageFormat.NV21, w, h, null);
baos = new ByteArrayOutputStream();
yuvimage.compressToJpeg(new Rect(0, 0, w, h), 60, baos);
byte[] jdata = baos.toByteArray();
bitmapOut = BitmapFactory.decodeByteArray(jdata, 0, jdata.length);
if (null == bitmapOut) {
return bitmapOut;
}
jdata = null;
yuvimage = null;
Matrix matrix = new Matrix();
matrix.postRotate(90);
bitmapOut = Bitmap.createBitmap(bitmapOut, 0, 0, w, h, matrix, false);
} catch (Exception e) {
} finally {
try {
if (baos != null) {
baos.flush();
baos.close();
}
} catch (Exception e) {
}
}
return bitmapOut;
}
So, what should I do to resolve it ???

Saving PreviewImage has a green tint - Android

There's a green tint after saving the image. The preview image on the surface holder look normal. What can I do to remove this?
mCamera.setPreviewCallback(new Camera.PreviewCallback() {
#Override
public void onPreviewFrame(byte[] data, Camera camera) {
try {
if (saveImg) {
int width = parameters.getPreviewSize().width;
int height = parameters.getPreviewSize().height;
YuvImage yuv = new YuvImage(data, parameters.getPreviewFormat(), width, height, null);
ByteArrayOutputStream out = new ByteArrayOutputStream();
yuv.compressToJpeg(new Rect(0, 0, width, height), 100, out);
byte[] bytes = out.toByteArray();
final Bitmap bitmap = BitmapFactory.decodeByteArray(bytes, 0, bytes.length);
File file = new File(Environment.getExternalStorageDirectory() + "/newimage.png");
ByteArrayOutputStream os = new ByteArrayOutputStream();
bitmap.compress(Bitmap.CompressFormat.PNG, 100, os);
byte[] blobArray = os.toByteArray();
FileOutputStream fos = new FileOutputStream(file);
fos.write(blobArray);
fos.close();
I ended up setting this value for the white-balance to remove the green tint:
parameters.set("whitebalance", WHITE_BALANCE_FLUORESCENT);
mCamera.setParameters(parameters);

How to convert NV21 image format in bitmap?

I am getting preview data from camera. It is in NV21 format. I want to save the preview to SD Card i.e in bitmap. My code is getting image and saving it, but in the gallery it is not captured preview. It is just black rectangle, Here is the code.
public void processImage() {
Bitmap bitmap = null;
if (flag == true) {
flag = false;
if (mCamera != null) {
Camera.Parameters parameters = mCamera.getParameters();
int imageFormat = parameters.getPreviewFormat();
if (imageFormat == ImageFormat.NV21) {
Toast.makeText(mContext, "Format: NV21", Toast.LENGTH_SHORT)
.show();
int w = parameters.getPreviewSize().width;
int h = parameters.getPreviewSize().height;
YuvImage yuvImage = new YuvImage(mData, imageFormat, w, h,
null);
Rect rect = new Rect(0, 0, w, h);
ByteArrayOutputStream baos = new ByteArrayOutputStream();
yuvImage.compressToJpeg(rect, 100, baos);
byte[] jData = baos.toByteArray();
bitmap = BitmapFactory.decodeByteArray(jData, 0,
jData.length);
}
else if (imageFormat == ImageFormat.JPEG
|| imageFormat == ImageFormat.RGB_565) {
Toast.makeText(mContext, "Format: JPEG||RGB_565",
Toast.LENGTH_SHORT).show();
bitmap = BitmapFactory.decodeByteArray(mData, 0,
mData.length);
}
}
if (bitmap != null) {
saveImage(bitmap);
Toast.makeText(mContext, "Image Saved", Toast.LENGTH_SHORT)
.show();
} else
Toast.makeText(mContext, "Bitmap Null", Toast.LENGTH_SHORT)
.show();
}
}
If you want to save the NV21 preview image to view in the gallery the easiest way is to create an YuvImage from the NV21 byte array and then compress it to a JPEG in a file output stream, like the code below:
FileOutputStream fos = new FileOutputStream(Environment.getExternalStorageDirectory() + "/imagename.jpg");
YuvImage yuvImage = new YuvImage(nv21bytearray, ImageFormat.NV21, width, height, null);
yuvImage.compressToJpeg(new Rect(0, 0, width, height), 100, fos);
fos.close();
Please note that you probably want to change the path to save the image.

Android-Ocr using Tesseract in Portrait

I used the ocr sample in this link https://github.com/rmtheis/android-ocr
Every thing is working fine but i want it in Portrait view,I followed the steps in this link , Zxing Camera in Portrait mode on Android, to enable ocr tesstow in Portrait mode . The View is portrait now but the camera is still taking the picture in landscape mode.
Any help ?
final class PreviewCallback implements Camera.PreviewCallback {
private static final String TAG = PreviewCallback.class.getSimpleName();
private final CameraConfigurationManager configManager;
private Handler previewHandler;
private int previewMessage;
PreviewCallback(CameraConfigurationManager configManager) {
this.configManager = configManager;
}
void setHandler(Handler previewHandler, int previewMessage) {
this.previewHandler = previewHandler;
this.previewMessage = previewMessage;
}
// (NV21) format.
#Override
public void onPreviewFrame(byte[] data, Camera camera) {
Point cameraResolution = configManager.getCameraResolution();
Handler thePreviewHandler = previewHandler;
if (cameraResolution != null && thePreviewHandler != null) {
Message message = thePreviewHandler.obtainMessage(previewMessage, cameraResolution.x,
cameraResolution.y, data);
message.sendToTarget();
previewHandler = null;
} else {
Log.d(TAG, "Got preview callback, but no handler or resolution available");
}
}
Are you using the preview data with this method:
public void onPreviewFrame(byte[] data, Camera camera) {}
If yes, then I can help you, since I am doing very similar project (that will be open sourced soon)
here is the code that I am using to rotate the preview image
public static Bitmap getBitmapImageFromYUV(byte[] data, int width,
int height, int degree, Rect rect) {
Bitmap bitmap = getBitmapImageFromYUV(data, width, height, rect);
return rotateBitmap(bitmap, degree,rect);
}
public static Bitmap rotateBitmap(Bitmap source, float angle, Rect rect) {
Matrix matrix = new Matrix();
matrix.postRotate(angle);
source = Bitmap.createBitmap(source, 0, 0, source.getWidth(),
source.getHeight(), matrix, true);
source = Bitmap.createBitmap(source, rect.left, rect.top, rect.width(), rect.height());
if(mShouldSavePreview)
saveBitmap(source);
return source;
}
public static Bitmap getBitmapImageFromYUV(byte[] data, int width,
int height, Rect rect) {
YuvImage yuvimage = new YuvImage(data, ImageFormat.NV21, width, height,
null);
ByteArrayOutputStream baos = new ByteArrayOutputStream();
yuvimage.compressToJpeg(new Rect(0, 0, width, height), 90, baos);
byte[] jdata = baos.toByteArray();
BitmapFactory.Options bitmapFatoryOptions = new BitmapFactory.Options();
bitmapFatoryOptions.inPreferredConfig = Bitmap.Config.ARGB_8888;
Bitmap bmp = BitmapFactory.decodeByteArray(jdata, 0, jdata.length,
bitmapFatoryOptions);
Log.d(TAG,"getBitmapImageFromYUV w:"+bmp.getWidth()+" h:"+bmp.getHeight());
return bmp;
}
guys i found the solution!
Replace the next code in function: ocrDecode(byte[] data, int width, int height) in DecodeHandler.java file
beepManager.playBeepSoundAndVibrate();
activity.displayProgressDialog();
// *************SHARNOUBY CODE
byte[] rotatedData = new byte[data.length];
for (int y = 0; y < height; y++) {
for (int x = 0; x < width; x++)
rotatedData[x * height + height - y - 1] = data[x + y * width];
}
int tmp = width;
width = height;
height = tmp;
//******************************
// Launch OCR asynchronously, so we get the dialog box displayed
// immediately
new OcrRecognizeAsyncTask(activity, baseApi, rotatedData, width, height)
.execute();
...the problem was in the switch case in the function handleMessage(Message message)
the second case was never triggered which calls the rotation code

Getting image from SurfaceView to ImageView?

I'm having a little trouble of getting an image/drawable or a bitmap from a SurfaceView that works as a camera preivew.
final CameraSurfaceView cameraSurfaceView = new CameraSurfaceView(this);
LinearLayout ll = (LinearLayout)findViewById(R.id.linearLayout1);
ll.addView(cameraSurfaceView); // THIS WORKS
ImageView ivCam = (ImageView) findViewById(R.id.ivCam);
ivCam.setImageBitmap(cameraSurfaceView.getDrawingCache()); // THIS DOESN'T :(
Any suggestions? Thanks!
EDIT:
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
final CameraSurfaceView cameraSurfaceView = new CameraSurfaceView(this);
LinearLayout ll = (LinearLayout)findViewById(R.id.LLS);
ll.addView(cameraSurfaceView); // THIS WORKS
}
///////////////////////////////////////////////////////////////////////////////////////////////
public class CameraSurfaceView extends SurfaceView implements SurfaceHolder.Callback
{
private SurfaceHolder holder;
private Camera camera;
public CameraSurfaceView(Context context)
{
super(context);
//Initiate the Surface Holder properly
this.holder = this.getHolder();
this.holder.addCallback(this);
this.holder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
}
#Override
public void surfaceCreated(SurfaceHolder holder)
{
try
{
this.camera = Camera.open();
this.camera.setPreviewDisplay(this.holder);
this.camera.setPreviewCallback(new PreviewCallback() {
public void onPreviewFrame(byte[] _data, Camera _camera) {
Camera.Parameters params = _camera.getParameters();
int w = params.getPreviewSize().width;
int h = params.getPreviewSize().height;
int format = params.getPreviewFormat();
YuvImage image = new YuvImage(_data, format, w, h, null);
ByteArrayOutputStream out = new ByteArrayOutputStream();
Rect area = new Rect(0, 0, w, h);
image.compressToJpeg(area, 50, out);
Bitmap bm = BitmapFactory.decodeByteArray(out.toByteArray(), 0, out.size());
ImageView ivCam = (ImageView) findViewById(R.id.imageView1);
ivCam.setImageBitmap(bm); /// NULL POINT EX HERE!
}
});
}
catch(IOException ioe)
{
ioe.printStackTrace(System.out);
}
}
#Override
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height)
{
this.camera.startPreview();
}
#Override
public void surfaceDestroyed(SurfaceHolder holder)
{
this.camera.stopPreview();
this.camera.release();
this.camera = null;
}
public Camera getCamera()
{
return this.camera;
}
}
It's far more complicated than that. The background of the SurfaceView is not the camera preview. You have to have a class that implements Camera.PreviewCalback. Once you have that, you can get a byte array containing the image that the preview sends. On some phones, you can set the preview to be a JPEG in which case you can decode it straight with BitmapFactory. On other phones that don't support that feature, you'll get by default a YUV 4:2:0 image that you have to convert into a JPEG image.
On Android 2.2+, you can convert the YUV image to a JPEG like so:
int w = params.getPreviewSize().width;
int h = params.getPreviewSize().height;
int format = params.getPreviewFormat();
YuvImage image = new YuvImage(data, format, w, h, null);
ByteArrayOutputStream out = new ByteArrayOutputStream();
Rect area = new Rect(0, 0, w, h);
image.compressToJpeg(area, 50, out);
Bitmap bm = BitmapFactory.decodeByteArray(out.toByteArray(), 0, out.size());
ivCam.setImageBitmap(bm);
If you're targeting older models, you have to use a conversion algorithm like the one here.
http://blog.tomgibara.com/post/132956174/yuv420-to-rgb565-conversion-in-android
A SO source:
Getting frames from Video Image in Android
EDIT:
If all you want is to show the camera view, then you just add the SurfaceView that your camera is using to a layout that is already displayed like you did in your question. It's already displaying it.

Categories

Resources