There's a green tint after saving the image. The preview image on the surface holder look normal. What can I do to remove this?
mCamera.setPreviewCallback(new Camera.PreviewCallback() {
#Override
public void onPreviewFrame(byte[] data, Camera camera) {
try {
if (saveImg) {
int width = parameters.getPreviewSize().width;
int height = parameters.getPreviewSize().height;
YuvImage yuv = new YuvImage(data, parameters.getPreviewFormat(), width, height, null);
ByteArrayOutputStream out = new ByteArrayOutputStream();
yuv.compressToJpeg(new Rect(0, 0, width, height), 100, out);
byte[] bytes = out.toByteArray();
final Bitmap bitmap = BitmapFactory.decodeByteArray(bytes, 0, bytes.length);
File file = new File(Environment.getExternalStorageDirectory() + "/newimage.png");
ByteArrayOutputStream os = new ByteArrayOutputStream();
bitmap.compress(Bitmap.CompressFormat.PNG, 100, os);
byte[] blobArray = os.toByteArray();
FileOutputStream fos = new FileOutputStream(file);
fos.write(blobArray);
fos.close();
I ended up setting this value for the white-balance to remove the green tint:
parameters.set("whitebalance", WHITE_BALANCE_FLUORESCENT);
mCamera.setParameters(parameters);
Related
I have to save android.media.Image as bitmaps and eventually compress into jpeg.The images saved are half image and half grey.Also the images are rotated.
#
image = frame.acquireCameraImage();
Log.i("Bitmap", "-60");
Image.Plane[] planes = image.getPlanes();
Log.i("Bitmap", "-6");
ByteBuffer buffer = planes[0].getBuffer();
ByteBuffer buffer1 = planes[1].getBuffer();
Log.i("Bitmap", "-15");
ByteBuffer buffer2 = planes[2].getBuffer();
Log.i("Bitmap", "-7");
int format = image.getFormat();
int width = image.getWidth();
int height = image.getHeight();
Log.i("Bitmap", "-20");
Rect cropRect = new Rect(0, 0, width, height);
Log.i("Bitmap", "-8");
byte[] arr = new byte[buffer.remaining()];
buffer.get(arr);
Log.i("Bitmap", "-9");
byte[] arr1 = new byte[buffer1.remaining()];
buffer1.get(arr1);
Log.i("Bitmap", "-10");
byte[] arr2 = new byte[buffer2.remaining()];
buffer2.get(arr2);
image.close();
Log.i("Bitmap", "-5");
byte[] c = new byte[arr.length + arr1.length + arr2.length];
Log.i("Bitmap", "-1");
System.arraycopy(arr, 0, c, 0, arr.length);
Log.i("Bitmap", "-2");
System.arraycopy(arr1, 0, c, arr.length, arr1.length);
Log.i("Bitmap", "-3");
System.arraycopy(arr2, 0, c, arr1.length, arr2.length);
Log.i("Bitmap", "0");
YuvImage yuvImage = new YuvImage(c,ImageFormat.NV21,width,height, null);
Log.i("Bitmap", "1");
ByteArrayOutputStream b = new ByteArrayOutputStream();
Log.i("Bitmap", "1");
yuvImage.compressToJpeg(cropRect, 90, b);
byte[] jpegData = b.toByteArray();
Log.i("Bitmap", "2");
BitmapFactory.Options options = new BitmapFactory.Options();
options.inPreferQualityOverSpeed=true;
// int scaleFactor;
//options.inSampleSize = scaleFactor;
Log.i("Bitmap", "3");
Bitmap bitmap = BitmapFactory.decodeByteArray(jpegData, 0, jpegData.length, options);
saveBitmap(bitmap);
Bitmap bitmap = BitmapFactory.decodeByteArray(jpegData, 0, jpegData.length, options);
this is your code please change it like
Bitmap bitmap = BitmapFactory.decodeByteArray(jpegData, 100, jpegData.length, options);
in your code you were setting the quality as 0 hence the issue, please change it to 100 to have the image as it is.
Hope it helps.
I use camera.addCallbackBuffer(data); to reuse the buffer to avoid out of memory. My code in previewCallback like
checkFace(data, camera);
camera.addCallbackBuffer(data);
In method checkFace just convert data to bitmap then using FaceDetector to check faces. And I have try to use camera.addCallbackBuffer(data);after converting data, but the native memory show in Android Studio profiler like
after my app running about 10 minutes, the "Native" increase from 10MB to 250MB.
When my app running about 4 hours, it will crash and Logcat print :
E/IMemory (17967): cannot map BpMemoryHeap (binder=0x11515160), size=462848, fd=70 (Out of memory)
I think maybe because of the "Native" memory increasingly
CODE:
camera1.setPreviewCallbackWithBuffer(previewCallback1);
camera1.addCallbackBuffer(buffer1);
camera1.startPreview();
...
private Camera.PreviewCallback previewCallback1 = (data, camera) -> {
checkFace(data, camera);
camera.addCallbackBuffer(data);
};
//convert data to bitmap then check face from the bitmap
private void checkFace(byte[] data, Camera camera){
...
...run on new Thread...
Bitmap bitmap = BitmapUtil.ByteToBitmap(data, camera.getParameters().getPreviewSize());
...
FaceDetector detector = new FaceDetector(bitmap.getWidth(), bitmap.getHeight(), numberOfFace);
...then get the result of face detection
}
//convert frame data to bitmap
public static Bitmap ByteToBitmap(byte[] data, Camera.Size previewSize) {
ByteArrayOutputStream baos = null;
Bitmap bitmapOut = null;
try {
int w = previewSize.width;
int h = previewSize.height;
YuvImage yuvimage = new YuvImage(data, ImageFormat.NV21, w, h, null);
baos = new ByteArrayOutputStream();
yuvimage.compressToJpeg(new Rect(0, 0, w, h), 60, baos);
byte[] jdata = baos.toByteArray();
bitmapOut = BitmapFactory.decodeByteArray(jdata, 0, jdata.length);
if (null == bitmapOut) {
return bitmapOut;
}
jdata = null;
yuvimage = null;
Matrix matrix = new Matrix();
matrix.postRotate(90);
bitmapOut = Bitmap.createBitmap(bitmapOut, 0, 0, w, h, matrix, false);
} catch (Exception e) {
} finally {
try {
if (baos != null) {
baos.flush();
baos.close();
}
} catch (Exception e) {
}
}
return bitmapOut;
}
So, what should I do to resolve it ???
I am getting preview data from camera. It is in NV21 format. I want to save the preview to SD Card i.e in bitmap. My code is getting image and saving it, but in the gallery it is not captured preview. It is just black rectangle, Here is the code.
public void processImage() {
Bitmap bitmap = null;
if (flag == true) {
flag = false;
if (mCamera != null) {
Camera.Parameters parameters = mCamera.getParameters();
int imageFormat = parameters.getPreviewFormat();
if (imageFormat == ImageFormat.NV21) {
Toast.makeText(mContext, "Format: NV21", Toast.LENGTH_SHORT)
.show();
int w = parameters.getPreviewSize().width;
int h = parameters.getPreviewSize().height;
YuvImage yuvImage = new YuvImage(mData, imageFormat, w, h,
null);
Rect rect = new Rect(0, 0, w, h);
ByteArrayOutputStream baos = new ByteArrayOutputStream();
yuvImage.compressToJpeg(rect, 100, baos);
byte[] jData = baos.toByteArray();
bitmap = BitmapFactory.decodeByteArray(jData, 0,
jData.length);
}
else if (imageFormat == ImageFormat.JPEG
|| imageFormat == ImageFormat.RGB_565) {
Toast.makeText(mContext, "Format: JPEG||RGB_565",
Toast.LENGTH_SHORT).show();
bitmap = BitmapFactory.decodeByteArray(mData, 0,
mData.length);
}
}
if (bitmap != null) {
saveImage(bitmap);
Toast.makeText(mContext, "Image Saved", Toast.LENGTH_SHORT)
.show();
} else
Toast.makeText(mContext, "Bitmap Null", Toast.LENGTH_SHORT)
.show();
}
}
If you want to save the NV21 preview image to view in the gallery the easiest way is to create an YuvImage from the NV21 byte array and then compress it to a JPEG in a file output stream, like the code below:
FileOutputStream fos = new FileOutputStream(Environment.getExternalStorageDirectory() + "/imagename.jpg");
YuvImage yuvImage = new YuvImage(nv21bytearray, ImageFormat.NV21, width, height, null);
yuvImage.compressToJpeg(new Rect(0, 0, width, height), 100, fos);
fos.close();
Please note that you probably want to change the path to save the image.
i'm trying to get the byte[] from the preview of the camera, convert it to bitmap and display it on a imageview with imageView.setImageBitmap()
i've managed to start the preview and display it on a surfaceView, but i don't know how to convert the byte[] data (that comes in Yuv format i think) in a RGB bitmap to display it on a imageView.
the code i'm trying is the following:
camera = camera.open();
parameters = camera.getParameters();
camera.setParameters(parameters);
surfaceHolder = surfaceView.getHolder();
surfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
camera.setPreviewDisplay(surfaceHolder);
camera.setPreviewCallback(this);
camera.startPreview();
and the preview callback is this
#Override
public void onPreviewFrame(byte[] data, Camera camera) {
Camera.Parameters parameters = camera.getParameters();
int width = parameters.getPreviewSize().width;
int height = parameters.getPreviewSize().height;
ByteArrayOutputStream outstr = new ByteArrayOutputStream();
Rect rect = new Rect(0, 0, width, height);
YuvImage yuvimage=new YuvImage(data,ImageFormat.NV21,width,height,null);
yuvimage.compressToJpeg(rect, 100, outstr);
Bitmap bmp = BitmapFactory.decodeByteArray(outstr.toByteArray(), 0, outstr.size());
imgView1.setImageBitmap(bmp);
}
The preview works but the imageView remains empty
Any idea?
It is possible you did not open the Camera in the UI thread. However, you need to ensure setImageBitmap is called in the UI thread:
#Override
public void onPreviewFrame(final byte[] data, Camera camera) {
Camera.Parameters parameters = camera.getParameters();
int width = parameters.getPreviewSize().width;
int height = parameters.getPreviewSize().height;
YuvImage yuv = new YuvImage(data, parameters.getPreviewFormat(), width, height, null);
ByteArrayOutputStream out = new ByteArrayOutputStream();
yuv.compressToJpeg(new Rect(0, 0, width, height), 50, out);
byte[] bytes = out.toByteArray();
final Bitmap bitmap = BitmapFactory.decodeByteArray(bytes, 0, bytes.length);
MyActivity.this.runOnUiThread(new Runnable() {
#Override
public void run() {
((ImageView) findViewById(R.id.loopback)).setImageBitmap(bitmap);
}
});
}
How do I cut out the middle area of the bitmap?
it's my sample code:
public void onPictureTaken(byte[] paramArrayOfByte, Camera paramCamera)
{
FileOutputStream fileOutputStream = null;
try {
File saveDir = new File("/sdcard/CameraExample/");
if (!saveDir.exists())
{
saveDir.mkdirs();
}
BitmapFactory.Options options = new BitmapFactory.Options();
options.inSampleSize = 5;
Bitmap myImage = BitmapFactory.decodeByteArray(paramArrayOfByte, 0,paramArrayOfByte.length, options);
Bitmap bmpResult = Bitmap.createBitmap(myImage.getWidth(), myImage.getHeight(),Config.RGB_565);
int length = myImage.getHeight()*myImage.getWidth();
int[] pixels = new int[length];
myImage.getPixels(pixels, 0, myImage.getWidth(), 0,0, myImage.getWidth(), myImage.getHeight());
Bitmap TygolykovLOL = Bitmap.createBitmap(pixels, 0, myImage.getWidth(), myImage.getWidth(),myImage.getHeight(), Config.RGB_565);
Paint paint = new Paint();
Canvas myCanvas = new Canvas(bmpResult);
myCanvas.drawBitmap(TygolykovLOL, 0, 0, paint);
fileOutputStream = new FileOutputStream("/sdcard/CameraExample/" + "1ggggqqqqGj2.bmp");
BufferedOutputStream bos = new BufferedOutputStream(fileOutputStream );
bmpResult.compress(CompressFormat.PNG, 100, bos);
bos.flush();
bos.close();
You might want to use the other overload of createBitmap - it has x, y, width and height parameters which you could use to crop the middle portion of the bitmap into a new bitmap.
Something like this:
Bitmap cropped = Bitmap.createBitmap(sourceBitmap, 50, 50, sourceBitmap.getWidth() - 100, sourceBitmap.getHeight() - 100);
to clip out everything 50 pixels in from the edges.