I am getting preview data from camera. It is in NV21 format. I want to save the preview to SD Card i.e in bitmap. My code is getting image and saving it, but in the gallery it is not captured preview. It is just black rectangle, Here is the code.
public void processImage() {
Bitmap bitmap = null;
if (flag == true) {
flag = false;
if (mCamera != null) {
Camera.Parameters parameters = mCamera.getParameters();
int imageFormat = parameters.getPreviewFormat();
if (imageFormat == ImageFormat.NV21) {
Toast.makeText(mContext, "Format: NV21", Toast.LENGTH_SHORT)
.show();
int w = parameters.getPreviewSize().width;
int h = parameters.getPreviewSize().height;
YuvImage yuvImage = new YuvImage(mData, imageFormat, w, h,
null);
Rect rect = new Rect(0, 0, w, h);
ByteArrayOutputStream baos = new ByteArrayOutputStream();
yuvImage.compressToJpeg(rect, 100, baos);
byte[] jData = baos.toByteArray();
bitmap = BitmapFactory.decodeByteArray(jData, 0,
jData.length);
}
else if (imageFormat == ImageFormat.JPEG
|| imageFormat == ImageFormat.RGB_565) {
Toast.makeText(mContext, "Format: JPEG||RGB_565",
Toast.LENGTH_SHORT).show();
bitmap = BitmapFactory.decodeByteArray(mData, 0,
mData.length);
}
}
if (bitmap != null) {
saveImage(bitmap);
Toast.makeText(mContext, "Image Saved", Toast.LENGTH_SHORT)
.show();
} else
Toast.makeText(mContext, "Bitmap Null", Toast.LENGTH_SHORT)
.show();
}
}
If you want to save the NV21 preview image to view in the gallery the easiest way is to create an YuvImage from the NV21 byte array and then compress it to a JPEG in a file output stream, like the code below:
FileOutputStream fos = new FileOutputStream(Environment.getExternalStorageDirectory() + "/imagename.jpg");
YuvImage yuvImage = new YuvImage(nv21bytearray, ImageFormat.NV21, width, height, null);
yuvImage.compressToJpeg(new Rect(0, 0, width, height), 100, fos);
fos.close();
Please note that you probably want to change the path to save the image.
Related
I use camera.addCallbackBuffer(data); to reuse the buffer to avoid out of memory. My code in previewCallback like
checkFace(data, camera);
camera.addCallbackBuffer(data);
In method checkFace just convert data to bitmap then using FaceDetector to check faces. And I have try to use camera.addCallbackBuffer(data);after converting data, but the native memory show in Android Studio profiler like
after my app running about 10 minutes, the "Native" increase from 10MB to 250MB.
When my app running about 4 hours, it will crash and Logcat print :
E/IMemory (17967): cannot map BpMemoryHeap (binder=0x11515160), size=462848, fd=70 (Out of memory)
I think maybe because of the "Native" memory increasingly
CODE:
camera1.setPreviewCallbackWithBuffer(previewCallback1);
camera1.addCallbackBuffer(buffer1);
camera1.startPreview();
...
private Camera.PreviewCallback previewCallback1 = (data, camera) -> {
checkFace(data, camera);
camera.addCallbackBuffer(data);
};
//convert data to bitmap then check face from the bitmap
private void checkFace(byte[] data, Camera camera){
...
...run on new Thread...
Bitmap bitmap = BitmapUtil.ByteToBitmap(data, camera.getParameters().getPreviewSize());
...
FaceDetector detector = new FaceDetector(bitmap.getWidth(), bitmap.getHeight(), numberOfFace);
...then get the result of face detection
}
//convert frame data to bitmap
public static Bitmap ByteToBitmap(byte[] data, Camera.Size previewSize) {
ByteArrayOutputStream baos = null;
Bitmap bitmapOut = null;
try {
int w = previewSize.width;
int h = previewSize.height;
YuvImage yuvimage = new YuvImage(data, ImageFormat.NV21, w, h, null);
baos = new ByteArrayOutputStream();
yuvimage.compressToJpeg(new Rect(0, 0, w, h), 60, baos);
byte[] jdata = baos.toByteArray();
bitmapOut = BitmapFactory.decodeByteArray(jdata, 0, jdata.length);
if (null == bitmapOut) {
return bitmapOut;
}
jdata = null;
yuvimage = null;
Matrix matrix = new Matrix();
matrix.postRotate(90);
bitmapOut = Bitmap.createBitmap(bitmapOut, 0, 0, w, h, matrix, false);
} catch (Exception e) {
} finally {
try {
if (baos != null) {
baos.flush();
baos.close();
}
} catch (Exception e) {
}
}
return bitmapOut;
}
So, what should I do to resolve it ???
This is the function that I use:
public void vidyoConferenceFrameReceivedCallback(final int participantId, final int width, final int height, final byte[] rawImageBytes) {
if (selfView == null || selfView.getVisibility() == View.GONE)
return;
try {
ByteArrayOutputStream out = new ByteArrayOutputStream();
YuvImage yuvImage = new YuvImage(rawImageBytes, ImageFormat.NV21, width, height, null);
yuvImage.compressToJpeg(new Rect(0, 0, width / 2, height / 2), 50, out);
byte[] imageBytes = out.toByteArray();
final Bitmap image = BitmapFactory.decodeByteArray(imageBytes, 0, imageBytes.length);
new Handler(Looper.getMainLooper()).post(new Runnable() {
#Override
public void run() {
selfView.setImageBitmap(image);
System.gc();
}
});
} catch (Exception e) {
Logger.info("Error on vidyoConferenceFrameReceivedCallback: " + e.getMessage());
}
}
And this is being called from a Video SDK which sends the byte array.
I have also tried this function: convertYUV420_NV21toRGB8888
from the following link: Extract black and white image from android camera's NV21 format
And both times this is the image that I get back:
What could go wrong here?
EDIT:
I also tried with renderscript:
try {
byte[] outBytes = new byte[W * H * 4];
rs = RenderScript.create(getActivity());
yuvToRgbIntrinsic = ScriptIntrinsicYuvToRGB.create(rs, Element.RGBA_8888(rs));
Type.Builder yuvType = new Type.Builder(rs, Element.U8(rs))
.setX(W).setY(H)
.setYuvFormat(android.graphics.ImageFormat.NV21);
Allocation in = Allocation.createTyped(rs, yuvType.create(), Allocation.USAGE_SCRIPT);
Type.Builder rgbaType = new Type.Builder(rs, Element.RGBA_8888(rs))
.setX(W).setY(H);
Allocation out = Allocation.createTyped(rs, rgbaType.create(), Allocation.USAGE_SCRIPT);
in.copyFrom(rawImageBytes);
yuvToRgbIntrinsic.setInput(in);
yuvToRgbIntrinsic.forEach(out);
out.copyTo(outBytes);
final Bitmap bmpout = Bitmap.createBitmap(W, H, Bitmap.Config.ARGB_8888);
out.copyTo(bmpout);
new Handler(Looper.getMainLooper()).post(new Runnable() {
#Override
public void run() {
remoteView.setImageBitmap(bmpout);
System.gc();
}
});
} catch (Exception e) {
Logger.info("Error on vidyoConferenceFrameReceivedRemoteCallback: " + e.getMessage());
}
But the same. How can I know that the bite array I get from the camera is valid?
I also included a file here, that shows the byte array I receive:
https://www.dropbox.com/s/fbubwpx06ypr61e/byte%20array.txt?dl=0
The image you attached is very low quality 320x180 I420.
There's a green tint after saving the image. The preview image on the surface holder look normal. What can I do to remove this?
mCamera.setPreviewCallback(new Camera.PreviewCallback() {
#Override
public void onPreviewFrame(byte[] data, Camera camera) {
try {
if (saveImg) {
int width = parameters.getPreviewSize().width;
int height = parameters.getPreviewSize().height;
YuvImage yuv = new YuvImage(data, parameters.getPreviewFormat(), width, height, null);
ByteArrayOutputStream out = new ByteArrayOutputStream();
yuv.compressToJpeg(new Rect(0, 0, width, height), 100, out);
byte[] bytes = out.toByteArray();
final Bitmap bitmap = BitmapFactory.decodeByteArray(bytes, 0, bytes.length);
File file = new File(Environment.getExternalStorageDirectory() + "/newimage.png");
ByteArrayOutputStream os = new ByteArrayOutputStream();
bitmap.compress(Bitmap.CompressFormat.PNG, 100, os);
byte[] blobArray = os.toByteArray();
FileOutputStream fos = new FileOutputStream(file);
fos.write(blobArray);
fos.close();
I ended up setting this value for the white-balance to remove the green tint:
parameters.set("whitebalance", WHITE_BALANCE_FLUORESCENT);
mCamera.setParameters(parameters);
I'm developing an application that can display a photo from the camera using the camera intent using the extra crop option. The code is working fine in most of the devices but when i tried to test it in my brand new Galaxy Note3 it crashes and didn't run properly, Also the image taken is still huge in size "Almost 4 MB" which is so large to be displayed in the imageview. Can anyone point me if there is anyway to avoid this?
Hereunder my code :
Intent intent = new Intent(
"android.media.action.IMAGE_CAPTURE");
file = getOutputMediaFile();
intent.putExtra("crop", "true");
intent.putExtra(MediaStore.EXTRA_OUTPUT,
Uri.fromFile(file));
intent.putExtra("outputFormat",
Bitmap.CompressFormat.JPEG
.toString());
intent.putExtra(
MediaStore.EXTRA_SCREEN_ORIENTATION,
ActivityInfo.SCREEN_ORIENTATION_PORTRAIT);
startActivityForResult(intent,
ACTION_REQUEST_CAMERA);
and for activityforresult
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
if (resultCode == RESULT_OK) {
switch (requestCode) {
case ACTION_REQUEST_CAMERA:
if (data != null) {
try {
int inWidth = 0;
int inHeight = 0;
InputStream in = new FileInputStream(
file.getAbsolutePath());
// decode image size (decode metadata only, not the
// whole image)
BitmapFactory.Options options = new BitmapFactory.Options();
options.inJustDecodeBounds = true;
BitmapFactory.decodeStream(in, null, options);
in.close();
in = null;
// save width and height
inWidth = options.outWidth;
inHeight = options.outHeight;
// decode full image pre-resized
in = new FileInputStream(file.getAbsolutePath());
options = new BitmapFactory.Options();
// calc rought re-size (this is no exact resize)
options.inSampleSize = Math.max(inWidth / 350,
inHeight / 550);
// decode full image
Bitmap roughBitmap = BitmapFactory.decodeStream(in,
null, options);
// calc exact destination size
Matrix m = new Matrix();
RectF inRect = new RectF(0, 0, roughBitmap.getWidth(),
roughBitmap.getHeight());
RectF outRect = new RectF(0, 0, 700, 800);
m.setRectToRect(inRect, outRect,
Matrix.ScaleToFit.CENTER);
float[] values = new float[9];
m.getValues(values);
// resize bitmap
Bitmap resizedBitmap = Bitmap.createScaledBitmap(
roughBitmap,
(int) (roughBitmap.getWidth() * values[0]),
(int) (roughBitmap.getHeight() * values[4]),
true);
// save image
try {
FileOutputStream out = new FileOutputStream(
file.getAbsolutePath());
resizedBitmap.compress(Bitmap.CompressFormat.JPEG,
90, out);
fullphoto = resizedBitmap;
setPic(file.getAbsolutePath(), camera);
} catch (Exception e) {
Log.e("Image", e.getMessage(), e);
}
} catch (IOException e) {
Log.e("Image", e.getMessage(), e);
}
}
// fullphoto = BitmapFactory.decodeFile(file.getAbsolutePath());
// photo = decodeSampledBitmapFromFile(file.getAbsolutePath(),
// 100, 100);
// camera.setImageBitmap(imghelper.getRoundedCornerBitmap(
// fullphoto, 10));
iscamera = "Yes";
firsttime = false;
break;
}
i'm trying to get the byte[] from the preview of the camera, convert it to bitmap and display it on a imageview with imageView.setImageBitmap()
i've managed to start the preview and display it on a surfaceView, but i don't know how to convert the byte[] data (that comes in Yuv format i think) in a RGB bitmap to display it on a imageView.
the code i'm trying is the following:
camera = camera.open();
parameters = camera.getParameters();
camera.setParameters(parameters);
surfaceHolder = surfaceView.getHolder();
surfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
camera.setPreviewDisplay(surfaceHolder);
camera.setPreviewCallback(this);
camera.startPreview();
and the preview callback is this
#Override
public void onPreviewFrame(byte[] data, Camera camera) {
Camera.Parameters parameters = camera.getParameters();
int width = parameters.getPreviewSize().width;
int height = parameters.getPreviewSize().height;
ByteArrayOutputStream outstr = new ByteArrayOutputStream();
Rect rect = new Rect(0, 0, width, height);
YuvImage yuvimage=new YuvImage(data,ImageFormat.NV21,width,height,null);
yuvimage.compressToJpeg(rect, 100, outstr);
Bitmap bmp = BitmapFactory.decodeByteArray(outstr.toByteArray(), 0, outstr.size());
imgView1.setImageBitmap(bmp);
}
The preview works but the imageView remains empty
Any idea?
It is possible you did not open the Camera in the UI thread. However, you need to ensure setImageBitmap is called in the UI thread:
#Override
public void onPreviewFrame(final byte[] data, Camera camera) {
Camera.Parameters parameters = camera.getParameters();
int width = parameters.getPreviewSize().width;
int height = parameters.getPreviewSize().height;
YuvImage yuv = new YuvImage(data, parameters.getPreviewFormat(), width, height, null);
ByteArrayOutputStream out = new ByteArrayOutputStream();
yuv.compressToJpeg(new Rect(0, 0, width, height), 50, out);
byte[] bytes = out.toByteArray();
final Bitmap bitmap = BitmapFactory.decodeByteArray(bytes, 0, bytes.length);
MyActivity.this.runOnUiThread(new Runnable() {
#Override
public void run() {
((ImageView) findViewById(R.id.loopback)).setImageBitmap(bitmap);
}
});
}