Why does BitmapFactory.decodeByteArray not work with camera previews? - android

So I'm trying to use face detection with a camera, so I need to convert the byte array supplied to the preview callback, into a bitmap, using this code:
Camera.PreviewCallback previewCallback=new Camera.PreviewCallback()
{
#Override
public void onPreviewFrame(byte[] data, Camera camera)
{
BitmapFactory.Options options = new BitmapFactory.Options();
Bitmap mBitmap = BitmapFactory.decodeByteArray(data, 0, data.length, options);
if(mBitmap==null) faceDetected=false;
else faceDetected=(findFace(mBitmap)!=null);
}
};
Unfortunately, mBitmap is always null, and options outHeight and outWidth are always -1 - which indicates a decode error.
Naturally, there are no diagnostics, so it's impossible to fix.

Related

Android Camera.takePicture() saves pictures with 176x144 pixels

I have programmed an App with a CameraPreview. When I take a Photo now, the App save it with 176 x 144 Pixels. Can I Change the width an the height?
I use the Camera Api, not Camera2.
When I took photos with the normal Camera-App, it saved Images with 2368x4208 Pixels.
This is where the Photos where taken
public void onPictureTaken(byte[] data, Camera camera) {
Bitmap rawImg = BitmapFactory.decodeByteArray(data, 0, data.length);
Log.e("TakePicture", "Picture Callback erreicht");
shotAnalyser.startAnalysation(rawImg, data);
}
Here I grab the Data and make a Bitmap
public void startAnalysation(final Bitmap rawImg, final byte[] data){
Bitmap rawBitmap = BitmapFactory.decodeByteArray(data, 0, data.length);
// Ermitteln der nötigen Bild-Eigenschaften
int width = rawBitmap.getWidth();
int height = rawBitmap.getHeight();
}
Here I tried to Change the PictureSize
public Camera getCameraInstance(){
Camera c = null;
try {
releaseCameraAndPreview();
c = Camera.open(Camera.CameraInfo.CAMERA_FACING_BACK); // attempt to get a Camera instance
Log.e(LOG, "CameraInstance: " + c + " RUNS");
Camera.Parameters parameters = c.getParameters();
parameters.set("jpeg-quality", 70);
parameters.setPictureFormat(PixelFormat.JPEG);
parameters.setPictureSize(2048, 1232);
c.setParameters(parameters);
}
catch (Exception e){
}
return c; // returns null if camera is unavailable
}
I've googled a lot, but I can't find anything.

Not able to convert byte[] to Bitmap in onPreviewFrame() callback method

I am not able to convert a byte array to a Bitmap in the preview callback method. I'm using the following code:
public void onPreviewFrame(byte[] data, Camera camera) {
Bitmap bmp = BitmapFactory.decodeByteArray(data , 0, data.length);
Mat orig = new Mat(bmp.getHeight(),bmp.getWidth(),CvType.CV_8UC3);
Bitmap myBitmap32 = bmp.copy(Bitmap.Config.ARGB_8888, true);
Utils.bitmapToMat(myBitmap32, orig);
}
bmp is null. Please justify how to convert bitmap image.

How to retrieve a new picture taken by the camera as OpenCV Mat on Android?

I am trying to take a picture with an Android device. The picture must be converted as Mat to be an input for a computation of which I like to provide the results within an API.
In which format does Android provide the byte[] data in it's callback and how to convert it to an OpenCV Mat in the color-format BGR?
The first problem: "How to take the picture without a SurfaceView" is solved. I used a SurfaceTexture, which must not be visible.
mCamera = Camera.open();
mCamera.setPreviewTexture(new SurfaceTexture(10));
So I was able to start the preview and take a picture. But in which format is the byte[] data and how to convert it to an OpenCV BGR Mat?
mCamera.startPreview();
mCamera.takePicture(null, null, null, new PictureCallback() {
#Override
public void onPictureTaken(byte[] data, Camera camera) {
Log.e(MainActivity.APP_ID, "picture-taken");
android.hardware.Camera.Size pictureSize = camera.getParameters().getPictureSize();
Mat mat = new Mat(new Size(pictureSize.width, pictureSize.height), CvType.CV_8U);
mat.put(0,0,data);
mat.reshape(0, pictureSize.height);
// Imgproc.cvtColor(mat, mat, Imgproc.COLOR_YUV420sp2RGBA);
......
As tokan pointed on his comment in the question, this solution works great:
android.hardware.Camera.Size pictureSize = camera.getParameters().getPictureSize();
Mat mat = new Mat(new Size(pictureSize.width, pictureSize.height), CvType.CV_8U);
mat.put(0, 0, data);
Mat img = Imgcodecs.imdecode(mat, Imgcodecs.CV_LOAD_IMAGE_UNCHANGED);

Android byte[] to image in Camera.onPreviewFrame

When trying to convert the byte[] of Camera.onPreviewFrame to Bitamp using BitmapFactory.decodeByteArray gives me an error SkImageDecoder::Factory returned null
Following is my code:
public void onPreviewFrame(byte[] data, Camera camera) {
Bitmap bmp=BitmapFactory.decodeByteArray(data, 0, data.length);
}
This has been hard to find! But since API 8, there is a YuvImage class in android.graphics. It's not an Image descendent, so all you can do with it is save it to Jpeg, but you could save it to memory stream and then load into Bitmap Image if that's what you need.
import android.graphics.YuvImage;
#Override
public void onPreviewFrame(byte[] data, Camera camera) {
try {
Camera.Parameters parameters = camera.getParameters();
Size size = parameters.getPreviewSize();
YuvImage image = new YuvImage(data, parameters.getPreviewFormat(),
size.width, size.height, null);
File file = new File(Environment.getExternalStorageDirectory()
.getPath() + "/out.jpg");
FileOutputStream filecon = new FileOutputStream(file);
image.compressToJpeg(
new Rect(0, 0, image.getWidth(), image.getHeight()), 90,
filecon);
} catch (FileNotFoundException e) {
Toast toast = Toast
.makeText(getBaseContext(), e.getMessage(), 1000);
toast.show();
}
}
Since Android 3.0 you can use a TextureView and TextureSurface to display the camera, and then use mTextureView.getBitmap() to retrieve a friendly RGB preview frame.
A very skeletal example of how to do this is given in the TextureView docs. Note that you'll have to set your application or activity to be hardware accelerated by putting android:hardwareAccelerated="true" in the manifest.
I found the answer after a long time. Here it is...
Instead of using BitmapFactory, I used my custom method to decode this byte[] data to a valid image format. To decode the image to a valid image format, one need to know what picture format is being used by the camera by calling camera.getParameters().getPictureFormat(). This returns a constant defined by ImageFormat. After knowing the format, use the appropriate encoder to encode the image.
In my case, the byte[] data was in the YUV format, so I looked for YUV to BMP conversion and that solved my problem.
you can try this:
This example send camera frames to server
#Override
public void onPreviewFrame(byte[] data, Camera camera) {
try {
byte[] baos = convertYuvToJpeg(data, camera);
StringBuilder dataBuilder = new StringBuilder();
dataBuilder.append("data:image/jpeg;base64,").append(Base64.encodeToString(baos, Base64.DEFAULT));
mSocket.emit("newFrame", dataBuilder.toString());
} catch (Exception e) {
Log.d("########", "ERROR");
}
}
};
public byte[] convertYuvToJpeg(byte[] data, Camera camera) {
YuvImage image = new YuvImage(data, ImageFormat.NV21,
camera.getParameters().getPreviewSize().width, camera.getParameters().getPreviewSize().height, null);
ByteArrayOutputStream baos = new ByteArrayOutputStream();
int quality = 20; //set quality
image.compressToJpeg(new Rect(0, 0, camera.getParameters().getPreviewSize().width, camera.getParameters().getPreviewSize().height), quality, baos);//this line decreases the image quality
return baos.toByteArray();
}

Best way to scale size of camera picture before saving to SD

The code below is executed as the jpeg picture callback after TakePicture is called. If I save data to disk, it is a 1280x960 jpeg. I've tried to change the picture size but that's not possible as no smaller size is supported. JPEG is the only available picture format.
PictureCallback jpegCallback = new PictureCallback() {
public void onPictureTaken(byte[] data, Camera camera) {
FileOutputStream out = null;
Bitmap bm = BitmapFactory.decodeByteArray(data, 0, data.length);
Bitmap sbm = Bitmap.createScaledBitmap(bm, 640, 480, false);
data.Length is something like 500k as expected. After executing BitmapFactory.decodeByteArray(), bm has a height and width of -1 so it appears the operation is failing.
It's unclear to me if Bitmap can handle jpeg data. I would think not but I have seem some code examples that seem to indicate it is.
Does data need to be in bitmap format before decoding and scaling?
If so, how to do this?
Thanks!
On your surfaceCreated, you code set the camara's Picture Size, as shown the code below:
public void surfaceCreated(SurfaceHolder holder) {
camera = Camera.open();
try {
camera.setPreviewDisplay(holder);
Camera.Parameters p = camera.getParameters();
p.set("jpeg-quality", 70);
p.setPictureFormat(PixelFormat.JPEG);
p.setPictureSize(640, 480);
camera.setParameters(p);
} catch (IOException e) {
e.printStackTrace();
}
}

Categories

Resources