I'm writing an Android application that saves a JPEG snapshot from the camera when the user clicks a button. Unfortunately, when I look at the JPEG file my code is saving looks corrupted. It appears to be caused by my call to parameters.setPreviewSize (see code snippet below) - if I remove that then the image saves fine; however without it I can't set the preview size and setDisplayOrientation also appears to have no effect without it.
My app is targeting API Level 8 (Android 2.2), and I'm debugging on an HTC Desire HD. Not quite sure what I'm doing wrong here... any help would be very much appreciated!
Cheers,
Scottie
public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
// Now that the size is known, set up the camera parameters and begin
// the preview.
Camera.Parameters parameters = mCamera.getParameters();
Camera.Size size = getBestPreviewSize(w,h);
// This next call is required in order for preview size to be set and
// setDisplayOrientation to take effect...
// Unfortunately it's also causing JPEG to be created wrong
parameters.setPreviewSize(size.width, size.height);
parameters.setPictureFormat(ImageFormat.JPEG);
mCamera.setParameters(parameters);
mCamera.setDisplayOrientation(90);
mCamera.startPreview();
}
// This is the snapshot button event handler
public void onSnapshotButtonClick(View target) {
//void android.hardware.Camera.takePicture(ShutterCallback shutter,
// PictureCallback raw, PictureCallback jpeg)
mPreview.mCamera.takePicture(null, null, mPictureCallback);
}
// This saves the camera snapshot as a JPEG file on the SD card
Camera.PictureCallback mPictureCallback = new Camera.PictureCallback() {
public void onPictureTaken(byte[] imageData, Camera c) {
if (imageData != null) {
FileOutputStream outStream = null;
try {
String myJpgPath = String.format(
"/sdcard/%d.jpg", System.currentTimeMillis());
outStream = new FileOutputStream(myJpgPath);
outStream.write(imageData);
outStream.close();
Log.d("TestApp", "onPictureTaken - wrote bytes: "
+ imageData.length);
c.startPreview();
Toast.makeText(getApplicationContext(), String.format("%s written", myJpgPath), Toast.LENGTH_SHORT).show();
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
}
}
}
};
Another workaround is to match the aspect ratio between preview and picture sizes (i.e. setPreviewSize(w1,h1); setPictureSize(w2,h2) with w1/h1 ~ w2/h2 (small differences seem to be ok)). E.g. for Desire HD S w1=800,h1=480, w2=2592,h2=1552 works as well as w1=960,h1=720,h2=2592,h2=1952 (if you don't mind distorted images ;-)
I assume that you are using a common implementation of the getBestPreviewSize(w,h) method that is floating about, where you cycle through the different getSupportedPreviewSizes() to find the best match. Although I am not certain as to why it causes the images to be distorted, I have found that calling the parameters.setPreviewSize(size.width, size.height) method with the output of the getBestPreviewSize method is what is causing the problem on the HTC Desire. I have also verified that by commenting it out, the distorted image issue goes away.
Related
I'm trying to transport the image-info from Android Camera to ROS in real-time. However, I got a OOM problem. I'm new to Android-ROS, nearly have no experiences of dealing with such problem.
Here're some information of my demo: (if you guys need more, pls comment)
1.
public class MainActivity extends RosActivity implements NodeMain, SurfaceHolder.Callback, Camera.PreviewCallback
2.Dependencies Opencv-for-Android(3.2.0).
3.ROS messages type: android_cv_bridge.
I'm trying to publish the image-messages in onPreviewFrame() function. Code like this:
#Override
public void onPreviewFrame(byte[] data, Camera camera) {
Camera.Size size = camera.getParameters().getPreviewSize();
YuvImage yuvImage = new YuvImage(data, ImageFormat.NV21, size.width, size.height, null);
Bitmap bmp = null;
if(yuvImage != null){
ByteArrayOutputStream baos = new ByteArrayOutputStream();
yuvImage.compressToJpeg(new Rect(0, 0, size.width, size.height), 80, baos);
bmp = BitmapFactory.decodeByteArray(baos.toByteArray(), 0, baos.size());
try{
baos.flush();
baos.close();
}
catch(IOException e){
e.printStackTrace();
}
image = imagePublisher.newMessage();
Time curTime = connectedNode.getCurrentTime();
image.setEncoding("rgba8");
image.getHeader().setStamp(curTime);
image.getHeader().setFrameId("camera");
curTime = null;
if(isOpenCVInit){
Mat mat_image = new Mat(bmp.getHeight(), bmp.getWidth(), CvType.CV_8UC4, new Scalar(0));
Bitmap copyBmp = bmp.copy(Bitmap.Config.ARGB_8888, true);
// bitmap to mat
Utils.bitmapToMat(copyBmp, mat_image);
// mat to cvImage
CvImage cvImage = new CvImage(image.getHeader(), "rgba8", mat_image);
try {
imagePublisher.publish(cvImage.toImageMsg(image));
} catch (IOException e) {
e.printStackTrace();
}
mat_image.release();
mat_image = null;
if(!bmp.isRecycled()) {
bmp.recycle();
bmp = null;
}
if(!copyBmp.isRecycled()) {
copyBmp.recycle();
copyBmp = null;
}
cvImage =null;
image = null;
}
}
yuvImage = null;
System.gc();
}
The imagePublisher are initialized here:
#Override
public void onStart(ConnectedNode connectedNode) {
this.connectedNode = connectedNode;
imagePublisher = connectedNode.newPublisher(topic_name, sensor_msgs.Image._TYPE);
}
Well, I had try my best to avoid the OOM problem. I had also trying to not apply the OpenCV, and just dealing with the bitmap like this:
ChannelBufferOutputStream cbos = new ChannelBufferOutputStream(MessageBuffers.dynamicBuffer());
bmp.compress(Bitmap.CompressFormat.JPEG, 80, baos);
cbos.buffer().writeBytes(baos.toByteArray());
image.setData(cbos.buffer().copy());
cbos.buffer().clear();
imagePublisher.publish(image);
Unfortunately, it's get worse. I'm doubt the way I'm trying to achieve this target. Or is there a better way to do?
I think your problem might be that your network can't transfer this amount of image data and the OOM is caused by the data stuck in buffers that is not yet transferred.
I had similar issues when I wanted to transfer image from my android device. If your problem is the same, you could solve it in several ways:
transfer data via usb tethering, it's generally much faster than wifi or cellular and can transfer even raw image stream without compression with 30 fps 640x480. For Jpeg I think you will be able to stream FullHD at 30 fps.
save data on the phone to a ROS Bag http://wiki.ros.org/rosbag and then work with the data. Here you miss realtime, but sometimes it's not needed. To make it I actually wrote an application for android https://github.com/lamerman/ros_android_bag and you can also download it directly from Google Play https://play.google.com/store/apps/details?id=org.lamerman.rosandroidbag&hl=en
try to decrease the bandwidth even further (decrease image size, fps) or increase the network quality
About your second attempt with transferring JPEG instead of RAW data, have a look at this source code, here it's implemented correctly https://github.com/rosjava/android_core/blob/kinetic/android_10/src/org/ros/android/view/camera/CompressedImagePublisher.java#L80
The problem of transferring via network is for sure actual for raw images, but may also be for compressed ones if the size of image is big and the frame rate is high.
I would like to get frames from camera's phone. So, i try to capture video and i use matlab to find frames per second of this video, i got 250 frames per 10 seconds. But when i use
public void onPreviewFrame(byte[] data, Camera camera) {}
on Android, i only get 70 frames per 10 seconds.
Do you know why? I put my code below:
private Camera.PreviewCallback previewCallBack = new Camera.PreviewCallback() {
#Override
public void onPreviewFrame(byte[] data, Camera camera) {
System.out.println("Get frame " + frameNumber);
if (data == null)
throw new NullPointerException();
Camera.Parameters p = camera.getParameters();
Camera.Size size = p.getPreviewSize();
if (frameNumber == 0) {
startTime = System.currentTimeMillis();
}
// Log.e("GetData", "Get frame " + frameNumber);
frameNumber++;
camera.addCallbackBuffer(data);
}
}
That's true; Android video recorder does not use Camera.PreviewCallback, and it may be much faster than what you get with Java callbacks. The reason is that it can send the video frame from camera to the hardware encoder inside the kernel, without ever putting the pixels into user space.
However, I have reliably achieved 30 FPS in Java on advanced devices, like Nexus 4 or Galaxy S3. The secrets are: to avoid garbage collection by using Camera.setPreviewCallbackWithBuffer(), and to push the callbacks off the UI thread by using an HandlerThread.
Naturally, the preview callback itself should be optimized as thoroughly as possible. In your sample, the calls to camera.getParameters() is slow and can be avoided. No allocations (new) should be made.
I am using the Zxing library for scanning only QRcode 39 in my application. thanks to sean for the wonderful work. It is working fine, But the problem is, It takes more time to scan. I am scanning with both front camera and rear camera.
I am using zxing project as library to my application.
With the help of cameraInfo API, I am finding the front camera index and passing through an intent to ScanCard which extends CaptureActivity --> CameraManager.
public class ScanCard extends CaptureActivity {
#Override
public void handleDecode(Result rawResult, Bitmap barcode) {
// TODO Auto-generated method stub
super.handleDecode(rawResult, barcode);
mScanResult = rawResult.getText().toString();
}
}
In the Camera Manager class, I changed accordingly to show the front facing camera for scanning as like below.
public void openDriver(SurfaceHolder holder, int myCamera)
throws IOException {
Camera theCamera = camera;
if (theCamera == null) {
theCamera = Camera.open(myCamera);
if (theCamera == null) {
throw new IOException();
}
camera = theCamera;
}
theCamera.setPreviewDisplay(holder);
if (!initialized) {
initialized = true;
configManager.initFromCameraParameters(theCamera);
if (requestedFramingRectWidth > 0 && requestedFramingRectHeight > 0) {
setManualFramingRect(requestedFramingRectWidth,
requestedFramingRectHeight);
requestedFramingRectWidth = 0;
requestedFramingRectHeight = 0;
}
}
configManager.setDesiredCameraParameters(theCamera);
SharedPreferences prefs = PreferenceManager
.getDefaultSharedPreferences(context);
reverseImage = prefs.getBoolean(PreferencesActivity.KEY_REVERSE_IMAGE,
false);
}
What should i do to make the scan more faster? Thanks for the help
When i surf around, I got this Nimbledroid. It's good to go with NimbleDroid?
https://github.com/zxing/zxing core folder is fair enough do deal with android. You don't need to use android-xxx projects.
If you want to scan codes faster, you should use ZBar library http://zbar.sourceforge.net/ but it is under GPL licence.
EDIT
int bitmapWidth = bitmap.getWidth();
int bitmapHeight = bitmap.getHeight();
int[] pixels = new int[bitmapWidth * bitmapHeight];
bitmap.getPixels(pixels, 0, bitmapWidth, 0, 0, bitmapWidth, bitmapHeight);
bitmap.recycle();
RGBLuminanceSource source = new RGBLuminanceSource(bitmapWidth, bitmapHeight, pixels);
BinaryBitmap binaryBitmap = new BinaryBitmap(new HybridBinarizer(source));
Reader reader = new MultiFormatReader();
try {
return reader.decode(binaryBitmap).toString();
} catch (Exception e) {
// nothing happens - entry is just not available in this frame
}
return null;
bitmap is Bitmap object created from camera video Preview.
Here you got explanation how to set the camera preview.
In Camera object you should set PreviewCallback that gives you bytes that should be converted into Bitmap. Zxing has a nice api, but their android app is a crap - that is all you need.
HTH
PS. Put in the google "zbar android" - first link contains api for android shared on github...
Instead of adding all the packages of the zxing library, why don't you try adding zxing project as a library in your application.
Following code is tested on HTC Desire S, Galaxy S II and emulator. It is working fine, but surprisingly it doesn't work on Galaxy S Duos (GT-S7562). What happens is that all calls are successful with no exception but callbacks are not called.
public class CameraManager implements PictureCallback {
private final static String DEBUG_TAG = "CameraManager";
public void TakePicture() {
try {
_camera = Camera.open(cameraId);
Log.d(DEBUG_TAG, "Camera.TakePicture.open");
SurfaceView view = new SurfaceView(CameraManager.this.getContext());
_camera.setPreviewDisplay(view.getHolder());
Log.d(DEBUG_TAG, "Camera.TakePicture.setPreviewDisplay");
_camera.startPreview();
Log.d(DEBUG_TAG, "Camera.TakePicture.startPreview");
AudioManager manager = (AudioManager) CameraManager.super.getContext().getSystemService(Context.AUDIO_SERVICE);
Log.d(DEBUG_TAG, "Camera.TakePicture.AudioManager.ctor()");
manager.setStreamVolume(AudioManager.STREAM_SYSTEM, 0 , AudioManager.FLAG_REMOVE_SOUND_AND_VIBRATE);
Log.d(DEBUG_TAG, "Camera.TakePicture.setStreamVolume");
Camera.ShutterCallback shutter = new Camera.ShutterCallback() {
#Override
public void onShutter() {
AudioManager manager = (AudioManager) CameraManager.super.getContext().getSystemService(Context.AUDIO_SERVICE);
Log.d(DEBUG_TAG, "Camera.TakePicture.Shutter.AudioManager.ctor()");
manager.setStreamVolume(AudioManager.STREAM_SYSTEM, manager.getStreamMaxVolume(AudioManager.STREAM_SYSTEM) , AudioManager.FLAG_ALLOW_RINGER_MODES);
Log.d(DEBUG_TAG, "Camera.TakePicture.Shutter.setStreamVolume");
}
};
Camera.PictureCallback rawCallback = new Camera.PictureCallback() {
#Override
public void onPictureTaken(byte[] data, Camera camera) {
if (data != null) {
Log.i(DEBUG_TAG, "Picture taken::RAW");
_camera.stopPreview();
_camera.release();
} else {
Log.wtf(DEBUG_TAG, "Picture NOT taken::RAW");
}
}
};
_camera.takePicture(shutter, rawCallback, CameraManager.this);
Log.d(DEBUG_TAG, "Camera.TakePicture.taken");
} catch (Exception err) {
err.printStackTrace();
Log.d(DEBUG_TAG, "Camera.TakePicture.Exception:: %s" + err.getMessage());
}
}
#Override
public void onPictureTaken(byte[] data, Camera camera) {
if (data != null) {
Log.i(DEBUG_TAG, "Picture taken::JPG");
_camera.stopPreview();
_camera.release();
} else {
Log.wtf(DEBUG_TAG, "Picture NOT taken::JPG");
}
}
}
Here's the output log of logcat for execution of above code, As you can see, callbacks are not called.:
[ 10-16 01:39:18.711 3873:0xf21 D/CameraManager ]
Camera.TakePicture.open
[ 10-16 01:39:18.891 3873:0xf21 D/CameraManager ]
Camera.TakePicture.setFrontCamera
[ 10-16 01:39:18.901 3873:0xf21 D/CameraManager ]
Camera.TakePicture.setPreviewDisplay
[ 10-16 01:39:18.901 3873:0xf21 D/CameraManager ]
Camera.TakePicture.startPreview
[ 10-16 01:39:18.901 3873:0xf21 D/CameraManager ]
Camera.TakePicture.AudioManager.ctor()
[ 10-16 01:39:19.001 3873:0xf21 D/CameraManager ]
Camera.TakePicture.setStreamVolume
[ 10-16 01:39:19.041 3873:0xf21 D/CameraManager ]
Camera.TakePicture.taken
I have also checked SO for similar problems with Galaxy S and found following code, I used it with no success:
Camera.Parameters parameters = camera.getParameters();
parameters.set("camera-id", 2);
// (800, 480) is also supported front camera preview size at Samsung Galaxy S.
parameters.setPreviewSize(640, 480);
camera.setParameters(parameters);
I was wondering if anyone could tell me what's wrong with my code? or maybe there's some limitations with this model that doesn't allow taking pictures without showing a preview surface. If so, then could you please let me know of any possible workaround? Note that this code is executed from an android service.
Documentation is explicit: you must start preview if you want to take a picture. From your code, it is not clear why the preview surface is not showing. IIRC, in Honeycomb and later, you cannot play with the preview surface coordinates to move it off screen. But you can usually hide the preview surface behind some image view.
Camera.takePicture with a rawCallback requires calling addRawImageCallbackBuffer
(I ran into the problem too and had to go to the source code to figure is out) When Camera.takePicture is called with second argument (Callback raw) non-null, the user must call Camera.addRawImageCallbackBuffer() at least once before takePicture() to start the supply of buffers for data to be returned in. If this is not done, the image is discarded (and apparently the callbacks are not called.
This is a block comment from android.hardware.Camera.java for addRawImageCallbackBuffer():
Adds a pre-allocated buffer to the raw image callback buffer queue.
Applications can add one or more buffers to the queue. When a raw image
frame arrives and there is still at least one available buffer, the
buffer will be used to hold the raw image data and removed from the
queue. Then raw image callback is invoked with the buffer. If a raw
image frame arrives but there is no buffer left, the frame is
discarded. Applications should add buffers back when they finish
processing the data in them by calling this method again in order
to avoid running out of raw image callback buffers.
The size of the buffer is determined by multiplying the raw image
width, height, and bytes per pixel. The width and height can be
read from {#link Camera.Parameters#getPictureSize()}. Bytes per pixel
can be computed from
{#link android.graphics.ImageFormat#getBitsPerPixel(int)} / 8,
using the image format from {#link Camera.Parameters#getPreviewFormat()}.
This method is only necessary when the PictureCallbck for raw image
is used while calling {#link #takePicture(Camera.ShutterCallback,
Camera.PictureCallback, Camera.PictureCallback, Camera.PictureCallback)}.
Please note that by calling this method, the mode for
application-managed callback buffers is triggered. If this method has
never been called, null will be returned by the raw image callback since
there is no image callback buffer available. Furthermore, When a supplied
buffer is too small to hold the raw image data, raw image callback will
return null and the buffer will be removed from the buffer queue.
#param callbackBuffer the buffer to add to the raw image callback buffer
queue. The size should be width * height * (bits per pixel) / 8. An
null callbackBuffer will be ignored and won't be added to the queue.
#see #takePicture(Camera.ShutterCallback,
Camera.PictureCallback, Camera.PictureCallback, Camera.PictureCallback)}.
Try your code with the 'raw' callback argument to takePicture() set to null.
I need to obtain raw preview data from Camera object at least 15 frame per second, but I can only get a frame in 110 milliseconds which means I can get only 9 frames per second. I brief my code below.
Camera mCamera = Camera.open();
Camera.Parameters parameters = mCamera.getParameters();
parameters.setPreviewFrameRate(30);
parameters.setPreviewFpsRange(15000,30000);
mCamera.setParameters(parameters);
mCamera.addCallbackBuffer(new byte[dataBufferSize]);
//dataBufferSize stands for the byte size for a picture frame
mCamera.addCallbackBuffer(new byte[dataBufferSize]);
mCamera.addCallbackBuffer(new byte[dataBufferSize]);
mCamera.setPreviewDisplay(videoCaptureViewHolder);
//videoCaptureViewHolder is a SurfaceHolder object
mCamera.setPreviewCallbackWithBuffer(new Camera.PreviewCallback() {
private long timestamp=0;
public synchronized void onPreviewFrame(byte[] data, Camera camera) {
Log.v("CameraTest","Time Gap = "+(System.currentTimeMillis()-timestamp));
timestamp=System.currentTimeMillis();
//do picture data process
camera.addCallbackBuffer(data);
return;
}
}
mCamera.startPreview();
In the briefed code above, dataBufferSize and videoCaptureViewHolder is defined and calculated or assigned in other statements.
I run my code, I can see preview on the screen and I get the log below:
...
V/CameraTest( 5396): Time Gap = 105
V/CameraTest( 5396): Time Gap = 112
V/CameraTest( 5396): Time Gap = 113
V/CameraTest( 5396): Time Gap = 115
V/CameraTest( 5396): Time Gap = 116
V/CameraTest( 5396): Time Gap = 113
V/CameraTest( 5396): Time Gap = 115
...
This means onPreviewFrame(byte[] data, Camera camera) is called every 110 milliseconds so I can get no more than 9 frames per second. And no matter what preview frame rate I set by issue setPreviewFrameRate() and what preview Fps range I set by issue setPreviewFpsRange(), the log is the same.
Would some one give me some help on this problem? I need to obtain raw preview data from Camera object at least 15 frames per second. Thank you in advance.
I put my entire code below.
CameraTest.java
package test.cameratest;
import java.io.IOException;
import java.util.Iterator;
import java.util.List;
import android.app.Activity;
import android.graphics.ImageFormat;
import android.hardware.Camera;
import android.hardware.Camera.ErrorCallback;
import android.hardware.Camera.Parameters;
import android.hardware.Camera.Size;
import android.os.Bundle;
import android.util.Log;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import android.view.SurfaceHolder.Callback;
public class CameraTestActivity extends Activity {
SurfaceView mVideoCaptureView;
Camera mCamera;
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
mVideoCaptureView = (SurfaceView) findViewById(R.id.video_capture_surface);
SurfaceHolder videoCaptureViewHolder = mVideoCaptureView.getHolder();
videoCaptureViewHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
videoCaptureViewHolder.addCallback(new Callback() {
public void surfaceDestroyed(SurfaceHolder holder) {
}
public void surfaceCreated(SurfaceHolder holder) {
startVideo();
}
public void surfaceChanged(SurfaceHolder holder, int format,
int width, int height) {
}
});
}
private void startVideo() {
SurfaceHolder videoCaptureViewHolder = null;
try {
mCamera = Camera.open();
} catch (RuntimeException e) {
Log.e("CameraTest", "Camera Open filed");
return;
}
mCamera.setErrorCallback(new ErrorCallback() {
public void onError(int error, Camera camera) {
}
});
Camera.Parameters parameters = mCamera.getParameters();
parameters.setPreviewFrameRate(30);
parameters.setPreviewFpsRange(15000,30000);
List<int[]> supportedPreviewFps=parameters.getSupportedPreviewFpsRange();
Iterator<int[]> supportedPreviewFpsIterator=supportedPreviewFps.iterator();
while(supportedPreviewFpsIterator.hasNext()){
int[] tmpRate=supportedPreviewFpsIterator.next();
StringBuffer sb=new StringBuffer();
sb.append("supportedPreviewRate: ");
for(int i=tmpRate.length,j=0;j<i;j++){
sb.append(tmpRate[j]+", ");
}
Log.v("CameraTest",sb.toString());
}
List<Size> supportedPreviewSizes=parameters.getSupportedPreviewSizes();
Iterator<Size> supportedPreviewSizesIterator=supportedPreviewSizes.iterator();
while(supportedPreviewSizesIterator.hasNext()){
Size tmpSize=supportedPreviewSizesIterator.next();
Log.v("CameraTest","supportedPreviewSize.width = "+tmpSize.width+"supportedPreviewSize.height = "+tmpSize.height);
}
mCamera.setParameters(parameters);
if (null != mVideoCaptureView)
videoCaptureViewHolder = mVideoCaptureView.getHolder();
try {
mCamera.setPreviewDisplay(videoCaptureViewHolder);
} catch (Throwable t) {
}
Log.v("CameraTest","Camera PreviewFrameRate = "+mCamera.getParameters().getPreviewFrameRate());
Size previewSize=mCamera.getParameters().getPreviewSize();
int dataBufferSize=(int)(previewSize.height*previewSize.width*
(ImageFormat.getBitsPerPixel(mCamera.getParameters().getPreviewFormat())/8.0));
mCamera.addCallbackBuffer(new byte[dataBufferSize]);
mCamera.addCallbackBuffer(new byte[dataBufferSize]);
mCamera.addCallbackBuffer(new byte[dataBufferSize]);
mCamera.setPreviewCallbackWithBuffer(new Camera.PreviewCallback() {
private long timestamp=0;
public synchronized void onPreviewFrame(byte[] data, Camera camera) {
Log.v("CameraTest","Time Gap = "+(System.currentTimeMillis()-timestamp));
timestamp=System.currentTimeMillis();
try{
camera.addCallbackBuffer(data);
}catch (Exception e) {
Log.e("CameraTest", "addCallbackBuffer error");
return;
}
return;
}
});
try {
mCamera.startPreview();
} catch (Throwable e) {
mCamera.release();
mCamera = null;
return;
}
}
private void stopVideo() {
if(null==mCamera)
return;
try {
mCamera.stopPreview();
mCamera.setPreviewDisplay(null);
mCamera.setPreviewCallbackWithBuffer(null);
mCamera.release();
} catch (IOException e) {
e.printStackTrace();
return;
}
mCamera = null;
}
public void finish(){
stopVideo();
super.finish();
};
}
AndroidManifest.xml
<?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
package="test.cameratest"
android:versionCode="1"
android:versionName="1.0">
<uses-sdk android:minSdkVersion="9" android:targetSdkVersion="10" android:maxSdkVersion="10"/>
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.VIBRATE" />
<uses-permission android:name="android.permission.ACCESS_WIFI_STATE" />
<uses-permission android:name="android.permission.WAKE_LOCK" />
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<uses-permission android:name="android.permission.MOUNT_UNMOUNT_FILESYSTEMS"/>
<uses-permission android:name="android.permission.READ_CONTACTS"/>
<uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS"/>
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE"/>
<uses-permission android:name="android.permission.PROCESS_OUTGOING_CALLS"/>
<uses-permission android:name="android.permission.CALL_PHONE"/>
<uses-permission android:name="android.permission.BOOT_COMPLETED"/>
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE"/>
<uses-permission android:name="android.permission.WRITE_SETTINGS" />
<application android:icon="#drawable/icon" android:label="#string/app_name">
<activity android:name=".CameraTestActivity"
android:label="#string/app_name">
<intent-filter>
<action android:name="android.intent.action.MAIN" />
<category android:name="android.intent.category.LAUNCHER" />
</intent-filter>
</activity>
</application>
</manifest>
I'm affraid, you can not. Preview framerate setting is hint for camera appplication ( which runs in separate process) - and it is free to accept or silently ignore it. It is also not related with preview frame retrieval
When you request preview frame, you just say external application that you would like to have it. Buffer for it is allocated in camera application and then passed to your activity via mmaped memory segment - this takes time.
You may get desired performance on some devices, but not necessarily on one you are playing with.
If you need defined frame rate, you will have to capture video and then parse / decompress resulting binary stream.
Me experience with the camera stuff has been fiddly and hardware dependent. So try running it on other hardware sometime if you can.
Also might be worth trying some more camera settings.
Checkout
setRecordingHint (boolean hint).
Could even try FOCUS_MODE.
Thanks for including a code sample btw.
This should not be a problem. My androangelo (it's in the market) app get's up to 30 frames per second (at least I implemented a rate-brake to slow it down).
Please check carefully, whether Your log is filled with garbage-collector statements. This is the case if too few buffers are added. This had be the trick for me. At least I came up to add 20! buffers to the camera.
Then the processing of each frame should take place in a separate thread. While an image is in the thread for processing, the callback should skip the current frame.
In my understanding, Android does not allow user to a set a fixed framerate, nor guarantee the value of fps that you specify will be respected is due to the frame exposure time which is set by the camera hardware or firmware. The frame rate you observe may be a function of lighting conditions. For example, a certain phone may give you a 30fps preview rate in a day light but only 7 fps if you are filming in a low light condition.
One thing that seems to increase the fluidity of the preview, if not the actual FPS necessarily, is setting the previewFormat to YV12 if supported. It's fewer bytes to copy, 16-byte aligned, and possibly optimized in other ways:
// PREVIEW FORMATS
List<Integer> supportedPreviewFormats = parameters.getSupportedPreviewFormats();
Iterator<Integer> supportedPreviewFormatsIterator = supportedPreviewFormats.iterator();
while(supportedPreviewFormatsIterator.hasNext()){
Integer previewFormat =supportedPreviewFormatsIterator.next();
// 16 ~ NV16 ~ YCbCr
// 17 ~ NV21 ~ YCbCr ~ DEFAULT
// 4 ~ RGB_565
// 256~ JPEG
// 20 ~ YUY2 ~ YcbCr ...
// 842094169 ~ YV12 ~ 4:2:0 YCrCb comprised of WXH Y plane, W/2xH/2 Cr & Cb. see documentation
Log.v("CameraTest","Supported preview format:"+previewFormat);
if (previewFormat == ImageFormat.YV12) {
parameters.setPreviewFormat(previewFormat);
Log.v("CameraTest","SETTING FANCY YV12 FORMAT");
}
}
http://developer.android.com/reference/android/graphics/ImageFormat.html#YV12 describes the format. This plus a few spare buffers gives me "Time Gaps" of as low as 80...which is still not "good enough", but ... better? (actually I've got one at 69...but really, they're more around 90 on average). Not sure how much logging is slowing things down?
Setting the previewSize to 320x240 (versus 1280x720) gets things down to the 50-70msec range...so maybe that's what you need to do? Admittedly, that little data may be a lot less useful.
// all tested on Nexus4
I usually declare a global boolean lockCameraUse. The the callback function usually looks like this.
public void onPreviewFrame(byte[] data, Camera camera) {
if (lockCameraUse) {
camera.addCallbackBuffer(data);
return;
}
lockCameraUse = true;
// processinng data
// done processing data
camera.addCallbackBuffer(data);
lockCameraUse = false;
return;
}