I am using OpenCV to attempt to do some live video processing. Since the processing is fairly heavy, it delays the output frames significantly, making the live stream look choppy.
I'd like to offload some of the processing into an AsyncTask. I've tried it and it actually makes the video much smoother. However, it ends up starting a large amount of Tasks at once, and then they will slowly start returning with some results.
Is there any way to slow this down, and wait for a result, either by using Synchronize statements, or some other method?
On each camera frame, I start one of these tasks. DoImgProcessing does the long processing and returns a string result.
private class LongOperation extends AsyncTask<Mat, Void, String> {
#Override
protected String doInBackground(Mat... params) {
Mat inputFrame = params[0];
cropToCenter(inputFrame);
return doImgProcessing(inputFrame);
}
#Override
protected void onPostExecute(String result) {
Log.d(TAG, "on post execute: "+result);
}
#Override
protected void onPreExecute() {
Log.d(TAG, "on pre execute");
}
}
public Mat onCameraFrame(Mat inputFrame) {
inputFrame.copyTo(mRgba);//this will be used for the live stream
LongOperation op = new LongOperation();
op.execute(inputFrame);
return mRgba;
}
I would do something like that :
// Example value for a timeout.
private static final long TIMEOUT = 1000L;
private BlockingQueue<Mat> frames = new LinkedBlockingQueue<Mat>();
Thread worker = new Thread() {
#Override
public void run() {
while (running) {
Mat inputFrame = frames.poll(TIMEOUT, TimeUnit.MILLISECONDS);
if (inputFrame == null) {
// timeout. Also, with a try {} catch block poll can be interrupted via Thread.interrupt() so not to wait for the timeout.
continue;
}
cropToCenter(inputFrame);
String result = doImgProcessing(inputFrame);
}
}
};
worker.start();
public Mat onCameraFrame(Mat inputFrame) {
inputFrame.copyTo(mRgba);//this will be used for the live stream
frames.put(inputFrame);
return mRgba;
}
The onCameraFrame puts the frame on the Queue, the worker Thread polls from the Queue.
This decorelate the reception and the treatment of the frame. You can monitor the growth of the Queue using frames.size().
This is a typical producer-consumer example.
If you're doing this on each frame, it sounds like you need a thread instead. An AsyncTask is for when you want to do a one-off activity on another thread. Here you want to do it repeatedly. Just create a thread, and when it finishes a frame have it post a message to a handler to run the post step on the UI thread. It can wait on a semaphore at the top of its loop for the next frame to be ready.
Related
I am trying to write an application to trigger the Android camera at a fixed given time interval. I was testing it with TimerTask, however I read that I am not suppose to trigger the camera again until the JPEG is ready. Is there a method of triggering the camera at a fixed interval and letting the JPEG come when its ready and then trigger it again and let that next JPEG come when it's read, etc, without causing some sort of Heap Overflow? Is there a way to do this camera2?
Here are the relevant methods I have so far:
PictureCallback onPicTake=new PictureCallback() {
#Override
public void onPictureTaken ( byte[] bytes, Camera camera){
Log.d("data size",""+bytes.length);
Log.d("taken", "taken");
new SaveImageTask(getStorage()).execute(bytes);
resetCam();
}
};
Camera.ShutterCallback onShutter=new Camera.ShutterCallback()
{
#Override
public void onShutter () {
AudioManager mgr = (AudioManager) getSystemService(Context.AUDIO_SERVICE);
mgr.playSoundEffect(AudioManager.FLAG_PLAY_SOUND);
}
};
private class CameraTrigger extends TimerTask{
public void run(){
mCamera.takePicture(onShutter, null, onPicTake);
}
}
preview.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View arg0) {
timer = new Timer();
timer.schedule(new CameraTrigger(), 0, 1000);
}
});
private void resetCam() {
mCamera.startPreview();
preview.setCamera(mCamera);
}
There is nothing terribly wrong in your code, as long as you know for sure that onPictureTaken() will not take more than 1000 ms.
One optimization that I would suggest is counterintuitively not to save the picture in a background task, but rather do it on the callback thread.
The reason is that the huge memory chunk of bytes cannot be easily garbage collected this way. From the point of view of JVM, the following pattern does not put a burden on garbage collector:
byte[] bytes = new byte[1Mb];
fill bytes with something
onPreviewFrame(bytes);
nobody needs bytes again
bytes memory is reclaimed
But if there are outstanding references to bytes, it may be hard for GC to decide, and you can see spikes of CPU usage, app not responding, and eventually, even TimeTask callbacks delayed.
Note that it is not healthy to use onPictureTaken() on the main (UI) thread. To keep the camera callbacks in background, you need to open the camera on a secondary Looper thread (see this example).
I'm developing an application which requires heavy image processing using camera input and real-time results display. I've decided to use OpenGL and OpenCV along with Android's normal camera API. So far it has become a bit of a multithreading nightmare, and unfortunately I feel very restricted by the lack of documentation on the onPreviewFrame() callback.
I am aware from the documentation that onPreviewFrame() is called on the thread which acquires the camera using Camera.open(). What confuses me is how this callback is scheduled - it seems to be at a fixed framerate. My current architecture relies on the onPreviewFrame() callback to initiate the image processing/display cycle, and it seems to go into deadlock when I block the camera callback thread for too long, so I suspect that the callback is inflexible when it comes to scheduling. I'd like to slow down the framerate to test this, but my device doesn't support this.
I started with the code over at http://maninara.blogspot.ca/2012/09/render-camera-preview-using-opengl-es.html. This code is not very parallel, and it is only meant to display exactly the data which the camera returns. For my needs, I adapted the code to draw bitmaps, and I use a dedicated thread to buffer the camera data to another dedicated heavy-lifting image processing thread (all outside of the OpenGL thread).
Here is my code (simplified):
CameraSurfaceRenderer.java
class CameraSurfaceRenderer implements GLSurfaceView.Renderer, SurfaceTexture.OnFrameAvailableListener,
Camera.PreviewCallback
{
static int[] surfaceTexPtr;
static CameraSurfaceView cameraSurfaceView;
static FloatBuffer pVertex;
static FloatBuffer pTexCoord;
static int hProgramPointer;
static Camera camera;
static SurfaceTexture surfaceTexture;
static Bitmap procBitmap;
static int[] procBitmapPtr;
static boolean updateSurfaceTex = false;
static ConditionVariable previewFrameLock;
static ConditionVariable bitmapDrawLock;
// MarkerFinder extends CameraImgProc
static MarkerFinder markerFinder = new MarkerFinder();
static Thread previewCallbackThread;
static
{
previewFrameLock = new ConditionVariable();
previewFrameLock.open();
bitmapDrawLock = new ConditionVariable();
bitmapDrawLock.open();
}
CameraSurfaceRenderer(Context context, CameraSurfaceView view)
{
rendererContext = context;
cameraSurfaceView = view;
// … // Load pVertex and pTexCoord vertex buffers
}
public void close()
{
// … // This code usually doesn’t have the chance to get called
}
#Override
public void onSurfaceCreated(GL10 unused, EGLConfig config)
{
// .. // Initialize a texture object for the bitmap data
surfaceTexPtr = new int[1];
surfaceTexture = new SurfaceTexture(surfaceTexPtr[0]);
surfaceTexture.setOnFrameAvailableListener(this);
//Initialize camera on its own thread so preview frame callbacks are processed in parallel
previewCallbackThread = new Thread()
{
#Override
public void run()
{
try {
camera = Camera.open();
} catch (RuntimeException e) {
// … // Bitch to the user through a Toast on the UI thread
}
assert camera != null;
//Callback set on CameraSurfaceRenderer class, but executed on worker thread
camera.setPreviewCallback(CameraSurfaceRenderer.this);
try {
camera.setPreviewTexture(surfaceTexture);
} catch (IOException e) {
Log.e(Const.TAG, "Unable to set preview texture");
}
Looper.prepare();
Looper.loop();
}
};
previewCallbackThread.start();
// … // More OpenGL initialization stuff
}
#Override
public void onDrawFrame(GL10 unused)
{
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
synchronized (this)
{
surfaceTexture.updateTexImage();
}
// Binds bitmap data to texture
bindBitmap(procBitmap);
// … // Acquire shader program ttributes, render
GLES20.glFlush();
}
#Override
public synchronized void onFrameAvailable(SurfaceTexture surfaceTexture)
{
cameraSurfaceView.requestRender();
}
#Override
public void onPreviewFrame(byte[] data, Camera camera)
{
Bitmap bitmap = markerFinder.exchangeRawDataForProcessedImg(data, null, camera);
// … // Check for null bitmap
previewFrameLock.block();
procBitmap = bitmap;
previewFrameLock.close();
bitmapDrawLock.open();
}
void bindBitmap(Bitmap bitmap)
{
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, procBitmapPtr[0]);
bitmapDrawLock.block();
if (bitmap != null && !bitmap.isRecycled())
{
GLUtils.texImage2D(GLES20.GL_TEXTURE_2D, 0, bitmap, 0);
bitmap.recycle();
}
bitmapDrawLock.close();
previewFrameLock.open();
}
#Override
public void onSurfaceChanged(GL10 unused, int width, int height)
{
GLES20.glViewport(0, 0, width, height);
// … // Set camera parameters
camera.startPreview();
}
void deleteTexture()
{
GLES20.glDeleteTextures(1, surfaceTexPtr, 0);
}
}
CameraImgProc.java (abstract class)
public abstract class CameraImgProc
{
CameraImgProcThread thread = new CameraImgProcThread();
Handler handler;
ConditionVariable bufferSwapLock = new ConditionVariable(true);
Runnable processTask = new Runnable()
{
#Override
public void run()
{
imgProcBitmap = processImg(lastWidth, lastHeight, cameraDataBuffer, imgProcBitmap);
bufferSwapLock.open();
}
};
int lastWidth = 0;
int lastHeight = 0;
Mat cameraDataBuffer;
Bitmap imgProcBitmap;
public CameraImgProc()
{
thread.start();
handler = thread.getHandler();
}
protected abstract Bitmap allocateBitmapBuffer(int width, int height);
public final Bitmap exchangeRawDataForProcessedImg(byte[] data, Bitmap dirtyBuffer, Camera camera)
{
Camera.Parameters parameters = camera.getParameters();
Camera.Size size = parameters.getPreviewSize();
// Wait for worker thread to finish processing image
bufferSwapLock.block();
bufferSwapLock.close();
Bitmap freshBuffer = imgProcBitmap;
imgProcBitmap = dirtyBuffer;
// Reallocate buffers if size changes to avoid overflow
assert size != null;
if (lastWidth != size.width || lastHeight != size.height)
{
lastHeight = size.height;
lastWidth = size.width;
if (cameraDataBuffer != null) cameraDataBuffer.release();
//YUV format requires 1.5 times as much information in vertical direction
cameraDataBuffer = new Mat((lastHeight * 3) / 2, lastWidth, CvType.CV_8UC1);
imgProcBitmap = allocateBitmapBuffer(lastWidth, lastHeight);
// Buffers had to be resized, therefore no processed data to return
cameraDataBuffer.put(0, 0, data);
handler.post(processTask);
return null;
}
// If program did not pass a buffer
if (imgProcBitmap == null)
imgProcBitmap = allocateBitmapBuffer(lastWidth, lastHeight);
// Exchange data
cameraDataBuffer.put(0, 0, data);
// Give img processing task to worker thread
handler.post(processTask);
return freshBuffer;
}
protected abstract Bitmap processImg(int width, int height, Mat cameraData, Bitmap dirtyBuffer);
class CameraImgProcThread extends Thread
{
volatile Handler handler;
#Override
public void run()
{
Looper.prepare();
handler = new Handler();
Looper.loop();
}
Handler getHandler()
{
//noinspection StatementWithEmptyBody
while (handler == null)
{
try {
Thread.currentThread();
Thread.sleep(5);
} catch (Exception e) {
//Do nothing
}
};
return handler;
}
}
}
I want an application which is robust, no matter how long it takes for the CameraImgProc.processImg() function to finish. Unfortunately, the only possible solution when camera frames are being fed in at a fixed rate is to drop frames when the image processing hasn't finished yet, or else I'll quickly have a buffer overflow.
My questions are as follows:
Is there any way to slow down the Camera.PreviewCallback frequency on demand?
Is there an existing Android API for getting frames on demand from the camera?
Are there existing solutions to this problem which I can refer to?
onPreviewFrame() is called on the thread which acquires the camera
using Camera.open()
That's a common misunderstanding. The key word that is missing from this description is "event". To schedule the camera callbacks to a non-UI thread, you need and "event thread", a synonym of HandlerThread. Please see my explanation and sample elsewhere on SO. Well, using a usual thread to open camera as in your code, is not useless, because this call itself may take few hundred milli on some devices, but event thread is much, much better.
Now let me address your questions: no, you cannot control the schedule of camera callbacks.
You can use setOneShotPreviewCallback() if you want to receive callbacks at 1 FPS or less. Your milage may vary, and it depends on the device, but I would recommend to use setPreviewCallbackWithBuffer and simply return from onPreviewFrame() if you want to check the camera more often. Performance hit from these void callbacks is minor.
Note that even when you offload the callbacks to a background thread, they are blocking: if it takes 200 ms to process a preview frame, camera will wait. Therefore, I usually send the byte[] to a working thread, and quickly release the callback thread. I won't recommend to slow down the flow of preview callbacks by processing them in blocking mode, because after you release the thread, the next callback will deliver a frame with undefined timestamp. Maybe it will be a fresh one, or maybe it will be one buffered a while ago.
You can schedule the callback in later platform releases (>4.0) indirectly. You can setup the buffers that the callback will use to deliver the data. Typically you setup two buffers; one to be written by the camera HAL while you read from the other one. No new frame will be delivered to you (by calling your onPreviewFrame) until you return a buffer that the camera can write to. It also means that the camera will drop frames.
In my android application, on a certain activity I need to create screenshots of views without actually displaying them. I have been successful in achieving this by inflating the views and saving them as bitmaps.
But in some scenarios the number of these bitmaps is large enough and it takes a lot of time to create them. As such the UI on the phone becomes non responsive. Is there any way I can do this whole process in the background? I have already tried implementing it in an Async Task but that does not work because its not allowed to inflate views in an Async task.
Any suggestions are highly appreciated.
AsyncTask doBackground method works on another Thread, That's the reason you are not able to inflate the views.
First whether u have one layout or many. If you have many then try below.
I have not test this. Just a sample for you.
public class Task extends AsyncTask<Void, Integer, Void>
{
private ArrayList<Integer> layoutIds;
private View currentView;
private LayoutInflater inflater;
private Object lock = new Object();
public Task(Context context) {
super();
inflater = LayoutInflater.from(context);
}
#Override
protected void onPreExecute() {
super.onPreExecute();
}
#Override
protected Void doInBackground(Void... params) {
Bitmap temp;
for (int i = 0; i < layoutIds.size(); i++) {
temp = Bitmap.createBitmap(100, 100, Config.ARGB_8888);
Canvas canvas = new Canvas(temp);
synchronized (lock) {
publishProgress(i);
try {
// Wait for the UI Thread to inflate the layout.
lock.wait();
}
catch (InterruptedException e) {
}
}
currentView.draw(canvas);
// Now save this bitmap
try {
FileOutputStream stream = new FileOutputStream(new File(Environment.getExternalStorageDirectory(), "File_" + i + ".png"));
temp.compress(Bitmap.CompressFormat.PNG, 100, stream);
stream.flush();
stream.close();
} catch (Exception e) {
}
finally
{
if(temp != null)
{
temp.recycle();
temp = null;
}
}
}
return null;
}
#Override
protected void onProgressUpdate(Integer... values) {
synchronized (lock) {
currentView = inflater.inflate(layoutIds.get(values[0]), null);
// Notify async thread that inflate is done.
lock.notifyAll();
}
}
}
EDITED
Here we have two thread one is AsyncTask which is a Thread Pool and another is UI Thread.
With synchronized block we could make sure that only one thread could use the object lock as long as it is not in sleeping or waiting for another thread.
So if one thread is executing the block inside synchronize then it will start monitoring that object and make sure no other thread which also has a synchronize block for that object will be executed. i.e., another thread has to wait for as long as the active thread goes to sleep or completed its execution inside synchronized block.
For more explanation, See this
Here, we used the synchronize block to wait for UI thread to complete.
So as it execute lock.wait(), the AsyncThread will wait till another thread calls notify on the same object. And when lock.notifyAll() is called all the thread (AsyncThread) which are waiting will be resumed.
AsyncTask is divided to onPreExecute(), onProgressUpdate() and onPostExecute(), all happens in the main UI thread allowing you to inflate the view. Only doInBackground() is where things actually happen in the other thread. If you can do al your calculation on this method and only inflate onProgressUpdate() or onPostExecute() it might help.
Also if you can give us your code it might help us to find a way to make it more CPU efficient.
Anyhow, android will try to force close your app if the UI thread isn't responding for more than 5 seconds, and theres isn't much to do about it (As far as I know).
ExecutorService exec = Executors.newFixedThreadPool(8);
List<Future<Object>> results = new ArrayList<Future<Object>>();
// submit tasks
for(int i = 0; i < 8; i++) {
results.add(exec.submit(new ThreadTask()));
}
...
// stop the pool from accepting new tasks
exec.shutdown();
// wait for results
for(Future<Object> result: results) {
Object obj = result.get();
}
class ThreadTask implements Callable<Object> {
public Object call() {
// execute download
//Inside this method I need to pause the thread for several seconds
...
return result;
}
}
As shown above in the comment I need to pause the thread for several seconds. Hope you can help me with this.
Thanks for your time!
Just call Thread.sleep(timeInMillis) - that will pause the current thread.
So:
Thread.sleep(5000); // Sleep for 5 seconds
Obviously you shouldn't do this from a UI thread, or your whole UI will freeze...
Note that this simple approach won't allow the thread to be woken up other by interrupting it. If you want to be able to wake it up early, you could use Object.wait() on a monitor which is accessible to whichever code needs to wake it up; that code could use Object.notify() to wake the waiting thread up. (Alternatively, use a higher-level abstraction such as Condition or Semaphore.)
you could implement a new thread, which is not the UI thread..
something like this might do it for you..
class ThreadTask implements Callable<Object> {
public Object call() {
Thread createdToWait= new Thread() {
public void run() {
//---some code
sleep(1000);//call this function to pause the execution of this thread
//---code to be executed after the pause
}
};
createdToWait.start();
return result;
}
I am new to Android programming and Threads. I want to get a picture from a remote Server and display it. (that works so far ^^)
But the picture is from a camera and so I need a new one as soon as I show the one I downloaded before. That means ,that the Thread should never stop grabbing the picture. (As long the Activity exists.)
Also I just want to establish 1 connection to the server and then just do HTTP-gets. So I have to have an parameter "connection" that the Thread can use.
To get an idea - it should work something like this (but obviously it does not):
private class DownloadImageTask extends AsyncTask<URLConnection, Void, Bitmap> {
/** The system calls this to perform work in a worker thread and
* delivers it the parameters given to AsyncTask.execute() */
private URLConnection connection = null;
protected Bitmap doInBackground(URLConnection...connection ) {
this.connection = connection[0];
return getImageFromServer(connection[0]);
}
protected void onPostExecute(Bitmap result) {
pic.setImageBitmap(result);
this.doInBackground(connection);
}
}
Might be better to use a Thread here since AsyncTask is for when the Task ends at some point. Something like below could work for you. Apart from that you could be better off using a local Service
protected volatile boolean keepRunning = true;
private Runnable r = new Runnable() {
public void run() {
// methods are a bit bogus but it should you give an idea.
UrlConnection c = createNewUrlConnection();
while (keepRunning) {
Bitmap result = getImageFromServer(c);
// that probably needs to be wrapped in runOnUiThread()
pic.setImageBitmap(result);
}
c.close();
}
};
private Thread t = null;
onResume() {
keepRunning = true;
t = new Thread(r);
t.start();
}
onPause() {
keepRunning = false;
t = null;
}
You should set some delay for it, but to fix this I think that it should look like this:
private class DownloadImageTask extends AsyncTask<URLConnection, Void, Bitmap> {
/** The system calls this to perform work in a worker thread and
* delivers it the parameters given to AsyncTask.execute() */
private URLConnection connection = null;
protected Bitmap doInBackground(URLConnection...connection ) {
this.connection = connection[0];
return getImageFromServer(connection[0]);
}
protected void onPostExecute(Bitmap result) {
pic.setImageBitmap(result);
this.execute("...");
}
}
Async Task can only be executed once...
The task can be executed only once (an exception will be thrown if a second execution is attempted.)
see this.. documentation on AsyncTask documentation on AsyncTask
I suggest it is better if you use a service to download...
or even a thread can be used...
like this
public void run() {
while (true) {
//get image...
}
}