I'm trying a camera capture application with SurfaceView.
The application's overview is
It can detect Face in realtime (on camera view).
It can store the movie.
I'm using android.media.MediaRecord class for saving the movie.
myRecorder = new MediaRecorder();
myRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
myRecorder.setOutputFile(Environment.getExternalStorageDirectory() + "/Movies/sample.3gp");
myRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
myRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.MPEG_4_SP);
myRecorder.setVideoFrameRate(30);
myRecorder.setVideoSize(320, 240);
myRecorder.prepare();
myRecorder.start();
after avobe step,RuntimeException occur in below step.(camera is android.hardware.Camera object)
#Override
public void onPreviewFrame(byte[] data, Camera camera) {
//Log.d(TAG, "onPreviewFrame: ");
int width = camera.getParameters().getPreviewSize().width; <--
int height = camera.getParameters().getPreviewSize().height;
03-22 22:54:09.134 27875-27875/? E/AndroidRuntime: FATAL EXCEPTION: main
Process: wbcompany.co.jp.facedetector3, PID: 27875
java.lang.RuntimeException: getParameters failed (empty parameters)
at android.hardware.Camera.native_getParameters(Native Method)
at android.hardware.Camera.getParameters(Camera.java:2019)
at wbcompany.co.jp.facedetector3.CameraView.onPreviewFrame(CameraView.java:150)
at android.hardware.Camera$EventHandler.handleMessage(Camera.java:1192)
at android.os.Handler.dispatchMessage(Handler.java:102)
at android.os.Looper.loop(Looper.java:154)
at android.app.ActivityThread.main(ActivityThread.java:6189)
at java.lang.reflect.Method.invoke(Native Method)
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:866)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:756)
before calling 'myRecorder.start()',this Exception doesn't occur.
I have no idea of this error's solution.
Please give me the solution of this problem.
My runtime enviroment:
Android 7.0/ API level 24
This is a strange error message, but the case is very real. When the camera is busy in MediaRecorder, it will not be accessible for other uses.
Generally speaking, access to camera parameters may be very inefficient on some devices. It is strongly recommended not to call camera.getParameters() for every preview frame. Create local fields in your CameraView class or in the activity that embeds it, and store width and height there when you start preview. They will not change unless you explicitly stop the camera and change its configuration.
If I am not missing something, your onPreviewFrame() callback happens on the main (UI) thread. The good practice is to call Camera.open() on a separate HandlerThread, to keep preview processing from slowing down UI.
Related
I've a custom class which extends TextureView. In this TextureView I've a surfaceTexture, mSurfaceTexture. This is a single-buffered surface.
As per android documentation, here:
In single buffered mode the application is responsible for serializing access to the image content buffer. Each time the image content is to be updated, the releaseTexImage() method must be called before the image content producer takes ownership of the buffer.
So I'm calling this whenever the image producer is taking ownership. It works fine on Android 10, however on android 9 I get the following error:
11-05 19:05:53.960 23442 24509 E AndroidRuntime: java.lang.IllegalStateException: Unable to release texture contents (see logcat for details)
11-05 19:05:53.960 23442 24509 E AndroidRuntime: at android.graphics.SurfaceTexture.nativeReleaseTexImage(Native Method)
11-05 19:05:53.960 23442 24509 E AndroidRuntime: at android.graphics.SurfaceTexture.releaseTexImage(SurfaceTexture.java:252)
I'm calling ANativeWindow_fromSurface. After using this should I do something else too like release nativewindow etc?
Any idea as to why this is happening and also anyone else had a similar issue ?
I have trained my model using ssd_mobilenet_v2_quantized_coco, which was also a long painstaking process of digging. Once training was successful, the model was correctly detecting images from my laptop but on my phone as soon as an object is detected, app crashes. I used TF lite Android app available at GitHub. I did some debugging on Android Studio and getting the following error log when an object gets detected and app crashes:
I/tensorflow: MultiBoxTracker: Processing 0 results from 314 I/tensorflow:
DetectorActivity: Preparing image 506 for detection in bg thread.
I/tensorflow: DetectorActivity: Running detection on image 506
I/tensorflow: MultiBoxTracker: Processing 0 results from 506
I/tensorflow: DetectorActivity: Preparing image 676 for detection in bg thread.
I/tensorflow: DetectorActivity: Running detection on image 676
E/AndroidRuntime: FATAL EXCEPTION: inference
Process: org.tensorflow.lite.demo, PID: 3122
java.lang.ArrayIndexOutOfBoundsException: length=80; index=-2147483648
at java.util.Vector.elementData(Vector.java:734)
at java.util.Vector.get(Vector.java:750)
at org.tensorflow.demo.TFLiteObjectDetectionAPIModel.recognizeImage(TFLiteObjectDetectionAPIModel.java:213)
at org.tensorflow.demo.DetectorActivity$3.run(DetectorActivity.java:247)
at android.os.Handler.handleCallback(Handler.java:873)
at android.os.Handler.dispatchMessage(Handler.java:99)
at android.os.Looper.loop(Looper.java:193)
at android.os.HandlerThread.run(HandlerThread.java:65)
My guess is labels located in .txt file being somehow misread. This is because of the line:
at org.tensorflow.demo.TFLiteObjectDetectionAPIModel.recognizeImage(TFLiteObjectDetectionAPIModel.java:213)
and that line corresponds to the following code:
labels.get((int) outputClasses[0][i] + labelOffset)
However, I don't know what to change in labels.txt. Possibly, I need to edit that txt as suggested here. Any other suggestions and explanation for possible causes are appreciated.
Update. I added ??? to the labels.txt and compiled/run, but I am still getting the same error as above.
P.S. I trained ssdmobilenet_V2_coco (the model without quantization) as well and it is working without crash on the app. I am guessing, perhaps, quantization is converting label indices differently and maybe resulting in outofbound error for labels.
Yes it is because the output of labels at times gets garbage value. For a quick fix you can try this:
add a condition:
if((int) outputClasses[0][i]>10)
{
outputClasses[0][i]=-1;
}
here 10 is the number of classes for which the model was trained for. You can change it accordingly.
I have been working on a Mobile application that analyzes the frame looking for specific objects. The processing was to heavy, and I keep getting
05-08 17:44:24.909: I/Choreographer(31606): Skipped 114 frames! The application may be doing too much work on its main thread.
So i switched the image processing to threads, now it is much faster but I am not able to recognize any object . The data(the different frames) is not updating and I don't know why. Here is what I'm doing in pseudocode( SurfaceHolder.Callback ,Camera.PreviewCallback and camera.addCallbackBuffer(data) are implemented )
public void onPreviewFrame(byte[] data, Camera camera)
{
Imageprocessor np = new ImageProcessor(data);
np.start()
results = np.getResults();
}
From the debugging I have done so far I know that start is analyzing the whole frame, but . the data is not updating it keeps stacked at the very first frame. This does not happen if I do it in the main Thread like this,
public void onPreviewFrame(byte[] data, Camera camera)
{
Imageprocessor np = new ImageProcessor();
np.process(data)
results = np.getResults();
}
This works but it force me to skip many frames. The answer may be easy, but I could not find it online.
Forgive me if I am posting a very noob question
Thanks in advance
That's because in the single threaded case, the np.process() is complete before you execute results=..., but in the threaded case, the results=... follows immediately after starting the threads. Unless getResults() waits for all the threads to finish??
I am trying to encode a 30 frames per second video using MediaCodec through the Camera's PreviewCall back(onPreviewFrame). The video that I encoded always plays very fast(this is not desired).
So, I tried to check the number of frames that is coming into my camera's preview by setting up a int frameCount variable to remember its count. What I am expecting is 30 frames per second because I setup my camera's preview to have 30 fps preview(as shown below). The result that I get back is not the same.
I called the onPreviewFrame callback for 10 second, the number of frameCount I get back is only about 100 frames. This is bad because I am expecting 300 frames. Is my camera parameters setup correctly? Is this a limitation of Android's Camera preview call back? And if this is a limitation on the Android Camera's preview call back, then is there any other camera callback that can return the camera's image data(nv21,yuv, yv12) in 30 frames per second?
thanks for reading and taking your time to helpout. i would appreciate any comments and opinions.
Here is an example an encoded video using Camera's onPreviewFrame:
http://www.youtube.com/watch?v=I1Eg2bvrHLM&feature=youtu.be
Camera.Parameters parameters = mCamera.getParameters();
parameters.setPreviewFormat(ImageFormat.NV21);
parameters.setPictureSize(previewWidth,previewHeight);
parameters.setPreviewSize(previewWidth, previewHeight);
// parameters.setPreviewFpsRange(30000,30000);
parameters.setPreviewFrameRate(30);
mCamera.setParameters(parameters);
mCamera.setPreviewCallback(previewCallback);
mCamera.setPreviewDisplay(holder);
No, Android camera does not guarantee stable frame rate, especially at 30 FPS. For example, it may choose longer exposure at low lighting conditions.
But there are some ways we, app developers, can make things worse.
First, by using setPreviewCallback() instead of setPreviewCallbackWithBuffer(). This may cause unnecessary pressure on the garbage collector.
Second, if onPreviewFrame() arrives on the main (UI) thread, you cause any UI action directly delay the camera frames arrival. To keep onPreviewFrame() on a separate thread, you should open() the camera on a secondary Looper thread. Here I explained in detail how this can be achieved: Best use of HandlerThread over other similar classes.
Third, check that processing time is less than 20ms.
I know this is a common question, however this stack trace shows something else is wrong. You can see that even though setDisplay(holder) is called inside of surfaceCreated it still throws IllegalArgumentException. This isn't a rare exception either, yesterday happening ~125,000 times in ~3,000,000 clip views. I can assure you that mCurrentPlayer is initialized correctly as well.
surfaceCreated:
#Override
public void surfaceCreated(SurfaceHolder holder) {
mIsSurfaceCreated = true;
mCurrentPlayer.setDisplay(holder);
}
surfaceDestroy:
#Override
public void surfaceDestroyed(SurfaceHolder holder) {
mIsSurfaceCreated = false;
// Could be called after player was released in onDestroy.
if (mCurrentPlayer != null) {
mCurrentPlayer.setDisplay(null);
}
}
Stacktrace:
java.lang.IllegalArgumentException: The surface has been released
at android.media.MediaPlayer._setVideoSurface(Native Method)
at android.media.MediaPlayer.setDisplay(MediaPlayer.java:660)
at com.xxx.xxx.view.VideoPlayerView.surfaceCreated(VideoPlayerView.java:464)
at android.view.SurfaceView.updateWindow(SurfaceView.java:543)
at android.view.SurfaceView.access$000(SurfaceView.java:81)
at android.view.SurfaceView$3.onPreDraw(SurfaceView.java:169)
at android.view.ViewTreeObserver.dispatchOnPreDraw(ViewTreeObserver.java:590)
at android.view.ViewRootImpl.performTraversals(ViewRootImpl.java:1644)
at android.view.ViewRootImpl.handleMessage(ViewRootImpl.java:2505)
at android.os.Handler.dispatchMessage(Handler.java:99)
at android.os.Looper.loop(Looper.java:154)
at android.app.ActivityThread.main(ActivityThread.java:4945)
at java.lang.reflect.Method.invokeNative(Native Method)
at java.lang.reflect.Method.invoke(Method.java:511)
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:784)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:551)
at dalvik.system.NativeStart.main(Native Method)
Any ideas on what else could be going wrong? Is SurfaceHolder potentially destroying the surface on a background thread and then waiting for the main thread (currently occupied by surfaceCreated) to finish it's block before it can call surfaceDestroyed on the main thread (which I don't even think locks can fix)? Something else?
Update -- After drilling down a little farther I found out what causes "The surface has been released" to be thrown:
Which references android_view_Surface_getSurface which can be found here:
This is where my lack of C++ knowledge hurts, it's looking like it's trying to lock onto the surface, and if it can't the surface returned will be null. Once it gets returned as null, IllegalArgumentException will be thrown.
I've just fighted a similar problem.
And my investigation shows, that there is a bug in the SurfaceView, which causes passing invalid surface to the surfaceCreated callback method.
This is the commit in the android repo that fixes it: link.
It seems that fix in android sources was introduced in 4.2 version. And, from crashes of application I see that crash caused by invalid surface occured only on 4.0 and 4.1.
So, I can assume that before 4.0 it was valid to pass invalid surface to the MediaPlayer. And there was change in logic of SurfaceView/MediaPlayer in 4.0 which caused this being valid no more. But code in SurfaceView was not updated before 4.2 (in which this problem in SurfaceView is fixed).
I have checked in git repo of android and really, version tagged android-4.0.1_r1 does not include fix and version tagged android-4.2.1_r1 includes it.
So, to fix it for platforms which doesn't contain fix for it, manual check if surface valid before setting it to the MediaPlayer is needed only for platforms 4.0 and after:
#Override public void surfaceCreated(final SurfaceHolder holder) {
final Surface surface = holder.getSurface();
if ( surface == null ) return;
// on pre Ice Scream Sandwich (4.0) versions invalid surfaces seems to be accepted (or at least do not cause crash)
final boolean invalidSurfaceAccepted = Build.VERSION.SDK_INT < Build.ICE_CREAM_SANDWICH;
final boolean invalidSurface = ! surface.isValid();
if ( invalidSurface && ( ! invalidSurfaceAccepted ) ) return;
_mediaPlayer.setDisplay(holder);
}
In this way, on older platforms invalid surface will be successfully set to the media player and video will playback, on platforms 4.0-4.1 it will throw invalid surfaces away (and i think that surfaceCreated will be called again with valid surface) and on 4.2 and later surfaceCreated will just not be called with invalid surface.
I had issues like this in the past using Android VideoViews/MediaPlayers. It turned out that the underlying SurfaceView was getting garbage collected. I solved it by adding an onPreparedLister to the MediaPlayer, and then holding an explicit reference to it in my class while I was using it. Maybe this helps.