I know this is a common question, however this stack trace shows something else is wrong. You can see that even though setDisplay(holder) is called inside of surfaceCreated it still throws IllegalArgumentException. This isn't a rare exception either, yesterday happening ~125,000 times in ~3,000,000 clip views. I can assure you that mCurrentPlayer is initialized correctly as well.
surfaceCreated:
#Override
public void surfaceCreated(SurfaceHolder holder) {
mIsSurfaceCreated = true;
mCurrentPlayer.setDisplay(holder);
}
surfaceDestroy:
#Override
public void surfaceDestroyed(SurfaceHolder holder) {
mIsSurfaceCreated = false;
// Could be called after player was released in onDestroy.
if (mCurrentPlayer != null) {
mCurrentPlayer.setDisplay(null);
}
}
Stacktrace:
java.lang.IllegalArgumentException: The surface has been released
at android.media.MediaPlayer._setVideoSurface(Native Method)
at android.media.MediaPlayer.setDisplay(MediaPlayer.java:660)
at com.xxx.xxx.view.VideoPlayerView.surfaceCreated(VideoPlayerView.java:464)
at android.view.SurfaceView.updateWindow(SurfaceView.java:543)
at android.view.SurfaceView.access$000(SurfaceView.java:81)
at android.view.SurfaceView$3.onPreDraw(SurfaceView.java:169)
at android.view.ViewTreeObserver.dispatchOnPreDraw(ViewTreeObserver.java:590)
at android.view.ViewRootImpl.performTraversals(ViewRootImpl.java:1644)
at android.view.ViewRootImpl.handleMessage(ViewRootImpl.java:2505)
at android.os.Handler.dispatchMessage(Handler.java:99)
at android.os.Looper.loop(Looper.java:154)
at android.app.ActivityThread.main(ActivityThread.java:4945)
at java.lang.reflect.Method.invokeNative(Native Method)
at java.lang.reflect.Method.invoke(Method.java:511)
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:784)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:551)
at dalvik.system.NativeStart.main(Native Method)
Any ideas on what else could be going wrong? Is SurfaceHolder potentially destroying the surface on a background thread and then waiting for the main thread (currently occupied by surfaceCreated) to finish it's block before it can call surfaceDestroyed on the main thread (which I don't even think locks can fix)? Something else?
Update -- After drilling down a little farther I found out what causes "The surface has been released" to be thrown:
Which references android_view_Surface_getSurface which can be found here:
This is where my lack of C++ knowledge hurts, it's looking like it's trying to lock onto the surface, and if it can't the surface returned will be null. Once it gets returned as null, IllegalArgumentException will be thrown.
I've just fighted a similar problem.
And my investigation shows, that there is a bug in the SurfaceView, which causes passing invalid surface to the surfaceCreated callback method.
This is the commit in the android repo that fixes it: link.
It seems that fix in android sources was introduced in 4.2 version. And, from crashes of application I see that crash caused by invalid surface occured only on 4.0 and 4.1.
So, I can assume that before 4.0 it was valid to pass invalid surface to the MediaPlayer. And there was change in logic of SurfaceView/MediaPlayer in 4.0 which caused this being valid no more. But code in SurfaceView was not updated before 4.2 (in which this problem in SurfaceView is fixed).
I have checked in git repo of android and really, version tagged android-4.0.1_r1 does not include fix and version tagged android-4.2.1_r1 includes it.
So, to fix it for platforms which doesn't contain fix for it, manual check if surface valid before setting it to the MediaPlayer is needed only for platforms 4.0 and after:
#Override public void surfaceCreated(final SurfaceHolder holder) {
final Surface surface = holder.getSurface();
if ( surface == null ) return;
// on pre Ice Scream Sandwich (4.0) versions invalid surfaces seems to be accepted (or at least do not cause crash)
final boolean invalidSurfaceAccepted = Build.VERSION.SDK_INT < Build.ICE_CREAM_SANDWICH;
final boolean invalidSurface = ! surface.isValid();
if ( invalidSurface && ( ! invalidSurfaceAccepted ) ) return;
_mediaPlayer.setDisplay(holder);
}
In this way, on older platforms invalid surface will be successfully set to the media player and video will playback, on platforms 4.0-4.1 it will throw invalid surfaces away (and i think that surfaceCreated will be called again with valid surface) and on 4.2 and later surfaceCreated will just not be called with invalid surface.
I had issues like this in the past using Android VideoViews/MediaPlayers. It turned out that the underlying SurfaceView was getting garbage collected. I solved it by adding an onPreparedLister to the MediaPlayer, and then holding an explicit reference to it in my class while I was using it. Maybe this helps.
Related
I'm working on a video processing app. The app has one Activity that contains a Fragment. The Fragment in turn contains a VideoSurfaceView derived from GLSurfaceView for me to show the preview of the video with effect (using OpenGL) to users. After previewing, users can start processing the video.
To process the video, I mainly apply the method described in here.
Everything works fine on most devices, but the Oppo Mirror 3 (Android 4.4). On this device, everytime I try to create an Surface using MediaCodec.createInputSurface(), it throws out java.lang.IllegalStateException with code -38.
E/OMXMaster: A component of name 'OMX.qcom.audio.decoder.aac' already exists, ignoring this one.
E/SoftAVCEncoder: internalSetParameter: StoreMetadataInBuffersParams.nPortIndex not zero!
E/OMXNodeInstance: OMX_SetParameter() failed for StoreMetaDataInBuffers: 0x80001001
E/ACodec: [OMX.google.h264.encoder] storeMetaDataInBuffers (output) failed w/ err -2147483648
E/OMXNodeInstance: createInputSurface requires COLOR_FormatSurface (AndroidOpaque) color format
E/ACodec: [OMX.google.h264.encoder] onCreateInputSurface returning error -38
E/VideoProcessing: java.lang.IllegalStateException
at android.media.MediaCodec.createInputSurface(Native Method)
at com.ltpquang.android.core.processing.codec.VideoEncoder.<init>(VideoEncoder.java:46)
at com.ltpquang.android.core.VideoProcessing.setupVideo(VideoProcessing.java:200)
at com.ltpquang.android.core.VideoProcessing.<init>(VideoProcessing.java:167)
at com.ltpquang.android.ui.activity.PreviewEditActivity.lambda$btNext$12(PreviewEditActivity.java:723)
at com.ltpquang.android.ui.activity.PreviewEditActivity.access$lambda$12(PreviewEditActivity.java)
at com.ltpquang.android.ui.activity.PreviewEditActivity$$Lambda$13.run(Unknown Source)
at java.lang.Thread.run(Thread.java:841)
Playing around a little bit, I observed that:
BEFORE creating and adding the VideoSurfaceView to the layout, I can create MediaCodec encoder and obtain the input surface successfully. And I can create as many as I want if I release the previous MediaCodec before creating a new one, otherwise I can only obtain one and only one input surface regardless how many MediaCodec I have.
AFTER creating and adding the VideoSurfaceView to the layout, there is no chance that I can get the input surface from the MediaCodec, it thows java.lang.IllegalStateException always.
I've tried removing the VideoSurfaceView from the layout, set it to null, before creating the surface, but no luck for me.
I also tried with suggestions from here, or here, but they didn't help.
From this, it seems that my device can only get the software codec. So that I cant create the input surface.
My question is:
Why was that?
If the device's resources is limited, what can I do (release something for example) to continue the process?
If it is related to the software codec, what should I do? How can I detect and release the resource?
Is this related to GL contexts? If yes, what should I do? Should I manage the contexts my self?
I'm trying a camera capture application with SurfaceView.
The application's overview is
It can detect Face in realtime (on camera view).
It can store the movie.
I'm using android.media.MediaRecord class for saving the movie.
myRecorder = new MediaRecorder();
myRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
myRecorder.setOutputFile(Environment.getExternalStorageDirectory() + "/Movies/sample.3gp");
myRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
myRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.MPEG_4_SP);
myRecorder.setVideoFrameRate(30);
myRecorder.setVideoSize(320, 240);
myRecorder.prepare();
myRecorder.start();
after avobe step,RuntimeException occur in below step.(camera is android.hardware.Camera object)
#Override
public void onPreviewFrame(byte[] data, Camera camera) {
//Log.d(TAG, "onPreviewFrame: ");
int width = camera.getParameters().getPreviewSize().width; <--
int height = camera.getParameters().getPreviewSize().height;
03-22 22:54:09.134 27875-27875/? E/AndroidRuntime: FATAL EXCEPTION: main
Process: wbcompany.co.jp.facedetector3, PID: 27875
java.lang.RuntimeException: getParameters failed (empty parameters)
at android.hardware.Camera.native_getParameters(Native Method)
at android.hardware.Camera.getParameters(Camera.java:2019)
at wbcompany.co.jp.facedetector3.CameraView.onPreviewFrame(CameraView.java:150)
at android.hardware.Camera$EventHandler.handleMessage(Camera.java:1192)
at android.os.Handler.dispatchMessage(Handler.java:102)
at android.os.Looper.loop(Looper.java:154)
at android.app.ActivityThread.main(ActivityThread.java:6189)
at java.lang.reflect.Method.invoke(Native Method)
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:866)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:756)
before calling 'myRecorder.start()',this Exception doesn't occur.
I have no idea of this error's solution.
Please give me the solution of this problem.
My runtime enviroment:
Android 7.0/ API level 24
This is a strange error message, but the case is very real. When the camera is busy in MediaRecorder, it will not be accessible for other uses.
Generally speaking, access to camera parameters may be very inefficient on some devices. It is strongly recommended not to call camera.getParameters() for every preview frame. Create local fields in your CameraView class or in the activity that embeds it, and store width and height there when you start preview. They will not change unless you explicitly stop the camera and change its configuration.
If I am not missing something, your onPreviewFrame() callback happens on the main (UI) thread. The good practice is to call Camera.open() on a separate HandlerThread, to keep preview processing from slowing down UI.
I'm having a problem with OpenSL ES on Android. I'm using OpenSL to play sound effects. Currently I'm creating a new player each time I play a sound. (I know this isn't terribly efficient, but it's "good enough" for the time being.)
After a while of playback, I start to get these errors:
E/libOpenSLES(25131): Too many objects
W/libOpenSLES(25131): Leaving Engine::CreateAudioPlayer (SL_RESULT_MEMORY_FAILURE)
I'm tracking my create/destroy pattern and I never go above 4 outstanding objects at any given time, well below the system limit of 32. Of course, this is assuming that the Destroy is properly working.
My only guess right now is that I'm doing something incorrectly when I clean up the player objects. One possible issue is that the Destroy is often called in the context of the player callback (basically destroying the player after it's finished playing), although I can't find any reference suggesting this is a problem. Are there any other cleanup steps I should be taking besides "Destroy"-ing the player object? Do the Interfaces need to be cleaned up somehow as well?
-- Added --
After more testing, it happens consistently after the 30th player is created (there is an engine and a mix too, so that brings the total to 32 objects). So I must not be destroying the object properly. Here's the code--I'd love to know what's going wrong:
SLuint32 playerState = 0;
SLresult result = (*pPlayerObject)->GetState(pPlayerObject, &playerState);
return_if_fail(result);
if (playerState == SL_OBJECT_STATE_REALIZED)
{
(*pPlayerObject)->AbortAsyncOperation(pPlayerObject);
(*pPlayerObject)->Destroy(pPlayerObject);
}
else
{
__android_log_print(1, LOG_TAG, "Player object in unexpected state (%d)", playerState);
return 1002;
}
if (playerState == SL_OBJECT_STATE_REALIZED)
is not needed. Try to do it always.
AbortAsyncOperation is called in Destroy => not needed.
So try just (*pPlayerObject)->Destroy(pPlayerObject); it should be enough.
Edit:
I tested, and found solution.
You cannot call Destroy() from player callback. Should make "destroy" list and destroy it somewhere else, for example, in main thread.
In the sample code for RandomMusicPlayer, reset() is called right before release():
// stop and release the Media Player, if it's available
if (releaseMediaPlayer && mPlayer != null) {
mPlayer.reset();
mPlayer.release();
mPlayer = null;
}
Is this really necessary? Shouldn't the release take care of everything that would possibly need reset?
Per the documentation, release() can be executed at any time. It is not necessary to call reset() first, nor is it necessary to set the player to null afterwards (GC should dispose of it in due time).
From the docs:
After release(), the object is no longer available.
That said, I've ran into a few issues with MediaPlayer and its documentation. It's a very complex object to work with and tends to be a little buggy at times especially after throwing an error (with no explanation of the error code anywhere to be found!)
Shouldn't the release take care of everything that would possibly need reset?
Well, the MediaPlayer can be quite tricky. You need to understand the states that a MediaPlayer can be in and the calls that are allowed in those different states. The state diagram and valid/invalid states are here - http://developer.android.com/reference/android/media/MediaPlayer.html#StateDiagram
The reason the code sample you provided calls reset() is just a defensive measure to uninitialize the mediaPlayer object to clean everything up properly. Strictly speaking, everything should be fine if you just call release(), but i'm not 100% sure about that.
I've been developing a live wallpaper using GLWallpaperService, and have gotten good results overall. It runs rock-solid in the emulator and looks good. I've dealt with OpenGL many times before so have a solid command of how to do things... unfortunately I'm having a hell of a time getting this to actually be stable on the actual hardware.
The basic symption occurs when you slide the physical keyboard on a Motorola Droid in and out a few times. This causes the wallpaper to get destroyed/recreated several times in quick succession -- which would be fine, as I have my assets clearing in onDestroy and reloading in onSurfaceChanged. The problem is after a few iterations of this, (four or five, maybe) the calls to onSurfaceChanged completely stop, and i get an endless string of this printed to the log:
04-02 00:53:18.088: WARN/SharedBufferStack(1032): waitForCondition(ReallocateCondition) timed out (identity=337, status=0). CPU may be pegged. trying again.
Is there something I should be implementing here aside from the Android-typical onSurfaceCreated/onSurfaceChanged/onSurfaceDestroyed triumvirate? Browsing through the WallpaperService and WallpaperRenderer classes doesn't pop up anything obvious to me.
I had a similar problem. The error was that I needed to call to "unlockCanvasAndPost":
Canvas c = null;
**try {**
c = holder.lockCanvas(null);
synchronized (holder) {
instance.doDraw(c);
}
**} finally {
// do this in a finally so that if an exception is thrown
// during the above, we don't leave the Surface in an
// inconsistent state
if (c != null) {
holder.unlockCanvasAndPost(c);
}
}**