Playing video in surfaceview - android

My app plays a video inside a surfaceview. And I stop the mediaplayer from playing when onPause(). However, when I return from another activity, the surfaceview is all black and starts to show frames only if I start playing video again. I want to know if there is a method to preserve the last frame when it stops. Thanks a lot.

I used 2 SurfaceViews, first for playing videos and second surf for displaying pictures.
when I return from pause, if mediaplayer is not playing, I get current frame of video by this part of code:
try {
mediaMDRetriever.setDataSource(mediaPath);
picBitmap = mediaMDRetriever.getFrameAtTime();
} catch (Exception e) {
}
and draw it on the second surfview.
you can clear second SurfaceView if needed by the following code:
Canvas surfCanvas = surfHolder2.lockCanvas();
surfCanvas.drawColor( 0, PorterDuff.Mode.CLEAR );
surfHolder2.unlockCanvasAndPost( surfCanvas );

Related

Exoplayer - displaying preview when calling seekTo multiple times

I'm using ExoPlayer for video playback and I have a seekbar to allow user to move back or forward during playback.
What I want to achieve is for user to be able to seek while also be able to preview the frame at the current time relative to the seeker position.
Problem is, the player is displaying black frame until the user lets go of the seekbar. I'm think it's because seekTo is being called multiple times and the player just doesn't have enough time to load the frame?
Any idea how to work around this?
This is how I call the seekTo inside the seekbar listener
time_range.setOnRangeSeekbarChangeListener { minValue, maxValue ->
val mediaUri = Uri.parse(media?.dataUrl)
player?.prepare(buildClipMediaSource(mediaUri, startTime.toInt(), endTime.toInt()))
player?.seekTo((startTime * 1000f).toLong())
player?.playWhenReady = true
}

Continue recording video in background - SurfaceTexture, GLSurfaceView

My first experience with background video recording on Android was with JavaCV FFMpegRecoder. It's easy to implement, just create camera instance in activity, set PreviewCallback listener in the service, on onPreviewFrame just send byte to FFMpegRecorder and don't destroy (disconnect) camera in onPause or onStop of course
But FFMpegRecorder isn't that good (cpu, memory usage)
So I found INDExOS m4m library (by Intel): https://github.com/INDExOS/media-for-mobile
It has CameraCapturerActivity.java - https://github.com/INDExOS/media-for-mobile/blob/master/samples/src/main/java/org/m4m/samples/CameraCapturerActivity.java
Seems it really doesn't eat many resources
I decided to try recording in background mode, I just simply commented its onPause method where stop recording and preview methods are executed, but it just doesn't record anything (freezes on the last frame) until I return to activity
When I set PreviewCallback listener to this class, onPreviewFrame sends byte in background ok, seems onFrameAvailable of SurfaceTexture related to delivering frames in m4m library are stopped when onPause is called from Activity
library has two onFrameAvailable listeners:
first in PreviewRender.java - https://github.com/INDExOS/media-for-mobile/blob/master/android/src/main/java/org/m4m/android/PreviewRender.java#L241, seems class contains everything that related to displaying frames in view class (so should not be important for recording video)
second in CameraSource.java - https://github.com/INDExOS/media-for-mobile/blob/master/android/src/main/java/org/m4m/android/CameraSource.java#L222
seems this is the main class that gets frames, and I guess it is used for video recording
But also it seems those classes are still related in quite things
For example if I comment createPreview method in CameraCapturerActivity.java
private void createPreview() {
surfaceView = new GLSurfaceView(getApplicationContext());
surfaceView.setDebugFlags(GLSurfaceView.DEBUG_CHECK_GL_ERROR);
((RelativeLayout) findViewById(R.id.camera_layout)).addView(surfaceView, 0);
preview = capture.createPreview(surfaceView, camera);
preview.setFillMode(fillMode);
if (getRequestedOrientation() == ActivityInfo.SCREEN_ORIENTATION_PORTRAIT) {
capture.setOrientation(90);
} else if (getRequestedOrientation() == ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE) {
capture.setOrientation(0);
}
preview.start();
}
app will run ok, of course I won't see frames, but recording won't be working when trying to press the record button, so recording doesn't work if preview wasn't created
So I need help to understand how all of this works, how could I pause preview but continue recording in background when I leave activity and resume preview when returning to activity again. I didn't work with SurfaceTexture, GLSurfaceView, only worked with ordinary SurfaceView and its holder callbacks onSurfaceCreated, onSurfaceChanged,..
I just don't see in the project something similar to onSurfaceDestroy which would stop recording when user leaves activity
I see OpenGl API, textures are also used in Grafika project https://github.com/google/grafika
So I believe there are people who worked with something like this and could know how SurfaceTexture and its callbacks (onFrameAvailable,..) works
Of course many things are related to m4m library code design itself, but still hard to understand something when you didn't work with all of this (opengl, surfacetexture,...)
UPDATE
Now I know a little about EGLContext, that we have to set it to a specific source (for preview or for recording - encoder)
I succeeded to make Grafika recording example working in background https://github.com/google/grafika/blob/master/src/com/android/grafika/ContinuousCaptureActivity.java
In that sample class I commented everything in onPause, removed mDisplaySurface and did other things
And onFrameAvailable looks like this now:
#Override // SurfaceTexture.OnFrameAvailableListener; runs on arbitrary thread
public void onFrameAvailable(SurfaceTexture surfaceTexture) {
Log.d(TAG, "frame available");
if (mEglCore == null) {
return;
}
mEncoderSurface.makeCurrent();
mCameraTexture.updateTexImage();
mCameraTexture.getTransformMatrix(mTmpMatrix);
if (!mFileSaveInProgress) {
GLES20.glViewport(0, 0, VIDEO_WIDTH, VIDEO_HEIGHT);
mFullFrameBlit.drawFrame(mTextureId, mTmpMatrix);
drawExtra(mFrameNum, VIDEO_WIDTH, VIDEO_HEIGHT);
mCircEncoder.frameAvailableSoon();
mEncoderSurface.setPresentationTime(mCameraTexture.getTimestamp());
mEncoderSurface.swapBuffers();
}
mFrameNum++;
//mHandler.sendEmptyMessage(MainHandler.MSG_FRAME_AVAILABLE);
}
So now when I press home button, it still records frames to a file I can see it later
Now I need to get back to m4m library, cause it record audio and has utils for frame processing
In Grafika, all the videos are recorded in internal storage.
Try to change the output path of video to see it in other player. Something like that:
File outputFile = Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_DOWNLOADS);
boolean isPresent = true;
if (!outputFile.exists()) {
isPresent = outputFile.mkdir();
}
if (isPresent) {
outputFile = new File(outputFile.getAbsolutePath(),"camera-test.mp4");
} else {
// Failure
}
You can change CameraCaptureActivity (Show + camera capture) example to perform what you want. Comment OnPause method and change openCamera like that:
private void openCamera(int desiredWidth, int desiredHeight) {
if (mCamera != null) {
return;
}

Android MediaRecorder saving empty file and weird camera behaviours

I am having a headache over the Camera API 1 for android. After reading all of the Internet content, I made some sample app that works OK. It creates a service, which then is used to operate with the camera in the background, so there is no preview or activity enabled. To achieve this I use a dummy SurfaceHolder, like this:
protected class MySurfaceHolder implements SurfaceHolder {
private final Surface surface;
private final SurfaceTexture surfaceTexture;
public MySurfaceHolder () {
int[] textures = new int[1];
GLES20.glGenTextures(1, textures, 0);
if (textures.length > 0) {
this.surfaceTexture = new SurfaceTexture(textures[0]);
this.surface = new Surface(this.surfaceTexture);
} else {
this.surface = null;
this.surfaceTexture = null;
}
}
[...]
}
and then I use it like this
// simplified version of my code
try {
initializeCamera(); // open camera and set Camera.Parameters
camera.setPreviewDisplay(new MySurfaceHolder());
camera.startPreview();
camera.unlock();
initializeMediaRecorder(); // create MediaRecorder, set video/audio parameters
mediaRecorder.prepare();
mediaRecorder.start();
// wait until recording finish and exit
} finally {
stopRecording();
}
the Camera and MediaRecorder initialization methods are just like the documentation states they should be (and they work).
Everything works and operates as it should. Almost everything - sometimes, under unknown circumstances the MediaRecorder creates empty files, like 32kB containing only headers and info about the video - no frames. The longer I record like this, the bigger is the file (few kB every few seconds). After 1 minute, the file weights about 80kB. Funny thing is I know that the camera is working and capturing frames (I debugged it a little showing preview frames), but the frames are not written into the output file.
Also when it happens I am not able to record in FHD (1920x1080) - I get the "start failed" message - at this time camera is not capturing frames. The same thing could happen when I use wrong (not supported) video size. I suppose in this case the message is thrown at the mediaRecorder.start(); line, and stopRecording(); is invoked but I am not sure.
After some time or after unknown action the problem is suddenly gone (I don't know when, I don't know how). It happens for sure on Android 5.1, but may happen on other versions as well.
Could this bug be related to my custom surface code?
What could cause the MediaRecorder to not write frames into a file?
Why I am not able to record in FHD, but in the same time I am able to record in HD (1280x720)?
Is there any alternative for MediaRecorder, so I can avoid these bugs?
May it happen when another app is trying to get Camera object, thus distrupting current recording? If so, how to regain access to the Camera object (I apparently am not able to do this now on some devices).
EDIT:
I think I might have a clue. I am calling
camera.setOneShotPreviewCallback(new Camera.PreviewCallback() {
// ... get current frame
}
camera.startPreview();
to get preview frame of current recording. It appears that the bug occurs when I am using this method to get preview frame (at random times). It seems flawed, because not all devices react to this thing properly (sometimes there is no preview frame...). Is there any other, better method of handling current preview frame without the real surface?

How can I play a video on a surface again after drawing black on it?

I have a video player that gets reset and loaded with a new video in one activity. One of the problems I had with this approach was that the last frame of the previous window was still shown when the next video is loaded. Several people have the same issue.
I solved this by doing the following in the onCompletion listener, similar to what is explained in this question and this other one:
mPlayer.stop();
mPlayer.release();
Canvas canvas = mPlayView.getHolder().lockCanvas();
canvas.drawColor(Color.TRANSPARENT, PorterDuff.Mode.CLEAR);
mPlayView.getHolder().unlockCanvasAndPost(canvas);
mPlayer = null;
Then later, I call this again:
mPlayer = new MediaPlayer();
mPlayer.setDataSource(videoPath);
mPlayer.setDisplay(mSurfaceHolder);
mPlayer.setScreenOnWhilePlaying(true);
mPlayer.setOnPreparedListener(this);
mPlayer.setOnCompletionListener(this);
mPlayer.setOnVideoSizeChangedListener(this);
mPlayer.setOnErrorListener(this);
mPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
mPlayer.prepare();
// wait for onPrepared callback and play
The problem is that the video play surface is still blackā€”no video is shown anymore.
How do I "reset" the canvas and make it show the video again?
Or if that does not work, how do I clear the canvas of the last frame instead? I tried this but the next loaded video still shows the last frame of the previous one.

VideoView and step play (frame by frame)

I would like to play frame by frame with the videoview.
I have this:
mVideoView.seekTo(mVideoView.getCurrentPosition()+1);
But after this I do not see the frame until I click play... I do not want that, I just want to see the next frame.
Also - can I do the same for previous frame?
Thanks in advance.
It will not seek while paused. But you can always pause right after seeking, if you use a SurfaceView and a MediaPlayer because VideoView doesn't have the onSeekComplete callback.
public void seek() {
mediaplayer.start();
mediaplayer.seekTo(mediaplayer.getCurrentPosition()+1);
}
public void onSeekComplete(MediaPlayer mp) {
mediaplayer.pause();
}
This will seek and display something. Unfortunately you will find that this does not allow you to play frame by frame.
You can use MediaMetadataRetriever to select the frame you want and next get it as a BitMap using MediaMetadataRetriever.getFrameAtTime(long timeUs). It will work much better than using a MediaPlayer.
I hope it is useful.

Categories

Resources