Android TextureView with MediaPlayer does not update for every frame - android

I am using Android MediaPlayer to play a video file on TextureView. I count the times the onSurfaceTextureUpdated callback is made and on some devices it is less than the number of the frames of the video, while in others it reaches the total number of frames.
Why is this happening? Isn't the Surface Texture supposed to be updating for every frame? Could it be a TextureView or a codec implementation issue?

This is memory issue.
Please try to finish other activities and release unneeded data before play the video on TextureView.

Related

Android - SurfaceTexture.updateTexImage and MediaPlayer

I play a video with mediaPlayer and modify it using surfaceTexture and OpenGL ES 2.0.
In the doc, it says that
surfaceTexture.updateTexImage();
will "update the texture image to the most recent frame from the image stream".
So if I call 2 times updateTexImage, the texture image will not necessarily be the 2nd frame of the video ?
If this is the case, then I guess there is no way to control the speed of the video with media player and OpenGL ?
Yes, if you call 2 times updateTexImage, it may not be 2nd frame of the video.
There is no way you can fasten the (increase fps) video than input. However, with timing of updateTexImage, you can slow down (reduced fps), skipping frames.

Android - Encode a video while decoding it

I modify a video with glsl shaders, using a SurfaceTexture and OpenGL ES 2.0. I can also encode the result video with MediaCodec.
The problem is that the only way I've found to decode the video is with MediaPlayer and SurfaceTexture, but MediaPlayer doesn't have a frame by frame decoding option. So right now, it's like a live encoding/decoding, there is no pause.
I've also tried to use seekTo / pause / start, but it would never update the texture..
So would it be possible to do a step by step decoding instead, to follow the encoding process ? I'm afraid that my current method is not very accurate.
Thanks in advance !
Yes, instead of using MediaPlayer, you need to use MediaExtractor and MediaCodec to decode it (into the same SurfaceTexture that you're already using with MediaPlayer).
An example of this would be ExtractMpegFramesTest at http://bigflake.com/mediacodec/, possibly also DecodeEditEncodeTest (or for a >= Android 5.0 async version of it, see https://github.com/mstorsjo/android-decodeencodetest).
EDIT : Wrong, mediaPlayer's stream cannot be used frame by frame, seems like it only works in "real" speed.
I've managed to do it with MediaPlayer actually. Following this answer :
stackoverflow - SurfaceTexture.OnFrameAvailableListener stops being called
Using counters, you can speed up or speed down the video stream, and synchronize it with the preview or the encoding.
But - If you want to do a real seek to a particular frame, then mstorsjo's solution is way better. In my case, I just wanted to make sure the encoding process is not going faster or slower than the video input stream.

Video is in low fps when playing on TextureView

I'm creating video filter for Android app, so I'm using TextureView to play video and filter on its SurfaceTexture.
But the FPS of video's always lower than original(30fps).
As I checked on Galaxy S3, onSurfaceTextureUpdated() only enter 5~8 times per sec although having filter or not. But on stronger device, as Samsung Galaxy J, it could increase to 10~13 times per sec
Note that this video load from SD card.
Does someone know the reason?
mVideoPlayer.reset();
mVideoPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
mVideoPlayer.setDataSource(mVideoPath);
mVideoPlayer.setSurface(new Surface(surfaceTexture));
mVideoPlayer.setLooping(true); mVideoPlayer.prepareAsync();
mVideoPlayer.setOnPreparedListener(new MediaPlayer.OnPreparedListener() {
public void onPrepared(MediaPlayer mp) {mp.start(); }
});
If the video frame is being accessed from software, then the data has to be copied a couple of times, and that will kill your FPS.
If you send the video directly to the Surface associated with the TextureView, you won't see a slowdown, but you will also have no opportunity to filter it. (Grafika has video players based on SurfaceView and TextureView, with no noticeable difference in performance. SurfaceView is slightly more efficient, but not dramatically so.)
To get real-time filtering at 30fps you can use the GPU, as in this example, though there are limits to what you can do with that.
After checked with more devices, I found the main reason cause this issue is memory.
So I finish previous activity and release data before go to this activity.
Then the FPS of video increase to 10 + on S3.
Thanks,

Camera Preview Frames interlaced with Media playback Error

I took the code from grafika for Double Decoder and changed it so that one textureview is outputting from mediaplayer and the other textureview outputting the camera preview. It would be working fine except that in the textureview with the media player output will get frames of the camera preview interlaced. It's really funky behavior and I am not doing anything dynamic to reconfigure the output viewport.
Both Textureviews & SurfaceTextures are separated and are independently assigned to their respective output -> mediaplayer output and camera preview output:
mediaPlayer.setSurface(surface);
camera.setPreviewTexture(surfaceTexture);
Running the mediaPlayer code and camera code on a separate threads does not help. It seems like the SurfaceTexture frame buffers are being somehow being shared and frames are being dropped or interfering with each other? Could this be be a performance issue? Even when the original example of the code was to be able to decode movies in parallel?

Android MediaCodec API video plays too fast

I'm currently working with Android Jelly Bean MediaCodec API to develop a simple video player.
I extract tracks, play audio and video in separate threads. The problem is that video track always is played too fast.
Where can be the problem hidden?
Both audio and video are treated almost the same way, except audio is played via AudioTrack and video is rendered to the surface.
If you render frames at maximum speed you'll hit 60fps on most devices. You need to pace it according to the presentation time stamps provided by the encoder.
For example, if the input is a format supported by Android (e.g. a typical .mp4 file), you can use the MediaExtractor class to extract each frame. The time stamp can be retrieved with getSampleTime(). You want to delay rendering by the difference between timestamps on consecutive frames -- don't assume that the first frame will have a timestamp of zero.
Also, don't assume that video frames appear at a constant rate (e.g. 30fps). For some sources the frames will arrive unevenly.
See the "Play video (TextureView)" example in Grafika, particularly the SpeedControlCallback class. The gen-eight-rects.mp4 video uses variable frame durations to exercise it. If you check the "Play at 60fps" box, the presentation time stamps are ignored.

Categories

Resources