I'm creating video filter for Android app, so I'm using TextureView to play video and filter on its SurfaceTexture.
But the FPS of video's always lower than original(30fps).
As I checked on Galaxy S3, onSurfaceTextureUpdated() only enter 5~8 times per sec although having filter or not. But on stronger device, as Samsung Galaxy J, it could increase to 10~13 times per sec
Note that this video load from SD card.
Does someone know the reason?
mVideoPlayer.reset();
mVideoPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
mVideoPlayer.setDataSource(mVideoPath);
mVideoPlayer.setSurface(new Surface(surfaceTexture));
mVideoPlayer.setLooping(true); mVideoPlayer.prepareAsync();
mVideoPlayer.setOnPreparedListener(new MediaPlayer.OnPreparedListener() {
public void onPrepared(MediaPlayer mp) {mp.start(); }
});
If the video frame is being accessed from software, then the data has to be copied a couple of times, and that will kill your FPS.
If you send the video directly to the Surface associated with the TextureView, you won't see a slowdown, but you will also have no opportunity to filter it. (Grafika has video players based on SurfaceView and TextureView, with no noticeable difference in performance. SurfaceView is slightly more efficient, but not dramatically so.)
To get real-time filtering at 30fps you can use the GPU, as in this example, though there are limits to what you can do with that.
After checked with more devices, I found the main reason cause this issue is memory.
So I finish previous activity and release data before go to this activity.
Then the FPS of video increase to 10 + on S3.
Thanks,
Related
I have a simple Android media player that can play multiple videos simultaneously on a single screen. So basically a single media player screen is divided into 4 parts, with 4 mediaPlayer instance glued together, and each part plays a given video.
It works almost OK when my video files are stored locally on the device. There are synchronization problems, but minor. But when I input a URL for HTTP streaming, there is significant synchronization problems. What is the problem? Generally, how can I remove the synchronization problems?
The only thing I could do was first instantiate the mediaplayers and prepare() them, then call start() one after the other so at least the start times be close to eachother. It doesn't have much effect though.
Here I have a method that return each of the mediaplayer instances:
MediaPlayer mediaPreparation(String filename, boolean setMute) {
String url = "myURL"; // your URL here
// create mediaplayer instance
MediaPlayer mediaPlayer = new MediaPlayer();
if (setMute) {
mediaPlayer.setVolume(0, 0);
}
try {
mediaPlayer.setDataSource(url);
mediaPlayer.prepare();
} catch (IOException e) {
}
mediaPlayer.setLooping(true);
// mediaPlayer.start();
return mediaPlayer;
}
And then I start them one by one:
mp[0].start();
mp[1].start();
mp[2].start();
mp[3].start();
In streaming cases, there is always a risk of data being not continuously available, so players buffer quite a few frames before start playing. And in this case, multiple streams might take different time to get buffered for sufficient quantity. I see one way you can try, mediacodec. Refer this, https://developer.android.com/reference/android/media/MediaCodec.html.
Go through particularly, releaseOutputBuffer() and its variants. You have more control over rendering (alter the timestamp if required, though I won't advice as playback won't be smooth). You can keep track of whether all 4 instances got a particular timestamped frame decoded or not and then render them at once.
I'm not sure if any Android media player offers this functionality.
I suspect there may be device dependencies also, as different devices may have different capabilities in the HW to decode and play multiple videos, and if some of your videos have to use SW decoding etc they will be much slower.
It may not meet your needs, but a common way to provide a grid of videos like this on an end device is to merge the videos together on the server side and deliver it to the device as a single video stream.
Update
One other thing to be aware of if using MediaCodec and leveraging the HW codecs - if the videos have different video profiles this can cause different decoding latency also.
This is to do with how the videos are encoded - in simple terms if a particular frame refers to information from a frame that comes after it (a common compression approach) then the decoder needs to buffer the frame until it has the refereed to frame also. Simpler compression approaches, for using Baseline profile, do not use this technique so don't have to buffer and hence may have lower latency. This appears to be different for different HW vendors also - see this note from Intel, in particular the low latency section at the end:
https://software.intel.com/en-us/android/articles/android-hardware-codec-mediacodec
I suspect the best approach to this particular aspect is to aim for the lowest common dominator - either only use Baseline profile or else try to delay all video display by some factor longer than the maximum latency you can expect from any individual video.
I am Writing video player in android. So far i could able to capture the frames, with the help of av_read_frame and avcodec_decode_video2, and updating to SDL2.0. I have followed dranger tutorial02.c http://dranger.com/ffmpeg/ .
Sudo Code is :
while (1)
{
1. Read packet
2. check if video frame; if not Go to Step 3.
2.1 if video frame, then update with SDL_UpdateYUVTexture,
3. Handle SDL Event
4. Clear the Renderer.
5. Present Renderer.
}
I wonder, do i need to take care of synchronization of video, dts/pts calculation while i need only to display video?
This scenario works well in the samsung, but not in other mobiles.
What woud be your advice?
It depends. If you're ok with the fact that your video will a) play as fast as the device can decode it and b) will play with different speed on different devices and even on the same device depending on other processes, then you don't need to synchronize, and can just dump the frames as soon as they're decoded.
Otherwise you still need to synchronize the video output to PTS. Since you don't have audio, and won't have audio clock, your only option would be to synchronize the video to the system clocks which makes it simpler.
I am working on application that does some real time image processing on camera frames. For that, I use preview callback's method onPreviewFrame. This works fine for cameras that support preview frames that have resolution at least 640x480 or larger. But when camera does not support such large camera preview resolution, application is programmed to refuse processing such frames. Now, the problem I have is with phones like Sony Xperia Go. It is a very nice device that can record video up to resolution 1280x720, but unfortunately maximum camera preview size is 480x320, which is too small for my needs.
What I would like to know is how to obtain these larger camera frames (up to 1280x720 or more)? Obviously it has to be possible because camera application has the ability to record videos in that resolution - therefore this application somehow must be able to access those larger frames. How to do the same from my application?
Application has to support Android 2.1 and later, but I would be very happy even if I find the solution for my problem only for Android 4.0 or newer.
This question is similar to http://stackoverflow.com/questions/8839109/processing-android-video-frame-by-frame-while-recording, but I don't need to save the video - I only need those high resolution video frames...
It seems the only thing you can do is decoding frames from MediaRecoder data.
You may use ffmpeg to decode recoreder data from LocalSocket.
Hope the following open source projects may help:
ipcamera-for-android: https://code.google.com/p/ipcamera-for-android/
spydroid-ipcamera: https://code.google.com/p/spydroid-ipcamera/
You should probably take a look at the OpenCV library.
It has methods that allow you to receive full frames.
I have an impression: video preview size is small, and is slow, slower than the set video recording frame rate.
I was once trying to look for solutions on this. It seems a better way is to get the video stream from the video recorder, then directly process the data from the video stream.
You could find some examples on Android ip-camera.
You can use this library:
https://github.com/natario1/CameraView
This library has addFrameProcessor listener that in process function has Frame parameter.
If you need to record video while frame processing, you need to use from takeVideoSnapshot function of CameraView. takeVideo stop frame processing until complete video recording in latest version I tested 2.6.4.
I implemented a video recorder in my code and it runs perfectly on almost all the devices
except to HTC One X. There the video record getting stuck(the first image doesn't change) and when I'm trying to open the file I'm receiving a pop-up "Cannot play video, sorry this video cannot be played"
Here are my settings
mMediaRecorder.setPreviewDisplay(mSurfaceHolder.getSurface());
// Use the same frame rate for both, since internally
// if the frame rate is too large, it can cause camera to become
// unstable. We need to fix the MediaRecorder to disable the support
// of setting frame rate for now.
mMediaRecorder.setVideoFrameRate(mProfile.videoFrameRate);
//mMediaRecorder.setVideoSize(mVideoWidth, mVideoHeight);
mMediaRecorder.setVideoSize(640,480); // Works On Note(not on HTC One X)
mMediaRecorder.setVideoEncodingBitRate(MAXIMAL_PERMITTED_VIDEO_ENCODING_BITRATE);
// mMediaRecorder.setVideoEncoder(mProfile.videoCodec);
mMediaRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.DEFAULT);
// mMediaRecorder.setAudioEncoder(mProfile.audioCodec);
mMediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.DEFAULT);
Thanks
I adapted some code from this question How can I capture a video recording on Android? for recording video, set it to 640x480, and it ran fine on my AT&T One X:
https://raw.github.com/lnanek/Misc/master/HtcOneXVideoRecord/src/com/htc/sample/videorecord/RecordVideo.java
So it isn't the 640x480 that isn't working in and of itself. What's the value for the bitrate you are setting? Have you considered using profiles instead, which are build in supported combinations? For example, you would set:
mRecorder.setProfile(CamcorderProfile.get(CamcorderProfile.QUALITY_HIGH)
And that would set the resolution, bit rate, etc. to values that work. There are various constants for high quality recording, low, etc..
I have a game in which a "ding" sound is made for each point scored in a game. Sometimes you can score points in very quick succession. In this case I do not allow overlapping sounds so I wait for the mediaplayer's isPlaying() function to go to false before playing the next sound.
On every phone I've tried so far (admittedly all 2.2 or 2.3) the result is a pleasing rapid-fire succession of sounds.
But just now I've tried Samsung galaxy S II with 4.0.3. On this machine each "ding" is separated by a long gap. The isPlaying() state seems to last twice as long as the sound itself. According to Audacity the sound should last about 0.1 seconds, but isPlaying() is remaining true for .28 seconds.
The sound is saved from Audacity into Ogg Vorbis format.
Any idea what's gone wrong?
It's better to use SoundPool for playing rapid-fire samples in games as they're uncompressed once and kept in memory. MediaPlayer may be decoding on the fly, causing a delay as it gets ready. Not sure why there's such a difference between devices, but I'd give SoundPool a try and see if it improves things.