I want to have an FPS counter that displays the frame rate of the video as it is being played to see the FPS count change from when the video was playing in regular speed until it goes to slow motion.
Is there an API in Android SDK that allows us to track every time the frames are rendered when the video plays?
Related
I am developing an android app which plays a video from a server where I stored my video content. But it takes too much time to play a video, using 300-400 kb/s it takes almost 15-20 sec.
I want to know what facts are related to stream faster and how can I solve my problem.
It could be:
The video bitrate is too high for the connection speed
The player initial buffering is too safe
If you're saying your network is 300-400kbps, you probably want to try converting your video to an adaptive format like HLS or DASH, where the player will detect the user's bandwidth and download the best quality version that can start quickly.
From there it could be player configuration, adjusting the amount of buffered video the player waits for before starting playback. But beware, reducing initial buffering can cause rebuffering later in playback.
How can i capture slow motion video in my app?
I tried using
mMediaRecorder.setVideoFrameRate(100);
but app crashes if i set the value 20 or more with IllegalStateException.
I have researched a lot.Normal video is between 24 and 30 fps.To see slow motion video we need to capture 100-120 fps but device does not allow that.But I see the default camera in my device has an option of Slow motion.Also few apps in play store allow to create slow motion videos.I also tried setting higher setCaptureRate(),but with that also normal mode video is captured.At few places it is mentioned that slow motion movie can be accomplished through OpenCV/JavaCV libraries but i failed to understand how to use these libraries to capture slow motion video in android?
From the source you provided (CamcorderProfile), all you have to do is INCREASE taken images per second:
mMediaRecorder.setVideoFrameRate(QUALITY_HIGH_SPEED_LOW);
or
mMediaRecorder.setVideoFrameRate(QUALITY_HIGH_SPEED_HIGH);
So, if you take a 100 images per seconds, and show 25 Frames per second, that recorded second takes 4 seconds to shownPlease, read the documentation on the class you are using:
public static final int QUALITY_HIGH_SPEED_LOW
High speed ( >= 100fps) quality level corresponding to the lowest available resolution.
For all the high speed profiles defined below ((from QUALITY_HIGH_SPEED_LOW to QUALITY_HIGH_SPEED_2160P), they are similar as normal recording profiles, with just higher output frame rate and bit rate. Therefore, setting these profiles with setProfile(CamcorderProfile) without specifying any other encoding parameters will produce high speed videos rather than slow motion videos that have different capture and output (playback) frame rates. To record slow motion videos, the application must set video output (playback) frame rate and bit rate appropriately via setVideoFrameRate(int) and setVideoEncodingBitRate(int) based on the slow motion factor. If the application intends to do the video recording with MediaCodec encoder, it must set each individual field of MediaFormat similarly according to this CamcorderProfile.
Although I am not able to capture smooth slow motion video without jerks but i am able to convert captured video into slow motion using ffmpeg which comes out to be very smooth and even.For integrating FFmpeg in android we can use precompiled libraries like ffmpeg-android.
As per the case in question,we can capture video from camera and then convert it into slow motion using ffmpeg.
To create a slow motion video we can use the below command-
String[] complexCommand = {"-y", "-i", inputFileAbsolutePath, "-filter_complex", "[0:v]setpts=2.0*PTS[v];[0:a]atempo=0.5[a]", "-map", "[v]", "-map", "[a]", "-b:v", "2097k", "-r", "60", "-vcodec", "mpeg4", outputFileAbsolutePath};
Here,
-y
Overwrite output files without asking
-i
ffmpeg reads from an arbitrary number of input “files” specified by the -i option
-map
Output link labels are referred to with -map.
-b:v
Set the video bitrate
-r
Set frame rate
-vcodec
Set the video codec
-filter_complex filtergraph
Define a complex filtergraph, i.e. one with arbitrary number of inputs and/or outputs.
The filter works by changing the presentation timestamp (PTS) of each video frame.To slow down your video, you have to use a multiplier greater than 1. For example, if there are two succesive frames shown at timestamps 1 and 2, and you want to slow down the video, those timestamps need to become 2 and 4, respectively.Thus, we have to multiply them by 2.0.
You can speed up or slow down audio with the atemto audio filter.The atempo filter is limited to using values between 0.5 and 2.0 (so it can slow it down to no less than half the original speed, and speed up to no more than double the input).To slow down the audio to half of its speed we have to use atempo value 0.5 .
Check out this fffmpeg video editor tutorial which I have written on my blog which includes creating slow motion video and the complete code for the tutorial here.
What worked for me was to put higher the capture rate of mMediaRecorder like:
mMediaRecorder.setVideoFrameRate(profile.videoFrameRate / 2);
mMediaRecorder.setVideoEncodingBitRate(profile.videoBitRate / 2);
mMediaRecorder.setCaptureRate(profile.videoFrameRate);
Where profile is the CamcorderProfileset with QUALITY_HIGH (I can't have more since I'm using a LG G2, API 19).
Here in my case,profile.videoFrameRate is equal to 30.
More info about setCaptureRate in the official documentation:
Set video frame capture rate. This can be used to set a different video frame capture rate than the recorded video's playback rate. This method also sets the recording mode to time lapse. In time lapse video recording, only video is recorded. Audio related parameters are ignored when a time lapse recording session starts, if an application sets them.
The video recorded result is twice as long as the initial capture. However setting the capture rate disables the audio. In my case, my max fps rate seems to be 30fps, and then it got played back at 15fps.
Hope it helps.
Try this code.It will help...
myCamera = getCameraInstance();
mediaRecorder = new MediaRecorder();
myCamera.unlock();
mediaRecorder.setCamera(myCamera);
mediaRecorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER);
mediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
mediaRecorder.setProfile(CamcorderProfile.get(CamcorderProfile.QUALITY_HIGH));
mediaRecorder.setPreviewDisplay(mSurfaceView.getHolder().getSurface());
setCaptureRate:
http://developer.android.com/reference/android/media/MediaRecorder.html#setCaptureRate%28double%29
setVideoFrameRate:
http://developer.android.com/reference/android/media/MediaRecorder.html#setVideoFrameRate%28int%29
What is the diffrence between setCaptureRate() and setVideoFrameRate() API's in MediaRecorder Class of Android?
From documentation for setCaptureRate():
Set video frame capture rate. This can be used to set a different
video frame capture rate than the recorded video's playback rate. This
method also sets the recording mode to time lapse. In time lapse video
recording, only video is recorded. Audio related parameters are
ignored when a time lapse recording session starts, if an application
sets them.
let's use 0.1fps as parameter so
setCaptureRate(0.1) means 1 frame per 10 seconds. 0.001 would be 1 frame per 100 seconds.
I've found example: here
and setVideoFrameRate() is what you know:
Sets the frame rate of the video to be captured
25fps, 30fps, 60fps -
You can see difference also in paramater. setCaptureRate() can accept decimals but setVideoFrameRate() works only with integers.
I'm very new to developing for android through Adobe Air, and semi new to using actionscript 3.
I'm recoding an old actionscript 2 game I made as actionscript 3 and android deployment, and have been going well with the coding part, no problems. But one strange issue has me stumped!
When I skip a movie clip on a timeline, any sound on the timeline of that movie clip or any movie clips within it starts playing! I'm not sure how the movieclip even exists after being skipped, or why it would ignore any stops(); on the timeline of that movieclip and play the sound anyway.
For example, I have my main menu on frame 3. One of the functions for a button there sends the main timeline to gotoAndStop frame 6, where my actual game screen is located.
However, there is a mission briefing movieclip on frame 4, and the audio for this movieclip plays after skipping from frame 3 to frame 6, even though if it were somehow to exist it should not play the audio since there is a stop(); on it's frame 1 before the streaming audio begins.
I had similar problems with map objects in my massive map sections movieclip, Eg. skipping to map number 100 would play various sounds from movieclips present on frames 1-99 of that map MC, over and over.
Any ideas how this could happen? I always assumed timeline frames skipped with a gotoAndWhatever were not actually loaded, and even if they were any MC's present on skipped frames should no exist after that frame is no longer the current timeline frame.
I have been having hardware problems with my pc and a new one comes in a week, but I don't think this is the reason for this problem.
Any ideas?
////EDIT UPDATE
Ok, so I did a test. I started a new adobe air file. I then created a movieclip with an event sound inside it, on frame 99, then added a stop on frame 100. I then placed that movieclip on frame 5 of the main timeline.
On frame 1 of the main timeline, i put a gotoAndStop 10 and added a blank frame there for it to land on. This works fine, no sounds played and my MC was skipped properly.
BUT if when i converted that movieclip to a movieclip again (so two layers of movieclip now, with the sound inside the child movieclip) and now the sound plays when skipped, over and over again, ignoring the stop i placed after it... I hope I explained that well enough, I'm stumped!!
Taken from http://forums.adobe.com/message/4451672:
Click in the frame that has audio. Open the properties panel and check the Sync drop-down. It is set to Event by default. Change it to stream. Now make sure the frame that has the audio has enough frames to play the full audio.
By using Sync:Event you are asking flash to play the audio in the frame regardless if it's in that frame or not. Stream only plays if the playhead is currently playing that clip. The second the playhead is outside the clip, when set to stream, it will stop playing.
So in your screenshot, after you set Sync:Stream you will need to add frames (F5) to the layer that contains the audio (you can see the waveform). If you don't you'll only hear however many frames of audio you extend.
I'm currently working with Android Jelly Bean MediaCodec API to develop a simple video player.
I extract tracks, play audio and video in separate threads. The problem is that video track always is played too fast.
Where can be the problem hidden?
Both audio and video are treated almost the same way, except audio is played via AudioTrack and video is rendered to the surface.
If you render frames at maximum speed you'll hit 60fps on most devices. You need to pace it according to the presentation time stamps provided by the encoder.
For example, if the input is a format supported by Android (e.g. a typical .mp4 file), you can use the MediaExtractor class to extract each frame. The time stamp can be retrieved with getSampleTime(). You want to delay rendering by the difference between timestamps on consecutive frames -- don't assume that the first frame will have a timestamp of zero.
Also, don't assume that video frames appear at a constant rate (e.g. 30fps). For some sources the frames will arrive unevenly.
See the "Play video (TextureView)" example in Grafika, particularly the SpeedControlCallback class. The gen-eight-rects.mp4 video uses variable frame durations to exercise it. If you check the "Play at 60fps" box, the presentation time stamps are ignored.