I have two media files located locally - mp4 video and m4a audio, which have to be played in sync. I use MediaPlayer objects for this purpose, all start/pause methods are called simultaneously.
Sometimes I see the difference between audio and video right after players start, sometimes after tapping on pause/resume.
I added logs, and see that after pausing media players, their position differs (e.g. MediaPlayer with audio file: 1820ms, MediaPlayer with video file: 1760ms).
One more interesting thing is that seekTo operation with audio file works good, while with video it's really unpredictable.
Please suggest what is the reason of such a behavior and which solutions are available in order to fix that?
I think that this effect with the video timing depends on how the " key frames", which contain a full frame of information have been encoded. It's only possible to locate to one of these keyframes.
Video editing software gets over this by rolling the deltas from intervening i-frames to the exact seek position.
Related
Whenever I call setDisplay() on a MediaPlayer while it is playing (i.e. setDisplay(null) when backgrounding a video to play audio in the background) the media player stutters and rewinds for some small amount of time. Once I comment out these lines, essentially restricting the application to only play audio, these stutters and consequent rewinds no longer occur. Any ideas what the issue may be?
EDIT:
So I think the cause of the problem might be android media player choosing the closest keyframe to resume from on each display change because it can''t render inbetween keyframes. Any ideas?
Has anyone tried using ExoPlayer to achieve this?
I tried looking online with no success.
When I say gapless playback, I am referring to the problem of using the media player to play local videos back to back. After the first video is done playing, there is a noticeable delay of 1 second before the second video starts.
Hoping this question helps in understanding this issue further.
For reference please look at the following question:
Android: MediaPlayer gapless or seamless Video Playing
ExoPlayer 2, which is now officially released, seems to support gapless playback using the ConcatenatingMediaSource class. From its developer guide:
Transitions between sources are seamless. There is no requirement that the sources being concatenated are of the same format (e.g. it’s fine to concatenate a video file containing 480p H264 with one that contains 720p VP9). The sources may even be of different types (e.g. it’s fine to concatenate a video with an audio only stream).
And the example code:
MediaSource firstSource = new ExtractorMediaSource(firstVideoUri, ...);
MediaSource secondSource = new ExtractorMediaSource(secondVideoUri, ...);
// Plays the first video, then the second video.
ConcatenatingMediaSource concatenatedSource =
new ConcatenatingMediaSource(firstSource, secondSource);
EDIT: ExoPlayer 2 supports gapless playback, but as of the time of writing is still unreleased as a stable version.
You will most likely never be able to achieve perfect gapless playback of multiple tracks with ExoPlayer or Android Media Player. Neither have been written to support starting multiple tracks and I imagine it will stay out of scope for both of them.
You can achieve gapless playback by using 2 different player instances, once you have started and played the first, you can load the second and start playback once the first finishes. Using this method you could have a gapless solution, as long as you prepare the second video during playback of the first.
To take it further, you can also use 2 different surface textures for rendering the multiple videos, once the first video reaches the end you could fade out the texture and fade in the new one. Resulting in a nice seamless video effect.
Because of the nature of playing multiple videos at once you will most likely want to create your own timer for incrementing the time and deciding when to switch to the next video, rather than trying to use the callbacks from ExoPlayer or Android Media. This will allow you to keep track of the time in a more accurate fashion, without needing to keep talking to multiple video codecs.
I know this is not the answer you've been looking for, but it's the only reasonable answer. The sole way to ensure no gaps in playback is to download the entire file first and begin playback when it's done. Otherwise, in the event that you lose connectivity before the file is finished downloading, pausing is inescapable.
I just tried switching to ExoPlayer from the standard MediaPlayer implementation and the gap is the same if not worse. However I have used a very simple method of restarting the player when the status changes to ended. I don't know if there's a better proper way to do it, perhaps with 2 different ExoPlayers.
Trying to read from the same MP4 file with a MediaCodec (decoder) instance and a MediaPlayer instance doesn't work as expected. When pausing (or seeking) and then resuming playback, the MediaPlayer's position is unpredictable and will often jump ahead of the MediaCodec decoder by 100-500 ms.
In my application, the decoder is handling H.264 video frames and the MediaPlayer is playing AAC audio. Both have been initialized with the same on-device MP4 file.
Some observations:
Playback from the beginning of the file (without pausing or seeking) works fine
The audio nearly always jumps ahead of the video after seeking or pausing
Profiling showed that the "jump" appears to happen just after resuming playback; prior to resuming, both components report the same playback position. Just after resuming, there is a large and sudden change in the MediaPlayer's position.
The audio remains ahead of the video by a constant amount after the jump
Letting each component use its own identical copy of the MP4 works fine, and that's how I solved the problem for now. I am aware that MediaCodec can be used to process audio, but I would prefer to avoid that approach.
After combing through the relevant AOSP code (MediaCodec, MediaPlayer, and some JNI classes), it isn't clear to me how these components would be interfering with one another. Do they share a low-level Binder object? Is the MediaPlayer somehow (re)using the cursor that the MediaCodec decoder is using to fill its input buffers?
I'm working on an android app that plays video (using video view). the video is meant to have both music (left and right) and narration, but I want to selectively be able to turn off the narration track in the MediaPlayer.
Is the way to do this correctly to encode by mp4 video file with 3 audio tracks (right left and narration) and then turn off the naration audio track with deselectTrack()?
Not clear to me from the documentation that MediaPlayer can handle more than 2 audio tracks.
If the audio tracks are limited to 2, would it make sense to run two media player simultaneously (synching them up with seekTo())when I want the narration track to play?
Thanks.
Sorry to burst your bubble, but...
1) You have a misunderstanding about what a "track" denotes. A track can have multiple channels (e.g., a stereo track has left and right channels). As I understand it, stereo is the extent of the Android AudioTrack implementation at present. I haven't yet checked if the OpenSL implementation is more extensive than the Java API.
2) Only 1 audio track can be selected at a time, so you wouldn't be able to have background and narration simultaneously in the way you were thinking.
3) Audio tracks can only be selected in the prepared state (i.e., not after playback has started). The documentation mentions this limitation is not ideal, so it will probably change in the future. If not for this problem, your goal could be accomplished with two audio tracks encoded in the stream, one with both background & narration, the other just background.
You will probably find it difficult to synchronize two MediaPlayers, but I haven't tried. Maybe this approach would be acceptable for your situation, although be forewarned the seekTo method isn't accurate. It depends on the encoding of the files.
Something I would try if I were you is to have two complete encoded videos, one with narration, the other without. Use two MediaPlayers and keep them both prepared. When you want to switch use seekTo to put the correct one at (or near) the desired location. That way you don't have to worry about synchronization. If the video is large, this method could use significantly more resources, though.
I want to stream an audio mp3 file and then play it through android media player plus I also want to cache this file, so that mediaplayer don't have to stream for recently played tracks.
I have tried using prepareAsync method but it doesn't give me access to buffer content, so I have decided to stream the audio file myself and then pass it to the media player for playing. I have achieved this by following this article here but this approach has a problem i.e. while transferring the file to media player it goes into error mode which causes my player to behave inconsistently.
When media player enters its error mode it doesn't come out of it automatically so I am forced to create a new media player and then re-provide it the downloaded file, this workaround causes the user to experience an undesired pause in the song playing.
So, does any one have improved an version of code given in above link? or do they know a better solution to this problem or is there is actually a library for streaming an audio file in android?
Thanks
The link you provided looks like a less than ideal solution (not to mention outdated). What you probably want is a local proxy server that gives you access to byte data before the MediaPlayer gets it. See my answer here for a little more explanation.