I am develping a media player for my learning purpose and I want to have crossfading feature in media player app. But I don't have clue where to start from. i tried searching on inernet but no luck. I am using Android MediaPlayer class for all media player related operation. Anyone know any workaround to achieve the same.
thanks for your support
Try to use the AudioTrack instead of MediaPlayer. Generally, I'd suggest the following plan:
Learn some sources for an app that uses AudioTrack. A good player can be found here
This is an aac audio player that uses JNI for aac-audio decoding.
Find a MP3 decoding library. The library should be a Java one (look at this for example, or it is possible there are another libraries (I did not use such java libraries)) or a C/C++ library (in this case you also will use it through JNI).
When you will get simple working MP3 player, add manual crossfading (this should be easy, if you aware of basics of digital audio).
Try to use two objects of MediaPlayer one after another with crossfade, like in this class https://github.com/psaravan/JamsMusicPlayer/blob/f165057dd664727ed06b9fac2c27557e5fb7e7ee/jamsMusicPlayer/src/main/java/com/jams/music/player/Services/AudioPlaybackService.java
When second mediaPlayer stars first pauses for while (On some devices mostly on samsung )so transition is not smooth. MediaPlayer has this issue and its been reported to google since long ago but still not resolved(https://issuetracker.google.com/issues/36931073) so we can do nothing about it. So I used Exoplayer for playing audio and it works very smoothly without any pause.
Related
Let me refraise my question, I wrote it in a hurry.
Current situation:
I have set up a digital video recorder to record broadcasts provided via DVB-C. It is running on a raspberry 3B using TVHeadend and jetty/cling to provide UPnP and other possibilities to access media files. For watching recordings, I wrote an android player app using IJKPlayer, which runs on smartphones, FireTV and AndroidTV.
One hassle when playing media files which are currently beeing recorded is, that IJKPlayer doesn not support timeshifting. Means, when I start playing a currently recording file, I can only watch the length which is known by the player at that moment. Anything which is recorded afterwards can not be played. I need to exit the player activity and start it again. I have resolved that issue by "simulating" a completed recoding using a custom servlet implementation. Since the complete length of the recording is already known, I can use ffmpeg to accomplish this.
Future situation:
I plan to move away from IJKPlayer to ExoPlayer, because it supports hardware playback and is much faster when playing h.264 media. I can of course use the same solution like above, but as far as I have found out yet, ExoPlayer can support media files which are currently being recorded by using the Timeline class. However, I don't seem to find neither a usefull documentation nor any good example. Hence, I would appreciate any help with the timeline object.
Regards
Harry
Looks like my approach won't work. At least, I didn't find a solution. Problem is, that the server returns the stream size as it is during player-start-time. I didn't find a method to update the media duration for "regular" files.
However, I can solve the problem by changing the server side. Instead of accessing a regular file, I convert the file to m3u8 in realtime, using ffmpeg. I then throw the m3u8 URI onto the player and it updates the duration of the stream (while playing) without the need to create any additional code on the client side.
I've been struggling for days to build an application that, given a RTSP link, views the video to the screen.
It's a live video streaming coming from an IP camera in H264 format (it's not a file).
I couldn't get it done using the native MediaPlayer or VideoView (it starts showing the video and then freezes afrer 2-3 seconds), but since that when I view it in VLC player it's working fine, I've decided to use a VLC MedaPlayer in my application.
I've found some sources such as vlc-android and exoplayer_vlc, but I would like just to have a library jar, which I get import to my application and just use its classes. Is there such a jar somewhere? How can I make it work?
Thanks in advance
Has anyone tried using ExoPlayer to achieve this?
I tried looking online with no success.
When I say gapless playback, I am referring to the problem of using the media player to play local videos back to back. After the first video is done playing, there is a noticeable delay of 1 second before the second video starts.
Hoping this question helps in understanding this issue further.
For reference please look at the following question:
Android: MediaPlayer gapless or seamless Video Playing
ExoPlayer 2, which is now officially released, seems to support gapless playback using the ConcatenatingMediaSource class. From its developer guide:
Transitions between sources are seamless. There is no requirement that the sources being concatenated are of the same format (e.g. it’s fine to concatenate a video file containing 480p H264 with one that contains 720p VP9). The sources may even be of different types (e.g. it’s fine to concatenate a video with an audio only stream).
And the example code:
MediaSource firstSource = new ExtractorMediaSource(firstVideoUri, ...);
MediaSource secondSource = new ExtractorMediaSource(secondVideoUri, ...);
// Plays the first video, then the second video.
ConcatenatingMediaSource concatenatedSource =
new ConcatenatingMediaSource(firstSource, secondSource);
EDIT: ExoPlayer 2 supports gapless playback, but as of the time of writing is still unreleased as a stable version.
You will most likely never be able to achieve perfect gapless playback of multiple tracks with ExoPlayer or Android Media Player. Neither have been written to support starting multiple tracks and I imagine it will stay out of scope for both of them.
You can achieve gapless playback by using 2 different player instances, once you have started and played the first, you can load the second and start playback once the first finishes. Using this method you could have a gapless solution, as long as you prepare the second video during playback of the first.
To take it further, you can also use 2 different surface textures for rendering the multiple videos, once the first video reaches the end you could fade out the texture and fade in the new one. Resulting in a nice seamless video effect.
Because of the nature of playing multiple videos at once you will most likely want to create your own timer for incrementing the time and deciding when to switch to the next video, rather than trying to use the callbacks from ExoPlayer or Android Media. This will allow you to keep track of the time in a more accurate fashion, without needing to keep talking to multiple video codecs.
I know this is not the answer you've been looking for, but it's the only reasonable answer. The sole way to ensure no gaps in playback is to download the entire file first and begin playback when it's done. Otherwise, in the event that you lose connectivity before the file is finished downloading, pausing is inescapable.
I just tried switching to ExoPlayer from the standard MediaPlayer implementation and the gap is the same if not worse. However I have used a very simple method of restarting the player when the status changes to ended. I don't know if there's a better proper way to do it, perhaps with 2 different ExoPlayers.
I know we can play mp3 file in MediaPlayer.
But can we play mp3+g on android??
I saw in the documentation on android, but i didn't see it.
http://developer.android.com/guide/appendix/media-formats.html
Is there any work around or library to do this?
Thanks
I don't "think" that Android is going to support mp3+g playback anytime soon. That being said an mp3+g "file" should either be one zipped file(with two files inside) or two separate files named the same with exception of the file extension. So other then playing the MP3 there is really nothing else that MediaPLayer can do, and changing MediaPlayer int the android framework to get this to work would not be portable from device to device.
Workaround 1
Use FFMPEG to transcode and mux these files to a different format that is supported such as mp4. Here is an example of someone using ffmpeg to mux mp3+g into FLV.
Workaround 2
Another option would be to use Android For VLC which is in pre-alpha found here. Now I'm not sure that VLC for android will support mp3+g, but libvlc does support decoding of the two files so I'm guessing it would work, or you could alter the code a bit to get it to work. I have checked out the VLC for Android code recently and I have to say its a cpu hog but since mp3 and cdg are generally smaller less cpu intensive files I think that android devices could handle the work load using VLC.
Workaround 3
Now as far as more complex options you could utilize the Android NDK and create a decoder yourself (This would take you a lot of time).
Hope some of this helps you.
I have found the solution..
http://code.google.com/p/cdg-toolkit/
It was written in java so we should porting it first to Android if you want to use it.
I'm currently implementing a sound effect mixing on Android via OpenSL. I have an initial implementation going, but I've encountered some issues.
My implementation is as follows:
1) For each sound effect I create several AudioPlayer objects (one for each simultaneous sound) that uses an SLDataLocator_AndroidFD data source that in turn refers to an OGG file. For example, if I have a gun firing sound (lets call it gun.ogg) that is played in rapid succession, I use around 5 AudioPlayer objects that refer to the same gun.ogg audio source and also the same outputmix object.
2) When I need to play that sound effect, I search through all the AudioPlayer objects I created and find one that isn't currently in the SL_PLAYSTATE_PLAYING state and use it to play the effect.
3) Before playing a clip, I seek to the start of it using SLPlayItf::SetPosition.
This is working out alright so far, but there is some crackling noise that occurs when playing sounds in rapid succession. I read on the Android NDK newsgroup that OpenSL on Android has problems with switching data sources. Has anyone come across this issue?
I'm also wondering if anyone else seen or come up with a sound mixing approach for OpenSL on Android. If so, does your approach differ from mine? Any advice on the crackling noise?
I've scoured the internet for OpenSL documentation and example code, but there isn't much out there with regards to mixing (only loading which I've figured out already). Any help would be greatly appreciated.
This is probably not the best approach (creating many instances of audio players). Unfortunately the Android version (2.3) of OpenSL ES doesn't support SLDynamicSourceItf. Which would be similar to OpenAL's source binding interface. One approach would be to create multiple stream players. You would then search for a stream player that isn't currently playing and start streaming your sound effect to it from memory. It's not ideal but it's doable.
You should probably not use the ogg format for sound effects either. You're better off with WAV (PCM) as it won't need to be decoded.
Ogg is fine for streaming background music.