How to play Multi Track video file through android MediaPlayer? - android

I'm working an an App that plays Video files. I'm using android MediaPlayer class to play Video files.
Problem:
I want to play let say a video file with multiple embedded Audio Tracks. And then want to allow users choose between the tracks at runtime through an interface.
Is it even possible with Android MediaPlayer?
I've seen many application that has this feature like MX PLayer, VLC for android ...

Yes Android MediaPlayer supports playback for multiple embedded Audio Tracks.
You can use selectTrack API to achieve the same.
Syntax goes as below.
public void selectTrack (int index)
index int: the index of the
track to be selected. The valid range of the index is 0..total number
of track - 1. The total number of tracks as well as the type of each
individual track can be found by calling getTrackInfo() method.
Example usage:
MediaPlayer mplayer = new MediaPlayer();
MediaPlayer.TrackInfo[] trackInfo = mplayer.getTrackInfo();
for (int i = 0; i < trackInfoArray.length; i++) {
if (trackInfo[i].getTrackType() == MediaPlayer.TrackInfo.MEDIA_TRACK_TYPE_AUDIO) {
mplayer.selectTrack(i);
break;
}

Related

Does ExoPlayer create a windowIndex for each media source in ConcatenatingMediaSource?

I'm working on an app that streams a list of mp3 files, to do this I've used ExoPlayer with a ConcatenatingMediaSource as this:
private fun createMediaSource(
tracks: List<Track>
): MediaSource = ConcatenatingMediaSource(true).apply {
tracks.forEach { track ->
val mediaSource = ProgressiveMediaSource
.Factory(DefaultDataSourceFactory(context))
.createMediaSource(MediaItem.fromUri(track.getFullUri()))
addMediaSource(mediaSource)
}
}
This works great, the files play as list with no errors at all, however what's required from me is to play all these streams as a single stream, where I show the total length of all streams on the seek bar, and the user would seek seamlessly between them.
Of course I'm not using the VideoPlayer provided by ExoPlayer because I need the seekbar to span all media sources, which apparently this is not possible to do with ExoPlayerUi.
So this is the logic I've used when the user tries to seek:
exoPlayer.apply {
var previousTracksLength = 0L
var windowIndex = 0
var currentItemLength = 0L
run loop#{
tracksList.forEachIndexed { index, track ->
currentItemLength = track.getLengthMillis()
previousTracksLength += currentItemLength
if (newPositionMillis < previousTracksLength) {
windowIndex = index
return#loop
}
}
}
val positionForCurrentTrack = (newPositionMillis - (previousTracksLength - currentItemLength))
pause()
if (windowIndex == currentWindowIndex) {
seekTo(positionForCurrentTrack)
} else {
seekTo(windowIndex, positionForCurrentTrack)
}
play()
}
This works amazingly well when the ConcatenatingMediaSource has only 3 or less media sources, but if it's bigger than that, weird behavior starts showing up, I might just want to seek 10 seconds forward the player would move more than 2 minutes instead.
After debugging it was obvious for me that when I call: seekTo(windowIndex, positionForCurrentTrack) exoPlayer is seeking to a window that's not mapped with a specific media source in the ConcatenatingMediaSource !
And here comes my questions:
Does ExoPlayer create a single window for each mediaSource in the ConcatenatingMediaSource or not ?
and If not is there a way to force it to do that ?
This is not really an answer but the explanation to why when I called seekTo(windowIndex, position) the player seemed like it was ignoring the windowIndex and actually seek to a completely unexpected position is because the media type was mp3 !
Apparently many devs have suffered the same issue where the player seek position is out of sync with the real position of the media that's being played when it's an mp3.
More details for anyone having weird issues when playing mp3 using ExoPlayer
https://github.com/google/ExoPlayer/issues/6787#issuecomment-568180969

How do I get the index of the current track being played using ExoPlayer

I am working on an Android project that involves the use of Google's ExoPlayer.
I have a list of video sources which I build a playlist from using the following code:
for (int i = 0; i < vidList.length(); i++) {
MediaSource source = new ExtractorMediaSource(Uri.parse(vidList.getJSONObject(i).getString("url")),
buildDataSourceFactory(bandwidthMeter), extractorsFactory, mainHandler, HomeFragment.this);
mediaSources.add(source);
captions.add(vidList.getJSONObject(i).getString("caption"));
}
mediaSource = new ConcatenatingMediaSource(mediaSources.toArray(new MediaSource[mediaSources.size()]));
I then call
exoplayer.prepare(mediasource, false, false)
and the videos play in succession fine. I would like to display the caption of the currently playing video in a textView and so I have a separate list that holds the "caption" values for each video.
From scouring through the code I see that I can get the currently playing video in the playlist like this;
exoPlayer.getCurrentPeriodIndex()
Which seems to work and returns the index except for one problem. It returns the value of 0 twice as playback starts. That is video at index 0 returns period 0 as well as video at index 1. This only occurs at indexes 0 and 1 and thereafter everything else looks fine except that the getCurrentPeriodIndex() will return theAccurateIndex - 1.
I see this also happening in the demo Exoplayer application.
Is there a better way to determine what track is currently playing in the playlist?
Thanks.
To find the currently playing track, you need to reference currentWindowIndex exoPlayer field. Looks like this in Java...
exoPlayer.getCurrentWindowIndex()
I'm not sure what getCurrentPeriodIndex() does, and the docs don't elaborate, and I don't like speculating.
exoPlayer.getCurrentWindowIndex() is Deprecated.
Use exoPlayer.getCurrentMediaItemIndex() instead.

MediaCodec audio/video muxing issues ond Android

I am transcoding videos based on the example given by Google (https://android.googlesource.com/platform/cts/+/master/tests/tests/media/src/android/media/cts/ExtractDecodeEditEncodeMuxTest.java)
Basically, transocding of MP4 files works, but on some phones I get some weird results. If for example I transcode a video with audio on an HTC One, the code won't give any errors but the file cannot play afterward on the phone. If I have a 10 seconds video it jumps to almost the last second and you only here some crackling noise. If you play the video with VLC the audio track is completely muted.
I did not alter the code in terms of encoding/decoding and the same code gives correct results on a Nexus 5 or MotoX for example.
Anybody having an idea why it might fail on that specific device?
Best regard and thank you,
Florian
I made it work in Android 4.4.2 devices by following changes:
Set AAC profile to AACObjectLC instead of AACObjectHE
private static final int OUTPUT_AUDIO_AAC_PROFILE = MediaCodecInfo.CodecProfileLevel.AACObjectLC;
During creation of output audio format, use sample rate and channel count of input format instead of fixed values
MediaFormat outputAudioFormat = MediaFormat.createAudioFormat(OUTPUT_AUDIO_MIME_TYPE,
inputFormat.getInteger(MediaFormat.KEY_SAMPLE_RATE),
inputFormat.getInteger(MediaFormat.KEY_CHANNEL_COUNT));
Put a check just before audio muxing audio track to control presentation timestamps. (To avoid timestampUs X < lastTimestampUs X for Audio track error)
if (audioPresentationTimeUsLast == 0) { // Defined in the begining of method
audioPresentationTimeUsLast = audioEncoderOutputBufferInfo.presentationTimeUs;
} else {
if (audioPresentationTimeUsLast > audioEncoderOutputBufferInfo.presentationTimeUs) {
audioEncoderOutputBufferInfo.presentationTimeUs = audioPresentationTimeUsLast + 1;
}
audioPresentationTimeUsLast = audioEncoderOutputBufferInfo.presentationTimeUs;
}
// Write data
if (audioEncoderOutputBufferInfo.size != 0) {
muxer.writeSampleData(outputAudioTrack, encoderOutputBuffer, audioEncoderOutputBufferInfo);
}
Hope this helps...
If original CTS tests fail you need to go to device vendors and ask for fixes

Cricket Audio. While playing many CkSound effects, best to call destroy CkSound?

Using the Cricket Audio Sound Engine ( ios & android) how would I set up a machine gun type sound effect. I need to be able to play many instances of a sound per second. The sound effects need to layer on top of each other.
My solution is to create a new CkSound instance and forget about it. I don't see a easy to destroy the sound, with out a complex sound tracking method. Will this cause memory problems as I am creating thousands of CkSounds over the course of a play session? I really don't want to have to keep track of individual sounds for garbage collection.
// Example sound effect call
void SoundManager::playEffect(const char* name){
// I make a sound , play it , and forget about it
sound = CkSound::newBankSound(g_bank, name);
sound->play();
}
I don’t recommend you create instances and don’t destroy them, as this is a memory leak, so your app will use more and more memory as time goes on.
You could try something like this…
to initialize:
const int k_maxSounds = 5; // maximum number of sound instances to be playing at once
CkSound* g_sounds[k_maxSounds];
for (int i = 0; i < k_maxSounds; ++i)
{
g_sounds[i] = CkSound::newBankSound(g_bank, name);
}
to play another sound instance, find the first available instance and play it:
for (int i = 0; i < k_maxSounds; ++i)
{
if (!g_sounds[i]->isPlaying())
{
g_sounds[i]->play();
break;
}
}
-steve -Cricket Audio Creator answered via email

Android : multiple audio tracks in a VideoView?

I've got some .MP4 video files that must be read in a VideoView in an Android activity. These videos include several audio tracks, with each one corresponding to a user language (eg. : English, French, Japanese...).
I've got unexpected trouble finding any help or documentation to provide such a feature. I'm currently able to load the video and play it in a VideoView with a MediaController, but not to change audio tracks.
I'm not sure the Android SDK provides any easy way to do this, which leaves me quite clueless on how to solve my problem. I was thinking of extracting every audio track, loading the audio that I want into a MediaPlayer depending on the language, then make audio and video play together. But I fear that some sync issues could arise and prevent me from doing this.
If you have any clue, any advice to help me getting started with this problem, you're more than welcome.
No 3rd party library required:
mVideoView.setVideoURI(Uri.parse("")); // set video source
mVideoView.setOnInfoListener(new MediaPlayer.OnInfoListener() {
#Override
public boolean onInfo(MediaPlayer mp, int what, int extra) {
MediaPlayer.TrackInfo[] trackInfoArray = mp.getTrackInfo();
for (int i = 0; i < trackInfoArray.length; i++) {
// you can switch out the language comparison logic to whatever works for you
if (trackInfoArray[i].getTrackType() == MediaPlayer.TrackInfo.MEDIA_TRACK_TYPE_AUDIO
&& trackInfoArray[i].getLanguage().equals(Locale.getDefault().getISO3Language()) {
mp.selectTrack(i);
break;
}
}
return true;
}
});
As far as I can tell - audio tracks should be encoded in the 3-letter ISO 639-2 in order to be recognized correctly.
Haven't tested myself yet, but it seems that Vitamio library has support for multiple audio tracks (among other interesting features). It is API-compatible with VideoView class from Android.
Probably you would have to use Vitamio VideoView.setAudioTrack() to set audio track (for example based on locale). See Vitamio API docs for details.
Now you can Play Multiple audio track through ExoPlayer.
Here is the details,
https://exoplayer.dev/track-selection.html
Exo Player Track Selection
VideoView class can't support your require.U must parse to get audio stream data(you want) to play with AudioTrack class on java layer.

Categories

Resources