Playing dynamic url in exoplayer - android

i have a url and it changes every 3 seconds. I make a request to the url every 2 seconds and refresh the url. 3 seconds becomes a valid m3u8 file.Only the query parameters in the url change every 3 seconds. I'm returning the same post just a different link.
DataSource.Factory dataSourceFactory = new DefaultHttpDataSourceFactory();
HlsMediaSource hlsMediaSource =
new HlsMediaSource.Factory(dataSourceFactory)
.createMediaSource(MediaItem.fromUri(dataItem.getVideo()));
concatenatingMediaSource = new ConcatenatingMediaSource();
concatenatingMediaSource.addMediaSource(hlsMediaSource);
player.setMediaSource(concatenatingMediaSource);
player.prepare();
player.setPlayWhenReady(true);
private void setLiveStreamData(String id) {
Call<LiveStreamData> liveStreamDataCall = RetrofitBuilder.newCreate().getStreamLive(id);
liveStreamDataCall.enqueue(new Callback<LiveStreamData>() {
#Override
public void onResponse(#NotNull Call<LiveStreamData> call, #NotNull Response<LiveStreamData> response) {
if (response.isSuccessful() && response.body() != null) {
DataSource.Factory dataSourceFactory = new DefaultHttpDataSourceFactory();
HlsMediaSource hlsMediaSource =
new HlsMediaSource.Factory(dataSourceFactory)
.createMediaSource(MediaItem.fromUri(response.body().getUrl()));
concatenatingMediaSource.addMediaSource(hlsMediaSource);
}
}
#Override
public void onFailure(#NotNull Call<LiveStreamData> call, #NotNull Throwable t) {
Log.e(TAG, "onFailure: ", t);
}
});
}
I may not be able to add the exoplayer correctly. because after 3 seconds exoplayer keeps playing the first link and gives an error. After 3 seconds the old url no longer returns an m3u8 file.
How can I set up such a structure correctly?
Playback error
com.google.android.exoplayer2.ExoPlaybackException: Source error

It looks like your use case is a Live HLS stream.
For Live you should not have to worry about manually re-requesting the mpd file yourself when it updates as the player will recognise it is a Live stream and request updates itself.
This is actually specified in the HLS RFC along with guidance so the player does not generate too many requests and overload the server:
The client MUST periodically reload a Media Playlist file to learn
what media is currently available, unless it contains an EXT-X-
PLAYLIST-TYPE tag with a value of VOD, or a value of EVENT and the
EXT-X-ENDLIST tag is also present.
However, the client MUST NOT attempt to reload the Playlist file more
frequently than specified by this section, in order to limit the
collective load on the server.
(HLS RFC: https://datatracker.ietf.org/doc/html/rfc8216)
One important check is to make sure the manifest is correctly formatted for Live streams and in particular that it does not contain the EXT-X-ENDLIST tag as noted above and in the Apple HLS guidelines:
In live sessions, the index file is updated by removing media URIs from the file as new media files are created and made available. The EXT-X-ENDLIST tag isn't present in the live playlist, indicating that new media files will be added to the index file as they become available.
More info including the above at this link: https://developer.apple.com/documentation/http_live_streaming/example_playlists_for_http_live_streaming/live_playlist_sliding_window_construction

Related

LibVlc android getting all tracks

I can't find much documentation on the process of getting all of the media tracks (video audio and subtitles) using libvlc on android.
From what I understand, I have to parse the media, and I'm doing it like this:
Media media = new Media(libVLC, Uri.parse(url));
media.setEventListener(new IMedia.EventListener() {
#Override
public void onEvent(IMedia.Event event) {
switch (event.type){
case IMedia.Event.ParsedChanged:
if(event.getParsedStatus() == IMedia.ParsedStatus.Done){
Log.i("App", "Parse done, track count " + media.getTrackCount());
Gson gson = new Gson();
for(int i=0; i<media.getTrackCount(); i++){
Log.i("App", "Track " + i + ": " + gson.toJson(media.getTrack(i)));
}
}
break;
}
}
});
media.parseAsync();
vlc.setMedia(media);
vlc.play();
The results I get from this are odd: sometimes I get one track only, the video track, but sometimes I also get the audio track, so two tracks total.
The problem is that the media also have a subtitle track, so there must be a way for me to get all three tracks (Playing the same exact media with vlc on windows shows, indeed, all three tracks).
What am I doing wrong?
Edit: I need a way to dynamically get all tracks, the media could have n tracks so I don't know the exact number. This is just a test and I know there are three tracks.
Thanks
If you are not able to get the tracks from media, use VLC MediaPlayer object, VLC media player provides methods to get Audio Tracks, Video Tracks and Subtitle tracks using MediaPlayer object.
mMediaPlayer!!.setEventListener {
when (p0?.type) {
MediaPlayer.Event.Opening-> {
val audioTracks = mMediaPlayer!!.audioTracks
val subtitleTracks = mMediaPlayer!!.spuTracks
val videoTracks = mMediaPlayer!!.videoTracks
}
}
You can iterate over the lists to get individual tracks.

MediaPlayer prepare() hangs on custom device

I'm currently implementing streaming internet audio on a custom tablet (i.e. not Samsung, Google, etc). I developed the code and tested on a Samsung tablet which worked fine. When installing on the custom tablet, the MediaPlayer hangs on the call to prepare(). I have implemented OnPreparedListener, OnErrorListener, OnBufferingUpdateListner, and OnCompletionListener all with logs for basic troubleshooting. I am implementing the streaming in a class that extends Service and uses a separate thread. I have tried both prepare() and prepareAsync().
To clarify, when I say the MediaPlayer hangs, I mean I receive NO error messages, crashes, log output, etc. The app continues to function, but onPrepared(...) never gets called.
I configure the player like this:
private void configureMediaPlayer(String streamUrl) {
mMediaPlayer = new MediaPlayer();
AudioAttributes attributes = new AudioAttributes.Builder()
.setContentType(AudioAttributes.CONTENT_TYPE_MUSIC)
.setUsage(AudioAttributes.USAGE_MEDIA)
.build();
mMediaPlayer.setAudioAttributes(attributes);
Log.d(TAG, "Media player configured");
playTuneInStation(streamUrl);
}
I attempt to play the music stream like this:
private void playStation(String streamUrl) {
Log.d(TAG, "playStation: " + streamUrl);
if (mMediaPlayer.isPlaying()) {
mMediaPlayer.stop();
mMediaPlayer.reset();
Log.d(TAG, "Stopped media player.");
}
try {
mMediaPlayer.setOnPreparedListener(this);
mMediaPlayer.setOnErrorListener(this);
mMediaPlayer.setOnBufferingUpdateListener(this);
mMediaPlayer.setOnCompletionListener(this);
mMediaPlayer.setDataSource(streamUrl);
new Handler(getMainLooper()).post(() -> Log.d(TAG, "Main thread: " + Thread.currentThread().getId()));
Log.d(TAG, "Preparing stream on thread: " + Thread.currentThread().getId() + ".");
mMediaPlayer.prepare();
Log.d(TAG, "Waiting for media player to be prepared");
} catch (IOException e) {
//TODO: Gracefully inform user of failure
e.printStackTrace();
}
}
and I implement onPrepared(MediaPlayer mp) to start playing the stream:
#Override
public void onPrepared(MediaPlayer mp) {
Log.d(TAG, "Media player prepared. Starting playback.");
mp.start();
}
Again, this works great on the Samsung table, but, due to the nature of the project, we have to use custom tablets provided by the client. We use VideoViews in some places which seem to work and play audio, but those are local files on the device (not sure if that matters). The devices use a 4G network and all web api calls work fine so I don't think it would be anything related to network. I just find it odd that it will hang forever (I've let it sit for 30 min with no crashes, errors, logs, etc). The app is totally responsive during this time.
I'm not sure if there is some other configuration I can use which might help. Any help would be greatly appreciated.
UPDATE
I've tried loading a local audio file by replacing:
setDataSource(streamUrl);
with:
setDataSource(this, Uri.parse(filename));
and it played the audio without a problem.
UPDATE 2
I've now tried passing in an invalid URL to the setDataSource(...) method and the OnErrorListener is called with the error that it can't load the resource. Also, I passed in the URL that should work, but stripped a few characters from the end. It seemed to recognize it as a potentially valid stream because it posted error logs and retried loading the resource 10 times before failing in the OnErrorListener.
Here are the logs I receive when prepare() (also tried with prepareAsync()) is called:
05-15 11:02:33.149 252-26756/? D/FslExtractor: FslExtractor::FslExtractor mime=audio/mpeg
05-15 11:02:33.150 252-26756/? D/FslExtractor: FslExtractor::Init BEGIN
GetLibraryName lib_mp3_parser_arm11_elinux.3.0.so
load parser name lib_mp3_parser_arm11_elinux.3.0.so
FslExtractor::CreateParserInterface success
05-15 11:02:33.150 252-26756/? I/FslExtractor: Core parser MP3PARSER_03.01.15 build on Nov 17 2016 13:55:34
05-15 11:02:33.150 252-26756/? D/FslExtractor: createParser2 flag=5e,err=0
05-15 11:02:33.150 252-26756/? I/FslExtractor: mReadMode=0,mNumTracks=1
bSeekable 1
05-15 11:02:33.150 252-26756/? D/FslExtractor: FslExtractor::ParseMediaFormat BEGIN
ParseAudio index=0,type=5,subtype=2829696637
It looks in these logs like it's beginning to parse the audio, but nothing happens afterward.
Well, since there were no crashes or decent logs that pointed to anything, I switched to using Google's ExoPlayer, which seems to be working just fine.
private void configureMediaPlayer(String streamUrl) {
Log.d(TAG, "configureMediaPlayer: " + streamUrl);
DefaultBandwidthMeter meter = new DefaultBandwidthMeter();
mMediaPlayer = ExoPlayerFactory.newSimpleInstance(this,
new DefaultTrackSelector(
new AdaptiveTrackSelection.Factory(
meter)));
DataSource.Factory dataSourceFactory = new DefaultDataSourceFactory(this,
Util.getUserAgent(this, "appName"), meter);
mMediaSourceFactory = new ExtractorMediaSource.Factory(dataSourceFactory);
mMediaPlayer.setPlayWhenReady(true);
MediaSource mediaSource = mMediaSourceFactory.createMediaSource(Uri.parse(streamUrl));
Log.d(TAG, "playTuneInStation: media source set");
mMediaPlayer.addListener(this);
mMediaPlayer.prepare(mediaSource);
}
If anybody has any ideas of why the original idea using MediaPlayer didn't work, please feel free to comment or post another answer. Thanks.

How to send readable part of mp4 while recording it

I'm working on an app that records video in background and sends it to server in parts by reading bytes and storing them in byte array. For now algorithm is pretty simple:
start recording;
reading part of video file to byte array;
send byte array via POST (with help of retrofit).
Problem occurs if connection somehow interrupts and last part isn't sent. Server just can't make readable video file as moov atom would be written only after recording stops. My question - is it possible some how to make complete video files from byte array parts or any other way? I can change video codec if it would solve the problem.
p.s. I can only send data via POST.
p.p.s I can't change something on server side including streaming video directly to server.
SOLUTION
I decided to record small chunks of video in recursive way. Next solution is suitable for first version of Camera API. If you're using Camera2 or something else - you can try to use same algorithm.
In service class that records video make sure that mediarecorder is configured next way:
mediaRecorder.setMaxDuration(10000);
//or
mMediaRecorder.setMaxFileSize(10000);
Then you need to implement setOnInfoListener interface next way:
mediaRecorder.setOnInfoListener(new MediaRecorder.OnInfoListener() {
#Override
public void onInfo(MediaRecorder mr, int what, int extra) {
if (what == MediaRecorder.MEDIA_RECORDER_INFO_MAX_DURATION_REACHED) {
//Use next condition if you decided to use max file size
//if (what == MediaRecorder.MEDIA_RECORDER_INFO_MAX_FILESIZE_REACHED)
stopRecording();
setRecordingStatus(false);
startRecording(surfaceHolder);
}
}
});
Don't forget to pass surfaceHolder instance for next iteration otherwise you can get "Application lost surface" error.
Next thing you need to do is declare FileObserver in onCreate method:
FileObserver fileObserver = new FileObserver(pathToFolder, FileObserver.CLOSE_WRITE) {
//FileObserver.CLOSE_WRITE mask means that this observer would be triggered when it receive end of writing to file system event
#Override
public void onEvent(int event, String path) {
//here path is name of file (with extension) but not the full path to file
if (event == FileObserver.CLOSE_WRITE && path.endsWith(".mp4")) {
String name = String.valueOf(Long.parseLong(path.substring(0, path.length() - 4)) / 1000);
sendNewVideo(pathToFolder + "/" + path, name);
}
}
};
In onStartCommand method:
fileObserver.startWatching();

How to use ExoPlayer

I want to use ExoPlayer in my app. Could you please tell me which is simplest example? I have tried to do likely https://github.com/google/ExoPlayer/ but it's not easy for me. I tried to import library as module then i received bintray-release error.
As stated in the main Readme.md, you can import ExoPlayer as you will do for any other dependencies :
In your app build.gradle > dependencies add :
compile 'com.google.android.exoplayer:exoplayer:rX.X.X'
The current version is r1.5.1 as of October 27, 2015. see here.
Old question but since there are too few simple ExoPlayer tutorials out there, I wrote this up. I recently converted an app I have from using Android's default media player to ExoPlayer. The performance gains are amazing and it works on a wider range of devices. It is a bit more complicated, however.
This example is tailored specifically to playing an http audio stream but by experimenting you can probably adapt it easily to anything else. This example uses the latest v1.xx of ExoPlayer, currently v1.5.11:
First, put this in your build.gradle (Module: app) file, under "dependencies":
compile 'com.google.android.exoplayer:exoplayer:r1.5.11'
Also your class should implement ExoPlayer.Listener:
...implements ExoPlayer.Listener
Now here's the relevant code to play an http audio stream:
private static final int RENDERER_COUNT = 1; //since we want to render simple audio
private static final int BUFFER_SEGMENT_SIZE = 64 * 1024; // for http mp3 audio stream use these values
private static final int BUFFER_SEGMENT_COUNT = 256; // for http mp3 audio steam use these values
private ExoPlayer exoPlayer;
// for http mp3 audio stream, use these values
int minBufferMs = 1000;
int minRebufferMs = 5000;
// Prepare ExoPlayer
exoPlayer = ExoPlayer.Factory.newInstance(RENDERER_COUNT, minBufferMs, minRebufferMs);
// String with the url of the stream to play
String stream_location = "http://audio_stream_url";
// Convert String URL to Uri
Uri streamUri = Uri.parse(stream_location);
// Settings for ExoPlayer
Allocator allocator = new DefaultAllocator(BUFFER_SEGMENT_SIZE);
String userAgent = Util.getUserAgent(ChicagoPoliceRadioService.this, "ExoPlayer_Test");
DataSource dataSource = new DefaultUriDataSource(ChicagoPoliceRadioService.this, null, userAgent);
ExtractorSampleSource sampleSource = new ExtractorSampleSource(
streamUri, dataSource, allocator, BUFFER_SEGMENT_SIZE * BUFFER_SEGMENT_COUNT);
MediaCodecAudioTrackRenderer audioRenderer = new MediaCodecAudioTrackRenderer(sampleSource, MediaCodecSelector.DEFAULT);
// Attach listener we implemented in this class to this ExoPlayer instance
exoPlayer.addListener(this);
// Prepare ExoPlayer
exoPlayer.prepare(audioRenderer);
// Set full volume
exoPlayer.sendMessage(audioRenderer, MediaCodecAudioTrackRenderer.MSG_SET_VOLUME, 1f);
// Play!
exoPlayer.setPlayWhenReady(true);
There are three callback methods:
#Override
public void onPlayWhenReadyCommitted() {
// No idea what would go here, I left it empty
}
// Called when ExoPlayer state changes
#Override
public void onPlayerStateChanged(boolean playWhenReady, int playbackState) {
// If playbackState equals STATE_READY (4), that means ExoPlayer is set to
// play and there are no errors
if (playbackState == ExoPlayer.STATE_READY) {
// ExoPlayer prepared and ready, no error
// Put code here, same as "onPrepared()"
}
}
// Called on ExoPlayer error
#Override
public void onPlayerError(ExoPlaybackException error) {
// ExoPlayer error occurred
// Put your error code here
}
And when you're done playing do the usual:
if (exoPlayer != null) {
exoPlayer.stop();
exoPlayer.release();
}
NOTE: I'm still not 100% sure about the details of all of the ExoPlayer settings. I've never tried playing video. Note this is for version 1.5.x of ExoPlayer, 2.0 changed a lot and I still haven't figured it out. I do highly recommend this code to anyone who has an app that streams audio from the web as the performance gains are incredible and for my app it fixed an issue with Samsung phones that would only play about 30sec of audio before stopping.

How to play video in video view from url based on buffer %age in android?

I have a link of video from s3 server and i am playing this video on VideoView. Video is playing properly but the problem is that first it downloads the entire video then plays it.
I want it play like buffer. I mean if 20 % video downloaded it should play those and then again download (Like in youtube). Here is my code what i have done is..
FFmpegMediaMetadataRetriever mediaMetadataRetriever = new FFmpegMediaMetadataRetriever();
AWSCredentials myCredentials = new BasicAWSCredentials(
"AKIAIGOIY4LLB7EMACGQ",
"7wNQeY1JC0uyMaGYhKBKc9V7QC7X4ecBtyLimt2l");
AmazonS3 s3client = new AmazonS3Client(myCredentials);
GeneratePresignedUrlRequest request = new GeneratePresignedUrlRequest(
"mgvtest", videoUrl);
URL objectURL = s3client.generatePresignedUrl(request);
try {
mediaMetadataRetriever.setDataSource(videoUrl);
} catch (Exception e) {
utilDialog.showDialog("Unable to load this video",
utilDialog.ALERT_DIALOG);
pb.setVisibility(View.INVISIBLE);
}
videoView.setVideoURI(Uri.parse(videoUrl));
MediaController myMediaController = new MediaController(this);
// myMediaController.setMediaPlayer(videoView);
videoView.setMediaController(myMediaController);
videoView.setOnCompletionListener(myVideoViewCompletionListener);
videoView.setOnPreparedListener(MyVideoViewPreparedListener);
videoView.setOnErrorListener(myVideoViewErrorListener);
videoView.requestFocus();
videoView.start();
Listeners
MediaPlayer.OnCompletionListener myVideoViewCompletionListener = new MediaPlayer.OnCompletionListener() {
#Override
public void onCompletion(MediaPlayer arg0) {
// Toast.makeText(PlayRecordedVideoActivity.this, "End of Video",
// Toast.LENGTH_LONG).show();
}
};
MediaPlayer.OnPreparedListener MyVideoViewPreparedListener = new MediaPlayer.OnPreparedListener() {
#Override
public void onPrepared(MediaPlayer mp) {
pb.setVisibility(View.INVISIBLE);
imgScreenshot.setVisibility(View.VISIBLE);
tvScreenshot.setVisibility(View.VISIBLE);
// final Animation in = new AlphaAnimation(0.0f, 1.0f);
// in.setDuration(3000);
// tvScreenshot.startAnimation(in);
Animation animation = AnimationUtils.loadAnimation(
getApplicationContext(), R.anim.zoom_in);
tvScreenshot.startAnimation(animation);
new Handler().postDelayed(new Runnable() {
#Override
public void run() {
tvScreenshot.setVisibility(View.INVISIBLE);
}
}, 3000);
}
};
MediaPlayer.OnErrorListener myVideoViewErrorListener = new MediaPlayer.OnErrorListener() {
#Override
public boolean onError(MediaPlayer mp, int what, int extra) {
// Toast.makeText(PlayRecordedVideoActivity.this, "Error!!!",
// Toast.LENGTH_LONG).show();
return true;
}
};
To be able to start playing an mp4 video before it has fully downloaded the video has to have the metadata at the start of the video rather than the end - unfortunately, with standard mp4 the default it usually to have it at the end.
The metadata is in an 'atom' or 'box' (basically a data structure within the mp4 file) and can be moved to the start. This is usually referred to as faststart and tools such as ffmpeg will allow you do this. The following is an extract from the ffmpeg documentation:
The mov/mp4/ismv muxer supports fragmentation. Normally, a MOV/MP4 file has all the metadata about all packets stored in one location (written at the end of the file, it can be moved to the start for better playback by adding faststart to the movflags, or using the qt-faststart tool).
There are other tools and software which will allow you do this also - e.g. the one mentioned in the ffmpeg extract above:
http://multimedia.cx/eggs/improving-qt-faststart/
If you actually want full streaming where the server breaks the file into chunks and these are downloaded one by one by the client, then you probably want to use one of the adaptive bit rate protocols (Apple's HLS, MS's Smoothstreaming, Adobe Adaptive Streaming or the new open standard DASH). This also allows you have different bit rates to allow for different network conditions. You will need a server that can support this functionality to use these techniques. This may be overkill if you just want a simple site with a single video and will not have too much traffic.
Actually you have to start cloudfront with s3, so can stream s3 videos,
checkout this link for more information:
http://www.miracletutorials.com/s3-streaming-video-with-cloudfront/

Categories

Resources