I'm currently implementing streaming internet audio on a custom tablet (i.e. not Samsung, Google, etc). I developed the code and tested on a Samsung tablet which worked fine. When installing on the custom tablet, the MediaPlayer hangs on the call to prepare(). I have implemented OnPreparedListener, OnErrorListener, OnBufferingUpdateListner, and OnCompletionListener all with logs for basic troubleshooting. I am implementing the streaming in a class that extends Service and uses a separate thread. I have tried both prepare() and prepareAsync().
To clarify, when I say the MediaPlayer hangs, I mean I receive NO error messages, crashes, log output, etc. The app continues to function, but onPrepared(...) never gets called.
I configure the player like this:
private void configureMediaPlayer(String streamUrl) {
mMediaPlayer = new MediaPlayer();
AudioAttributes attributes = new AudioAttributes.Builder()
.setContentType(AudioAttributes.CONTENT_TYPE_MUSIC)
.setUsage(AudioAttributes.USAGE_MEDIA)
.build();
mMediaPlayer.setAudioAttributes(attributes);
Log.d(TAG, "Media player configured");
playTuneInStation(streamUrl);
}
I attempt to play the music stream like this:
private void playStation(String streamUrl) {
Log.d(TAG, "playStation: " + streamUrl);
if (mMediaPlayer.isPlaying()) {
mMediaPlayer.stop();
mMediaPlayer.reset();
Log.d(TAG, "Stopped media player.");
}
try {
mMediaPlayer.setOnPreparedListener(this);
mMediaPlayer.setOnErrorListener(this);
mMediaPlayer.setOnBufferingUpdateListener(this);
mMediaPlayer.setOnCompletionListener(this);
mMediaPlayer.setDataSource(streamUrl);
new Handler(getMainLooper()).post(() -> Log.d(TAG, "Main thread: " + Thread.currentThread().getId()));
Log.d(TAG, "Preparing stream on thread: " + Thread.currentThread().getId() + ".");
mMediaPlayer.prepare();
Log.d(TAG, "Waiting for media player to be prepared");
} catch (IOException e) {
//TODO: Gracefully inform user of failure
e.printStackTrace();
}
}
and I implement onPrepared(MediaPlayer mp) to start playing the stream:
#Override
public void onPrepared(MediaPlayer mp) {
Log.d(TAG, "Media player prepared. Starting playback.");
mp.start();
}
Again, this works great on the Samsung table, but, due to the nature of the project, we have to use custom tablets provided by the client. We use VideoViews in some places which seem to work and play audio, but those are local files on the device (not sure if that matters). The devices use a 4G network and all web api calls work fine so I don't think it would be anything related to network. I just find it odd that it will hang forever (I've let it sit for 30 min with no crashes, errors, logs, etc). The app is totally responsive during this time.
I'm not sure if there is some other configuration I can use which might help. Any help would be greatly appreciated.
UPDATE
I've tried loading a local audio file by replacing:
setDataSource(streamUrl);
with:
setDataSource(this, Uri.parse(filename));
and it played the audio without a problem.
UPDATE 2
I've now tried passing in an invalid URL to the setDataSource(...) method and the OnErrorListener is called with the error that it can't load the resource. Also, I passed in the URL that should work, but stripped a few characters from the end. It seemed to recognize it as a potentially valid stream because it posted error logs and retried loading the resource 10 times before failing in the OnErrorListener.
Here are the logs I receive when prepare() (also tried with prepareAsync()) is called:
05-15 11:02:33.149 252-26756/? D/FslExtractor: FslExtractor::FslExtractor mime=audio/mpeg
05-15 11:02:33.150 252-26756/? D/FslExtractor: FslExtractor::Init BEGIN
GetLibraryName lib_mp3_parser_arm11_elinux.3.0.so
load parser name lib_mp3_parser_arm11_elinux.3.0.so
FslExtractor::CreateParserInterface success
05-15 11:02:33.150 252-26756/? I/FslExtractor: Core parser MP3PARSER_03.01.15 build on Nov 17 2016 13:55:34
05-15 11:02:33.150 252-26756/? D/FslExtractor: createParser2 flag=5e,err=0
05-15 11:02:33.150 252-26756/? I/FslExtractor: mReadMode=0,mNumTracks=1
bSeekable 1
05-15 11:02:33.150 252-26756/? D/FslExtractor: FslExtractor::ParseMediaFormat BEGIN
ParseAudio index=0,type=5,subtype=2829696637
It looks in these logs like it's beginning to parse the audio, but nothing happens afterward.
Well, since there were no crashes or decent logs that pointed to anything, I switched to using Google's ExoPlayer, which seems to be working just fine.
private void configureMediaPlayer(String streamUrl) {
Log.d(TAG, "configureMediaPlayer: " + streamUrl);
DefaultBandwidthMeter meter = new DefaultBandwidthMeter();
mMediaPlayer = ExoPlayerFactory.newSimpleInstance(this,
new DefaultTrackSelector(
new AdaptiveTrackSelection.Factory(
meter)));
DataSource.Factory dataSourceFactory = new DefaultDataSourceFactory(this,
Util.getUserAgent(this, "appName"), meter);
mMediaSourceFactory = new ExtractorMediaSource.Factory(dataSourceFactory);
mMediaPlayer.setPlayWhenReady(true);
MediaSource mediaSource = mMediaSourceFactory.createMediaSource(Uri.parse(streamUrl));
Log.d(TAG, "playTuneInStation: media source set");
mMediaPlayer.addListener(this);
mMediaPlayer.prepare(mediaSource);
}
If anybody has any ideas of why the original idea using MediaPlayer didn't work, please feel free to comment or post another answer. Thanks.
Related
i have a url and it changes every 3 seconds. I make a request to the url every 2 seconds and refresh the url. 3 seconds becomes a valid m3u8 file.Only the query parameters in the url change every 3 seconds. I'm returning the same post just a different link.
DataSource.Factory dataSourceFactory = new DefaultHttpDataSourceFactory();
HlsMediaSource hlsMediaSource =
new HlsMediaSource.Factory(dataSourceFactory)
.createMediaSource(MediaItem.fromUri(dataItem.getVideo()));
concatenatingMediaSource = new ConcatenatingMediaSource();
concatenatingMediaSource.addMediaSource(hlsMediaSource);
player.setMediaSource(concatenatingMediaSource);
player.prepare();
player.setPlayWhenReady(true);
private void setLiveStreamData(String id) {
Call<LiveStreamData> liveStreamDataCall = RetrofitBuilder.newCreate().getStreamLive(id);
liveStreamDataCall.enqueue(new Callback<LiveStreamData>() {
#Override
public void onResponse(#NotNull Call<LiveStreamData> call, #NotNull Response<LiveStreamData> response) {
if (response.isSuccessful() && response.body() != null) {
DataSource.Factory dataSourceFactory = new DefaultHttpDataSourceFactory();
HlsMediaSource hlsMediaSource =
new HlsMediaSource.Factory(dataSourceFactory)
.createMediaSource(MediaItem.fromUri(response.body().getUrl()));
concatenatingMediaSource.addMediaSource(hlsMediaSource);
}
}
#Override
public void onFailure(#NotNull Call<LiveStreamData> call, #NotNull Throwable t) {
Log.e(TAG, "onFailure: ", t);
}
});
}
I may not be able to add the exoplayer correctly. because after 3 seconds exoplayer keeps playing the first link and gives an error. After 3 seconds the old url no longer returns an m3u8 file.
How can I set up such a structure correctly?
Playback error
com.google.android.exoplayer2.ExoPlaybackException: Source error
It looks like your use case is a Live HLS stream.
For Live you should not have to worry about manually re-requesting the mpd file yourself when it updates as the player will recognise it is a Live stream and request updates itself.
This is actually specified in the HLS RFC along with guidance so the player does not generate too many requests and overload the server:
The client MUST periodically reload a Media Playlist file to learn
what media is currently available, unless it contains an EXT-X-
PLAYLIST-TYPE tag with a value of VOD, or a value of EVENT and the
EXT-X-ENDLIST tag is also present.
However, the client MUST NOT attempt to reload the Playlist file more
frequently than specified by this section, in order to limit the
collective load on the server.
(HLS RFC: https://datatracker.ietf.org/doc/html/rfc8216)
One important check is to make sure the manifest is correctly formatted for Live streams and in particular that it does not contain the EXT-X-ENDLIST tag as noted above and in the Apple HLS guidelines:
In live sessions, the index file is updated by removing media URIs from the file as new media files are created and made available. The EXT-X-ENDLIST tag isn't present in the live playlist, indicating that new media files will be added to the index file as they become available.
More info including the above at this link: https://developer.apple.com/documentation/http_live_streaming/example_playlists_for_http_live_streaming/live_playlist_sliding_window_construction
I'm using Cordova to build my mobile app and I need to record sounds.
I'm using the media-capture plugin which launches the default android recorder app within this function:
private void captureAudio() {
Intent intent = new Intent(android.provider.MediaStore.Audio.Media.RECORD_SOUND_ACTION);
this.cordova.startActivityForResult((CordovaPlugin) this, intent, CAPTURE_AUDIO);
}
The problem is that after I get the file path and try to getAudioVideoData (which contains informations like "duration") the audio recording format (which is default to .amr) seems like cannot be parsed and throws an exception.
private JSONObject getAudioVideoData(String filePath, JSONObject obj, boolean video) throws JSONException {
MediaPlayer player = new MediaPlayer();
try {
player.setDataSource(filePath);
player.prepare();
obj.put("duration", player.getDuration() / 1000);
if (video) {
obj.put("height", player.getVideoHeight());
obj.put("width", player.getVideoWidth());
}
} catch (IOException e) {
Log.d(LOG_TAG, "Error: loading video file");
}
return obj;
}
I know that the problem is media format because on my older Android device, with 4.4.4, the Sound Recorder app has settings from where I can change file type and if I set it to .wav, than the getAudioVideoData works!
I have tried to add the following inside captureAudio() before startActivityForResult():
intent.putExtra(android.provider.MediaStore.Audio.Media.ENTRY_CONTENT_TYPE, "audio/aac");
intent.putExtra(android.provider.MediaStore.Audio.Media.MIME_TYPE, "audio/aac");
intent.putExtra(android.provider.MediaStore.Audio.Media.CONTENT_TYPE, "audio/aac");
..but with no success.
I couldn't find a way to influence the output of Sound Recorder app via intent, but I solved the main problem, which was that I couldn't read recorded audio file's metadata (duration property).
Fixed with this PR: https://github.com/apache/cordova-plugin-media-capture/pull/50
I have a link of video from s3 server and i am playing this video on VideoView. Video is playing properly but the problem is that first it downloads the entire video then plays it.
I want it play like buffer. I mean if 20 % video downloaded it should play those and then again download (Like in youtube). Here is my code what i have done is..
FFmpegMediaMetadataRetriever mediaMetadataRetriever = new FFmpegMediaMetadataRetriever();
AWSCredentials myCredentials = new BasicAWSCredentials(
"AKIAIGOIY4LLB7EMACGQ",
"7wNQeY1JC0uyMaGYhKBKc9V7QC7X4ecBtyLimt2l");
AmazonS3 s3client = new AmazonS3Client(myCredentials);
GeneratePresignedUrlRequest request = new GeneratePresignedUrlRequest(
"mgvtest", videoUrl);
URL objectURL = s3client.generatePresignedUrl(request);
try {
mediaMetadataRetriever.setDataSource(videoUrl);
} catch (Exception e) {
utilDialog.showDialog("Unable to load this video",
utilDialog.ALERT_DIALOG);
pb.setVisibility(View.INVISIBLE);
}
videoView.setVideoURI(Uri.parse(videoUrl));
MediaController myMediaController = new MediaController(this);
// myMediaController.setMediaPlayer(videoView);
videoView.setMediaController(myMediaController);
videoView.setOnCompletionListener(myVideoViewCompletionListener);
videoView.setOnPreparedListener(MyVideoViewPreparedListener);
videoView.setOnErrorListener(myVideoViewErrorListener);
videoView.requestFocus();
videoView.start();
Listeners
MediaPlayer.OnCompletionListener myVideoViewCompletionListener = new MediaPlayer.OnCompletionListener() {
#Override
public void onCompletion(MediaPlayer arg0) {
// Toast.makeText(PlayRecordedVideoActivity.this, "End of Video",
// Toast.LENGTH_LONG).show();
}
};
MediaPlayer.OnPreparedListener MyVideoViewPreparedListener = new MediaPlayer.OnPreparedListener() {
#Override
public void onPrepared(MediaPlayer mp) {
pb.setVisibility(View.INVISIBLE);
imgScreenshot.setVisibility(View.VISIBLE);
tvScreenshot.setVisibility(View.VISIBLE);
// final Animation in = new AlphaAnimation(0.0f, 1.0f);
// in.setDuration(3000);
// tvScreenshot.startAnimation(in);
Animation animation = AnimationUtils.loadAnimation(
getApplicationContext(), R.anim.zoom_in);
tvScreenshot.startAnimation(animation);
new Handler().postDelayed(new Runnable() {
#Override
public void run() {
tvScreenshot.setVisibility(View.INVISIBLE);
}
}, 3000);
}
};
MediaPlayer.OnErrorListener myVideoViewErrorListener = new MediaPlayer.OnErrorListener() {
#Override
public boolean onError(MediaPlayer mp, int what, int extra) {
// Toast.makeText(PlayRecordedVideoActivity.this, "Error!!!",
// Toast.LENGTH_LONG).show();
return true;
}
};
To be able to start playing an mp4 video before it has fully downloaded the video has to have the metadata at the start of the video rather than the end - unfortunately, with standard mp4 the default it usually to have it at the end.
The metadata is in an 'atom' or 'box' (basically a data structure within the mp4 file) and can be moved to the start. This is usually referred to as faststart and tools such as ffmpeg will allow you do this. The following is an extract from the ffmpeg documentation:
The mov/mp4/ismv muxer supports fragmentation. Normally, a MOV/MP4 file has all the metadata about all packets stored in one location (written at the end of the file, it can be moved to the start for better playback by adding faststart to the movflags, or using the qt-faststart tool).
There are other tools and software which will allow you do this also - e.g. the one mentioned in the ffmpeg extract above:
http://multimedia.cx/eggs/improving-qt-faststart/
If you actually want full streaming where the server breaks the file into chunks and these are downloaded one by one by the client, then you probably want to use one of the adaptive bit rate protocols (Apple's HLS, MS's Smoothstreaming, Adobe Adaptive Streaming or the new open standard DASH). This also allows you have different bit rates to allow for different network conditions. You will need a server that can support this functionality to use these techniques. This may be overkill if you just want a simple site with a single video and will not have too much traffic.
Actually you have to start cloudfront with s3, so can stream s3 videos,
checkout this link for more information:
http://www.miracletutorials.com/s3-streaming-video-with-cloudfront/
I'm using libGDX and face the problem that background music does not flawlessly loop on various Android devices (Nexus 7 running Lollipop for example). Whenever the track loops (i.e. jumps from the end to the start) a clearly noticeable gap is hearable. Now I wonder how the background music can be played in a loop without the disturbing gap?
I've already tried various approaches like:
Ensuring the number of Samples in the track are an exact multiple of the tracks sample rate (as mentioned somewhere here on SO).
Various audio formats like .ogg, .m4a, .mp3 and .wav (.ogg seems to be the solution of choice here at SO, but unfortunately it does not work in my case).
Used Androids MediaPlayer with setLooping(true) instead of libGDX Music class.
Used Androids MediaPlayer.setNextMediaPlayer(). The code looks like the following, and it plays the two tracks without a gap in between, but unfortunately, as soon as the second MediaPlayer finishes, the first does not start again!
/* initialization */
afd = context.getAssets().openFd(filename);
firstBackgroundMusic.setDataSource(afd.getFileDescriptor(), afd.getStartOffset(), afd.getLength());
firstBackgroundMusic.prepare();
firstBackgroundMusic.setOnCompletionListener(this);
secondBackgroundMusic.setDataSource(afd.getFileDescriptor(), afd.getStartOffset(), afd.getLength());
secondBackgroundMusic.prepare();
secondBackgroundMusic.setOnCompletionListener(this);
firstBackgroundMusic.setNextMediaPlayer(secondBackgroundMusic);
secondBackgroundMusic.setNextMediaPlayer(firstBackgroundMusic);
firstBackgroundMusic.start();
#Override
public void onCompletion(MediaPlayer mp) {
mp.stop();
try {
mp.prepare();
} catch (IOException e) { e.printStackTrace(); }
}
Any ideas what's wrong with the code snippet?
Just for the records:
It tuned out to be unsolvable. At the end, we looped the background music various times inside the file. This way the gap appears less frequently. It's no real solution to the problem, but the best workaround we could find.
This is an old question but I will give my solution in case anyone has the same problem.
The solution requires the use of the Audio-extension(deprecated but works just fine),if you cant find the link online here are the jars that I am using, also requires some external storage space.
The abstract is the following
Extract the raw music data with a decoder(VorbisDecoder class for ogg or Mpg123Decoder for mp3)and save them to the external storage(you can make a check to see if it already exists so that it only needs to be extracted once, cause it takes some time).
Create a RandomAccessFile using the file you just saved to the external storage
While playing set the RandomAccessFile pointer to the correct spot in the file and read a data segment
Play the above data segment with the AudioDevice class
Here is some code
Extract the music file and save it to the external storage,file is the FileHandle of the internal music file,here is an ogg and thats why we use VorbisDecoder
FileHandle external=Gdx.files.external("data/com.package.name/music/"+file.name());
file.copyTo(external);
VorbisDecoder decoder = new VorbisDecoder(external);
FileHandle extreactedDataFile=Gdx.files.external("data/com.package.name/music/"+file.nameWithoutExtension()+".mdata");
if(extreactedDataFile.exists())extreactedDataFile.delete();
ShortBuffer sbuffer=ByteBuffer.wrap(shortBytes).order(ByteOrder.LITTLE_ENDIAN).asShortBuffer();
while(true){
if(LogoScreen.shouldBreakMusicLoad)break;
int num=decoder.readSamples(samples, 0,samples.length);
sbuffer.put(samples,0,num);
sbuffer.position(0);
extreactedDataFile.writeBytes(shortBytes,0,num*2, true);
if(num<=0)break;
}
external.delete();
Create an RandomAccessFile pointing to the file we just created
if(extreactedDataFile.exists()){
try {
raf=new RandomAccessFile(Gdx.files.external(extreactedDataFile.path()).file(), "r");
raf.seek(0);
} catch (Exception e) {
e.printStackTrace();
}
}
Create a Buffer so we can translate the bytes read from the file to a short array that gets feeded to the AudioDevice
public byte[] rafbufferBytes=new byte[length*2];
public short[] rafbuffer=new short[length];
public ShortBuffer sBuffer=ByteBuffer.wrap(rafbufferBytes).order(ByteOrder.LITTLE_ENDIAN).asShortBuffer();
When we want to play the file we create an AudioDevice and a new thread where we read constantly read from the raf file and feed it to the AudioDevice
device = Gdx.audio.newAudioDevice((int)rate/*the hrz of the music e.g 44100*/,isthemusicMONO?);
currentBytes=0;//set the file to the beggining
playbackThread = new Thread(new Runnable() {
#Override
public synchronized void run() {
while (playing) {
if(raf!=null){
int length=raf.read(rafbufferBytes);
if(length<=0){
ocl.onCompletion(DecodedMusic.this);
length=raf.read(rafbufferBytes);
}
sBuffer.get(rafbuffer);
sBuffer.position(0);
if(length>20){
try{
device.writeSamples(rafbuffer,0,length/2);
fft.spectrum(rafbuffer, spectrum);
currentBytes+=length;
}catch(com.badlogic.gdx.utils.GdxRuntimeException ex){
ex.printStackTrace();
device = Gdx.audio.newAudioDevice((int)(rate),MusicPlayer.mono);
}
}
}
}
}
});
playbackThread.setDaemon(true);
playbackThread.start();
And when we want to seek at a position
public void seek(float pos){
currentBytes=(int) (rate*pos);
try {
raf.seek(currentBytes*4);
} catch (IOException e) {
e.printStackTrace();
}
}
My app, among other things, streams clips of songs through SoundCloud. It's been working great for the past few weeks, but last night, in my MediaPlayer's prepare() method, it started getting error (1, -1004), which is ridiculously weird because I can still play the streams that I get in my desktop browser. Has something recently changed on SoundCloud that could have broken this?
This is a snippet of my code that I used to confirm that this was a problem:
String url = replacementUrlEdit.getText().toString();
try {
mediaPlayer.setDataSource(url);
} catch (IOException e) {
showInfoPopup("exception preparing: " + e);
e.printStackTrace();
}
mediaPlayer.setOnPreparedListener(new MediaPlayer.OnPreparedListener() {
#Override
public void onPrepared(MediaPlayer mp) {
mp.start();
}
});
mediaPlayer.prepareAsync();
Pretty standard stuff. The url that I end up passing to the MediaPlayer is the redirected stream url, in a format like https://ec-media.soundcloud.com/XXXXXXXXXX.128.mp3?xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx&AWSAccessKeyId=xxxxxxxxxxxxxxxxxxxxxxxx&Expires=1406322087&Signature=xxxxxxxxxxxxxxxxxx%3D
The error I get is this:
07-25 16:56:29.801 4425-4696/com.example E/MediaPlayer﹕ error (1, -1004)
07-25 16:56:29.861 4425-4425/com.example E/MediaPlayer﹕ Error (1,-1004)
I have also tried it preparing synchronously, in which case I get an IOException with error 0x1.
How can I fix this?