Streams stopped working on Android MediaPlayer - android

My app, among other things, streams clips of songs through SoundCloud. It's been working great for the past few weeks, but last night, in my MediaPlayer's prepare() method, it started getting error (1, -1004), which is ridiculously weird because I can still play the streams that I get in my desktop browser. Has something recently changed on SoundCloud that could have broken this?
This is a snippet of my code that I used to confirm that this was a problem:
String url = replacementUrlEdit.getText().toString();
try {
mediaPlayer.setDataSource(url);
} catch (IOException e) {
showInfoPopup("exception preparing: " + e);
e.printStackTrace();
}
mediaPlayer.setOnPreparedListener(new MediaPlayer.OnPreparedListener() {
#Override
public void onPrepared(MediaPlayer mp) {
mp.start();
}
});
mediaPlayer.prepareAsync();
Pretty standard stuff. The url that I end up passing to the MediaPlayer is the redirected stream url, in a format like https://ec-media.soundcloud.com/XXXXXXXXXX.128.mp3?xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx&AWSAccessKeyId=xxxxxxxxxxxxxxxxxxxxxxxx&Expires=1406322087&Signature=xxxxxxxxxxxxxxxxxx%3D
The error I get is this:
07-25 16:56:29.801 4425-4696/com.example E/MediaPlayer﹕ error (1, -1004)
07-25 16:56:29.861 4425-4425/com.example E/MediaPlayer﹕ Error (1,-1004)
I have also tried it preparing synchronously, in which case I get an IOException with error 0x1.
How can I fix this?

Related

MediaPlayer prepare() hangs on custom device

I'm currently implementing streaming internet audio on a custom tablet (i.e. not Samsung, Google, etc). I developed the code and tested on a Samsung tablet which worked fine. When installing on the custom tablet, the MediaPlayer hangs on the call to prepare(). I have implemented OnPreparedListener, OnErrorListener, OnBufferingUpdateListner, and OnCompletionListener all with logs for basic troubleshooting. I am implementing the streaming in a class that extends Service and uses a separate thread. I have tried both prepare() and prepareAsync().
To clarify, when I say the MediaPlayer hangs, I mean I receive NO error messages, crashes, log output, etc. The app continues to function, but onPrepared(...) never gets called.
I configure the player like this:
private void configureMediaPlayer(String streamUrl) {
mMediaPlayer = new MediaPlayer();
AudioAttributes attributes = new AudioAttributes.Builder()
.setContentType(AudioAttributes.CONTENT_TYPE_MUSIC)
.setUsage(AudioAttributes.USAGE_MEDIA)
.build();
mMediaPlayer.setAudioAttributes(attributes);
Log.d(TAG, "Media player configured");
playTuneInStation(streamUrl);
}
I attempt to play the music stream like this:
private void playStation(String streamUrl) {
Log.d(TAG, "playStation: " + streamUrl);
if (mMediaPlayer.isPlaying()) {
mMediaPlayer.stop();
mMediaPlayer.reset();
Log.d(TAG, "Stopped media player.");
}
try {
mMediaPlayer.setOnPreparedListener(this);
mMediaPlayer.setOnErrorListener(this);
mMediaPlayer.setOnBufferingUpdateListener(this);
mMediaPlayer.setOnCompletionListener(this);
mMediaPlayer.setDataSource(streamUrl);
new Handler(getMainLooper()).post(() -> Log.d(TAG, "Main thread: " + Thread.currentThread().getId()));
Log.d(TAG, "Preparing stream on thread: " + Thread.currentThread().getId() + ".");
mMediaPlayer.prepare();
Log.d(TAG, "Waiting for media player to be prepared");
} catch (IOException e) {
//TODO: Gracefully inform user of failure
e.printStackTrace();
}
}
and I implement onPrepared(MediaPlayer mp) to start playing the stream:
#Override
public void onPrepared(MediaPlayer mp) {
Log.d(TAG, "Media player prepared. Starting playback.");
mp.start();
}
Again, this works great on the Samsung table, but, due to the nature of the project, we have to use custom tablets provided by the client. We use VideoViews in some places which seem to work and play audio, but those are local files on the device (not sure if that matters). The devices use a 4G network and all web api calls work fine so I don't think it would be anything related to network. I just find it odd that it will hang forever (I've let it sit for 30 min with no crashes, errors, logs, etc). The app is totally responsive during this time.
I'm not sure if there is some other configuration I can use which might help. Any help would be greatly appreciated.
UPDATE
I've tried loading a local audio file by replacing:
setDataSource(streamUrl);
with:
setDataSource(this, Uri.parse(filename));
and it played the audio without a problem.
UPDATE 2
I've now tried passing in an invalid URL to the setDataSource(...) method and the OnErrorListener is called with the error that it can't load the resource. Also, I passed in the URL that should work, but stripped a few characters from the end. It seemed to recognize it as a potentially valid stream because it posted error logs and retried loading the resource 10 times before failing in the OnErrorListener.
Here are the logs I receive when prepare() (also tried with prepareAsync()) is called:
05-15 11:02:33.149 252-26756/? D/FslExtractor: FslExtractor::FslExtractor mime=audio/mpeg
05-15 11:02:33.150 252-26756/? D/FslExtractor: FslExtractor::Init BEGIN
GetLibraryName lib_mp3_parser_arm11_elinux.3.0.so
load parser name lib_mp3_parser_arm11_elinux.3.0.so
FslExtractor::CreateParserInterface success
05-15 11:02:33.150 252-26756/? I/FslExtractor: Core parser MP3PARSER_03.01.15 build on Nov 17 2016 13:55:34
05-15 11:02:33.150 252-26756/? D/FslExtractor: createParser2 flag=5e,err=0
05-15 11:02:33.150 252-26756/? I/FslExtractor: mReadMode=0,mNumTracks=1
bSeekable 1
05-15 11:02:33.150 252-26756/? D/FslExtractor: FslExtractor::ParseMediaFormat BEGIN
ParseAudio index=0,type=5,subtype=2829696637
It looks in these logs like it's beginning to parse the audio, but nothing happens afterward.
Well, since there were no crashes or decent logs that pointed to anything, I switched to using Google's ExoPlayer, which seems to be working just fine.
private void configureMediaPlayer(String streamUrl) {
Log.d(TAG, "configureMediaPlayer: " + streamUrl);
DefaultBandwidthMeter meter = new DefaultBandwidthMeter();
mMediaPlayer = ExoPlayerFactory.newSimpleInstance(this,
new DefaultTrackSelector(
new AdaptiveTrackSelection.Factory(
meter)));
DataSource.Factory dataSourceFactory = new DefaultDataSourceFactory(this,
Util.getUserAgent(this, "appName"), meter);
mMediaSourceFactory = new ExtractorMediaSource.Factory(dataSourceFactory);
mMediaPlayer.setPlayWhenReady(true);
MediaSource mediaSource = mMediaSourceFactory.createMediaSource(Uri.parse(streamUrl));
Log.d(TAG, "playTuneInStation: media source set");
mMediaPlayer.addListener(this);
mMediaPlayer.prepare(mediaSource);
}
If anybody has any ideas of why the original idea using MediaPlayer didn't work, please feel free to comment or post another answer. Thanks.

android stream audio by passing cookies as header

I have to play mp3 audio file from the Server Url. To make the audio to played I need to pass Cookie in the headers. So I used setDatasource(context, uri, headers) to play the media. It's not working for me.
It gives me the error:source returned error -1008, 10 retries left to source returned error -1008, 0 retries left.
Anyone can help me to get a solution for this. Thanks in advance.
MediaPlayer mediaPlayer;
Map headers = new HashMap();
headers.put("Cookie","CloudFront-Signature=xx; Domain=xx; Path=xx");
headers.put("Cookie","CloudFront-Policy:xx; Domain=xx; Path=xx");
headers.put("Cookie","CloudFront-Key-Pair-Id=xx; Domain=xx; Path=XX");
mediaPlayer = new MediaPlayer();
mediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
mediaPlayer.setDataSource(MediaPlayerService.this, Uri.parse("URL"),headers);
mediaPlayer.prepare();
mediaPlayer.setOnPreparedListener(new MediaPlayer.OnPreparedListener() {
#Override
public void onPrepared(MediaPlayer mediaPlayer) {
mediaPlayer.start();
}
});
**Following are the links referred:**
1. http://stackoverflow.com/questions/731603/android-mediaplayer-urls-with-cookie
2. http://stackoverflow.com/questions/9727098/android-add-cookie-to-mediaplayer-requests
3. http://stackoverflow.com/questions/2932362/how-to-stream-authenticated-content-with-mediaplayer-on-android
Finally, I figured out the solution by referring this link "https://github.com/yixia/VitamioBundle/issues/177#issuecomment-48872226". I can able to add cookies in MediaPlayer header like below. Audio Playing working in all 4.0,5.0 and 6.0 android versions without any issue. Hope this helps someone.
Important note adding \r\n at the end did the trick. Thanks for the above forum.
Map<String, String> headers = new HashMap<String, String>();
headers.put("Cookie","CloudFront-Policy=xxx; CloudFront-Signature=xxx;
CloudFront-Key-Pair-Id=xxx; Domain=xx; Path=xxx\r\n");
mediaPlayer = new MediaPlayer();
Uri uri = Uri.parse("url");
mediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
mediaPlayer.setDataSource(context, uri, headers);
mediaPlayer.setOnPreparedListener(new MediaPlayer.OnPreparedListener() {
#Override
public void onPrepared(MediaPlayer mediaPlayer) {
mediaPlayer.start();
}
});
mediaPlayer.prepare();

Android - set Sound Recorder mime type from Intent

I'm using Cordova to build my mobile app and I need to record sounds.
I'm using the media-capture plugin which launches the default android recorder app within this function:
private void captureAudio() {
Intent intent = new Intent(android.provider.MediaStore.Audio.Media.RECORD_SOUND_ACTION);
this.cordova.startActivityForResult((CordovaPlugin) this, intent, CAPTURE_AUDIO);
}
The problem is that after I get the file path and try to getAudioVideoData (which contains informations like "duration") the audio recording format (which is default to .amr) seems like cannot be parsed and throws an exception.
private JSONObject getAudioVideoData(String filePath, JSONObject obj, boolean video) throws JSONException {
MediaPlayer player = new MediaPlayer();
try {
player.setDataSource(filePath);
player.prepare();
obj.put("duration", player.getDuration() / 1000);
if (video) {
obj.put("height", player.getVideoHeight());
obj.put("width", player.getVideoWidth());
}
} catch (IOException e) {
Log.d(LOG_TAG, "Error: loading video file");
}
return obj;
}
I know that the problem is media format because on my older Android device, with 4.4.4, the Sound Recorder app has settings from where I can change file type and if I set it to .wav, than the getAudioVideoData works!
I have tried to add the following inside captureAudio() before startActivityForResult():
intent.putExtra(android.provider.MediaStore.Audio.Media.ENTRY_CONTENT_TYPE, "audio/aac");
intent.putExtra(android.provider.MediaStore.Audio.Media.MIME_TYPE, "audio/aac");
intent.putExtra(android.provider.MediaStore.Audio.Media.CONTENT_TYPE, "audio/aac");
..but with no success.
I couldn't find a way to influence the output of Sound Recorder app via intent, but I solved the main problem, which was that I couldn't read recorded audio file's metadata (duration property).
Fixed with this PR: https://github.com/apache/cordova-plugin-media-capture/pull/50

How to play video in video view from url based on buffer %age in android?

I have a link of video from s3 server and i am playing this video on VideoView. Video is playing properly but the problem is that first it downloads the entire video then plays it.
I want it play like buffer. I mean if 20 % video downloaded it should play those and then again download (Like in youtube). Here is my code what i have done is..
FFmpegMediaMetadataRetriever mediaMetadataRetriever = new FFmpegMediaMetadataRetriever();
AWSCredentials myCredentials = new BasicAWSCredentials(
"AKIAIGOIY4LLB7EMACGQ",
"7wNQeY1JC0uyMaGYhKBKc9V7QC7X4ecBtyLimt2l");
AmazonS3 s3client = new AmazonS3Client(myCredentials);
GeneratePresignedUrlRequest request = new GeneratePresignedUrlRequest(
"mgvtest", videoUrl);
URL objectURL = s3client.generatePresignedUrl(request);
try {
mediaMetadataRetriever.setDataSource(videoUrl);
} catch (Exception e) {
utilDialog.showDialog("Unable to load this video",
utilDialog.ALERT_DIALOG);
pb.setVisibility(View.INVISIBLE);
}
videoView.setVideoURI(Uri.parse(videoUrl));
MediaController myMediaController = new MediaController(this);
// myMediaController.setMediaPlayer(videoView);
videoView.setMediaController(myMediaController);
videoView.setOnCompletionListener(myVideoViewCompletionListener);
videoView.setOnPreparedListener(MyVideoViewPreparedListener);
videoView.setOnErrorListener(myVideoViewErrorListener);
videoView.requestFocus();
videoView.start();
Listeners
MediaPlayer.OnCompletionListener myVideoViewCompletionListener = new MediaPlayer.OnCompletionListener() {
#Override
public void onCompletion(MediaPlayer arg0) {
// Toast.makeText(PlayRecordedVideoActivity.this, "End of Video",
// Toast.LENGTH_LONG).show();
}
};
MediaPlayer.OnPreparedListener MyVideoViewPreparedListener = new MediaPlayer.OnPreparedListener() {
#Override
public void onPrepared(MediaPlayer mp) {
pb.setVisibility(View.INVISIBLE);
imgScreenshot.setVisibility(View.VISIBLE);
tvScreenshot.setVisibility(View.VISIBLE);
// final Animation in = new AlphaAnimation(0.0f, 1.0f);
// in.setDuration(3000);
// tvScreenshot.startAnimation(in);
Animation animation = AnimationUtils.loadAnimation(
getApplicationContext(), R.anim.zoom_in);
tvScreenshot.startAnimation(animation);
new Handler().postDelayed(new Runnable() {
#Override
public void run() {
tvScreenshot.setVisibility(View.INVISIBLE);
}
}, 3000);
}
};
MediaPlayer.OnErrorListener myVideoViewErrorListener = new MediaPlayer.OnErrorListener() {
#Override
public boolean onError(MediaPlayer mp, int what, int extra) {
// Toast.makeText(PlayRecordedVideoActivity.this, "Error!!!",
// Toast.LENGTH_LONG).show();
return true;
}
};
To be able to start playing an mp4 video before it has fully downloaded the video has to have the metadata at the start of the video rather than the end - unfortunately, with standard mp4 the default it usually to have it at the end.
The metadata is in an 'atom' or 'box' (basically a data structure within the mp4 file) and can be moved to the start. This is usually referred to as faststart and tools such as ffmpeg will allow you do this. The following is an extract from the ffmpeg documentation:
The mov/mp4/ismv muxer supports fragmentation. Normally, a MOV/MP4 file has all the metadata about all packets stored in one location (written at the end of the file, it can be moved to the start for better playback by adding faststart to the movflags, or using the qt-faststart tool).
There are other tools and software which will allow you do this also - e.g. the one mentioned in the ffmpeg extract above:
http://multimedia.cx/eggs/improving-qt-faststart/
If you actually want full streaming where the server breaks the file into chunks and these are downloaded one by one by the client, then you probably want to use one of the adaptive bit rate protocols (Apple's HLS, MS's Smoothstreaming, Adobe Adaptive Streaming or the new open standard DASH). This also allows you have different bit rates to allow for different network conditions. You will need a server that can support this functionality to use these techniques. This may be overkill if you just want a simple site with a single video and will not have too much traffic.
Actually you have to start cloudfront with s3, so can stream s3 videos,
checkout this link for more information:
http://www.miracletutorials.com/s3-streaming-video-with-cloudfront/

libGDX/Android: How to loop background music without the dreaded gap?

I'm using libGDX and face the problem that background music does not flawlessly loop on various Android devices (Nexus 7 running Lollipop for example). Whenever the track loops (i.e. jumps from the end to the start) a clearly noticeable gap is hearable. Now I wonder how the background music can be played in a loop without the disturbing gap?
I've already tried various approaches like:
Ensuring the number of Samples in the track are an exact multiple of the tracks sample rate (as mentioned somewhere here on SO).
Various audio formats like .ogg, .m4a, .mp3 and .wav (.ogg seems to be the solution of choice here at SO, but unfortunately it does not work in my case).
Used Androids MediaPlayer with setLooping(true) instead of libGDX Music class.
Used Androids MediaPlayer.setNextMediaPlayer(). The code looks like the following, and it plays the two tracks without a gap in between, but unfortunately, as soon as the second MediaPlayer finishes, the first does not start again!
/* initialization */
afd = context.getAssets().openFd(filename);
firstBackgroundMusic.setDataSource(afd.getFileDescriptor(), afd.getStartOffset(), afd.getLength());
firstBackgroundMusic.prepare();
firstBackgroundMusic.setOnCompletionListener(this);
secondBackgroundMusic.setDataSource(afd.getFileDescriptor(), afd.getStartOffset(), afd.getLength());
secondBackgroundMusic.prepare();
secondBackgroundMusic.setOnCompletionListener(this);
firstBackgroundMusic.setNextMediaPlayer(secondBackgroundMusic);
secondBackgroundMusic.setNextMediaPlayer(firstBackgroundMusic);
firstBackgroundMusic.start();
#Override
public void onCompletion(MediaPlayer mp) {
mp.stop();
try {
mp.prepare();
} catch (IOException e) { e.printStackTrace(); }
}
Any ideas what's wrong with the code snippet?
Just for the records:
It tuned out to be unsolvable. At the end, we looped the background music various times inside the file. This way the gap appears less frequently. It's no real solution to the problem, but the best workaround we could find.
This is an old question but I will give my solution in case anyone has the same problem.
The solution requires the use of the Audio-extension(deprecated but works just fine),if you cant find the link online here are the jars that I am using, also requires some external storage space.
The abstract is the following
Extract the raw music data with a decoder(VorbisDecoder class for ogg or Mpg123Decoder for mp3)and save them to the external storage(you can make a check to see if it already exists so that it only needs to be extracted once, cause it takes some time).
Create a RandomAccessFile using the file you just saved to the external storage
While playing set the RandomAccessFile pointer to the correct spot in the file and read a data segment
Play the above data segment with the AudioDevice class
Here is some code
Extract the music file and save it to the external storage,file is the FileHandle of the internal music file,here is an ogg and thats why we use VorbisDecoder
FileHandle external=Gdx.files.external("data/com.package.name/music/"+file.name());
file.copyTo(external);
VorbisDecoder decoder = new VorbisDecoder(external);
FileHandle extreactedDataFile=Gdx.files.external("data/com.package.name/music/"+file.nameWithoutExtension()+".mdata");
if(extreactedDataFile.exists())extreactedDataFile.delete();
ShortBuffer sbuffer=ByteBuffer.wrap(shortBytes).order(ByteOrder.LITTLE_ENDIAN).asShortBuffer();
while(true){
if(LogoScreen.shouldBreakMusicLoad)break;
int num=decoder.readSamples(samples, 0,samples.length);
sbuffer.put(samples,0,num);
sbuffer.position(0);
extreactedDataFile.writeBytes(shortBytes,0,num*2, true);
if(num<=0)break;
}
external.delete();
Create an RandomAccessFile pointing to the file we just created
if(extreactedDataFile.exists()){
try {
raf=new RandomAccessFile(Gdx.files.external(extreactedDataFile.path()).file(), "r");
raf.seek(0);
} catch (Exception e) {
e.printStackTrace();
}
}
Create a Buffer so we can translate the bytes read from the file to a short array that gets feeded to the AudioDevice
public byte[] rafbufferBytes=new byte[length*2];
public short[] rafbuffer=new short[length];
public ShortBuffer sBuffer=ByteBuffer.wrap(rafbufferBytes).order(ByteOrder.LITTLE_ENDIAN).asShortBuffer();
When we want to play the file we create an AudioDevice and a new thread where we read constantly read from the raf file and feed it to the AudioDevice
device = Gdx.audio.newAudioDevice((int)rate/*the hrz of the music e.g 44100*/,isthemusicMONO?);
currentBytes=0;//set the file to the beggining
playbackThread = new Thread(new Runnable() {
#Override
public synchronized void run() {
while (playing) {
if(raf!=null){
int length=raf.read(rafbufferBytes);
if(length<=0){
ocl.onCompletion(DecodedMusic.this);
length=raf.read(rafbufferBytes);
}
sBuffer.get(rafbuffer);
sBuffer.position(0);
if(length>20){
try{
device.writeSamples(rafbuffer,0,length/2);
fft.spectrum(rafbuffer, spectrum);
currentBytes+=length;
}catch(com.badlogic.gdx.utils.GdxRuntimeException ex){
ex.printStackTrace();
device = Gdx.audio.newAudioDevice((int)(rate),MusicPlayer.mono);
}
}
}
}
}
});
playbackThread.setDaemon(true);
playbackThread.start();
And when we want to seek at a position
public void seek(float pos){
currentBytes=(int) (rate*pos);
try {
raf.seek(currentBytes*4);
} catch (IOException e) {
e.printStackTrace();
}
}

Categories

Resources