Streaming to VideoView only plays on Wifi when using Samsung phones - android

I am using the following library to stream YouTube videos to an Android application.
http://code.google.com/p/android-youtube-player/source/browse/trunk/OpenYouTubeActivity/src/com/keyes/youtube/OpenYouTubePlayerActivity.java?r=3
I am successfully able to play videos on HTC and Motorola phones over 3G and Wifi. However, on Samsung Galaxy (Epic 4G) and Samsung Galaxy II phones I am only able to play using Wifi. 3G gives me this error: "Cannot play video. Sorry this video cannot be played."
I have tried forcing low quality YouTube streaming, but this did not help. I see in my log that Start() is being called in both cases (3G/Wifi). Is this an issue with VideoView? Is there a workaround?
Edit 2
The videos are coming from YouTube API. I have attempted using embedded and normal streams, as well as lowest quality stream available (varying per video). Also, I do not think it is an encoding issue since the same videos play correctly using Wifi.
Edit 1
I also receive the following output regardless of wether video plays using Wifi or does not using 3G.
01-30 15:22:38.305: E/MediaPlayer(3831): error (1, -1)
01-30 15:22:38.305: E/MediaPlayer(3831): callback application
01-30 15:22:38.305: E/MediaPlayer(3831): back from callback
01-30 15:22:38.309: E/MediaPlayer(3831): Error (1,-1)
According to this Link, these errors means the following (I think):
/*
Definition of first error event in range (not an actual error code).
*/
const PVMFStatus PVMFErrFirst = (-1);
/*
Return code for general failure
*/
const PVMFStatus PVMFFailure = (-1);
/*
/*
Return code for general success
*/
const PVMFStatus PVMFSuccess = 1;
/*
Further adding confusion.

Yes, as you are thinking, this is a issue in VideoView, similar issues also appear in MediaPlayer, and I've encountered similar and strange issues as you did, I had problems when the video was played only on 3G and not on Wi-Fi. This usually happens on 2.1 and some 2.2 devices, but not on higher API levels as I've seen so far.
So what I can recommend is do the following :
First check if the running device may be one that can have issues, something like this :
//Define a static list of known devices with issues
static List sIssueDevices=Arrays.asList(new String[]{"HTC Desire","LG-P500","etc"});
if(Build.VERSION.SDK_INT<9){
if(sIssueDevices.contains(Build.Device){
//This device may have issue in streaming, take appropriate actions
}
}
So this was the simplest part, to detect if the running device may have issues in streaming the video. Now, what I did and may also help you, is buffer the video from Youtube in a file on the SDCard and set that file as the source for your VideoView. I will write some code snippets to see how my approach was :
private class GetYoutubeFile extends Thread{
private String mUrl;
private String mFile;
public GetYotubeFile(String url,String file){
mUrl=url;
mFile=file;
}
#Override
public void run() {
super.run();
try {
File bufferingDir=new File(Environment.getExternalStorageDirectory()
+"/YoutubeBuff");
File bufferFile=new File(bufferingDir.getAbsolutePath(), mFile);
//bufferFile.createNewFile();
BufferedOutputStream bufferOS=new BufferedOutputStream(
new FileOutputStream(bufferFile));
URL url=new URL(mUrl);
URLConnection connection=url.openConnection();
connection.setRequestProperty("User-Agent", "Mozilla");
connection.connect();
InputStream is=connection.getInputStream();
BufferedInputStream bis=new BufferedInputStream(is,2048);
byte[] buffer = new byte[16384];
int numRead;
boolean started=false;
while ((numRead = bis.read(buffer)) != -1 && !mActivityStopped) {
//Log.i("Buffering","Read :"+numRead);
bufferOS.write(buffer, 0, numRead);
bufferOS.flush();
mBuffPosition += numRead;
if(mBuffPosition>120000 &&!started){
Log.e("Player","BufferHIT:StartPlay");
setSourceAndStartPlay(bufferFile);
started=true;
}
}
Log.i("Buffering","Read -1?"+numRead+" stop:"+mActivityStopped);
} catch (MalformedURLException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
}
public void setSourceAndStartPlay(File bufferFile) {
try {
mPlayer.setVideoPath(bufferFile.getAbsolutePath());
mPlayer.prepare();
mPlayer.start();
} catch (IllegalArgumentException e) {
e.printStackTrace();
} catch (IllegalStateException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
Another issue will arise when the VideoView has stopped playing before the end of file, because not enough was buffered in the file. For this you need to set an onCompletionListener() and if you are not at the end of the video, you should start again the video playback from the last position :
public void onCompletion(MediaPlayer mp) {
mPlayerPosition=mPlayer.getCurrentPosition();
try {
mPlayer.reset();
mPlayer.setVideoPath(
new File("mnt/sdcard/YoutubeBuff/"+mBufferFile).getAbsolutePath());
mPlayer.seekTo(mPlayerPosition);
mPlayer.start();
} catch (IllegalArgumentException e) {
e.printStackTrace();
} catch (IllegalStateException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
In the end, of course the GetYoutubeFile thread is started in the onCreate() method :
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
//init views,player,etc
new GetYoutubeFile().start();
}
Some modifications and adaptation I think will have to be done for this code, and it may not be the best approach, but it helped me, and I couldn't find any alternative.

I have tackle with this problem in my own way.First every time read your log cat.If you got
Error (1,-1)
that means you will get sorry,this video can not play message.So in this case finish that activity, give custom progress bar and download video.Then after downloading save it in temporary folder then play it.After playing delete that folder.
To reading log cat---
try {
Process process = Runtime.getRuntime().exec("logcat -d");
BufferedReader bufferedReader = new BufferedReader(
new InputStreamReader(process.getInputStream()));
StringBuilder log=new StringBuilder();
String line;
while ((line = bufferedReader.readLine()) != null) {
log.append(line);
}
} catch (IOException e) {
}
Read this answer too. although it is just user experience not linked to app.but it happens sometimes in default application also
Heaps of videos, even if I have full 3G network coverage, will say "Sorry, this video cannot be played". One day, I got so pissed off with it, I just kept pressing 'Okay' to dismiss the message, and then pressed the video again only to see the "Sorry, this video cannot be played" message again. I repeated this process (in my blind anger), but eventually, after about 5 tries, the video decided to miraculously play!
This method pretty much works for me every time. Most videos won't want to play the first time, but eventually if I am just persistent, and keep telling it to play even though it tells me it 'can't' play the video, it will play! Although, some videos i've had to press 'Okay', press the video, press 'Okay', press the video etc... for like 20 times before it actually decided to play. Those times, I have been incredibly close to getting my phone and throwing it down on the floor because of how shitty I am with how youtube won't work.
I wish there was a way to fix this problem. No one seems to have come up with a solution. Everyone just says "oh yeah I have the same problem" but no one contributes anything. GOOGLE, SOLVE THIS PROBLEM ON YOUR PHONES. THIS SEEMS TO BE HAPPENING WORLDWIDE, ON A RANGE OF ANDROID PHONES.

This message often cames from the inappropriate encoding of the video ("Cannot play video. Sorry this video cannot be played.") I was struggling with videoview for a while , now the correct encoded videos play on all tested devices, even when using Wifi or 3G. Let em know if you want to know how to encode the videos. And for streaming the videos I used the demo from android sdk apis and it works flawless.

Related

MediaPlayer on Android Wear OS. Why do I get a IOException after prepare. Prepare failed status=0x1

I would like to create an app on Wear OS which plays back an online stream. The following code works fine under Android but not on Android Wear OS. Does anyone has an idea, why I get the Prepare failed status.
MediaPlayer mediaPlayer;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
binding = ActivityMainBinding.inflate(getLayoutInflater());
setContentView(binding.getRoot());
mTextView = binding.text;
String url = "....the url like http://streamserver.com/stream";
mediaPlayer = new MediaPlayer();
mediaPlayer.setAudioAttributes(new AudioAttributes.Builder().setUsage(AudioAttributes.USAGE_MEDIA)
.setContentType(AudioAttributes.CONTENT_TYPE_MUSIC)
.build());
try {
mediaPlayer.setDataSource(url);
mediaPlayer.prepare(); //here the Exception takes place after around 15 seconds of waiting.
} catch (IOException e) {
e.printStackTrace();
}
mediaPlayer.start();
}
If I change the code to mediaPlayer.prepareAsync(); with mediaPlayer.setOnPreparedListener(...) it also doesn't work.
Logcat shows: E/MediaPlayerNative: error (1, -2147483648)
Any idea, why it doesn't work on Wear OS but works on android?
Thanks
Jason
OK I found the solution and the error. The error takes place if the url is a HTTP connection instead of a HTTPS. If the source is only accessable from a HTTP connection there are at least 2 solution.
1.) that one worked for me.
enter the following line in the manifest under application
android:usesCleartextTraffic="true"
2.) add a network security file. info found under
https://developer.android.com/training/articles/security-config.html
that solves the problem

MediaPlayer can't play music from SD Storage

I'm trying to play music from a ListView (which takes data from a file path). But everytime i click, it gets an error like this:
09-14 09:58:42.996 1229-1276/? W/AudioTrack﹕ AUDIO_OUTPUT_FLAG_FAST denied by client
Even I use file path directly, it still doesn't work.
Here is my code:
private MediaPlayer mMediaPlayer;
private File dir = new File(Environment.getExternalStorageDirectory() + "/MyOwnMusicFolder");
private File[] files;
public void playSong(int position){ //position of the item in the ListView
if(mMediaPlayer !=null) {
if(mMediaPlayer.isPlaying()){
mMediaPlayer.pause();
}
try {
mMediaPlayer.setDataSource(dir + File.separator + files[position].getName());
mMediaPlayer.prepare();
mMediaPlayer.start();
}
catch (IOException e){
//something...
}
}
}
Edit: I'm using Android Studio + Android Studio Emulator
Edit 2: my mp3 files are completely normal
"Most likely, the tap sound got a AUDIO_OUTPUT_FLAG_FAST in order to use low-latency playback if possible, but the AudioTrack class considered the track settings to be incompatible with the low-latency audio output, so the flag got removed and the track got treated as if the flag hadn't been set to begin with. So I wouldn't consider this to be something to worry about.
As for the reason why the flag got denied; I'd still say that the most probable reason is a sample rate mismatch. The log in the question you linked to appears to have been added in this commit to the AOSP. But if we look at the master branch of the code base used on many Qualcomm-based devices we see that it still has the "AUDIO_OUTPUT_FLAG_FAST denied by client" log in the case were there was a sample rate mismatch. Which logs you get depends on the exact implemetation running on your device (i.e. which device and Android version you're running)."
Answer was taken from here. Credits to Michael
This most like is caused by mismatch on the sample rate and it should not affect the program runtime IF it is running on an actual device. Ref:

can the android emulator play audio

I wanted to record and pass through the recorded sound to the phone's speaker, but I could not get the recording code to work (app crashes, SEE MY ATTEMPT HERE) so I am now trying to see if the emulator can do anything related to audio or not. I copied a 1 sec recording, in both wav (16 bit pcm, 44k sampling frequency, mono) and mp3 (recording and conversion both done through Audacity) to the sdcard. I can see the files in the IDE's file explorer, so I guess the sdcard is being properly detected by the emulator. But I could not get the emulator's built in music player to detect them (Why ??).
As a second attempt, I copied the code HERE to the sample hello world Android app. Here's the main activity class
public class MainActivity extends Activity {
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
// String PATH_TO_FILE = "/sdcard/asMP3.mp3";
// String PATH_TO_FILE = Environment.getExternalStorageDirectory().getPath()+"/asMP3.mp3";
String PATH_TO_FILE = Environment.getExternalStorageDirectory().getPath()+"/wavSigned16bitPCM.wav";
MediaPlayer mp1 = new MediaPlayer();
try
{
mp1.setDataSource(PATH_TO_FILE);
mp1.prepare();
mp1.start();
Toast.makeText(getApplicationContext(), "HERE", Toast.LENGTH_SHORT).show();
}
catch (IllegalArgumentException e)
{
// TODO Auto-generated catch block
e.printStackTrace();
}
catch (IllegalStateException e)
{
// TODO Auto-generated catch block
e.printStackTrace();
}
catch (IOException e)
{
// TODO Auto-generated catch block
e.printStackTrace();
}
}
#Override
public boolean onCreateOptionsMenu(Menu menu) {
// Inflate the menu; this adds items to the action bar if it is present.
getMenuInflater().inflate(R.menu.main, menu);
return true;
}
}
I assumed this would start playing the sound as soon as the app starts. The Toast shows up so I know the code is executing. The program does not crash but nothing else happens either, no sound in this case as well (Why ?)
As a third attempt, I used the code HERE, and added the files I wanted to play in res\raw as it says. This program does not crash either, but I still cannot hear anything.
So the question is, is it possible to do anything at all related to audio, on the emulator? Looking at THIS QUESTION it looks like this should be possible, so why isn't it happening in my program? Do I need to set any permissions int he manifest for audio output as well?
----EDIT----
I have also seen THIS, but if I use the -useaudio option the emulator just says -useaudio is an unknown option, and emulator -help does not list it, hen it is clearly shown as an option in the developers website and moreover it says that useaudio is enabled by default. So why isn't my emulator playing any sound?
--- UPDATE ---
It seems the audio features do not work if the emulator has been started using a snapshot. If not, the audio feature still may or may not work depending on the computer. Please see HERE
Yes, you can do audio related work on emulator.
Your code sequence should be -
mp1 = new MediaPlayer();
mp1.setAudioStreamType(AudioManager.STREAM_MUSIC);
mp1.setDataSource(PATH_TO_FILE);
mp1.prepare();
mp1.start();
And for setting permissions in the manifest file.
Use these for record feature-
and these for playing feature -
There is a guide article on official android developer website
Guide to audio capture and
Guide to media playback
and if it dosn't work, then post your log cat screenshot.

H264 encoder in android?

I've been having some problems while trying to fix a simple video recording app*. I think I followed the sequence of steps correctly. The following is a simplification of the part of the code that is giving me problems. This code is executed only as a callback once a button is pressed:
if ( mRecorder != null){
mRecorder.reset();
mRecorder.release();
}
mRecorder = new MediaRecorder();
if(mViewer.hasSurface){
mRecorder.setPreviewDisplay(mViewer.holder.getSurface());
Log.d(TAG,"Surface has been set");
}
try {
Log.d(TAG,"Sleeping for 4000 mili");
Thread.sleep(4000);
Log.d(TAG,"Waking up");
} catch (InterruptedException e) {
Log.e(TAG,"InterruptedException");
e.printStackTrace();
}
mRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
mRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
mRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
mRecorder.setVideoFrameRate(12);
mRecorder.setVideoSize(176, 144);
mRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.MPEG_4_SP);
mRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
mRecorder.setMaxDuration(MAX_DURATION_TEST);
String targetFile = "/sdcard/webcamera/temp.mp4";
File localFile = new File(targetFile);
if(localFile.exists()){
Log.d(TAG,"Local file exists");
}else{
Log.d(TAG,"Local file does not exist");
}
mRecorder.setOutputFile(targetFile);
try {
mRecorder.prepare();
bPrepared = true;
Log.i(TAG,"prepared");
return;
} catch (IllegalStateException e) {
e.printStackTrace();
} catch (IOException e) {
Log.e(TAG ,"IOException");
Log.e(TAG,"Message: "+e.getMessage());
StackTraceElement[] array = e.getStackTrace();
for(StackTraceElement element : array){
Log.e(TAG,""+element.toString());
}
}
The important thing which I don't understand here is that whenever I put the video encoder to be MPEG_4_S it works. On the other hand whenever I put the encoder to be H264 it just does not. The problem is that this piece of code is just part of a bigger project, and the rest of it kind of expects this video to be encoded with h264.
I'm testing on a samsung Galaxy I-7500, running on froyo by the way. And I think the Galaxy I-9000 has the same problem.
The puzzling thing for me is that according to this documentation right here:
http://developer.android.com/guide/appendix/media-formats.html, MPEG_4_SP encoding should not be supported at all, while H264 is supported since honeycomb. So why is it working with MPEG_4_SP at all? and is it possible to make it work with H264?
The error I get is not really clear.
07-11 00:01:40.626: ERROR/MediaSource(1386): Message: prepare failed.
07-11 00:01:40.766: ERROR/MediaSource(1386): android.media.MediaRecorder._prepare(Native Method)
07-11 00:01:40.766: ERROR/MediaSource(1386): android.media.MediaRecorder.prepare(MediaRecorder.java:508)
07-11 00:01:40.766: ERROR/MediaSource(1386): com.appdh.webcamera.MediaSource.prepareOutput(MediaSource.java:74)
07-11 00:01:40.766: ERROR/MediaSource(1386): com.appdh.webcamera.MainActivity.startDetectCamera(MainActivity.java:312)
*Actually, the app is a little more complicated than just that, as it also does stream the video over LAN, but the part which I am concerned here has nothing to do with that. You can check this interesing project out here: http://code.google.com/p/ipcamera-for-android/
As you already wrote H.264 encoding support can be only expected from devices running honeycomb and later, which currently means only tablets. If you need H.264 you should test for prepare failed and either tell the user that the device is not supported or better block devices without H.264 using market filters. Or you can compile ffmpeg for android - like several other projects do. Have a look at these links:
http://odroid.foros-phpbb.com/t338-ffmpeg-compiled-with-android-ndk
http://bambuser.com/opensource
FFmpeg on Android
You also can use JCodec
It supports Android and have few samples for it.
The best way to compile it with Gradle is:
compile 'com.github.jcodec:jcodec:0.2.0-vg4'
but for latest improvements and bug fixes you need to compile from latest commits (there is still no new release from 2016)

Stagefright media delay in 2.2 after setting data source?

My first post here. This website has been very useful for learning Android programming, thanks to everyone.
I have a simple app that loads an MP3 stream and plays it. It works fine on 1.6 and 2.1 but on 2.2 it doesn't quite work right. It seems my service is having a problem starting, it's giving my an ANR and the dialog where I have to tap "Wait", and then finally the service starts. Why is the service taking a long time to start up?
Here's my simple code that sets the source and plays the audio:
public class MyActivityService extends Service {
MediaPlayer player = new MediaPlayer();
#Override
public IBinder onBind(Intent intent) {
return null;
}
#Override
public void onCreate() {
super.onCreate();
try {
player.setDataSource("URL OF MUSIC FILE");
} catch (IllegalArgumentException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IllegalStateException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
try {
player.prepare();
player.setVolume(1, 1);
player.setLooping(true);
player.start();
Toast.makeText(getApplicationContext(), "Audio Service Started.", Toast.LENGTH_SHORT).show();}
ADDENDUM:
Turns out it wasn't my service not starting, it's the audio that's not starting quickly enough in 2.2...
I got rid of the ANRs when my service started and stopped by putting the onCreate() and onDestroy() methods of my service in their own threads, which is probably how it should have been from the beginning? Sorry, just learning.
But the delay that's the real problem remains, to clarify --
For example, my code, as it is right now, works fine and dandy and just how I want it to in 2.1 AND 2.2 when I set the data source to an MP3 file like this: http://dl.dropbox.com/u/6916184/TestSongStream.mp3
BUT when I set the data source to an audio stream location like this: http://d.liveatc.net/kjfk_twr or this: http://relay.radioreference.com:80/192236577 it works correctly ONLY in 2.1.
In 2.2 the sound does start playing eventually, but it takes "StagefrightPlayer" about 25 seconds or so after setting the data source until the audio starts as shown here:
07-31 02:54:30.176: INFO/StagefrightPlayer(34): setDataSource('http://relay.radioreference.com:80/192236577')
07-31 02:54:55.228: INFO/AwesomePlayer(34): calling prefetcher->prepare()
07-31 02:54:56.231: INFO/Prefetcher(34): [0x2cc28] cache below low water mark, filling cache.
07-31 02:54:56.337: INFO/AwesomePlayer(34): prefetcher is done preparing
07-31 02:54:56.347: DEBUG/AudioSink(34): bufferCount (4) is too small and increased to 12
07-31 02:54:57.337: ERROR/AwesomePlayer(34): Not sending buffering status because duration is unknown.
It also takes the same amount of time, about 25 seconds, to stop the media player after player.stop() is called in the onDestroy() method of my service, and the onDestroy() method continues on to Toast a message and cancel a notification.
Is that just the way it is with 2.2? If so, I can work around the delays, that's not a problem. Or more likely is it something I am doing wrong? But it works exactly as I want in 1.6 and 2.1!
Would posting more code help?
My code is very simple. There are no audio controls or anything like that. Simply a start audio button and stop audio button that start and stop a service that plays an audio stream.
Thank you for any help!
Do not call prepare() from the main application thread, particularly for a stream, because that may take much longer than you're allowed before an ANR. Use prepareAsync() instead. I have no idea if that is the root of your particular problem, but it would certainly be one cause of an ANR in your current implementation.

Categories

Resources