Can't play streaming audio in Android web browser - android

Ok, I am lost. I have 2 different streams and can use either one. But neither will work on a webpage PHP or HTML on Android. I've tried using the AUDIO tag for HTML5 and some other Javascript libs. A search of this board hasn't helped me much. Just dead ends.
The Mp3 stream is http://8713.live.streamtheworld.com:80/WWWQFMMOB_SC
The AAC stream is http://5913.live.streamtheworld.com:80/WWWQFMAAC_SC
These are actually inside of .pls files (playerservices.streamtheworld.com/pls/WWWQFM.pls) (MP3 Stream) (playerservices.streamtheworld.com/pls/WWWQFMAAC.pls) (AAC).
I can use any of them on an iPhone, but Android...no luck. It doesn't have to be in the same code as the iPhone player.
Any ideas?? I'm not a pro at code by far, but I'm not bad at adapting, changing and following the logic. Thanks!!

Related

Playing RTSP videos with h264 output in Android

I need to stream RTSP links within a VideoView, and in the case of RTSP links with a .mov output such as rtsp://184.72.239.149/vod/mp4:BigBuckBunny_115k.mov work fine. However, this one RTSP link I got from someone confidential has a h264 output according to what VLC Player says.
TL;DR how do you implement the streaming of these type of RTSP links and if there's no clear way to code it, are there any external libraries for Android Studio that easily this because I'm kinda at my loss here.
EDIT: Changed title. Streaming can be interpreted differently as in sending RTSP videos from your Android. That's NOT what I want to accomplish. A lot of examples on GitHub are heavily focused on the SD Card Storage and sending it outward, but I am still looking a way to play RTSP videos with h.264 output in my application.
Try to use MediaPlayerSDK from VXG. That is the only one that seems to be working and is open source.
https://github.com/VideoExpertsGroup/MediaPlayerSDK
However as per some posts this can be done using libFFMPEG as well. I didnt try it, but you can give it a try.

How to make audio in swiffy HTML5 files play in mobile browsers?

I have taken this problem to various forums, and here once (where was it blocked as off-topic). It seems on-topic to me - a specific programming problem, a software algorithm, and a software tool commonly used by programmers.
I have a website of Flash SWF animations with sound. I converted the SWF to HTML5 using Google's swiffy. On desktop PCs, using Chrome, FF and IE, there is sound, but not in Mobile Safari and Android browsers. And not in Chrome on my iPad.
The audio is embedded in the code generated by swiffy, and it looks like it's MP3 encoded in base64 (see "data:audio/mpeg;base64," and "format":"MP3" in the html).
Since there is no developer forum for swiffy, and I get no replies from the feedback form, I looked around to identify what's stopping Mobile Safari, Android browers and iPad Chrome playing the sound. For Safari, I find things like "You cannot preload sound files" but since the sound is embedded in swiffy, there's no sound file to preload. Why no sound in Android and iPad Chrome, both Google products, is also a mystery.
I imagine there's no hack that will solve the problem, if Google hasn't managed to, but insights are appreciated.
I know it's quite late for you but i think some people may also face the same problem and come to this post. So I would like to offer my answer here
As Victor said, you can use getURL to trigger sound outside. But if I use getURL, I would use Web Audio API in the javascript side. Because using Web Audio API, it is possible to play multi tracks simultaneously in mobile. I intergrated library SoundJS to my project.
However, as you have to change every audio frame into a getURL function, it actually costs you quite a lot of time. I found another optional idea that you can just edit the swiffy core once and for all, which use native audio object (but it also mean you can only play one sound at a time in mobile).
originally, swiffy creates native audio object to play base64 encoded sound. Because of the mobile limitation, it's silent if the sound is not triggered by an input event. However it's possible to reuse an active audio object.
First, I have a button to start the animation,which also create an audio object and play it (with a zero length base64 encoded sound);
audio = new Audio("data:audio/mpeg;base64,");
audio.play();
in swiffy core script, search keyword "new Audio", you will see the part that swiffy plays sound. Instead of createing a new audio object, make swiffy use your old audio object
audio.src = a.sound;
audio.play();
After that, without revising, your swf should be able to play sound even in mobile.
However, as it's quite like a hack. It may get some unexpected problem
You can communicate the swiffy flash with the main HTML file so, inside actionscript, you can send a getURL call to javascript where you have a hidden html5 mp3 player, something like:
getURL("Javascript:playSound();");
In HTML, inside the javascript function, something like:
<audio controls>
<source src="test.ogg" type="audio/ogg">
<source src="test.mp3" type="audio/mpeg">
</audio>
I'm still working in this but I hope this help you.

Android: Create app to stream flash videos

I am trying to create an Android App to stream live/archived videos from my church's website.
However, I ran into a problem because all of the streams are giving .flv (flash) videos and or flash players...
I have succesfully been able to load .3gp videos in a VideoView but because Android doesn't support flash natively I tried to open the videos via the WebView.
This didn't work. At least, not for the links that I am working with. However, I can open youtube.com and click on any video to play it - but I can't play any of the streams from the church website.
My question:
Is there any way for me to make this work?
I have access to
1) rtsp stream of .f4v
2) http stream of .m3u8
3) rtmp stream of .fv4
I have spent 2 days searching the web for ideas or fixes and everything I find doesn't seem
to work with my particular case.
It seems to me that the only option is to have the church stream direct .3gp/mp4 files that I can access.
Otherwise, I have no clue how to make .f4v files work. No luck with the WebViews yet..
Do any of you have any suggestions for me?
P.S. I will also have to create an iOS app so looking for a solution that will work on
both platforms.
Thanks for your time!
To answer my own question:
It seems that the android emulator cannot play flash/m3u8 files.
However, my nexus 7 does just fine with both VideoView and WebView!
Cool library I found is Vitamio that is supposed to solve the problem I had.
I didn't use it however.

stream audio file from FTP server to Android App

I have a FTP server setup that holds audio files in one of its directories. I would like to stream the audio from the server and play it on my Android phone instead of downloading it and playing it back that way. Also, is it possible to stream it to the MediaPlayer in Android for playback?
The FTP protocol does not support streaming audio or video.
However, you could set up a streaming server on the same box that will do it for you. I've used VLC to stream video and it's pretty easy to set up. Should work for audio too.
http://www.videolan.org/doc/streaming-howto/en/index.html
You can stream video over FTP. It is just a basic transfer protocol and once you have the data streaming to your device you can do what you want with it. Take a look at this tutorial if you want to set up streaming to your phone:
https://www.digitaldrugs.co.uk/wordpress/?p=37
Sure it is possible, the only problem I see is that your media files should be in a continuous file format, such as MP3. See shoutcast streaming for example, it works via http.
yxplayer is what you want, but it might be a bit limited
You can stream mp3 over FTP. Same way you can DL mp3 from ftp and listen to it before it's finished DLing. There's File Managers/Explorers like FX for one that will do this, but all it's streaming stuff is a trial or maybe by now a paid unlockable feature. Look for an open source remedy.

HTML5 audio on Android 2.1 / Opera Mobile 11

I'm trying to create a HTML5 page with embedded audio. The page has to work on a ZTE BASE Lutea smartphone (Android 2.1) with Opera Mobile 11. Audio format is MP3 but OGG would be fine, too. All files have to be stored local on the smartphones sdcard.
My problem:
While audio plays fine when loaded from a webserver, it fails when files are local. This is definitely not a source path error. The audio file is preloaded and I can alert the correct audio duration. Then, when the audio play() method is called, it throws me a MEDIA_ERR_DECODE.
This is always the same no matter if I use MP3, OGG or the original WAV file. Any idea why there is a decoding problem of local audio files?
EDIT: When I'm opening a local HTML page "file://localhost/sdcard/index.html" in Opera and the page embeds an audio "http://localhost/audio.mp3", it works. Of course I had to install a local webserver and this would be a very bad solution.
Why is the HTTP protocol needed to play an audio file, does something like a local sandbox exist on android?
EDIT: I found that the Video object is able to play local sounds without problems. Unfortunately it's useless to me because I need an invisible audio and in Opera Mobile there is no way to prevent the video from going fullscreen.
With PhoneGap, you should be able to play local audio.
Opera 12 has the same issue. If the audio is played from cache is not working.
I am using appcache.
This worked on Firefox mobile, so we used that.

Categories

Resources