Can FLV AAC stream be played in Android - android

I'm trying to build a radio player and the client is providing a stream which is a FLV container with the audio being AAC
When I read the headers it shows up as audio/aacp.
I have tried all possible ways such as using the
1) Streaming through mediaplayer (Does not work)
2) Use the NPR mode of using a proxy stream (I get a broken pipe exception)
3) Play it in chunks ( Plays but I need the SDCard and the playback is not very great)
4) Use the GPL'd FAAD2 Library but I would have to pay the royalty fee
Can some one help me out on figuring this issue out.
The last option that I have is to have my client change the stream to mp3 container (which I know that it works)
Regards,
Hari

Related

WebView getVideo/Audio tracks or MediaStream and send it to the server via WebRTC

I have to get video/audio tracks or if it's possible MediaStream object from the Android WebView which plays HLS stream ("m3m8").
I load HLS stream in the WebView using method loadUrl("...m3m8"). It's running without issues, but i can't figure out how to get live video and audio tracks from the HLS stream. I read a lot of articles and I was not able to find any solution. So my question is - Is it possible to get audio and video tracks from the running HLS stream on the WebView? I need to get the audio and video tracks because I should send them via PeerConnection(WebRTC) which accepts MediaStream or audio tracks and video tracks. Any ideas and examples will be helpful. Thank you in advance.
HLS works by defining a playlist (your .m3u8 file) which points to chunks of the video (say segment00000.ts, segment00001.ts, ...). It can contain different streams for different resolutions, but the idea (very simplified) is that your player can download a chunk, play it right away, and download the next one in the meantime.
A WebRTC video track takes RTP packets. I don't know exactly what the Android API exposes, but it feels like either you can pass it an m3u8 as a source (though that may be a bit weird for a generic API), or you can't, and you have to read the HLS stream yourself, extract RTP frames, and pass them to WebRTC.
For instance, gstreamer's hlsdemux seems to be able to read from an HLS source. From there you could extract RTP frames and feed them to your WebRTC track.
Probably that's a bit lower-level than you hoped, but if the Android API doesn't do it for you, you have to do it yourself (or find a library that does it).

How to play live AAC stream on Android with html5 audio element

I am trying to embed an html5 audio tag in a page to allow playing a live AAC+ stream coming from an Icecast server.
According to the media formats developer's guide, Android supports playback for several AAC flavors, either inside an MPEG-4 container or in ADTS.
I have successfully played AAC-encoded audio files in an MPEG-4 container, thus:
<audio controls="controls">
<source src="http://www.example.com/audio/program1.mp4" type="audio/mp4"/>
</audio>
However, I have not been able to play any AAC live stream (which, as far as I understand, is output by Icecast using ADTS) with the audio tag. I have tried setting different types (e.g., "audio/aac", which the player says it can "probably" play) as well as different file extensions for the stream URL. Nothing works. The player, by the way, initializes as if everything is OK, then when you press the play button nothing happens (other than the play button changing to a pause icon).
The only way I have been able to play a live AAC stream is by using a URL pointing to a .sdp manifest containing a link to an RTSP version of the stream. The browser then hands off the stream to the native audio player or another audio app, which plays it after a brief buffering period. This is not an option for us, as we would like to use a simple Icecast server for our stream.
Is there just no way to play a live AAC stream on Android via HTTP? It seems iOS supports it, but not Android.
From the lack of responses to the contrary, I have to conclude that the answer to the original question is, "No, it is not possible to use the HTML5 audio tag to play a live AAC+ stream from an Icecast server".
I am posting an answer to share what I ended up doing.
My first inclination was simply to set up a second Icecast stream using MP3 instead of AAC. This will work, but you must be willing to accept the buffering delay that Android's audio player introduces with MP3 streams. Unfortunately, at 64 kpbs Android makes you wait for over 40 seconds before it will start playing the MP3 stream. Admittedly, 64 kpbs is not very good quality for MP3, but even at 128 kbps the buffering takes over 20 seconds, enough for listeners to conclude that the stream is down. So MP3 is not an option for us.
My eventual solution was to ask our CDN to add a Wowza application that pulls from the AAC+ Icecast stream and transmuxes it using HLS.
Now my audio tag looks like this:
<audio controls="controls">
<source src="http://www.example.com/wowza/stream.m3u8"/>
<source src="http://www.example.com/audio/aac"/>
</audio>
Note that I had to list the HLS source first, because otherwise the Android device will actually pick the Icecast stream and try to play it, which it can't (you'd think it would know enough not to do that).
So in the end Android does play a live AAC+ stream with no delay, as long as it is delivered via HLS, and not directly from Icecast. I must say I was very disappointed with Android, both its lack of support for direct Icecast AAC+ and for its poor handling of live MP3 streams, especially since the competition (iOS) handles everything you throw at it without blinking.

is MP4 a streaming protocol or file format?

I am currently using Wowza to stream videos. I am currently trying to integrate Wowza, Android, and ChromeCast Device (CCD). According to this document, https://developers.google.com/cast/docs/media, Google Cast supports the "MP4 protocol".
So, my question is this: is MP4 a streaming protocol, file format, or both?
In the ChromeCast Android demo applications, they simply pass a URL like this http://commondatastorage.googleapis.com/gtv-videos-bucket/sample/BigBuckBunny.mp4 as metadata to the CCD.
To me, this implies that no server is required to stream the MP4 file. Meaning, I won't even need Wowza as an intermediary party to stream.
Is this understanding correct?
It seems that the client player will then be responsible to interact with the MP4 file directly (e.g. seek, pause, stop, play, etc...).
While you've already accepted an answer, and gotten your app to work (which was likely your ultimate goal), I thought it might be helpful to answer your question as well about what MP4 really is.
MP4 is a video container format; inside the MP4 container is video stream data (generally encoded in the H.264 format) and audio stream data (often encoded in the AAC format). The client player can interact with it directly because the Chromecast's browser has HTML5 video support for interpreting the MP4 container format and playing back the H.264 video and AAC audio, but it isn't "streaming" in the way that term is often used ... it's just downloading it from your web server in chunks and playing it back. There's nothing wrong with this if it's performing as you'd like (in fact, this is one of the big benefits of HTML5 video, that it doesn't need a streaming server backend), but if you actually want true media streaming (to leverage things such as adaptive bitrate switching, licensing, and so forth), you would have the MP4 file served via Wowza rather than via your web server.
If you simply have an MP4 file, just pass its url and it should work fine, just like the samples (CastVideos) projects that we have on the Github.

Android mediaplayer radio streaming with wma codec

I'm trying to play radio audio stream from the media player in android 4.0.3 API15.
Some of the stations work, but a lot of the stations fail, for example :
http://switch3.castup.net/cunet/gm.asp?ai=31&ar=88FM
This station(and many other) is returning me the following error:
error (1,-21477483648).
I checked the codec being used in that station and its wma.
The media player doesnt support this codec but i know that there is application("Radio Israel") that can play this station.
My question is , is there any workaround for playing that stream?
Thanks
WMA is a Windows format and Android is Linux. You could use a proxy server of your own to encode into MP3 (or another format) then stream from there.

stream audio file from FTP server to Android App

I have a FTP server setup that holds audio files in one of its directories. I would like to stream the audio from the server and play it on my Android phone instead of downloading it and playing it back that way. Also, is it possible to stream it to the MediaPlayer in Android for playback?
The FTP protocol does not support streaming audio or video.
However, you could set up a streaming server on the same box that will do it for you. I've used VLC to stream video and it's pretty easy to set up. Should work for audio too.
http://www.videolan.org/doc/streaming-howto/en/index.html
You can stream video over FTP. It is just a basic transfer protocol and once you have the data streaming to your device you can do what you want with it. Take a look at this tutorial if you want to set up streaming to your phone:
https://www.digitaldrugs.co.uk/wordpress/?p=37
Sure it is possible, the only problem I see is that your media files should be in a continuous file format, such as MP3. See shoutcast streaming for example, it works via http.
yxplayer is what you want, but it might be a bit limited
You can stream mp3 over FTP. Same way you can DL mp3 from ftp and listen to it before it's finished DLing. There's File Managers/Explorers like FX for one that will do this, but all it's streaming stuff is a trial or maybe by now a paid unlockable feature. Look for an open source remedy.

Categories

Resources