Problem playing media from RTSP URL in Android - android

I have made a Android streaming application that plays media from online URL's. For playing the media, I am using the
standard MediaPlayer class for playing media.
As per the Android documentation, it supports RTSP protocol for audio & video playback
http://developer.android.com/guide/appendix/media-formats.html
But when I am trying to play media from a RTSP URL, it gets connected but I am not able to hear any media
Following is one of those RTSP URL -
rtsp://sfera.live24.gr/sfera4132
Does anybody media have an idea of playing RTSP URL's through the Android MediaPlayer
Thanks

That link you provided has 3 audio tracks, with the first and last tracks appearing to be silent and don't contain any valid audio.
The middle track has audio (as per VLC). I don't know how Android deals with multiple audio tracks. I imagine you may get better results if you use links that only contain 1 audio and 1 video track at most.
I expect for an rtsp stream with multiple audio tracks, android is only going to play the first one as there is no user interface to select a specific audio stream, hence why you aren't hearing any audio.
If this is a stream from your own server, to hear the audio you should adjust the SDP file the valid audio track first. If this is not from your server, I don't know what you're options are.

Related

WebView getVideo/Audio tracks or MediaStream and send it to the server via WebRTC

I have to get video/audio tracks or if it's possible MediaStream object from the Android WebView which plays HLS stream ("m3m8").
I load HLS stream in the WebView using method loadUrl("...m3m8"). It's running without issues, but i can't figure out how to get live video and audio tracks from the HLS stream. I read a lot of articles and I was not able to find any solution. So my question is - Is it possible to get audio and video tracks from the running HLS stream on the WebView? I need to get the audio and video tracks because I should send them via PeerConnection(WebRTC) which accepts MediaStream or audio tracks and video tracks. Any ideas and examples will be helpful. Thank you in advance.
HLS works by defining a playlist (your .m3u8 file) which points to chunks of the video (say segment00000.ts, segment00001.ts, ...). It can contain different streams for different resolutions, but the idea (very simplified) is that your player can download a chunk, play it right away, and download the next one in the meantime.
A WebRTC video track takes RTP packets. I don't know exactly what the Android API exposes, but it feels like either you can pass it an m3u8 as a source (though that may be a bit weird for a generic API), or you can't, and you have to read the HLS stream yourself, extract RTP frames, and pass them to WebRTC.
For instance, gstreamer's hlsdemux seems to be able to read from an HLS source. From there you could extract RTP frames and feed them to your WebRTC track.
Probably that's a bit lower-level than you hoped, but if the Android API doesn't do it for you, you have to do it yourself (or find a library that does it).

Playing an audio file in reverse using ExoPlayer

I want to play audio file backward using exoplayer (or other media player) in Android Device.
It says to be very difficult to play video backward:
https://github.com/google/ExoPlayer/issues/2191
But, if I can reverse the pcm sample data stream going in player, I can play audio backward.
I'm now trying reverse part of music file using 'android.media.MediaExtractor' & 'android.media.MediaCodec'. But I can't create an reversed audio stream because I don't know how to create the header of sample data in short array.
The best option is to play audio file backward in ExoPlayer, the second option is create reversed audio stream and play it, the last option will be any method to play audio backward.
Thanks for your attention. I’m looking forward to your reply.

How to play live AAC stream on Android with html5 audio element

I am trying to embed an html5 audio tag in a page to allow playing a live AAC+ stream coming from an Icecast server.
According to the media formats developer's guide, Android supports playback for several AAC flavors, either inside an MPEG-4 container or in ADTS.
I have successfully played AAC-encoded audio files in an MPEG-4 container, thus:
<audio controls="controls">
<source src="http://www.example.com/audio/program1.mp4" type="audio/mp4"/>
</audio>
However, I have not been able to play any AAC live stream (which, as far as I understand, is output by Icecast using ADTS) with the audio tag. I have tried setting different types (e.g., "audio/aac", which the player says it can "probably" play) as well as different file extensions for the stream URL. Nothing works. The player, by the way, initializes as if everything is OK, then when you press the play button nothing happens (other than the play button changing to a pause icon).
The only way I have been able to play a live AAC stream is by using a URL pointing to a .sdp manifest containing a link to an RTSP version of the stream. The browser then hands off the stream to the native audio player or another audio app, which plays it after a brief buffering period. This is not an option for us, as we would like to use a simple Icecast server for our stream.
Is there just no way to play a live AAC stream on Android via HTTP? It seems iOS supports it, but not Android.
From the lack of responses to the contrary, I have to conclude that the answer to the original question is, "No, it is not possible to use the HTML5 audio tag to play a live AAC+ stream from an Icecast server".
I am posting an answer to share what I ended up doing.
My first inclination was simply to set up a second Icecast stream using MP3 instead of AAC. This will work, but you must be willing to accept the buffering delay that Android's audio player introduces with MP3 streams. Unfortunately, at 64 kpbs Android makes you wait for over 40 seconds before it will start playing the MP3 stream. Admittedly, 64 kpbs is not very good quality for MP3, but even at 128 kbps the buffering takes over 20 seconds, enough for listeners to conclude that the stream is down. So MP3 is not an option for us.
My eventual solution was to ask our CDN to add a Wowza application that pulls from the AAC+ Icecast stream and transmuxes it using HLS.
Now my audio tag looks like this:
<audio controls="controls">
<source src="http://www.example.com/wowza/stream.m3u8"/>
<source src="http://www.example.com/audio/aac"/>
</audio>
Note that I had to list the HLS source first, because otherwise the Android device will actually pick the Icecast stream and try to play it, which it can't (you'd think it would know enough not to do that).
So in the end Android does play a live AAC+ stream with no delay, as long as it is delivered via HLS, and not directly from Icecast. I must say I was very disappointed with Android, both its lack of support for direct Icecast AAC+ and for its poor handling of live MP3 streams, especially since the competition (iOS) handles everything you throw at it without blinking.

Audio Microphone Live Streaming in Android

I am new in live streaming. I have a problem to create live streaming of recording file. I can upload the audio file to server and play from url using media player but i dont want this i want as i speak it broadcast to all and when i finish my speech then it stops broadcasting. Is it feasible or not. If feasible then how?
Can i do this with Amazon CloudFront?
As per my knowledge you can use Spydroid
it is basically for video streaming but you can use it for audio streaming as well and then you have to alter its code for audio streaming..
seccondly you will need some media server i will prefer you to use Red5 media server as it is opensource Red 5
red5 supports audio and video streaming both but you have to study it little bit

Android Mediaplayer :: How to detect streaming content type (audio or video)

I am completely new to displaying streaming either audio or video content using media player.
Somehow using post available here, i am able to display RTSP content(.3gp video) in my MediaPlayer implementation.
How to identify streaming content is audio only stream or audio/video stream using MediaPlayer class or streaming link?
I could be wrong here, but I believe there is only a MediaPLayer.OnInfoListner API available in Java to get information about the content stream being played. Not sure of how helpful that API actually is though. You might also want to try stream scrapers(is what I believe they are called) to get stream data and see if there is both audio and video channel to make a determination.

Categories

Resources