How to broadcast HLS using Android MediaRecorder? - android

I'd like to broadcast HLS using Android MediaRecorder. I'm going to save stream to socket and read from it (known hack to handle live streaming without saving to file). How can i broadcast the stream in HLS? I believe i need to split the stream into HLS chunks. Any suggestions?

Related

WebView getVideo/Audio tracks or MediaStream and send it to the server via WebRTC

I have to get video/audio tracks or if it's possible MediaStream object from the Android WebView which plays HLS stream ("m3m8").
I load HLS stream in the WebView using method loadUrl("...m3m8"). It's running without issues, but i can't figure out how to get live video and audio tracks from the HLS stream. I read a lot of articles and I was not able to find any solution. So my question is - Is it possible to get audio and video tracks from the running HLS stream on the WebView? I need to get the audio and video tracks because I should send them via PeerConnection(WebRTC) which accepts MediaStream or audio tracks and video tracks. Any ideas and examples will be helpful. Thank you in advance.
HLS works by defining a playlist (your .m3u8 file) which points to chunks of the video (say segment00000.ts, segment00001.ts, ...). It can contain different streams for different resolutions, but the idea (very simplified) is that your player can download a chunk, play it right away, and download the next one in the meantime.
A WebRTC video track takes RTP packets. I don't know exactly what the Android API exposes, but it feels like either you can pass it an m3u8 as a source (though that may be a bit weird for a generic API), or you can't, and you have to read the HLS stream yourself, extract RTP frames, and pass them to WebRTC.
For instance, gstreamer's hlsdemux seems to be able to read from an HLS source. From there you could extract RTP frames and feed them to your WebRTC track.
Probably that's a bit lower-level than you hoped, but if the Android API doesn't do it for you, you have to do it yourself (or find a library that does it).

Stream 16 bit PCM to Wowza via Android Device

I have an android device that is reading a 16 bit PCM audio stream from an external bluetooth device. I've been able to write that data to an AudioTrack and play it on the Android device, so I know that the stream works. What I'd like to do is send this stream to a Wowza server, so that I can then access the live stream through a web app.
I've been able to successfully send other streams to Wowza using the libstreaming library, but as far as I can tell, this library only sends streams that come from the device's camera and mic.
I've tried opening a socket and sending out each packet that I read, but I have no idea if this is the right approach or not. Does anyone have a good suggesting for implementing this?

Audio Microphone Live Streaming in Android

I am new in live streaming. I have a problem to create live streaming of recording file. I can upload the audio file to server and play from url using media player but i dont want this i want as i speak it broadcast to all and when i finish my speech then it stops broadcasting. Is it feasible or not. If feasible then how?
Can i do this with Amazon CloudFront?
As per my knowledge you can use Spydroid
it is basically for video streaming but you can use it for audio streaming as well and then you have to alter its code for audio streaming..
seccondly you will need some media server i will prefer you to use Red5 media server as it is opensource Red 5
red5 supports audio and video streaming both but you have to study it little bit

Android Live microphone sound Streaming out

I would like to stream microphone.
And use http server on android so that
user just go http://xxxx.xxx.xx.xxx/xxx.wav can listen to what I say?
How can I do so?
I would try to develop a small HTTP server which serves an FLV stream.
You can take ipcamera-for-android as an example. This app serves a FLV video stream however you could reuse the server and the FLV encoder part.
As FLV supports PCM streams you can simply copy the buffer of the microphone to your stream.
Another variant
You can encode the microphone stream using the built-in MediaRecorder.AudioEncoder.AAC. Afterwards you can simply serve the AAC as a stream to your client.

Android Mediaplayer :: How to detect streaming content type (audio or video)

I am completely new to displaying streaming either audio or video content using media player.
Somehow using post available here, i am able to display RTSP content(.3gp video) in my MediaPlayer implementation.
How to identify streaming content is audio only stream or audio/video stream using MediaPlayer class or streaming link?
I could be wrong here, but I believe there is only a MediaPLayer.OnInfoListner API available in Java to get information about the content stream being played. Not sure of how helpful that API actually is though. You might also want to try stream scrapers(is what I believe they are called) to get stream data and see if there is both audio and video channel to make a determination.

Categories

Resources