Streaming an audio file in android via RTP - android

I am looking for a way to stream a prerecorded MP3 or WAV file over the internet using SIP and RTP. The main stumbling block now has been how to get a stream from a file and synchronize it, so that it could be delivered via RTP just like a stream from microphone or video camera.

Related

Is Android.net.rtp only for VoIP ppplication?

From API 12, Android have its only RTP package, which is android.net.rtp.
There are 4 classes in this package: RtpStream, AudioStream, AudioCodec and AudioGroup.
However, it seems that all those classes are only for VoIP application. I'm trying to stream some 'static' audio file (such as MP3 file on SDCARD) by using RTP protocol to VLC player and I find it somehow impossible by using android.net.rtp.
I tried this example: http://androidsourcecode.blogspot.com/2013/10/android-rtp-sample-receiving-via-vlc.html
The result is: The sound from the mic of my android phone was sent successfully to my MAC VLC player very well.
So, is it possible to stream 'static' file through android's native media decoder, and push decoded data to AudioRecord then again to AudioStream?
To make it simple, is it possible to make a fake microphone (MP3 file -> MediaPlayer -> AudioRecord -> AudioStream) pipeline?

How to broadcast HLS using Android MediaRecorder?

I'd like to broadcast HLS using Android MediaRecorder. I'm going to save stream to socket and read from it (known hack to handle live streaming without saving to file). How can i broadcast the stream in HLS? I believe i need to split the stream into HLS chunks. Any suggestions?

Streaming audio/video from Android

I am writing Android app to stream audio/video to Wowza server in RTSP interleaved mode. Using AAC and H.264 encoders. I created packetizers for both audio and video. The problem that I am facing is that when I send both streams simultaneously I am losing video stream. I only get audio on Wowza and VLC. When I do not stream audio video works just fine. This proves that my packetizers and RTP streaming code perform as expected. It looks as if I cannot send video fast enough to sustain the stream.
Similarly architected code on iOS provides stable video and audio feed.
Thank you

Android Live microphone sound Streaming out

I would like to stream microphone.
And use http server on android so that
user just go http://xxxx.xxx.xx.xxx/xxx.wav can listen to what I say?
How can I do so?
I would try to develop a small HTTP server which serves an FLV stream.
You can take ipcamera-for-android as an example. This app serves a FLV video stream however you could reuse the server and the FLV encoder part.
As FLV supports PCM streams you can simply copy the buffer of the microphone to your stream.
Another variant
You can encode the microphone stream using the built-in MediaRecorder.AudioEncoder.AAC. Afterwards you can simply serve the AAC as a stream to your client.

Android VideoView save RTSP stream

I'm playing on my Android Nexus One some videos of a few cisco cameras using a VideoView. While this works fine, I'm unsure if it's possible to save the movie to a file.
I'm opening an URL like rtsp://192.168.1.22:554/live.sdp
How can I save it to the SDcard ? Handle it like a file maybe ... Is that possible ?
You can implement/use library, your own RTSP client which will pipe the incoming RTP packets into a file.
if you want to also play the video stream you can then give the media player a local RTSP server address to your RTSP server and then pipe the same RTP packets to the media player as well.
If you need you can find an open source RTSP server/client here

Categories

Resources