Mediarecorder Video RTP/RTSP Streams! - android

I am able to Record Android's Camera Using Mediarecorder Class.
I heard that sipdroid (videocamera.java ) sends the recorded video as RTP streams.
I tried that and able to do the same.
But how do i receive the RTP streams and play it in PC?
I too heard that in Sipdroid,on the server (PBXes )side they were converting these streams into RTSP streams and passing it back to videocamera.java file.
Can anyone help me in converting RTP Streams to RTSP streams?

There is no such thing as an RTSP stream, RTSP is a session management protocol (as is SIP) and lets you setup streaming using RTP. Amongst other things the port numbers are communicated during session initiation. When RTSP is used, the actual media is still sent using RTP. Read the RFCs for more info on either protocol.

Related

Streaming audio from an Android device to another

How would I go about streaming audio from one device to another over the internet? I'm aware of sending basic data using Java sockets, but wondering how to:
Start streaming midway through a file (say, during the middle of a song)
What format is needed for the data being sent. MediaPlayer can take a url as a data source, so how should the audio be represented when being sent from the server side?
Thanks
Having implemented a music streaming app, I can share a little with you.
If you want to stream and use the Android MediaPlayer class, MP3 or OGG is your best bet for a format.
If your architecture is client-server, i.e. real server in the Internet serving streams to Android devices, then just stream MP3 or OGG bytes over HTTP. Just point MediaPlayer to a URL on on your server.
If your architecture is peer-to-peer with your own custom socket code, you can create a "proxy http" server that listens on localhost on a dedicated thread. You point your MediaPlayer instance to your local in-process socket server (e.g. http://localhost:54321/MyStream.mp3). Then you have to implement code to parse the HTTP get request form MediaPlayer, then proxy the stream bytes between your custom P2P socket protocol and listeners connected to your local http server. A lot of radio streaming apps do exactly this so as to parse the ICECAST metadata from the MP3 stream. Here's the code I use for my radio streaming app that does this.
For the "start midway through the file" scenario, you might find my MP3 Stream Reader class useful. It wraps an InputStream (file, socket stream, etc..) and syncs to the next valid frame from where ever you started from. Just call read_next_chunk to get the next block of audio and its format. MediaPlayer might do most of this heavy lifting for you, so this might not be needed.

Measure Stream Latency in RTSP Stream

I have a RTSP server to stream data and an android client. In client I have used "VideoView" widget of android to play the stream. I have just given the url of server to VideoView and started it. So I have not written any code to stream and decode the data. I want to measure the stream latency in this case.
Is there some way to measure the latency in this case?

Play music with RTP Android

It s possible that i can send music data stream on RTP with android mobile.
it's a example for rtp
http://androidsourcecode.blogspot.in/2013/10/android-rtp-sample-receiving-via-vlc.html

android RTP send and receive program

I am new to android programming and i need idea about android RTP programming stuff. Questions
How to capture the microphone audio data on android device?
How to construct RTP packet by using captured microphone audio data without using API?
How to transmit RTP packet to other android device?
How to play received RTP packet in android ?
For transmitting and receiving RTP packets, I would suggest looking into the jlibrtp library. Basically you initialize it with 2 DatagramSockets (one for sending RTP data and one for receiving RTCP data), define a payload type, add a recipient, and send byte arrays. I beleive it handles the RTP timestamps by itself, but you have to make sure you're payload is already formatted by the RFC reccommendations.
Here is an example of how you would set up an RTP session
Answers for your questions..
1.Use Android Media Api like AudioRecord for recording the voice data & AudioTrack for playing the voice data, both in .pcm format.
2.Go through this link
3.you have to use sip for transmiting packets.
4.go through this link

How to develop a video player that receives video stream on wi-fi broadcasting/multicasting?

general video players connect the media server through unicast
but I need a player to receive media stream using multicast/broadcast.
scenario:
Media Server ---> AP --(multicast/broadcast video stream)--> player(android phone)
is there any Android SDK to support this function?
or is there any solution without developing software codec and RTP stack?
James.
Here is a post about Android and multi-cast support: How to receive Multicast packets on Android
The question about a multicast video streaming protocol is a separate issue. There should be nothing Android-specific required (assuming you can get and receive multicast data is all you need from Android).
Getting the new codec to show up as a video-playing app in Android is a separate issue. See this question:
How to add a new video codec to Android?

Categories

Resources