I have a RTSP server to stream data and an android client. In client I have used "VideoView" widget of android to play the stream. I have just given the url of server to VideoView and started it. So I have not written any code to stream and decode the data. I want to measure the stream latency in this case.
Is there some way to measure the latency in this case?
Related
I have to build a live stream app, where I get the input video from an external device (a glasses with a camera) and broadcast that stream to a remote server. The input stream consist in a steam of bitmap, and this stream is also displayed in a view in my MainActivity. My idea was to capture the screen, or part of it, and than upload that portion.
How can I do that?
I am working on something that requires some raw data to be sent to the speaker in real time on an device that runs on android.
Example
I have a formula that generates the wave form. how do I send that data to the speaker for listening in android?
The simplest way to write raw PCM data is via the AudioTrack class: https://developer.android.com/reference/android/media/AudioTrack
You can operate it in a streaming mode, if needed.
How would I go about streaming audio from one device to another over the internet? I'm aware of sending basic data using Java sockets, but wondering how to:
Start streaming midway through a file (say, during the middle of a song)
What format is needed for the data being sent. MediaPlayer can take a url as a data source, so how should the audio be represented when being sent from the server side?
Thanks
Having implemented a music streaming app, I can share a little with you.
If you want to stream and use the Android MediaPlayer class, MP3 or OGG is your best bet for a format.
If your architecture is client-server, i.e. real server in the Internet serving streams to Android devices, then just stream MP3 or OGG bytes over HTTP. Just point MediaPlayer to a URL on on your server.
If your architecture is peer-to-peer with your own custom socket code, you can create a "proxy http" server that listens on localhost on a dedicated thread. You point your MediaPlayer instance to your local in-process socket server (e.g. http://localhost:54321/MyStream.mp3). Then you have to implement code to parse the HTTP get request form MediaPlayer, then proxy the stream bytes between your custom P2P socket protocol and listeners connected to your local http server. A lot of radio streaming apps do exactly this so as to parse the ICECAST metadata from the MP3 stream. Here's the code I use for my radio streaming app that does this.
For the "start midway through the file" scenario, you might find my MP3 Stream Reader class useful. It wraps an InputStream (file, socket stream, etc..) and syncs to the next valid frame from where ever you started from. Just call read_next_chunk to get the next block of audio and its format. MediaPlayer might do most of this heavy lifting for you, so this might not be needed.
I'm working on a project where I stream video over wifi-direct from one android device to another using rtp over udp.
Essentially, the first android device hosts a rtsp server and listens for connections. Once the client android device connects to it (over wi-fi direct) and starts listening for packets, the first device begins to stream the video content.
I'm aware that RTP packet headers have a 32-bit timestamp at bit offset 32 - 64. But I do not know how to access the contents of the packet and subsequently access just that segment of their header.
Currently, I am using libvlc to play the streamed video on the device. But I would like to be able to measure the latency between the two devices. Either by extracting the timestamps from the packets on arrival or by some other way (maybe VLC can help?)
Edit: I am going to try to learn from the code posted here, but still await any replies.
Edit2: So I'm trying to go simpler. I've instead made a client activity to connect to the remote host and read packets that way using android's DatagramSocket api. However, my server activity doesn't start serving even after the client activity says it is connected. Not sure what needs to be done to let the server know there is a client ready to be served. MediaPlayer and VLC apis both were able to start streaming video once they connected. What am I missing? Do I need to do more than DatagramSocket.connect(ipaddress, port)?
I am able to Record Android's Camera Using Mediarecorder Class.
I heard that sipdroid (videocamera.java ) sends the recorded video as RTP streams.
I tried that and able to do the same.
But how do i receive the RTP streams and play it in PC?
I too heard that in Sipdroid,on the server (PBXes )side they were converting these streams into RTSP streams and passing it back to videocamera.java file.
Can anyone help me in converting RTP Streams to RTSP streams?
There is no such thing as an RTSP stream, RTSP is a session management protocol (as is SIP) and lets you setup streaming using RTP. Amongst other things the port numbers are communicated during session initiation. When RTSP is used, the actual media is still sent using RTP. Read the RFCs for more info on either protocol.