Android how to stream video to remote server - android

I have to build a live stream app, where I get the input video from an external device (a glasses with a camera) and broadcast that stream to a remote server. The input stream consist in a steam of bitmap, and this stream is also displayed in a view in my MainActivity. My idea was to capture the screen, or part of it, and than upload that portion.
How can I do that?

Related

Audio call's ICE failed after getting Video Call connected in Same Activity in Android using WebRTC and JanusGateway

I am getting a Video Stream including audio and a separate Audio Stream. Both from same server.
I started with video stream and connected the call on one screen - Working fine.
Then worked with Audio Stream to connect call on another screen - Worked fine.
Now when i try to merge them into one screen.
Add Video call and audio call on same screen.
Video call is working but Audio call tries to connect then ICE failed and audio call got hangup.
Separately Video call and audio call both are working. But when i add them to one screen. Here they mess up with Audio call which is calling after video call connected.
i am using two sets of the different Signalling process (SDP, ICE, etc) to connect to 2 steams from a same server as peer.
Where i am wrong?

How do i send the results of a fomula to the audio stream in android?

I am working on something that requires some raw data to be sent to the speaker in real time on an device that runs on android.
Example
I have a formula that generates the wave form. how do I send that data to the speaker for listening in android?
The simplest way to write raw PCM data is via the AudioTrack class: https://developer.android.com/reference/android/media/AudioTrack
You can operate it in a streaming mode, if needed.

Streaming audio from an Android device to another

How would I go about streaming audio from one device to another over the internet? I'm aware of sending basic data using Java sockets, but wondering how to:
Start streaming midway through a file (say, during the middle of a song)
What format is needed for the data being sent. MediaPlayer can take a url as a data source, so how should the audio be represented when being sent from the server side?
Thanks
Having implemented a music streaming app, I can share a little with you.
If you want to stream and use the Android MediaPlayer class, MP3 or OGG is your best bet for a format.
If your architecture is client-server, i.e. real server in the Internet serving streams to Android devices, then just stream MP3 or OGG bytes over HTTP. Just point MediaPlayer to a URL on on your server.
If your architecture is peer-to-peer with your own custom socket code, you can create a "proxy http" server that listens on localhost on a dedicated thread. You point your MediaPlayer instance to your local in-process socket server (e.g. http://localhost:54321/MyStream.mp3). Then you have to implement code to parse the HTTP get request form MediaPlayer, then proxy the stream bytes between your custom P2P socket protocol and listeners connected to your local http server. A lot of radio streaming apps do exactly this so as to parse the ICECAST metadata from the MP3 stream. Here's the code I use for my radio streaming app that does this.
For the "start midway through the file" scenario, you might find my MP3 Stream Reader class useful. It wraps an InputStream (file, socket stream, etc..) and syncs to the next valid frame from where ever you started from. Just call read_next_chunk to get the next block of audio and its format. MediaPlayer might do most of this heavy lifting for you, so this might not be needed.

Measure Stream Latency in RTSP Stream

I have a RTSP server to stream data and an android client. In client I have used "VideoView" widget of android to play the stream. I have just given the url of server to VideoView and started it. So I have not written any code to stream and decode the data. I want to measure the stream latency in this case.
Is there some way to measure the latency in this case?

Mediarecorder Video RTP/RTSP Streams!

I am able to Record Android's Camera Using Mediarecorder Class.
I heard that sipdroid (videocamera.java ) sends the recorded video as RTP streams.
I tried that and able to do the same.
But how do i receive the RTP streams and play it in PC?
I too heard that in Sipdroid,on the server (PBXes )side they were converting these streams into RTSP streams and passing it back to videocamera.java file.
Can anyone help me in converting RTP Streams to RTSP streams?
There is no such thing as an RTSP stream, RTSP is a session management protocol (as is SIP) and lets you setup streaming using RTP. Amongst other things the port numbers are communicated during session initiation. When RTSP is used, the actual media is still sent using RTP. Read the RFCs for more info on either protocol.

Categories

Resources