initiating a rtsp connection over cellular with android - android

The majority of sim accounts are public dynamic. Most if not all cellular providers do not allow incoming connections to public dynamic ip addresses. (3g anyway, maybe not 4g/LTE)
The issue of connecting is not one of dynamic ips, but rather blocked incoming ports.
So, if I wanted to stream video from an android phone on demand (based on information gleaned from this conversation (Streaming video from Android camera to server)), what would be the chain of events to properly intitiate a connection.
My idea of this (roughly):
app on android phone initiates and keeps open some sort of connection to media server (wowza or something).
At some point when server wants video from phone, it uses the open connection to request a video stream.
Android phone pushes rtsp stream to server.
Is this correct, and if so, what type of connection should i use as the permanent control connection. Also, is it possible to push rtsp or would i have to do something else?
Thanks!

I know this is an old question but if anybody else is searching for something similar the following is now available:
http://developer.android.com/guide/google/gcm/index.html
This essentially allows a message to be sent from a server to an app on an Android device (it replaces C2DM which did a similar thing).
Update
Google GCM has now being replaced in turn by Google Firebase Cloud Messaging:
https://firebase.google.com/docs/cloud-messaging/
Using a could based app messaging service like this, the steps would be:
Add a message subscription service to your app (e.g. Firebase)
The App registers with the cloud messaging service when it starts up
When the server wants video from the phone (as noted in the questions above) the server sends a message to the app
The app opens a connections to the streaming server and starts to stream video to the server.
Note: there is a comment below about how this approach does not allow an incoming connection from the server to the Android phone.
This, in fact, is not how streaming from a phone typically works. The phone actually makes an 'outgoing' connection to a streaming server which it then streams the video to. Other devices wanting to see the video then stream it form here.
There are several reasons why this is the preferred approach, one of the key ones being that supporting a quality streaming service that will play back on most common devices, browsers, OS's etc requires transcoding the video into multiple bit rates, and even encodings in some cases, and packaging and serving in the appropriate streaming packaging format. Doing all this on the mobile device would be very compute and storage intensive.

Related

Server software for handling multiple mp3/ogg streams

I'm looking for advice regarding an aspect of a project I'm working on.
I'm developing a demo android app for a not-for-profit that specialises in services for the vision impaired. The plan is, among other things, the app will enable users to stream this organization's specialised audiobooks.
For the sake of demo'ing/development I need to establish some sort of server which will, pending a request from a device running the app:
directly transfer certain xml/html index files to the android phone (no streaming necessary)
stream .ogg and .mp3 audio files to the device
serve more than one client device at a time
start a stream from a specific point within an mp3/ogg file, pending a request from the phone app
I've had a look at Icecast as an mp3/ogg streaming solution, but my knowledge of servers is a bit limited (I've only ever done some basic work in Flask). Would I need to run this in tandem with something that can generically serve files / handle requests?
I'm basically just looking for a good solution/tool to implement this is. The server side doesn't need to be completely fleshed out, just fit the bill above, as my focus is developing the phone app side for now. For the sake of a demo, something straightforward / well documented would suit best.
You don't need a special server for this. Any HTTP server that supports range requests will be fine. This includes Nginx, Apache, etc. There is no need to spoon-feed clients media data from the server. This streaming and buffering aspect is handled automatically on the client, through TCP window size and outright closing the connection and connecting again later as-needed.
Icecast is meant for radio-style streams where everyone listening hears the same thing at roughly the same time. Since that's not an aspect you want, stick to any normal HTTP server.

Android Live video steaming from one device to another

I am building an android application which will stream video calling from one device to another android device for that i am using wowza video streaming API(Media engine) With that i am able to steam video from android app to web but is it possible device to device video steaming?
If you are planning to develop all the infrastructure , then these are the points to be evaluated and concluded.
What Technology is used
WebRTC is the technology used to support video calling. WebRTC is a free, open project that provides browsers and mobile applications with Real-Time Communications (RTC) capabilities via APIs. Check out WebRTC Details here It was introduced by Google in 2010. This allows real time communication between two browsers / mobiles.
Concepts involved
1. Data streams and Hardware
WebRTC helps into setting up/identifying hardware’s and identifying network with STUN server ( What is STUN server ) along with hardware ( microphone/camera and speakers) . For mobiles this comes as inbuilt hardwares
2. Audio Video CODECS
Google has made audio/video required for these features as open source. Generally Audio G711 for phones (still varies in specific cases) . And for Video VP8 and VP9
3. Peer Discovery
For making a call , generally either address is required. Now in internet most IP’s are dynamic. To solve this , server needs to keep track of who is online. This can be done using XMPP, SIP or some custom protocols. So for anyone to receive a call, the caller should check with server or the other way around
4. STUN Server
Once signalling (peer discovery) is done, then STUN server is required. This server will faciliate to determine external IP address as well as info whether two or more devices can talk to each other or not
5. TURN Server
If a peer-to-peer session is not possible , then a TURN server is required. The TURN server will basically shift the bits for you through open holes in the firewall between the two clients. This happens due to asymmetric firewalls and the possibility to punch holes on different ports in firewalls
Or else you can use providers like SINCH who already handles and configures the basic requirement and you only need to concentrate on mobile front end.
Check out SINCH ANDROID SAMPLE as well

How does a SIP based app scheme work?

This is an abstract question on how the SIP protocol works. Let us say I have a SIP server (Asterisk/Yate). And I have two Android devices that wish to connect to each other to have an audio call. ( I am looking for a purely VoIP call, no need for telephone numbers or carrier information).
How would this work? Do the packets have to pass through the server? or does the connection happen between the end-points. If the packets has to pass through the server, does the SIP server also provides profiles, or do profiles have to be created by a third party?
I need to understand how the scheme works in order to start planning building the system.
I have read lots of technical documentations, but none show an abstraction of the system. If you can provide me with resources, that would be great too.
Thanks
Since your device not know where each other located(ip/port), they call sip server or proxy.
Sip server match dialplan and send request changed(server) or unchanged(proxy) to other side.
In INVITE request each peer send address/port and info about media stream RTP
If that info unchanged(proxy) they can see each other rtp info and send rtp packets directly.
Also there is posible enother INVITE after call bridged,called re-INVITE with info about new stream for rtp(can be sound on other ip/port or video)
There are nothing called profiles in sip standart, sorry.
Anyway seams bad idea start planning voip system if you have limited REAL experience with sip server.
There are alot of articles(including wikipedia), videos on youtube with topic "how sip works", there are no way put all that here in one answer.

how to synchronize display of multiple android devices?

I was thinking of using multiple Android devices (e.g. Nexus 7 tablets) to build a photo / video wall and I'm wondering a) whether it is possible and b) how to synchronize the display of all these devices. Google showed off its Chrome racer experiment so clearly it is possible to synchronize displays across many devices.
So here are my questions:
what technology should I use to synchronize the displays? Android? Chrome? Please point me to existing code if possible.
what's the minimum lag between devices that could be achieved in such a setup?
can video and sound playback also be started simultaneously on multiple devices (think video wall)?
what kind of architecture should be considered for such a project? Centralized server that sends out commands? Should devices talk to each other?
I'm very curious about suggestions!
EDIT:
blinkendroid is the only app I've found so far that might do the job. Pros? Cons? Alternatives?
Technically the screen isn't shared, the games state is shared and the phones all render the state as they understand it.
Just a bit of background about Chrome Racer. We have a case-study on here, but it doesn't fully cover the question you are asking.
The primary technology used for communication in Racer is WebSockets. WebSockets allow one client to push and receive messages from a server in near-realtime.
Racer starts a session for a game by giving it a unique ID and holds open a Web Socket to the user. Anyone who subsequently joins a game is told to use the Same ID and the server creates a Web Socket to them as well. Now the server knows all the participants.
When a game starts a message is broadcast to all the participants asking them get ready to start, during this phase the server is working out how long it takes to round trip messages to all the clients. It is doing this so that it can work out any latency between devices and thus attempt to compensate for the latency on the slower clients.
Now the server knows about the clients the game can start properly. As the users are playing their game their commands are being pushed to the server over the web socket. The server the relays this message out to all the connected clients (like a satellite does) and it does this same thing for every single user that is connected to the session. This is how the games state is shared.
As each client receives the commands that are broadcast to it from the server it updates its internal representation of the game and renders that to the screen.
And that is about it.
Actually, we wanted to use WebRTC Data Channel because it can reduce the number of hops that data has to make to reach the client. In our solution today a client pings the server and the server relays the message (2 hops), we can reduce the latency by half if we can send it directly to the other user (which is the goal of WebRTC). Unfortunately WebRTC was not ubiquitous enough to deploy this as a solution at the time.
Websockets means web. You don't need the web to sync multiple devices at the same physical place... For video/music syncing, native apps via local offline technologies like Bluetooth or WiFi sounds more reliable.

How to make Android device as RTSP server

I am curious to know if there is way that I can make my android device as a RTSP server and stream video with any other acceptable device.
I have no understanding of RTSP server and other protocol that should be followed to make this happen.
Any help is appreciated.
Thanks,
SKU
There is nothing to worry about this problem. I seen a solution here
http://techsplurge.com/5080/get-vlc-media-player-for-android-with-unofficial/ take read and get your answer.
AFAIK, the architecture you should adopt is to put in place a remote machine.
On this machine, you must have a server installed : Flash Media Server, Red5, etc. It will host the application that will get the coming stream and will outputs it.
Then, you can stream from a device (that you called "server") to the remote hosted application (ex: www.remoteserver.com:1234/myApp). To stream, just put that url in the input of a media container (http://developer.android.com/reference/android/widget/VideoView.html). Hopefully, Android natively supports the RTSP protocol via that VideoView container.
Hope it helps.
I am trying to implement something similar. One suggestion I have received was, when you are making your device as RTSP server and someone wants to stream video from your device, its called peer to peer streaming. In this case, its better to involve a server in between, which acts like encoder-decoder. As in when another client sends RTSP SETUP request, it comes to your device for sure. But now when you are ready to stream video, send payload (using RTP) to server and server intern sends it to requesting device. You can avoid firewall problems possible in LAN or routers (in case of wife network).

Categories

Resources