Server software for handling multiple mp3/ogg streams - android

I'm looking for advice regarding an aspect of a project I'm working on.
I'm developing a demo android app for a not-for-profit that specialises in services for the vision impaired. The plan is, among other things, the app will enable users to stream this organization's specialised audiobooks.
For the sake of demo'ing/development I need to establish some sort of server which will, pending a request from a device running the app:
directly transfer certain xml/html index files to the android phone (no streaming necessary)
stream .ogg and .mp3 audio files to the device
serve more than one client device at a time
start a stream from a specific point within an mp3/ogg file, pending a request from the phone app
I've had a look at Icecast as an mp3/ogg streaming solution, but my knowledge of servers is a bit limited (I've only ever done some basic work in Flask). Would I need to run this in tandem with something that can generically serve files / handle requests?
I'm basically just looking for a good solution/tool to implement this is. The server side doesn't need to be completely fleshed out, just fit the bill above, as my focus is developing the phone app side for now. For the sake of a demo, something straightforward / well documented would suit best.

You don't need a special server for this. Any HTTP server that supports range requests will be fine. This includes Nginx, Apache, etc. There is no need to spoon-feed clients media data from the server. This streaming and buffering aspect is handled automatically on the client, through TCP window size and outright closing the connection and connecting again later as-needed.
Icecast is meant for radio-style streams where everyone listening hears the same thing at roughly the same time. Since that's not an aspect you want, stick to any normal HTTP server.

Related

Does WebRTC causes any load on the broadcaster site?

Imagine someone is broadcasting an audio or video world wide through WebRTC i.e one to many communication (app like periscope which i think is not done using WebRTC). Will it get affected by the broadcasters less bandwidth ? will it increase the load on broadcaster side causing loss of packets which will decrease quality of communication ? As this topic is new and very little content is available on net please suggest some good books and online tutorials.
WebRTC facilitates peer to peer communication. Going by this login a simple WebRTC broadcast application will heavily depend on broadcaster's bandwidth as the number of recipients will be same to the number of outbound media streams.
This is one of the main reasons why WebRTC Gateways or Media Servers or similar terms have been developed. In this case the broadcaster simply sends a single stream to the intermediary Gateway or Server and the other recipients then connect to the Gateway or Server and receive the stream from there.
To put it in simple terms, you basically add a central WebRTC client to which everyone connects.
You can read more at Janus, Kurento, Licode, etc.
Or find official RTCPeerConnection documentation here.

how to synchronize display of multiple android devices?

I was thinking of using multiple Android devices (e.g. Nexus 7 tablets) to build a photo / video wall and I'm wondering a) whether it is possible and b) how to synchronize the display of all these devices. Google showed off its Chrome racer experiment so clearly it is possible to synchronize displays across many devices.
So here are my questions:
what technology should I use to synchronize the displays? Android? Chrome? Please point me to existing code if possible.
what's the minimum lag between devices that could be achieved in such a setup?
can video and sound playback also be started simultaneously on multiple devices (think video wall)?
what kind of architecture should be considered for such a project? Centralized server that sends out commands? Should devices talk to each other?
I'm very curious about suggestions!
EDIT:
blinkendroid is the only app I've found so far that might do the job. Pros? Cons? Alternatives?
Technically the screen isn't shared, the games state is shared and the phones all render the state as they understand it.
Just a bit of background about Chrome Racer. We have a case-study on here, but it doesn't fully cover the question you are asking.
The primary technology used for communication in Racer is WebSockets. WebSockets allow one client to push and receive messages from a server in near-realtime.
Racer starts a session for a game by giving it a unique ID and holds open a Web Socket to the user. Anyone who subsequently joins a game is told to use the Same ID and the server creates a Web Socket to them as well. Now the server knows all the participants.
When a game starts a message is broadcast to all the participants asking them get ready to start, during this phase the server is working out how long it takes to round trip messages to all the clients. It is doing this so that it can work out any latency between devices and thus attempt to compensate for the latency on the slower clients.
Now the server knows about the clients the game can start properly. As the users are playing their game their commands are being pushed to the server over the web socket. The server the relays this message out to all the connected clients (like a satellite does) and it does this same thing for every single user that is connected to the session. This is how the games state is shared.
As each client receives the commands that are broadcast to it from the server it updates its internal representation of the game and renders that to the screen.
And that is about it.
Actually, we wanted to use WebRTC Data Channel because it can reduce the number of hops that data has to make to reach the client. In our solution today a client pings the server and the server relays the message (2 hops), we can reduce the latency by half if we can send it directly to the other user (which is the goal of WebRTC). Unfortunately WebRTC was not ubiquitous enough to deploy this as a solution at the time.
Websockets means web. You don't need the web to sync multiple devices at the same physical place... For video/music syncing, native apps via local offline technologies like Bluetooth or WiFi sounds more reliable.

How to comunicate two androids devices

I'm developing an online pong game in which two players could play between them.
I though that for it, players will have to conect to a server and it will tell the players who is online to play. Also the server will save rankings and other stuff.
But for the play, at first I though to use the server for the match too (sending coordinates, etc), but I think that is not the best design because it is really slow.
So I'm thinking that android devices have to can comunicate between them, isn't it? Any idea? They have an ID...
If they can... the server could send the ID of the opponent and with that, the match will start comunicating the stuff between the mobile devices and not with the server.
Need some help pls!
You can set up a direct connection between the phones, definitely. Make it so the server coordinates the matchup, sends each player the other player's data (IP and such).
You'll have to use/develop a server/client system between the players. One of the players will act as a server and the other will connect directly to it. Make sure they can properly identify each other. You can make the central server decide which player will act as the match server. A simple UDP connection over the network will do the trick.
This scheme will save you on bandwidth for the central server and probably be faster for the players. However, it IS one more subsystem you have to code.
Make sure you properly weigh those factors and remember that a fast deployment is sometimes better than no deployment at all. (SOMETIMES)

initiating a rtsp connection over cellular with android

The majority of sim accounts are public dynamic. Most if not all cellular providers do not allow incoming connections to public dynamic ip addresses. (3g anyway, maybe not 4g/LTE)
The issue of connecting is not one of dynamic ips, but rather blocked incoming ports.
So, if I wanted to stream video from an android phone on demand (based on information gleaned from this conversation (Streaming video from Android camera to server)), what would be the chain of events to properly intitiate a connection.
My idea of this (roughly):
app on android phone initiates and keeps open some sort of connection to media server (wowza or something).
At some point when server wants video from phone, it uses the open connection to request a video stream.
Android phone pushes rtsp stream to server.
Is this correct, and if so, what type of connection should i use as the permanent control connection. Also, is it possible to push rtsp or would i have to do something else?
Thanks!
I know this is an old question but if anybody else is searching for something similar the following is now available:
http://developer.android.com/guide/google/gcm/index.html
This essentially allows a message to be sent from a server to an app on an Android device (it replaces C2DM which did a similar thing).
Update
Google GCM has now being replaced in turn by Google Firebase Cloud Messaging:
https://firebase.google.com/docs/cloud-messaging/
Using a could based app messaging service like this, the steps would be:
Add a message subscription service to your app (e.g. Firebase)
The App registers with the cloud messaging service when it starts up
When the server wants video from the phone (as noted in the questions above) the server sends a message to the app
The app opens a connections to the streaming server and starts to stream video to the server.
Note: there is a comment below about how this approach does not allow an incoming connection from the server to the Android phone.
This, in fact, is not how streaming from a phone typically works. The phone actually makes an 'outgoing' connection to a streaming server which it then streams the video to. Other devices wanting to see the video then stream it form here.
There are several reasons why this is the preferred approach, one of the key ones being that supporting a quality streaming service that will play back on most common devices, browsers, OS's etc requires transcoding the video into multiple bit rates, and even encodings in some cases, and packaging and serving in the appropriate streaming packaging format. Doing all this on the mobile device would be very compute and storage intensive.

TCP-based RPC server (Erlang or something similar?) for iOS/Android app communication

I'm building native mobile applications in both iOS and Android. These apps require "realtime" updates from and to the server, same as any other network-based application does (Facebook, Twitter, social games like Words with Friends, etc)
I think using HTTP long polling for this is over kill in the sense that long polling can be detrimental to battery life, especially with a lot of TCP setup/teardown. It might make sense to have the mobile applications use persistent TCP sockets to establish a connection to the server, and send RPC style commands to the server for all web service communication. This ofcourse, would require a server to handle the long-lived TCP connection and be able to speak to a web service once it makes sense of the data passed down the TCP pipe. I'm thinking of passing data in plain text using JSON or XML.
Perhaps an Erlang based RPC server would do well for a network based application like this. It would allow for the mobile apps to send and receive data from the server all over one connection without multiple setup/teardown that individual HTTP requests would do using something like NSURLConnection on iOS. Since no web browser isn't involved, we don't need to deal with the nuances of HTTP at the mobile client level. A lot of these "COMET" and long-polling/streaming servers are built with HTTP in mind. I'm thinking just using a plain-text protocol over TCP is good enough, will make the client more responsive, allow for receiving of updates from the server, and preserve battery life over the traditional long polling and streaming models.
Does anyone currently do this with their native iOS or Android app? Did you write your own server or is there something open sourced out there that I can begin working with today instead of reinventing the wheel? Is there any reason why using just a TCP based RPC service is a worse decision than using HTTP?
I also looked into HTTP pipelining, but it doesn't look to be worth the trouble when it comes to implementing it on the clients. Also, I'm not sure if it would allow for bi-directional communication in the client<->server communication channel.
Any insight would be greatly appreciated.
Using TCP sockets with your own protocol rolled down is quite better than HTTP especially with the nature of resources on the mobile devices. Erlang will do quite well, however lets start from your protocol. Erlang excels well at this especially with the Bit Syntax expressions. However still, you could use plain text as you wish. JSON (would need a parser: Mochijson2.erl found in Mochiweb library) and XML (will need a parser: Erlsom).
I have personally worked on a project in which we were using raw TCP Sockets with our Erlang Servers and Mobile Devices. However, depending on the Port numbers you choose, Routers along the way would block/Drop packets depending on the security policies of service providers. However, i still think that HTTP can work. People chat on Facebook Mobile, send Twits e.t.c from their devices and am sure these social engines use some kind of Long Polling or Server Push or whatever but using HTTP. The mobile devices have advanced in capability of late.
Rolling your own TCP Based protocol comes with a number of challenges: Port selection, Parsing of data both at the client and server, Security issues e.t.c. Using HTTP will let you think of the actual problem than spending time correcting protocol issues at client or server. The Devices you've mentioned above like Android and IOS (Ipad, Iphone e.t.c) are very capable of handling HTTP COMET (Long polling). Am sure when you follow the standards for Web Applications on Mobile devices as well as these W3C Mobile Web Best Practices, your app will function well using HTTP.
Using HTTP methods will quicken the work and there are a lot of libraries on the SDKs of these Devices which would assist you prototype the solution you want as compared to the situation of rolling your own TCP-based plain text protocol. To back up this reasoning, look through these W3C findings.
Let me finally talk of the HTTP benefits on these Devices. If you are to use Web technologies for Mobile devices, such as Opera Widgets, Phone Gap, Sencha Touch, and JQuery Mobile, their SDKs and Libraries have Optimizations already done for you or have well documented ways in which your app can be made efficient. Further still, these technologies have the APIs to access the native Devices' resources like Battery check, SMS, MMS, GSM broadcast channels, Contacts, Lighting, GPS , and Memory; all as APIs in the JavaScript classes. It would become hard (inflexible) if you use native programming languages like J2ME, Mobile Python or Symbian C++ / Qt as compared to using Web technologies like CSS3, HTML5 and JavaScript tools mentioned above. Using the Web tools mentioned above will make your app easily distributable by say Ovi Store or Apple Store, from experience.
Take note that if you use HTTP, testing will be easy. All you need is a public Domain so the Widgets on the mobile device locates your servers over the Internet. If you role your own TCP/IP protocol, the Network Routers may be disruptive against the Port number you use unless you plan on using port 80 or another well known port, but then still your Server IP would have to be made Public. There is a short cut to this: if you put your TCP Server behind the same ISP as your testing Mobile's Internet connection, the ISP routers will see both source and destination as behind its Network. But all in all, there are challenges with rolling your own protocol.
Edit: Using HTTP, you will benefit from REST. Web Servers implemented in Erlang (especially Yaws and Mochiweb) excel at REST services. Look at this article: RESTFUL services with Yaws. For mochiweb, there is an interesting article about: A million User comet application using Mochiweb which is broken into 3 parts. Further still, you could look at the solution given to this question.
There are ZeroMQ builds for android and iOS. Java and ObjC bindings exist as well.
HTTP was created for infrequent requests with large responses. It is highly inefficient for transferring very big amounts of small data chunks. In typical situation, http headers can be twice in size of actual payload. The only strong side of HTTP is its habitualness, its 'One size fits all' karma.
If you want lightweight and fast solution, I guess ZeroMQ can be a perfect solution.
One reason to go with HTTP instead of a custom service is that it's widely supported on a transport level.
With mobile devices, a user might be on Wi-Fi at a hotel, airport, coffee shop, or corporate LAN. In some cases this means having to connect via proxy. Your application's users will be happiest if the application is able to use the device's proxy settings to connect. This provides the least surprise -- if web browsing works, then the application should work also.
HTTP is simple enough that it isn't difficult to write a server that will accept HTTP requests from a custom client. If you decide to go this route, the best solution is the one that you don't have to support. If you can write something in Erlang that is supportive of application changes, then it sounds like a reasonable solution. If you're not comfortable doing so then PHP or J2EE gets bonus points for the availability of cheap labor.
While HTTP does benefit from being widely supported, some successful projects are based on other protocols. The Sipdroid developers found that persistent TCP connections do greatly improve battery life. Their article on the topic doesn't address the server side but it does give a high-level description of their approach on the client.
Erlang is very well suited for your use case. I'd prefer using TCP over HTTP for the sake of saving battery life on the phone as you noted already.
Generally getting the communication between device and server up and running will be very easy. The protocol which you are using between the two is what will require most work. However writing protocols in Erlang is strikingly straight forward when using gen_fsm
You should checkout metajack's talk at the Erlang Factory which highlights his solution to a very similar use case for his iPhone game Snack Words.
I work on a application that connects to a Microsoft http server with long lived http/https connections to mobile devices to allow for push type data to be sent to the mobile. It works but there are lots of little gotcha's on the mobile side.
For the client to get 'packets' of data, we put the http connection into Chucked Encoding mode so that each packet is in one chucked packet.
Not all native http API services on each mobile will support calling you back when a 'chuck' of data has arrived, on the ones that don't normally wait until all the data from the server has arrived before calling the application back with the data. Platforms that support callbacks with partial data are (that I have found):
Symbian
Windows Mobile
Platforms that don't support partial data callbacks:
IOS
Blackberry
For the platforms that don't support partial callbacks, we have written our own http connection code with chucked encoding support using the native sock support. It's actually not very hard.
Don't rely on the fact that one chuck is one of your packets, http proxies or the native http api implementations may break that assumption.
On IOS with this background multitasking rules, means you can't keep this connection going while your application is in the background. You really need to use Apples Push Notification service and live by it's limitations.
Never trust mobile cellular networks, I have seen the weirdest stuff going on like the server side seeing the http connection drop and then reconnect (and replay of the original http request) while on the mobile end you don't see any drop in the connection. Basically treat the connection as unreliable where data can go missing. We ended up implementing a 'tcp' like sequence number scheme to ensure we didn't lose data.
Using http/https makes it easier to get past firewall rules on customer sites.
I'm not sure using http/https long-lived connections was the wisest decision we ever made, but it was made long before I turned up so I have to live with the fall-out of it.
As a alterative, we are looking at web sockets as well, but with the web-socket spec in the state of flux atm and generally being not to good to follow, I don't know if it will work out or not.
So that is my experience with using http/https as a long-lived realtime connection.
Your milage may vary.
It all depends on what data you are sending - the size of it, the criticality of timeliness, frequency of update etc.
If you are looking for a reasonably lazy update and verbose data (JSON say) then go with a HTTP comet pattern, as you will find it much easier to navigate standard network gear as other answers have highlighted. If you are behind a corporate firewall/proxy for example, http will be a much safer bet.
However, if you are doing fast things with small data sizes then go with something homegrown and leverage a TCP connection. It's much more to the point and you'll find the performance in real terms much better. Simple data structures and use fast operators to slice you data up as you need it.
Again as other posters have noted, battery usage is a big concern. You will eat a battery by literally burning a hole in your pocket if you are not careful. It is very easy to turn a battery that lasts 2 days into one that will last 6hours.
Lastly, don't trust the network if you are time sensitive. If you are not then a long poll over HTTP will be just fine for you. But if you are looking for high performance messaging, then be acutely aware that a mobile network is not an end-to-end TCP connection. Your requests will varying in trip time and latency.
So back to what you want to do with the app. As you are building for iOS (native obviously dictated) and Andriod, I would leverage Apple Push Services and their notification framework. Build you back end services to talk to that and also provide interfaces for non-apple devices (i.e. http or tcp level listeners). That way one platform and multiple 'gateways' for your apps. You can then do RIM via their push service too if you wanted to.

Categories

Resources