I was thinking of using multiple Android devices (e.g. Nexus 7 tablets) to build a photo / video wall and I'm wondering a) whether it is possible and b) how to synchronize the display of all these devices. Google showed off its Chrome racer experiment so clearly it is possible to synchronize displays across many devices.
So here are my questions:
what technology should I use to synchronize the displays? Android? Chrome? Please point me to existing code if possible.
what's the minimum lag between devices that could be achieved in such a setup?
can video and sound playback also be started simultaneously on multiple devices (think video wall)?
what kind of architecture should be considered for such a project? Centralized server that sends out commands? Should devices talk to each other?
I'm very curious about suggestions!
EDIT:
blinkendroid is the only app I've found so far that might do the job. Pros? Cons? Alternatives?
Technically the screen isn't shared, the games state is shared and the phones all render the state as they understand it.
Just a bit of background about Chrome Racer. We have a case-study on here, but it doesn't fully cover the question you are asking.
The primary technology used for communication in Racer is WebSockets. WebSockets allow one client to push and receive messages from a server in near-realtime.
Racer starts a session for a game by giving it a unique ID and holds open a Web Socket to the user. Anyone who subsequently joins a game is told to use the Same ID and the server creates a Web Socket to them as well. Now the server knows all the participants.
When a game starts a message is broadcast to all the participants asking them get ready to start, during this phase the server is working out how long it takes to round trip messages to all the clients. It is doing this so that it can work out any latency between devices and thus attempt to compensate for the latency on the slower clients.
Now the server knows about the clients the game can start properly. As the users are playing their game their commands are being pushed to the server over the web socket. The server the relays this message out to all the connected clients (like a satellite does) and it does this same thing for every single user that is connected to the session. This is how the games state is shared.
As each client receives the commands that are broadcast to it from the server it updates its internal representation of the game and renders that to the screen.
And that is about it.
Actually, we wanted to use WebRTC Data Channel because it can reduce the number of hops that data has to make to reach the client. In our solution today a client pings the server and the server relays the message (2 hops), we can reduce the latency by half if we can send it directly to the other user (which is the goal of WebRTC). Unfortunately WebRTC was not ubiquitous enough to deploy this as a solution at the time.
Websockets means web. You don't need the web to sync multiple devices at the same physical place... For video/music syncing, native apps via local offline technologies like Bluetooth or WiFi sounds more reliable.
Related
I'm looking for advice regarding an aspect of a project I'm working on.
I'm developing a demo android app for a not-for-profit that specialises in services for the vision impaired. The plan is, among other things, the app will enable users to stream this organization's specialised audiobooks.
For the sake of demo'ing/development I need to establish some sort of server which will, pending a request from a device running the app:
directly transfer certain xml/html index files to the android phone (no streaming necessary)
stream .ogg and .mp3 audio files to the device
serve more than one client device at a time
start a stream from a specific point within an mp3/ogg file, pending a request from the phone app
I've had a look at Icecast as an mp3/ogg streaming solution, but my knowledge of servers is a bit limited (I've only ever done some basic work in Flask). Would I need to run this in tandem with something that can generically serve files / handle requests?
I'm basically just looking for a good solution/tool to implement this is. The server side doesn't need to be completely fleshed out, just fit the bill above, as my focus is developing the phone app side for now. For the sake of a demo, something straightforward / well documented would suit best.
You don't need a special server for this. Any HTTP server that supports range requests will be fine. This includes Nginx, Apache, etc. There is no need to spoon-feed clients media data from the server. This streaming and buffering aspect is handled automatically on the client, through TCP window size and outright closing the connection and connecting again later as-needed.
Icecast is meant for radio-style streams where everyone listening hears the same thing at roughly the same time. Since that's not an aspect you want, stick to any normal HTTP server.
This question follows on from Unity3D -- Send message to other mobile phones in the same vicinity
However, I made mistake of restricting to Unity3D.
So I would like to re-ask the question without that constraint.
Let us say we have 20 mobile phone users in a cave (so no Wi-Fi networks / isGPS)
One user hits a button, and every other user's screen flashes, (within a few milliseconds)
How to accomplish this?
What if everyone is using an iPhone?
What if there is a mix of iPhone and android users?
Finally, is there any solution that would cover a wider range of phones?
You should have some network so that mobiles can share some data. Bluetooth can have maximum of 10 m distance coverage (depends upon devices though). Since, all mobile are running same app they should be linked to a network and communicate. Please Check:
http://developer.android.com/samples/BluetoothLeGatt/index.html
You can create one device as server and communicate among other devices.
https://github.com/polyclef/BluetoothChatMulti
If you have installed the app on all of the devices then in all probability yes, if the device supports push (pretty much any smartphone) then you can use the push service to synchronize the devices based on geofencing (ie, 10m from my location), there are some other discovery routes you could try to (without using the B word) pinging other devices
the app would need to be able to provide some sort of server service if it was to create its own private network based on the IP addresses of the devices it found nearby, as those devices would have to connect to that phone acting as a server. the network interface shouldn't be important, but connecting the satellite devices to the server should be. You could try doing it based on which device can provide data services, aka hotspot. You can easily connect devices to networks programmatically.
at that point your faced with the classic client server problem. There is going to be a huge amount of work to get devices configured, network creation, client server infrastructure if it has to be done without data, packet optimization. Very expensive and very high risk depending on how many restrictions there are.
Search for How to make a html5 group chat and then build on that example.
Possibly send commands to the chat delimited by a / character where a javascript could then execute the command.
Good Luck with your design.
Danny117
I'm trying to make an application for android such that I can control my computer mouse by moving on the phone screen. This means it needs to be quick and responsive.
So far I have the websocket server written that listens for movement, which works great when using a laptop's browser as the websocket client. However, I've tried several websocket clients for android, but they're all very slow and unresponsive.
Is it possible to create a websocket connection with android that can deliver real time communication? How? If not, any alternative solutions?
Thanks!
What you are looking for is fastest streamed data, with chance of missing some packets - as they are not mandatory to have persistent state.
So in your case UDP transport protocol would be the best choice. As it offers speed in a price of reliability on data delivery. So you might have messages dropped, but the ones that will be delivered will get there relatively fast (most will be delivered).
As well you need to implement some extrapolation, in order to predict mouse movements if you want to have feeling of being "in same time" on both sides.
Although, clicking - should be delivered reliably, with specific location of click. That way clicks can be properly simulated.
I'm developing a piece of software which consists on mobile clients and a machine acting like a server. This is for a highly trusted environment (not public), so I don't care much about security.
I want the clients to be allowed to perform a certain action only if they are, say, 2 meters from the server.
As the client is actually an HTML5 app, it would be better if the server perform the check, not the clients (maybe the clients can send its position to the server and then he performs the check), but if it cannot be done, it doesn't matter.
I have run out of ideas about how this can be done. I have thought about bluetooth and geolocation, but can that detect if the client is at least 5 meters nearby?
Is this even remotely possible?
You can use sound waves to do this. These links might help you get started:
How to estimate distance between two android devices? (bluetooth preferred)
http://www.ehow.com/how_6075947_measure-two-locations-using-sound.html
http://iqtainment.wordpress.com/acoustic-ruler/
I'm developing an online pong game in which two players could play between them.
I though that for it, players will have to conect to a server and it will tell the players who is online to play. Also the server will save rankings and other stuff.
But for the play, at first I though to use the server for the match too (sending coordinates, etc), but I think that is not the best design because it is really slow.
So I'm thinking that android devices have to can comunicate between them, isn't it? Any idea? They have an ID...
If they can... the server could send the ID of the opponent and with that, the match will start comunicating the stuff between the mobile devices and not with the server.
Need some help pls!
You can set up a direct connection between the phones, definitely. Make it so the server coordinates the matchup, sends each player the other player's data (IP and such).
You'll have to use/develop a server/client system between the players. One of the players will act as a server and the other will connect directly to it. Make sure they can properly identify each other. You can make the central server decide which player will act as the match server. A simple UDP connection over the network will do the trick.
This scheme will save you on bandwidth for the central server and probably be faster for the players. However, it IS one more subsystem you have to code.
Make sure you properly weigh those factors and remember that a fast deployment is sometimes better than no deployment at all. (SOMETIMES)