I have to build an application for android to stream video and audio to a desktop application through a server. Latency is important. I also have to make sure that android streaming can be controlled from pc (user should be able to switch the camera or turn off the microphone).
I thought to use the WebRTC protocol for communication but it seems I'm gonna have to write signalling server myself to support that requirement mentioned above.
Is there a better way to implement this whole thing? Also, I can't find any good docs or libraries for android streaming (no retrofit analogies obviously).
P.S. I'm thinking about using Javafx via Tornadofx for a desktop application.
You certainly don't need to create your own signaling server. I would suggest using something like Kurento Streaming Server or a derivation of Kurento like OpenVidu. It's open source and free and has lot's of great and active support via google groups. Depending on how much specific customization you may need one or the other might be better for you. OpenVidu allows for less customization since most of the stuff under the hood is already done for you, whereas Kurento allows you to modify and customize almost everything under the hood and on the front end using examples that can be changed at the code level. I have used it extensive on projects on the past and would think it meets most, if not all of your requirements. Scaling can be a bit challenging, but is still mush easier than just P2P webRTC since everything is relayed through a central server and most certainly doable depending on your requirements and implementation. Additionally you can record, process and transcode video server side.
Related
I want to implement my own encryption rule before the call data go into GSM network i.e. I want the call stream in the form of bits, I will implement my own encryption algo, and then send on to the network, my app on the other side(reciever's end) will recieve the data, decrypt it and make it into audio.
I want to know is it feasible, if it is how? I mean I want to use cell phone network as in like Airtel, Vodafone etc.
If it is not possible It will be of great help, if I can do it using internet (2G or 3G) ?
Any guidance in this, I want just direction.
Thanks in advance.
You can quickly create a chat application using Adobe Flex which will create an Adobe Air app that can run on Android (and also compile an iOS version if desired). The core strength of Adobe Flex is sending audio (and video) data with very little effort on the developers part.
You can configure your application to use SSL using the rtmps protocol if you want the data being transmitted to be encrypted.
This page shows you how you can create a simple video chat app for android using Flex http://coenraets.org/blog/2010/07/video-chat-for-android-in-30-lines-of-code/ - if you specifically don't want video you can send audio only data.
I can't imagine any reason why this wouldn't be possible as the networks are just passing data around, I don't think they care if it's encrypted or not encrypted - it's just a series of 1s and 0s.
As to how, that's a little beyond the remit of Stack Exchange - if you have a specific problems then post them with code.
There are other similar questions which you could look at:
Basic encryption on Android
https://stackoverflow.com/search?q=android+encryption
On Android, calls using the GCM (or other) network are handled by the baseband processor, which you don't have direct access to. You talk to it via the rild (Radio Interface Layer daemon) which uses proprietary library to talk to the actual hardware. So in practice you cannot mess with the mobile network.
A VOIP application would use the data connection and you can send/receive pretty much anything you want. If you use a standard technology such as SIP, there are ways to use TLS for the communication channel(s), so that traffic is encrypted. If you are creating your own, you might do something similar by using SSL sockets.
The 'how' part doesn't really fit the SO format, since it's very open ended and depends on how you decide to implement this.
We are going to build a multiplayer game.
The idea is that every player has a tablet and is connected to a server.
The server should control the game logic, while the clients (the tablets) will only serve as a frontend to the game.
We need to make a decision about the frameworks/programming language we are going to use.
A crossplatform frontend would be cool, but is not mandatory. It has to run on Android devices at least.
The communication between the server and the client must be bidirectional and realtime.
We don't care about a small amount of delay.
Currently we consider an HTML5 client in combination with a javascript server (running upon nodejs) to be the best option.
The communication would be managed by the javascript library socket.io.
The HTML5 frontend can be run either in-browser or as an app (built with PhoneGap).
However we did not decide yet since we want to be sure to make the right choice.
There might be frameworks that can do a better job.
Does anyone know a better solution?
Play 2 is great for this well, but node/socket are great choices too. I would use backbone.js for the front end as it gives you a lot of flexibility and it's easy to keep the views in sync. (there is an example of such an app on my github if you're interested)
I'm working on an Android game based on Playing Cards (Bridge, to be precise), which can be played by four players at time. And there'll be a server available via Web, to which devices will connect, and server will keep track of game progress.
My game is very basic when it comes to graphics that I can attain the UI without using any gaming engine.
While I'm supposed to build the game (the client) for Android, I wanted to develop server which can be RE-USED in future ports of the game, even if it is ported to other mobile platforms or even desktop.
So I thought of first possible candidate for server architecture was having RESTful Web Service so that I can leverage the server with any client as long as the client's programming end supports HTTP methods.
But later I realized that since there'll be persistent connection between devices and the server throughout the game session, would it be okay to have such a server, where connection will terminate after the request is responded (I'm not sure if it is true)?
Or shall I use DatagramSocket and DatagramPacket way of Java to build the server? (will that ensure re-usability of server?)
Any other suggestions or recommendations?
Note: I'm not new to Java or Network Programming in Java, but I'm new to both Android Development and creating RESTful services.
While writing for Android, don't plan for a persistent connection. Connections break very often (and often for good reasons, like switching from GSM to wifi). HTTP is a great, popular and proven choice (you get some lower levels of the stack out of your way and can focus on processing the payload).
BTW: saying "RESTful web service" int this context is meaningless - what you need is a HTTP server that serves data and accepts commands, not a mental framework for structuring your game logic as a set of stateful resources.
I think your HTTP-based plan is appropriate for this situation, I don't think the question of persistence of connection is relevant for a slow turned based game such as bridge.
Edit: as suggested by tdreger almost all Android docs recommend that you plan for routine connection failure and reestablishment through a different channel, as such the html connection seems the most resilient solution.
I think your idea of making it client-side independent is correct and important - in this light the HTTP idea is clearly much better in that it will be much easier to code client-side applications in other languages (which you will probably want - Javascript for a web-client and objective-C for an iOS app).
I also think the Android development will be easier as Android and appache have strong support for these HTTP-like connections.
I'm currently in an early stage of my internship at a company which offer VoIP solutions. I'm basically here to create a custom SIP-client App for iPhone. I told them however, if I were to set up the MVC pattern correctly and more efficient in terms of portability, there would be minimal code to write when porting to different platforms.
I've chose to go with MonoTouch C#.NET, for high portability and productivity (learning Objective-C is too steep for my timeframe + memory management too time consuming). To create even more portability I've been thinking of exposing a C# SIP library as webservice, so when porting to Android there's even less hooking up to different APIs. Also, MonoTouch for compile reasons does not allow usage of Dynamic Libraries.
My app would communicate to the SIP webservice and the webservice in turn to the SIP server.
SIP is very familier to HTTP, but could this solution work? As I'll be facing Realtime Transport Protocol aswell.
Kind regards
As far as I know, it won't work because, as you mentionned, you will face RTP. You'll probably get a lot of lag in your conversations. Also, you'll have to figure out how you are going to stream the data between the clients and the server.
However, to really know if this can be done would be to do a few prototypes to test these kind of issues.
I wanna develop a simple two way video call functionality and integrate it within my app.
I found two solutions:
Using Android SIP - i will need to handle sending and receiving streams
Using XMPP - Jingle - i will need to implement the whole protocol
Problem is that i am pretty new to SIP and do know really understand what the SIP protocol on android already handles and how much of development will be needed. I know on the other hand that XMPP on android is not easy as well especially when working with video streams.
I would love to have people thoughts on which solution would be the best to implement knowing that i want:
1. a simple working 2way video chat at first
2. extend the functionality to a system of users (i was thining that using XMPP with openfire will cover this easily but im kind of scared regarding the ammount of work to integrate jingle)
If you have any easier solution to integrate audio/video functionality on android i would be glad to hear from you.
Both solutions are the same in a lot of ways.
SIP and XMPP both take care only of the signaling. The media part (video streams, UDP, etc) are done "elsewhere" and with the same set of protocols: RTP and RTCP for transport and control. H.264/VP8 for the video codec, some other codec for voice.
I'd look into WebRTC to see if it has any available code on Android - that would take care of the media parts nicely.