Connect two android apps via http REST and EJB3 - android

I'm writing application for android where two devices should communicate between each other via internet. In addition to this task they also communicate with the EJB3 server via REST. So I decided to kill two birds with one stone and use REST+EJB3 for transferring data between two paired android devices.
So the scenario I implemented is something like this:
Both devices connect to the server and acquire session id.
First device sends data to the second device
Server gets the data but does not end the http request, instead it puts into a waiting pool
Second device asks for data
Server transfers the data to the second device and releases waiting connection (and thread) for first device.
If there are no first or second device requests then opponent waits for a timeout on a server side, then sends the request again. We need to wait for the data on the server side to give immediate respose after data is arrived.
So in this schema I see two drawbacks:
- Waiting thread on the server side - they consume server resources and as the result limit server throughput
- If the server thread will not wait for an answer with timeout, then the client should repeat requests on and on and spend a lot of traffic.
What is the best practice solution for such problem?
P.S: Forgot to mention that two devices should exchange data as smoothly and quickly as possible.

You will need to use C2DM
http://android-developers.blogspot.com/2010/05/android-cloud-to-device-messaging.html
When message needs to be sent from A to B - A should connect to server and depending on data kind/amount - server will either push data via C2DM or just tell device B to come back and grab data.
I would store data on server anyway. If push fails - you can retry it. No need to reinvent wheel. Most issues/problems already solved in C2DM

Related

SignalR on Azure App Service with ARR Disabled

Our server is scaling out with 1-3 instances in some specific period of time everyday. We have Azure Redis Backplane for connection persistency of signalr. In addition to this, the server doesn't ARR Affinity enabled. By the way we are using ServerSentEvents for Androids and WebSocket for iOS.
The problem is our mobile users(moto couriers) are disconnecting or reconnecting to SignalR server frequently because of their provider when the mobile signal is low.
We have checked all the things over mobile side. We pretty sure that we have one and only one signalr connection at a time. In addition to this when they are connected, we are storing their connectionids on persistent storage (SQL Database).
While sending a message to users we choose the latest connection id stored on Database. This means that we only sent to client's only one connection id.
However we get some feedbacks about the messages we sent over server popping up twice on their phones (Most of the time messages are received twice in the rush hours where server has 2 or 3 instances).
We are not able to trace it down why it is received twice especially on rush hours.
The question is, is there any chance this is about ARR Affinity? Because Redis backplane uses subscribe and publish methodology and since couriers are disconnecting/reconnecting frequently they have a chance to connect different servers, thus, when server sends a message, 2 servers might try to send that message and it is popping up on their phone twice even though they have one connection.
Additional info;
SignalR DisconnectTimeOut = 60 seconds
SignalR KeepAlive = 20 seconds
This seems to be the reason, when new connection request is made, you might need to drop existing one from other server, using replication.
if replication interval is short enough, it will minimise number of duplicates, for the rest, you might need to solve it on client side by ignoring last notification if notification hash/id is already received.

Detecting online status of device and sending back to server

Hi I am developing android application in which I want show whether other person is online or not so that person can intiate the communication.I thought about few solution :
1) Implementing heartbeat mechanism, in which device will send ping request to server after fix interval of time.
2) Server will send push type ping to client and client will give response on that so that server will know that client is online.
First case causes battery and data issue, while second one causes delay in push which will affect the process.
Is there any better solution for this problem? Apart from these or improvise version of above one.
nilkash. Virtually any method for checking network connectivity will at the end result in sending periodical pings between device and the server. Even push type ping will actually do the same (but it saves battery because push notifications aggregate messages for all applications in-to a single connection to a google server). So the best solution is just a proper combination of optimizations and you have to choose them depending on your requests.
Server pushes are power efficient, mostly because they reuse the
same connection for all applications, but the delay can be huge,
something like 10 minutes.
You can subscribe to connectivity
events and send "online" message to server once you are online. (But
not once you are offline because you are... offline). This will give
you immediate online events.
Do not send pings from device when there is no connectivity. Your application should be absolutely idle so as not to use battery.
There is no easy way to find out
when client goes offline on server side. You have to trade
traffic/battery for time resolution. More often you send pings, the
better resolution is. But you can't change ping interval for pushes,
so if you need better resolution, then you need to use your own
connection. But you can send other useful data through that connection too.
If you keep a TCP connection, then your pings can be
very data efficient: TCP keep alive packets are just 60/54 bytes.
But then you have to keep open connections with all clients on the
server. This may be a problem if you have a lot of clients.
The best combination may be something like that: you always send online message from a client when it becomes online. You keep TCP connection while the application is in foreground. You use the same connection to transfer data to and from the application. When your application goes to background you fallback to power consuming push based pings and do them at 10 minutes basis.

Android client server app sharing gps data

I am developing android application which allow users to share localization data between them, and showing it on the map.
I've done it, and its working but I am looking for better performance, or just better pattern for app like this.
For now, my app uses https connection between android client and REST servlet on tomcat.
For instance, if user is logged he is sending his gps data every 10 sec to server, and gets other users positions. Everything by HTTP POST.
First thing I want to ask you:
Does anyone knows better solution to sync data with server than android service with timed task every 10 sec? I have background service that is running all the time and every 10 sec runs AsyncTask which asking server about users localizations.
And what do you think about connection method with server?
Maybe it will be better to create connection by sockets?
Thanks in advance for all responses.
You're right. There is no necessity of establishing http connection each 10 seconds. It's absolutely inefficient for clients and your server in terms of CPU usage and data transferring.
There're two solutions which I suppose more appropriate for you task:
Yes, sockets. Socket connection could be fairly easy implemented on Аndroid. Two threads which share socket connection: first one reads data, second one writes data. To receive data when phone is sleep you could use GCM service.
P2P connection. It's quite reasonable because as I understood you don't need to modify or track transferring data by your server. Clients just will communicate with each other.

Android streaming text

I'm creating app that has to show live position of some vehicles. Their position is obtained by GPS via Rasberry PI, is sent to my server and there it is converted to Json file. Then on android device I am creating app which converts this file and lets user see vehicle position.
I am downloading it via HTTP protocol, and I think it is good way of solving problem. But my boss is insisting that it could be done by streaming (because, as he says, it is not neceserry to download data from server when vehicle is not moving for about 30 minutes), SO.
What is the way of creating a situation in which android device is not downloading data but waiting for server to send data? Is it even possible?
As far as I know, streaming is constantly sending data to target device, and device is constantly receiving this data.
The only way I can think about is to create server on every android device, send server data to my server (IP, port etc.) and from my server connect to every device and send position only when vehicle is moving - but this is costfull and not proper way I think.
Any ideas or help?
The best way to handle your problem is implementing a GCM and dealing with PUSH NOTIFICATIONS.
If connection is not available at the moment, will be sent when connection will be ok.
If the receiver isn't available at the moment, same things will happen.
It consumes less resources than constantly http calls.
See more information at https://developer.android.com/google/gcm/index.html

How does a server handle web service requests from multiple clients

I just completed an Android application that uses web services to connect to a remote database. I was working on localhost.
Now, I plan to host my web services on a server. Let's say I have my Android application installed on any number of different client smartphones. Each smartphone user calls the web service at the same time.
Now how does the server handle these requests? Does it execute one thread per request? I want to know about the server processing in detail. Considering, all phones use GPRS, will there be any sort of delay in such a situation?
BTW, my web services are all SOAP based and the server I plan to use later will be an SQL Server. I have used .NET framework for creating web services.
Its for the general concept, not a Android specific
Usually, each of the users sends an HTTP request for the page. The server receives the requests and delegates them to different workers (processes or threads).
Depending on the URL given, the server reads a file and sends it back to the user. If the file is a dynamic file such as a PHP file, the file is executed before it's sent back to the user.
Once the requested file has been sent back, the server usually closes the connection after a few seconds.
Look at How Web Servers Work
EDIT:
For HTTP uses TCP which is a connection-based protocol. That is, clients establish a TCP connection while they're communicating with the server.
Multiple clients are allowed to connect to the same destination port on the same destination machine at the same time. The server just opens up multiple simultaneous connections.
Apache (and most other HTTP servers) have a multi-processing module (MPM). This is responsible for allocating Apache threads/processes to handle connections. These processes or threads can then run in parallel on their own connection, without blocking each other. Apache's MPM also tends to keep open "spare" threads or processes even when no connections are open, which helps speed up subsequent requests.
Note:
One of the most common issues with multi-threading is "race conditions"-- where you two requests are doing the same thing ("racing" to do the same thing), if it is a single resource, one of them is going to win. If they both insert a record into the database, they can't both get the same id-- one of them will win. So you need to be careful when writing code to realize other requests are going on at the same time and may modify your database, write files or change globals.
The server will maintain a thread pool listening for incoming requests. Upon receiving a request, the thread will process the request and return the response. If all the requests are received at the same time and they're fewer than the maximum number of threads in the pool, they will all be services in parallel (though the actual processing will be interleaved based on the number of cores/cpus). If there are more requests than threads, the request will be queued (waiting for a connection) until either a thread frees up or the client request times out.
If you're connecting to the service from a mobile network, there is higher latency in the initial connection but not enough to make a difference.
Your question is not really related to Android but to mobile development with web backend.
I don't know how to use .NET for server app development but if you take the example of a Apache/PHP/MySQL, each requests are run in a separated thread.
There might be small latency delays while the request reach the server but this shouldn't affect the time taken by your server to process the request and the data.
One of the thing to think about is to avoid sending multiple requests from one same client. This is a common implementation problem : since you have no data already returned, you think there are no pending request and you launch a new request. This can basically create unnecessary load on your server.
Hope that helps !
a) one instance of the web service( example: spring boot micro service) runs/listens in the server machine at port like 80.
b) This webservice(Spring boot app) needs a servlet container like mostly tomcat.
This container will have thread pool configured.
c) when ever request come from different users simultaneously, this container will
assign each thread from the pool for each of the incoming requests.
d) Since the server side web service code will have beans(in case java) mostly
singleton, each thread pertaining to each request will call the singleton API's
and if there is a need for Database access , then synchronization of these
threads is needed which is done through the #transactional annotation. This
annotation synchronizes the database operation.

Categories

Resources