I've read this tutorial about data transfer in a battery efficient way.
All the lessons are based on one, simple concept: polling the server is Android is battery inefficient. For this reason, Google Cloud Messaging is introduced in order to send messages from the server to the device only when needed.
There is only one problem: I'm trying to implement a "mobile cloud", so a cloud composed by mobile devices, where each device can join/leave the network with high frequency. So I need some mechanism to detect when a device is not reachable anymore. Until now, in all the works that I've seen on the topic, the only solution was to periodically ping the main server to say "Hey, I'm still alive!" from the mobile device. Obviously this solution is battery killing, but until now I've not seen/found any better solution.
Do you know any battery efficient solution for this problem?
There's no reason why pinging the server periodically (heartbeat) is necessarily wasteful of the battery/inefficient. That depends upon how frequently you need to ping, and whether your ping needs to initiate its own transmission vs piggy backing on some other transmission.
Let me explain. Battery inefficiency depends upon whether or not you are increasing the frequency or duration of the transceiver being in an active state. If the transceiver is continually active anyway, such as it is continuously exchanging data or audio, then a heartbeat adds no additional burden. If it is not active, then there will be additional energy usage but that depends upon the frequency of your heartbeat compared to how long a ping will cause the transceiver to be powered. Even then, it's probably irrelevant to your application as I suspect "cloud" means the devices are active and connected.
Let's assume that your heartbeat is such that it will increase the duration of your transceiver being active. There are still techniques you can use to decrease this impact, such as caching your beat and sending it only when it can piggy back on another transmission. Of course, such solutions depend upon whether your heartbeat is implemented in an application, OS or kernel.
I suggest you do actual tests to see if there is truly an impact on your devices.
PS I'm not saying the tutorial is wrong. It isn't. But it is addressing a broader and more general problem then what you have.
Related
Referencing blog posts like this one, and SO questions like this one. I am going to assume that this is a general behavior (and not a bug on my side). The common answer seems to be something to the effect of: "Change the BLE firmware so it actively disconnects."
The question which is not well addressed is how Android apps handle what must be a very common occurrence? Connection is lost unexpectedly due to "range", I.E. radio signal strength.
Is there a way for an app to be notified "immediately" on the loss of connection?
It does seem unrealistic that all apps just sit there for something like the 20 seconds which is mentioned as a core OS timeout value. Is that what we should all be doing even though my equivalent app on iOS knows about the loss of connections in less than 1s?
Example 1
One common type of BLE device is the "Find my keys" type. Many of them have a feature to alert the user when you leave the "keys" unintentionally. I assume that this uses the connection going down as an indicator of you walking too far away. Right?
Example 2
Your app is supposed to be notified of of value changes from a characteristic on the device. This would be any kind of sensor data where you have some threshold is being crossed for example. I can think of plenty of examples where you'd want to know immediately that your "sensor" is out or range.
Known Workarounds
I have seen one workaround amounting to constantly monitoring the RSSI to the BLE device but that seems like it would eat up a lot of battery. Similarly any failure to write to a characteristic (that normally succeeds) could also be used, again with battery life paying a price.
Something approaching a definitive answer to this questions seems like a good resource.
In latest versions of Android they lowered the default timeout to 5 seconds, which is of course much better.
I guess most peripherals send their own Connection Update request where they can set a different timeout value, which solves that issue (except for the small time between connection setup and until the connection update has gone through).
So called "out of range" detector apps are according to me pretty useless and hard to make good because BLE devices will probably sometimes temporarily disconnect anyway, even though they are in range.
For your actual question if it's possible to "immediately" get notified on the loss of the connection, the basic answer is no, because the Bluetooth controller doesn't send anything to the main cpu that packets were lost (that's the whole purpose of having a timeout so you are allowed to have packet losses). But you can of course try to poll the rssi or set up a flow where the peripheral every constant interval sends a notification and then you can in your app detect if you don't get the notification after some amount of time. But in this case it's smarter to just set your own timeout (from the peripheral) using the connection parameter update procedure.
I have to exchange data between two bluetooth devices, one of them will be an Android device. For simplicity's sake you can assume the other device will be a generic linux device running bluez producing data similar to the data a fitness tracker would produce.
The scenario seems a straightforward use case for Bluetooth Low Energy. The problem i am currently running into comes from the fact that communication has to be reliable (reliable in the way TCP is reliable). This means:
no losses
no corruption of data
order needs to be preserved
no duplicates
no phantom packets
While losses are prevented at link layer level, the order for instance seems not to be explicitly preserved when working with Low Energy (using indications would probably achieve this).
Not having done a lot of work with Bluetooth I am currently overwhelmed quite a bit with the amount of options while at the same time no option seems to fit the bill nicely.
Is there a "best-practice" for setting up reliable communication between two bluetooth devices? A Bluetooth Low Energy solution would be preferable, but is not mandatory.
Once your Bluetooth connection is setup its reliable. So you don't have to be worried about data loss or corruption.
So the things you're worried about can be easily handled in your side. You'll get proper connection and disconnection callback while setting up a BroadcastReceiver for your BluetoothAdapter.
In case of any disconnection you may have to restart the procedure for connection again and once its established properly you may resend the data.
I don't know about your purpose yet, but one thing I need to mention here is, I would not recommend Bluetooth communication if you're holding the connection for long time. Some devices disconnects the connection automatically after some time if there's no continuos transmission.
Android has Bluetooth support, but it only allow to send ot receive data from stream. There is a very good sample project from Google: https://github.com/googlesamples/android-BluetoothChat . The only drawback of this sample is that it use Handler to nitify about Bluetooth events. I changed it a bit so it use another Thread and from it calls methods of interface you set, take a look at project: https://github.com/AlexShutov/LEDLights . This is ordinary Bluetooth, not BLE, hope it will help
Android's BLE stack is as good as the link layer specification. So you can use "write without response" in one direction and notifications for the other direction. Just make sure your peripheral side does not drop incoming writes.
BLE uses 24-bit CRC. for the amount of data transmitted using BLE the CRC is quite robust and the possibility of corruption is very low ( note that TCP CRC is 16bit and the Ethernet CRC is 32bit, please see http://www.evanjones.ca/tcp-and-ethernet-checksums-fail.html).
The ordering issues in wired network is a result of routing packets through different routes to the same destination ( plese see If TCP is connection oriented why do packets follow different paths?) . This is partially due to the use of sliding window acknowledgement protocol, which allows a number of packets to be transmitted before being acknowledged.In BLE there is no routing and the acknowledgement scheme is a variation of stop and wait ARQ scheme(2-bit lazy acknowledgement), this means that it is not possible to send a new packet without being acknowledged. These two factors makes the possibility of having an out of order transmission highly unlikely.
If I receive a GCM message in a BroadcastReceiver and then do some very basic logic, how much battery will that use? I am thinking around ~30 messages per day. What about data?
EDIT: I understand that different devices have different battery sizes and CPU efficiencies, etc. I'm not asking for a precise percentage used, more just "you should worry about that" vs "you don't have to worry".
The logic involves reading a long from the db (last sync time) and comparing it with current time.
The messages will mostly all be received in a span of ~4 minutes.
EDIT2: I guess what I am asking is is it worth it to limit/batch up the amount of push notifications that my server sends? Ideally, ignoring battery life, I would want my server should send a push notification every time new data gets updated. The phone can then decide whether or not to actually pull the updates from the server. If it doesn't decide to pull, then at least the phone knowingly decided to not update the data. As opposed to if I limit the amount of times my server sends push notifications to my phone to tell it there is new data, then the phone thinks it has updated data but it actually doesn't.
It really depends. GCM can be received over Wifi, or over a cellular connection. The cellular radio consumes much more power than Wifi. Also there's a "warm up" and "keep alive" after each use of the radio, so sending/receiving things in one big burst is more energy efficient than sending them spaced a few minutes apart. This Youtube video from GoogleDevelopers has some more information about how to optimize network access to minimize battery consumption.
That said, the exact amount of power used will likely depend on the handset, distance to tower/Wifi access point, network protocol used (802.11g, CDMA, GSM, etc), and other factors.
"you don't have to worry" is the answer. GCM services is one process which manage push notification for all android app. It is for sure better than many different apps which ask continuosly to a remote server for new messages. It will use some battery percentage, but less than any other way
I've been given the task to create an iBeacon client for Android. The purpose of the client is to transmit the ID of the closest beacon over WIFI/3g/4g to a server when another client requests the users position.
Now, I wonder, what is the best way of doing this, with respect to battery consumption?
There are (as far as I can see) two approaches:
1). Whenever the ID of the closest beacon changes, upload the new ID directly to the server.
2). Store the ID of the closest beacon locally on the client's phone. Only upload it to the server when the server requests the client to do so.
Nr 1). means that WIFI has to be activated every time the user approaches a new iBeacon. On the other hand, 2). requires the phone to listen to a port in order to handle requests from the server.
EDIT: I read about the Radio State Machine here: http://developer.android.com/training/efficient-downloads/efficient-network-access.html
If I keep a ServerSocket running in the background, will this keep my phone in Radio Low Power? It seems to me, that option 1) will put the phone into full radio power every time a beacon ID is transmitted, but after that, the phone will fall back into Low Power/Sleep mode, which will save me battery. 2) on the other hand will prevent the phone from reaching Sleep mode, since I have to keep a persistent connection to the server.
I would go with option one because it is simpler and less brittle. If you are really worried about battery, just place limits on how often it can talk to the server(e.g. 10x per hour) so it does not get crazy.
Other things may use battery even more, so test first and make sure this really requires optimization. For example, run this in the background for 10 hours and see how much it drains a fully charged battery vs. When the app is not reporting to the server. If it is only a few percent, then it is probably efficient enough.
I'm considering using a persistent connection to a "cloud service" from an Android app. This would run all the time in a background service (or something like that).
I'm thinking of using web sockets or XMPP to keep the connection, basically looking for a light weight connection that won't drain battery. I want to be able to push notifications in real time to this connection, so periodic polling is not desired. I am aware of C2DM and other commercial solutions, but am looking to roll my own. This is why a web socket (or other light weight connection) is what I'm investigating. So if I go this route, what are some best practices I should be aware of?
I'm thinking of stuff like:
how to prevent the battery from draining,
How to handle IP address changes, etc?
This might not be the answer you are looking for but I think you may want to rethink your architecture.
Things you can expect out of a mobile platform
Your IP address to change randomly
Your physical internet connection to be lost randomly
The OS to decide your not doing anything useful and killing your process
The connection type changing randomly (from WIFI to 4G to 3G to edge) and thus your IP to change
Basically your app needs to be able to handle a loss of connection, because its almost guaranteed to happen.
That being said, it is totally doable depending on your definition of real-time. If your willing to continually check that there is still a viable connection, you could keep any delays down to the minutes range. But this will drain the battery and there is not much you can do about it.
Some things just don't go well together. That is "push notifications in real time" and "prevent battery draining". You sure have to make compromises here.
I can only recommend to try some Android Apps that use XMPP to get a feeling how they handle persistent connections, IP address changes and battery consumption. If they are open-source you can also view the code and learn from it. Yaxim, Project MAXS and Beem to name a few. Maybe you shoud also have a look at XEP-0286: XMPP on Mobile Devices
That said, are you sure that you want to reinvent the wheel when Google offers you C2DM? Which is optimized for this use case. I think that it has some delay, so it's no where "real-time". But again, either you will end up with an solution that tries aggressivly to establish a persisent connection and drains the battery, or you will have to live with some kind of delay (~ 0-30 min).