I've been given the task to create an iBeacon client for Android. The purpose of the client is to transmit the ID of the closest beacon over WIFI/3g/4g to a server when another client requests the users position.
Now, I wonder, what is the best way of doing this, with respect to battery consumption?
There are (as far as I can see) two approaches:
1). Whenever the ID of the closest beacon changes, upload the new ID directly to the server.
2). Store the ID of the closest beacon locally on the client's phone. Only upload it to the server when the server requests the client to do so.
Nr 1). means that WIFI has to be activated every time the user approaches a new iBeacon. On the other hand, 2). requires the phone to listen to a port in order to handle requests from the server.
EDIT: I read about the Radio State Machine here: http://developer.android.com/training/efficient-downloads/efficient-network-access.html
If I keep a ServerSocket running in the background, will this keep my phone in Radio Low Power? It seems to me, that option 1) will put the phone into full radio power every time a beacon ID is transmitted, but after that, the phone will fall back into Low Power/Sleep mode, which will save me battery. 2) on the other hand will prevent the phone from reaching Sleep mode, since I have to keep a persistent connection to the server.
I would go with option one because it is simpler and less brittle. If you are really worried about battery, just place limits on how often it can talk to the server(e.g. 10x per hour) so it does not get crazy.
Other things may use battery even more, so test first and make sure this really requires optimization. For example, run this in the background for 10 hours and see how much it drains a fully charged battery vs. When the app is not reporting to the server. If it is only a few percent, then it is probably efficient enough.
Related
Referencing blog posts like this one, and SO questions like this one. I am going to assume that this is a general behavior (and not a bug on my side). The common answer seems to be something to the effect of: "Change the BLE firmware so it actively disconnects."
The question which is not well addressed is how Android apps handle what must be a very common occurrence? Connection is lost unexpectedly due to "range", I.E. radio signal strength.
Is there a way for an app to be notified "immediately" on the loss of connection?
It does seem unrealistic that all apps just sit there for something like the 20 seconds which is mentioned as a core OS timeout value. Is that what we should all be doing even though my equivalent app on iOS knows about the loss of connections in less than 1s?
Example 1
One common type of BLE device is the "Find my keys" type. Many of them have a feature to alert the user when you leave the "keys" unintentionally. I assume that this uses the connection going down as an indicator of you walking too far away. Right?
Example 2
Your app is supposed to be notified of of value changes from a characteristic on the device. This would be any kind of sensor data where you have some threshold is being crossed for example. I can think of plenty of examples where you'd want to know immediately that your "sensor" is out or range.
Known Workarounds
I have seen one workaround amounting to constantly monitoring the RSSI to the BLE device but that seems like it would eat up a lot of battery. Similarly any failure to write to a characteristic (that normally succeeds) could also be used, again with battery life paying a price.
Something approaching a definitive answer to this questions seems like a good resource.
In latest versions of Android they lowered the default timeout to 5 seconds, which is of course much better.
I guess most peripherals send their own Connection Update request where they can set a different timeout value, which solves that issue (except for the small time between connection setup and until the connection update has gone through).
So called "out of range" detector apps are according to me pretty useless and hard to make good because BLE devices will probably sometimes temporarily disconnect anyway, even though they are in range.
For your actual question if it's possible to "immediately" get notified on the loss of the connection, the basic answer is no, because the Bluetooth controller doesn't send anything to the main cpu that packets were lost (that's the whole purpose of having a timeout so you are allowed to have packet losses). But you can of course try to poll the rssi or set up a flow where the peripheral every constant interval sends a notification and then you can in your app detect if you don't get the notification after some amount of time. But in this case it's smarter to just set your own timeout (from the peripheral) using the connection parameter update procedure.
My app is single activity app which is used to generate token slip using WiFi printer over wifi which are connected locally. So the WiFi is always ON. Also the screen is always ON.
I have set android:keepScreenOn="true" in my activity xml file for this.
The tablet is exclusively used for this single app only not more than that. Even though, the battery drains around 4 hours.
Is this common? Or Would I change anything to achieve good battery backup?
Generally if your device is old, There is a chance to expect this kind of situation.
To debug about your network traffic, Follow this link
The network traffic generated by an app can have a significant impact
on the battery life of the device where it is running. In order to
optimize that traffic, you need to both measure it and identify its
source. Network requests can come directly from a user action,
requests from your own app code, or from a server communicating with
your app.
Link is here
I've read this tutorial about data transfer in a battery efficient way.
All the lessons are based on one, simple concept: polling the server is Android is battery inefficient. For this reason, Google Cloud Messaging is introduced in order to send messages from the server to the device only when needed.
There is only one problem: I'm trying to implement a "mobile cloud", so a cloud composed by mobile devices, where each device can join/leave the network with high frequency. So I need some mechanism to detect when a device is not reachable anymore. Until now, in all the works that I've seen on the topic, the only solution was to periodically ping the main server to say "Hey, I'm still alive!" from the mobile device. Obviously this solution is battery killing, but until now I've not seen/found any better solution.
Do you know any battery efficient solution for this problem?
There's no reason why pinging the server periodically (heartbeat) is necessarily wasteful of the battery/inefficient. That depends upon how frequently you need to ping, and whether your ping needs to initiate its own transmission vs piggy backing on some other transmission.
Let me explain. Battery inefficiency depends upon whether or not you are increasing the frequency or duration of the transceiver being in an active state. If the transceiver is continually active anyway, such as it is continuously exchanging data or audio, then a heartbeat adds no additional burden. If it is not active, then there will be additional energy usage but that depends upon the frequency of your heartbeat compared to how long a ping will cause the transceiver to be powered. Even then, it's probably irrelevant to your application as I suspect "cloud" means the devices are active and connected.
Let's assume that your heartbeat is such that it will increase the duration of your transceiver being active. There are still techniques you can use to decrease this impact, such as caching your beat and sending it only when it can piggy back on another transmission. Of course, such solutions depend upon whether your heartbeat is implemented in an application, OS or kernel.
I suggest you do actual tests to see if there is truly an impact on your devices.
PS I'm not saying the tutorial is wrong. It isn't. But it is addressing a broader and more general problem then what you have.
If I receive a GCM message in a BroadcastReceiver and then do some very basic logic, how much battery will that use? I am thinking around ~30 messages per day. What about data?
EDIT: I understand that different devices have different battery sizes and CPU efficiencies, etc. I'm not asking for a precise percentage used, more just "you should worry about that" vs "you don't have to worry".
The logic involves reading a long from the db (last sync time) and comparing it with current time.
The messages will mostly all be received in a span of ~4 minutes.
EDIT2: I guess what I am asking is is it worth it to limit/batch up the amount of push notifications that my server sends? Ideally, ignoring battery life, I would want my server should send a push notification every time new data gets updated. The phone can then decide whether or not to actually pull the updates from the server. If it doesn't decide to pull, then at least the phone knowingly decided to not update the data. As opposed to if I limit the amount of times my server sends push notifications to my phone to tell it there is new data, then the phone thinks it has updated data but it actually doesn't.
It really depends. GCM can be received over Wifi, or over a cellular connection. The cellular radio consumes much more power than Wifi. Also there's a "warm up" and "keep alive" after each use of the radio, so sending/receiving things in one big burst is more energy efficient than sending them spaced a few minutes apart. This Youtube video from GoogleDevelopers has some more information about how to optimize network access to minimize battery consumption.
That said, the exact amount of power used will likely depend on the handset, distance to tower/Wifi access point, network protocol used (802.11g, CDMA, GSM, etc), and other factors.
"you don't have to worry" is the answer. GCM services is one process which manage push notification for all android app. It is for sure better than many different apps which ask continuosly to a remote server for new messages. It will use some battery percentage, but less than any other way
I'm looking for a way to detect the disconnection of a Bluetooth device immediately after it has happened (2 second max), typically in a "device too far" scenario or Device battery is dead. Currently I can detect it with a BroadcastReceiver by getting a BluetoothDevice.ACTION_ACL_DISCONNECTED, but it takes about 16 to 20 seconds to fire.
Is there any way to get notified in 2 seconds Max.
I used BroadcatReceiver but it is not fast enough to get alert in 2 seconds Max, so is there any other kind of approach available to get notification quickly that bluetooth is disconnected.
I use this createRfcommSocketToServiceRecord(UUID); to connect a paired device and i am bound to use it using UUID.
I have visited a lot of links regarding this issue, but no one matches with my needs.that's why any help would be appreciated.
thanks.
I think the only way you can reliably sense loss of connection quickly (within two seconds) is via your own application protocol that you use over the Bluetooth connection. For example, your application protocol might implement a heartbeat that occurs every 500ms. If you don't see a heartbeat within two seconds then you could trigger your own event.
Bluetooth is a socket-based stream protocol that is designed to work over an unreliable medium (i.e. radio), and as such has to tolerate errors in (or loss of) packets. For this reason it will take significantly more than 2 seconds before your Bluetooth stack declares it has given up and disconnected the device, as you have found.
I have an application on Play which is designed to talk with an automotive ECU via Bluetooth and my strategy for sensing disconnection is exactly as I suggested in my first paragraph.
Update 20th June 14
I see in your bounty comment and also your comment below that you're asking for a code example, but it's kind of difficult for me to provide one without knowing anything about the application protocol that you're running over the socket connection. Or to put it another way, what exactly is it about my first paragraph (i.e. the heartbeat suggestion) that you do not understand or cannot create code for yourself? The concept of using a heartbeat really is quite simple. You would define a certain message type in your application protocol that represents a heartbeat message. One end of the connection sends this heartbeat message periodically, say every one second. The other end of the connection checks that this heartbeat message is received every second or so and drops the connection after a two-second time-out. It is impossible to be any more specific than that, because I can't see your existing code and I don't know what kind of messages you are currently exchanging over the socket.
After nothing work around!
I got two things to get my work done.
I need to check that is my Bluetooth socket is not in use(Sending Receiving) till 2 to 5 sec I disconnect that and when user wants to send data to the receiver device I connect that again.
Or I'll try to connect the socket after 2 to 5 sec so that if it is not ready to connect means it is already connected, else it will be connected and I refresh the previous socket references.
but first option is more valuable to work perfectly in my problem.
This is a problem with old bluetooth and more hardware than software.
If you want to notice that the connection is broken you need to do polling (a heartbeat), something like "are you alive? are you alive?"... This is bad for battery so... the users will finally uninstall your app.
I recommend you to change to BTLE (bluetooth low energy), devices like Nexus 5 has this.
With BTLE you have a proximity profile which can tell you the quality of the signal, so, you can guess the distance (near, far, out of range) and therefore you can also tell if the devices are disconnected.
Another nice point is that if the devices are out of range but one is again in range you could get noticed as well, so this is really nice for apps to open doors by proximity for example.
Check this:
https://developer.bluetooth.org/TechnologyOverview/Pages/PXP.aspx
In the other hand Apple has invented the concept of iBeacons, devices that are distance aware, and the good thing is that there is also an implementation of iBeacons for Android:
http://developer.radiusnetworks.com/ibeacon/android/