If I receive a GCM message in a BroadcastReceiver and then do some very basic logic, how much battery will that use? I am thinking around ~30 messages per day. What about data?
EDIT: I understand that different devices have different battery sizes and CPU efficiencies, etc. I'm not asking for a precise percentage used, more just "you should worry about that" vs "you don't have to worry".
The logic involves reading a long from the db (last sync time) and comparing it with current time.
The messages will mostly all be received in a span of ~4 minutes.
EDIT2: I guess what I am asking is is it worth it to limit/batch up the amount of push notifications that my server sends? Ideally, ignoring battery life, I would want my server should send a push notification every time new data gets updated. The phone can then decide whether or not to actually pull the updates from the server. If it doesn't decide to pull, then at least the phone knowingly decided to not update the data. As opposed to if I limit the amount of times my server sends push notifications to my phone to tell it there is new data, then the phone thinks it has updated data but it actually doesn't.
It really depends. GCM can be received over Wifi, or over a cellular connection. The cellular radio consumes much more power than Wifi. Also there's a "warm up" and "keep alive" after each use of the radio, so sending/receiving things in one big burst is more energy efficient than sending them spaced a few minutes apart. This Youtube video from GoogleDevelopers has some more information about how to optimize network access to minimize battery consumption.
That said, the exact amount of power used will likely depend on the handset, distance to tower/Wifi access point, network protocol used (802.11g, CDMA, GSM, etc), and other factors.
"you don't have to worry" is the answer. GCM services is one process which manage push notification for all android app. It is for sure better than many different apps which ask continuosly to a remote server for new messages. It will use some battery percentage, but less than any other way
Related
As for clarification, this question is not duplicated since the
situation differs from other related questions.
We are working on an client side application which will receive data from a server side PHP-powered web application. Data are critical and must be delivered to user as soon as possible. It doesn't matter if client request for data from server or server push data to client, the only thing we need is a reliable and fast option.
There are several methods but non of them fit our project:
Use GCM push notification ability:
This is a great option but in practice, we lost several pushes so it's not reliable and in other hand, delay is so much. I repeat, the situation is critical so it must be fast.
Request data from server by the client with a 1 or 2 second interval:
This is what we think is the best solution so far but is really expensive. It's reliable and fast. But in other hand, the pressure on our disturbed servers get extremely high and they become useless even with our current client numbers. If the number of clients get larger, we'll be down.
SMS based push:
The other option for us is to send SMS to client phones and use that data to operate application. Using this method, the pressure on our server will get really low (just like GCM option). But sending SMS in our countries mobile network is usually delayed, normally, 10 seconds. Although this option have good reliability but the speed so low that we can't use it.
FM radio signal based push:
We can use clients FM radio receiver to get data from local broadcasting stations. This method is reliable and very fast but the cost of stations will kill us! and even if we handle it (read: we can't), clients does not connect their earphone to the smartphones always.
So, what are the alternatives? what is a reliable and almost fast method which does not make a lot of pressure on our servers?
Would probably recommend using WebSockets for case you describe (using OkHttp library for example) - see following for nice overview of it's use https://medium.com/#ssaurel/learn-to-use-websockets-on-android-with-okhttp-ba5f00aea988. A common pattern would be use of WebSockets with Http REST requests (for an initial catch up query for example). Also you would typically only use WebSockets while app was in foreground and rely on push notifications otherwise.
I've read this tutorial about data transfer in a battery efficient way.
All the lessons are based on one, simple concept: polling the server is Android is battery inefficient. For this reason, Google Cloud Messaging is introduced in order to send messages from the server to the device only when needed.
There is only one problem: I'm trying to implement a "mobile cloud", so a cloud composed by mobile devices, where each device can join/leave the network with high frequency. So I need some mechanism to detect when a device is not reachable anymore. Until now, in all the works that I've seen on the topic, the only solution was to periodically ping the main server to say "Hey, I'm still alive!" from the mobile device. Obviously this solution is battery killing, but until now I've not seen/found any better solution.
Do you know any battery efficient solution for this problem?
There's no reason why pinging the server periodically (heartbeat) is necessarily wasteful of the battery/inefficient. That depends upon how frequently you need to ping, and whether your ping needs to initiate its own transmission vs piggy backing on some other transmission.
Let me explain. Battery inefficiency depends upon whether or not you are increasing the frequency or duration of the transceiver being in an active state. If the transceiver is continually active anyway, such as it is continuously exchanging data or audio, then a heartbeat adds no additional burden. If it is not active, then there will be additional energy usage but that depends upon the frequency of your heartbeat compared to how long a ping will cause the transceiver to be powered. Even then, it's probably irrelevant to your application as I suspect "cloud" means the devices are active and connected.
Let's assume that your heartbeat is such that it will increase the duration of your transceiver being active. There are still techniques you can use to decrease this impact, such as caching your beat and sending it only when it can piggy back on another transmission. Of course, such solutions depend upon whether your heartbeat is implemented in an application, OS or kernel.
I suggest you do actual tests to see if there is truly an impact on your devices.
PS I'm not saying the tutorial is wrong. It isn't. But it is addressing a broader and more general problem then what you have.
I know, it's so. But I don't understand, why?
Why not simply send queries to server periodically? Sure, it may discharge battery and increase internet traffic. I understand it. But how usage of Google Cloud Messaging can eliminate this problems?
I have found an answer. But it isn't pretty clear for me.
Can anyone give me a clear explanation?
Let's say you have 50 applications on your phone that do not use GCM. Each app developer decides it is appropriate to poll their respective backend once a minute.
Since these are all separate applications, each call will likely not happen at the same time as another api call. The biggest kill to battery is when the radio within an android device has to turn back on after being off to make an API call, so multiple calls happening with blocks of time in between drains battery faster (read this article on the radio state machine to better understand why this is https://developer.android.com/training/efficient-downloads/efficient-network-access.html)
In addition, each application will be hitting a separate endpoint. Each time you make an API call, you have to go through the connect process for a given server. With batched api requests or HTTP 2.0, multiple calls going to the same server can be optimized by not having to re-do a handshake or the connect process.
Now imagine, all 50 applications used GCM. GCM will poll an endpoint at some regular time interval on behalf of all 50 apps. Let's say GCM polls once a minute to a server that all the respective apps' backends send their notifications to to send to a device. You have reduced 50 different oddly timed API calls that are likely turning on and off the battery to one api call. You will use less data for polling. You do not incur the cost of the connect step of an HTTP call to 50 different servers. In addition, google is using the same polling already in place checking for OS updates, so there is no additional network overhead from using GCM (this info is based on old docs What technology does GCM (Google Cloud Messaging) use?)
Also, see this explanation straight from the Android website in an article entitled "Minimizing the Effects of Regular Updates" (http://developer.android.com/training/efficient-downloads/regular_updates.html):
Every time your app polls your server to check if an update is required, you activate the wireless radio, drawing power unnecessarily, for up to 20 seconds on a typical 3G connection.
Google Cloud Messaging for Android (GCM) is a lightweight mechanism used to transmit data from a server to a particular app instance. Using GCM, your server can notify your app running on a particular device that there is new data available for it.
Compared to polling, where your app must regularly ping the server to query for new data, this event-driven model allows your app to create a new connection only when it knows there is data to download.
The result is a reduction in unnecessary connections, and a reduced latency for updated data within your application.
GCM is implemented using a persistent TCP/IP connection. While it's possible to implement your own push service, it's best practice to use GCM. This minimizes the number of persistent connections and allows the platform to optimize bandwidth and minimize the associated impact on battery life.
My client is building an application using a Bluetooth Low Energy dongle. This unit takes continuous measurements. They originally went with the BLE for energy purposes and power saving. Since this unit will run for about 8 hours at a time while recording data, they will need it to operate the full 8 hours.
They want me to setup their android app to Poll the device on 4 different channels at a frequency of up to 250/sec. To me that defeats the Low Energy aspect. I am pretty sure that BLE was designed with notifications in mind which should send data to an app onChange instead of the app asking the device for data at a given frequency.
At the moment I have the App setup with device notifications. That means that the GattServer is sending data and then the app is only receiving notifications of data change. The problem with that is that there are 2 channels for a sensor which require an amount of data at a rate of 250/sec. Even if there is no change.
The server may or may not send at that rate. For one channel I get about 25 reads er sec and for the other channel I get maybe 4 reads per second. Since this is all setup with Notifications which only notify onChange I suspect that the app is doing exactly what it is designed to do. So since there are not enough data points, we are thinking of switching those two channels to polling instead.
Does this make sense to you and doesn't this defeat the Low Energy or even the entire design of a Low Energy chip design?
250/sec = 1 packet per 4ms.
BLE's smallest connection interval is 7.5ms. All data exchanged between master and slave must be done within a connection interval. Even though multiple packets can be sent back and forth within a single connection interval, I don't know how you are going to make Android squeeze multiple messages into one connection interval. In that way, you cannot make sure the 250 polls are equally spaced in time.
I think Android simply wouldn't let you send messages at that frequency. Send networking packets is usually a blocking operation. So your app will be blocked by the operating system when tx is busy. In the end, what you might get is simply tens of messages per second.
I've been given the task to create an iBeacon client for Android. The purpose of the client is to transmit the ID of the closest beacon over WIFI/3g/4g to a server when another client requests the users position.
Now, I wonder, what is the best way of doing this, with respect to battery consumption?
There are (as far as I can see) two approaches:
1). Whenever the ID of the closest beacon changes, upload the new ID directly to the server.
2). Store the ID of the closest beacon locally on the client's phone. Only upload it to the server when the server requests the client to do so.
Nr 1). means that WIFI has to be activated every time the user approaches a new iBeacon. On the other hand, 2). requires the phone to listen to a port in order to handle requests from the server.
EDIT: I read about the Radio State Machine here: http://developer.android.com/training/efficient-downloads/efficient-network-access.html
If I keep a ServerSocket running in the background, will this keep my phone in Radio Low Power? It seems to me, that option 1) will put the phone into full radio power every time a beacon ID is transmitted, but after that, the phone will fall back into Low Power/Sleep mode, which will save me battery. 2) on the other hand will prevent the phone from reaching Sleep mode, since I have to keep a persistent connection to the server.
I would go with option one because it is simpler and less brittle. If you are really worried about battery, just place limits on how often it can talk to the server(e.g. 10x per hour) so it does not get crazy.
Other things may use battery even more, so test first and make sure this really requires optimization. For example, run this in the background for 10 hours and see how much it drains a fully charged battery vs. When the app is not reporting to the server. If it is only a few percent, then it is probably efficient enough.