I made a battery bypass to a LG G4 mobile phone, to measure the direct energy consumption with a power monitor. However, 70-80 seconds into the OS boot-up, the system shuts-down with a message 'Invalid battery'.
I've tried several solutions, such as: setting a precise output voltage as measured on the battery; cleaning the 4 battery pins; tighter pin connection, but no luck.
The only hope I could find is within the android docs - Power/Component Power section:
Fake batteries can provide signals on thermistor or fuel gauge pins that mimic temperature and state of charge readings for a normal system, and may also provide convenient leads for connecting to external power supplies. Alternatively, you can modify the system to ignore the invalid data from the missing battery.
And here is my question: How do I modify the system to ignore the invalid data from the missing battery?
There isn't a reference to any article or clear explanation on how to do this. Could anyone point me in the right direction, please? Am I missing something really straightforward?
Related
I've seen a lot of discussions on battery for altbeacon, specially if beacons are inside a region for a long time. This post was actually very clarifying.
I am currently working in a solution that requires a good sensibility (which I define as being a small detection time for a new beacon in a region).
As some beacons may be anonymous (which I define as presenting unexpected MAC addresses but share a same matching byte sequence) to the scanner in this particular solution, I would like to achieve good sensibility to new beacons but also a balanced battery impact to the user.
What concerns me is if a first beacon is found and the region triggers based on the matching sequence, how could I get a notification once another beacon approaches (or leaves) ?
A guess I was going to try was to keep monitoring for a generic matching sequence and once a beacon is found for that general sequence, range it to get its address and them create a particular region for the mac I've taken. The only problem with this approach was how could I prevent the first beacon to keep triggering the generic region?
And just out of curiosity. Is the ScanFilter class related to those hardware filters introduced on android 5?
Thank you,
If you need to quickly find new beacons with the same byte patterns as ones that already exist in the vicinity, you really have no choice but to keep ranging.
In such a situation, there is no distinction between ranging and monitoring in terms of battery consumption. Both will require constant Bluetooth scans and decoding of all beacons in the vicinity. Scan filters (yes, the hardware filters introduced in Android 5,) will not help because you expect the byte patterns to be the same. There is no such thing as a packet "does not match" scan filter that could be used to find only new MAC addresses.
You may need to accept the battery drain of constant scans and just try to limit how long they last, if your use case allows. Short scans of 30 minutes or less might be acceptable.
You could possibly save some battery by writing your own BLE scanning parsing code tailored to this use case. You could first look for unique MAC addresses, and only do further processing and parsing if the MAC address has never been seen before. This will not reduce battery usage from the constant scan, but it would cut down on battery usage from CPU expended on parsing packets. This might save 10-30% depending on the number of beacons in the vicinity.
Bottom line: you are right to be concerned about battery usage with this use case.
I'm developing an Android application to control my quadcopter from the smartphone: I have a periodic process that sends the data acquired from the touchscreen.
The data in then received from a microcontroller, that generates a PWM command to 4 DC motors, obtaining the duty cycle values with a control loop that exploits the received commands.
Can someone suggest a precise criterion to choose the period of the process on the smartphone? Or it is possibile only a "trial and error" approach, checking the reactivity of the system?
EDIT: I have successfully implemented it just setting the frequency of the smartphone task as 2*control_loop_frequency
If you knew or could measure the impulse response of the system it would be possible to determine an appropriate control loop rate; however you do not have that data and it will be confounded in any case by external factors such as wind speed and direction. Determining the rate empirically will be faster than determining the precise characteristics.
If the control is open-loop, then probably you have to ask yourself how far off the desired course can you allow the vehicle to get before a correction is applied. That will depend on the vehicles maximum speed (in any direction).
In the end however Android is not a real-time operating system, so there are no guarantees that any particular periodic update will be performed precisely; its always going to be somewhat non deterministic. At a guess I would imagine that such a system might manage 10Hz update reasonably reliably and that would probably be sufficient for adequate control and responsiveness - if the only feedback is via the human controller's hand-eye coordination, that is perhaps the limiting factor in the system response.
I am building an application that collects the RSSI levels of APs around me and do some calculations based on that.
The problem is that the readings are unstable even if the device is still. For example, signals from an access point fluctuates between -71, -68,-75 and so on....
Is there a way to stabilize the signals?
In telecommunications, received signal strength indicator (RSSI) is a measurement of the power present in a received radio signal.
I think the best you can do is count them al together and devide them after a x amount of measssure time. (since you would never get an 100% accurate acces point because of al kinds of components).
source:
http://en.wikipedia.org/wiki/Received_signal_strength_indication
"The problem is that the readings are unstable even if the device is still. For example, signals from an access point fluctuates between -71, -68,-75 and so on...."
This is the nature of the wireless signal. Even if the device is still, the environment is "moving", so, the signal is suffering from small scale fading mostly due to the environment scatterers. So, it is normal to obtain these fluctuations in the device measurements.
The accurancy of each Android device Wi-Fi measurement is another story.
Moreover, keep in mind that the values returned are in dBm which means that is in mWatt but expressed in the log scale. So, as Thealon proposed the averaging, you have to be careful because you cannot devide dBm values.
one solution would be to convert the dBm values back to mWatt and then perform the averaging. like below
Convert dBm to mWatt: mWatt = 10^(dBm/10)
When getting the RSSI, Google recommends to:
Use calculateSignalLevel(int, int) to convert this number into an absolute signal level which can be displayed to a user.
Official Docs
How would I verify/ track device location within a 5' accuracy? I've heard of people using cell towers/ gps combinations.
As far as I know, the only way to get a 5 feet accuracy figure is to use GPS, then it still isn't always that accurate depending on how good a fix of the satellites (clear view to the sky) you have.
Cell tower / Wifi triangulation methods only serve to speed up positioning and will seldom (if ever) be more accurate than satellite positioning methods.
GPS is the way to go. Cell towers won't cut it. In Android (and I believe iOS) the system will provide you with an accuracy reading in addition to the actual location. You could use this value to determine whether the value you've received should be uploaded to your server. Keep in mind using the GPS is very battery intensive and there's no guarantee of how good the accuracy will be. Depending on conditions you may not be able to achieve 5' precision.
As #CommonsWare points out, 5' is really unrealistic anyway although you can get close.
As CommonsWare says you will not get much better that 10 metters accuracy in a consummer-grade device. Even in open sky, the atmosphere characteristcs change minute by minute and thats enough to change the GPS readings.
However, it's teoreticaly possible to increase accuracy if you could get all of the following:
1-There are some stationary GPS receiver stations with fixed known locations which measure the current GPS signals deviation. You would need to have one of those close to you and have access to the data in real time.
2-You would need to have low level access to your phone GPS receiver to read the unprocessed data received from sattelites. This would be different from device to device, and as far as I know, no supplier is providing this access.
3-Finnaly, you would need to do all the calculations required to determine your location applying the deviations got from point 1 above.
good luck.
The only way you can get this type of accuracy is with WAAS. As far as I know, there are no Android handsets that can receive WAAS corrections. If this is for a specific controlled situation, you could use a bluetooth gps receiver with WAAS, and only in WAAS supported locations. However, if this was for a wider deployment, then I think you are out of luck.
I've got a problem. I'm developing an android application that scans for wireless accesspoints/routers. I've been testing a couple of devices and I'm getting scan rates of 2, 1, 0.5, 0.1 etc. scans per second.
My goal is to reach 10 scans per second because a router can send beacons 10 times a second. And we need this for our application.
Is there away to make this possible? Perhapse hack a rom and replace the wifi drivers? I've been looking in to this but I can't find anything about this frequentie inside the driver.
The driver used is BCM4329 driver, I can't find any datasheets of the BCM4329 so it's kinda hard to figure this out.
Thanks in advance.
flitjes
I'm not familiar with driver development but I know it's one of the hardest thing in computer science so unless you have good knowledge in linux kernel development I would forget about it.
Moreover, you still need to scan the 12 Wi-fi channels to be sure that you are detecting all access points. An access point broadcasts a beacon every 100ms * 12 channels = 1.2 seconds. Spending less time than that and you risk missing access points.
You don't need to change anything in the device driver, Android makes it available to you to scan for access points. See the documentation.
Although requesting that many scans will probably not be very good for the battery life and the responsiveness of your app...
Your assumption that beacon rate is 10 per second is incorrect. This is really an AP configuration parameter, although 10 per sec is default in most. Besides that, APs do not send beacons simultaneously, if this happens, it's called a collision and a back-off algorithm is used for retransmission. In addition, even scanning 10 times per second doesn't make it certain for you to capture all beacons, like was pointed out in the previous answers.
if u use 4339 driver, you could not set the scan rate in driver or android api which is fixed in 4339 firmware, scan is about all channels && time u spend on each channel, according to the 80211 spec, which is part of mac && phy. in this case u just need to get the beacon, so u should use passive scan and use fixed channel && MaxChannelTime u want.
u have to ask broadcom for speical fw to figure out your problem,
IEEE
Std 802.11-2012 page 978
10.1.4.2 Passive scanning
If the ScanType parameter indicates a passive scan, the STA shall listen to each channel scanned for no
longer than a maximum duration defined by the MaxChannelTime parameter.