Bluetooth-Low-Energy RSSI changes periodically on Android devices - android

I noticed that the signal strength of Bluetooth Low Energy received on Androids is varying in cycles.
The graph below represents the RSSI values of one BLE beacon over two minutes. The receiving Android and the beacon were both static with a distance of 1 meter. I made sure that there is as low interference as possible. The Android was a Nexus 5, but I had the same phenomenon with other Android devices, all running on API 21. I could not test it on iOS yet.
RSSI Graph
You can see that there are 3 major levels for the RSSI repeating every 15 seconds, like low -> middle -> high -> low -> middle -> high etc.
My guess is that the reason lies on the android side, not sure whether it is because of hardware or software reasons.
Why is the RSSI cyclic over time? Can someone explain?

After reading a lot into the topic now, I might have come to an answer.
Bluetooth Low Energy beacons use three different channels for advertising, which is their adaption of frequency hopping to avoid interference with other 2.4GHz signals. This happens much slower than for normal Bluetooth (1600/s) - according to my measurements around every 5 seconds.
More here:
http://www.argenox.com/bluetooth-low-energy-ble-v4-0-development/library/a-ble-advertising-primer/
The received signal strength depends obviously on the frequency, so if the frequency changes to another channel, the RSSI is different. How to deal with that is now a different question.
UPDATE:
After following up on this issue, I have to update my remarks:
It is very likely that the three levels with each one around 5s are not directly due to the beacons slow frequency hopping, but to the android devices scanning seperately on the channels and switching to the next after such a time interval.
A way to overcome this behavior is starting and stopping the scan process in a loop, so that a scan lasts clearly less than 5s. When starting the scan, the device seems to begin scanning always on the same channel and the scan is restarted before it can switch to a different channel. With the restarts, the pattern is not detectable anymore - to the disadvantage that the channel is "fixed" and may suffer interference on this frequency.
Thanks to Airsource Ltd for bringing me back to this question.

As per Android AOSP - Definition of scan interval and scan window in android source code the scan interval in any scanning mode is 5000ms.
I would assume that your graph was generated via an application that used continuous scanning - i.e. scan window of 5000ms, which is basically continuous.
The scanner will rotate between channels 37,38,39 after every scan interval, which accounts for the differences you observe. Channels 37,38,39 are not contiguous in the BLE spectrum - 37 is at 2402Mz whereas 39 is at 2480Mz. The difference in wave length means that the multi path (interference from reflections) fade will be different for each channel http://www.cl.cam.ac.uk/~rmf25/papers/BLE.pdf - you say that the devices were static, so provided that nothing else was moving, the interference will also be static.
On iOS, the scan interval (foreground) is reportedly 40ms which means that you should not experience this precise effect.

Related

What does ScanPeriod and BetweenScanPeriod mean in AltBeacon Library?

Do the "setForegroundScanPeriod" and "setForegroundBetweenScanPeriod" in the AltBeacon library match to the Scan Window and Scan Interval of the BLE standard? Also, does this refer to a scan event for each of the 3 advertising channels or will a scan of the 3 channels occur in a scan window?
I have a beacon advertising on a single channel (CH39) every 400ms and want to set my scan window and scan interval appropriately to ensure a maximum packet reception rate with those two options that Android let you control.
I am also open to try other suggestions for that matter.
Thanks,
The setForegroundScanPeriod and setForegroundBetweenScanPeriod settings of the Android Beacon Library are a high-level concept designed to control:
battery usage
The BLE scan will be stopped during the foregroundBetweenScanPeriod, allowing you to duty cycle the scanning to save battery. This is typically set to 0 for the foreground and a much higher value (say 5-15 minutes) for the equivalent background setting. The Background setting applies when the app is not in the foreground or the screen is off, and the Foreground setting applies when the app is visible on an illuminated screen.
beacon search intervals
By default, the foregroundScanInterval is set to 1100ms. This means that the library will look for beacons for 1100ms, keep track of the distinct list of all it detects in that period, then report them to the application using the library at the end of that interval (e.g. every 1100ms). This is similar to what iOS does with its CoreLocation API at a rate of 1000ms. The reason it defaults to 1100 instead of 1000 is because many early Android devices with BLE support could not detect more than one distinct advertisement per scan, so scanning had to be stopped every cycle and restarted to detect one again. The default cycle was set to be slightly more than 1000ms to avoid being closely synchronized with beacons advertising at a 1Hz rate.
The library's settings are not the same thing as the ScanWindow and ScanInterval of the BLE Standard. The library's settings are a much higher level concept.
The BLE standard concepts of ScanInterval and ScanWindow, as you state, control how quickly a device doing a BLE scan rotates between listening on each of the distinct BLE advertising channels. Unfortunately, Android APIs give you no direct control over these intervals -- they are baked into the firmware by the Android manufacturer. Further, Android gives you no APIs to determine what these are set to, or even which advertising channel the receiver was set to when an advertising packet was detected (something that has unfortunate impacts on RSSI measurements, see below.) The limited scan settings provided by Android are visible here. The SCAN_MODE_LOW_POWER vs. SCAN_MODE_LOW_LATENCY may affect these intervals (they do in the open source Android code), but again, Android manufacturers may adjust this at will. The Android Beacon Library generally uses SCAN_MODE_LOW_LATENCY except in specific background mode states.
In general, these BLE-Standard settings baked into the Android firmware will be different compared to the library's settings mentioned above. (The BLE spec says the ScanInterval can range from 2.5 ms to 10,240 ms) The periods, however, vary quite a bit between different Android models. You can see the open source Android definitions in this answer, which set the ScanInterval to 5000ms for SCAN_MODE_LOW_LATENCY, but keep in mind that each manufacturer may adjust the constants to their own liking.
Because the RSSI of an advertisement detection varies a small but significant amount between each advertising channel, you can usually derive the hard-coded ScanInterval of an Android device by plotting the RSSI of advertisements detected on an Android device from an advertiser that uses all advertising channels. Plotting this on a graph of RSSI vs. time will show a stair step pattern, where the width of each step is equal to the ScanInterval. On Samsung devices, the ScanInterval is close to the max allowed by the spec at about 10 seconds. My anecdotal testing suggests the baked-in setting from devices by other manufacturers are typically shorter.
The inability to control the channel hopping rate on Android means that 2/3 of your advertisements will not be detected, and on Samsung devices, you will typically go 20 seconds without any detections.

Hundreds of BLE devices advertising simultaneously

If an environment has many BLE devices (200 to 500) advertising within range of a mobile device that is scanning for BLE devices, will the mobile device be able to see all of the advertising BLE devices? Does anyone know of actual testing that has been done on this scenario and can provide results of the testing?
There is no simple answer I am afraid. Assuming all advertizing devices are within a range (i.e. good RSSI), there are many factors affecting reception including: duty cycle of the scanner and broadcast interval of the beacons; you could also choose to use certain channels for groups of advertizers etc. Say the scanner is running flat out (100% duty cycle) and you start from sporadic broadcasts on all channels, gradually increasing the rate at which adverts are broadcast. At some point, you will hit a sweet spot -- increasing advert rate beyond will result in more collisions and poorer detection performance. (It's like being in a room full of people, all talking over each other, all repeating the same utterance over and over again.) The question is not so much "how many advertizers?" one can have but rather "how reliably can the advertizers be heard within a certain timeframe?" e.g. under which conditions can one detect an advertizer with 95% probability within 5 seconds.
If you can deploy multiple scanners (especially if they are spatially separated), then you will gain by collating their output. Based on one particular experiment I carried out using two scanners: 4% adverts were received only by scanner #1, 2% only by scanner #2, and 4% were not received at all. So #1 scanner alone received 94%, #2 alone 92%, but together they received 96% of the adverts.
Here is some related work I found Bluetooth beacon density maximum.

Optimal configuration for background scan - Android / AltBeacon

My goal is to provide a new means of communication to merchants. These merchants will seize their ads on a platform and the beacons will take care of "spreading" them.
The mobile application will therefore scan the beacons on the background (the most frequent case) and retrieve merchants' ads based on the ids of the discovered beacons. So I need a very regular scan so that no ads are missed.
I have already done a big part of the development however I do not know how to configure the periods of scans.
What optimal configuration would you advise me for this case ?
Currently the application to this configuration in background: setBackgroundScanPeriod(2000L); setBackgroundBetweenScanPeriod(0L);
The foreground setting is the default setting. So I scan for two seconds and then start again immediately.
Thank you in advance and sorry for my english.
The default settings of the AndroidBeaconLibrary are already optimized for fast background detections on Android 5+ devices when using the BackgroundPowerSaver.
When no beacons have yet been detected, the library will do a constant low-power scan for beacons when in the background. This yields detections within 5 seconds on tested Nexus or Pixel devices.
This relies on hardware filters which do not work on Android 4.3 and 4.4 devices and will not work if beacons are already in the vicinity. For these cases, background scanning falls back to cycles of 10 seconds scanning every 3000 seconds.
While you are welcome to increase the on/off cycle rate from 10/3000 to something more frequent, the 2000/0 ratio you suggest will drain the battery on users devices noticably, so I recommend against it.
The defaults are designed to give optimal performance for most use cases.
It entirely depends on the capability of the device and importantly how much battery do you not mind draining. The faster the polling the more draining of the device battery. You also have to keep in mind that the iBeacons also have a interval of sending scans which also drains battery.
For example I have set some iBeacons to a frequency of 900ms and they still are at 100% battery after 1 week. So it seems like you can max out the iBeacons, but as the android application goes you have to see how the battery drainage goes with higher polling rates.

iBeacon Accuracy While Android Device In Motion

I am testing out a positioning system using iBeacon and Altbeacon. I have found that my triangulation results are actually pretty accurate, but sometimes it takes upwards of 5 seconds to see the proper results.
For example, say I am currently standing at Point A. Altbeacon + my triangulation has me properly placed very close to Point A. However, when I move 5 meters away to Point B, I remain around Point A for around 6 seconds and all of the sudden I snap into place right near Point B. Is this an issue with Altbeacon, or possibly the communication between my iBeacons and my Android tablet?
Note: I am using a Kindle Fire 10, running FireOS 5.1.1 on top of Android. The Bluetooth iBeacon technology is BLE, and broadcasts at around 1Hz.
The issue of time lag that you describe may be caused by averaging intervals on the signal measurement. You do not say what scanning framework you are using, or if you are using raw RSSI or a distance estimate as input to your algorithm. The Android Beacon Library by default uses a 20 second averaging interval (configurable) for its distance estimates. Other framework's use similar averaging.
Reducing the averaging interval will lessen the lag, but increase the noise as an input to your algorithm.
EDIT: To reduce the distance estimate sampling interval to 3 seconds from the default 20 seconds, call:
RunningAverageRssiFilter.setSampleExpirationMilliseconds(3000l);
I have tried previously what you were trying to do. There was a lot of issues making it impossible to get correct triangulation results.
Theoretically it should work, but
Practically you will have a lot of challenges, like the fact the Bluetooth Beacon uses the 2.4GHz frequency, almost all Bluetooth Beacon has non-directional antenna, which means that you might risk not measuring the signal source but the reflection of the signal surrounded by the beacon.
The other fact is the noise from other sources or Bluetooth Beacon in your environment.
Depending on the Android phone model, the receiver antenna of Bluetooth is not necessarily mount same place in the phone, that means how you hold the phone will change the RSSI reading
Holding the phone in hand or near human body might also give different readings or no reading at all, since the human body contains water that is a signal reducer/killer for Bluetooth signal.
So even thus you improve your latency time of Bluetooth Beacon by software, you will still have these challenge make it almost impossible to get the right results.
I have seen a new directional Bluetooth Beacon I have not testing it yet, but it sounds like it solving some the mentioned issues.
It is correct what #davidgyoung wrote, but that won’t change the fact of real world scenario.
Btw, I have worked with Altbeacon a very nice and respected tool, and I used both RSSI and distance estimate with different type of Bluetooth Beacon and different phones and it did not help much, it is not Altbeacon the problem.
And regarding the university project I mentioned in my comments, we ended up using Bluetooth Beacon in different way to help us finding directions to target for visually impaired people, and we have developed scientific paper on it.
Finally for inspiration of what you are doing and what I mentioned in my answer, see this video it shows triangulation experiment, the provider of this video is btw also a user at Stackoverflow.
Note: my answer here is focusing on the context of triangulation and the challenges here make it as not a sweet solution.

android wifi network latency galaxy s2

i am trying to estimate network delay between two android devices connected over WiFi to synchronize clocks. but the network delay is varying a lot from 2ms to 1024ms. sometimes i gets delay value which varies between 2-10ms for continuous 100 readings. but sometimes values ranges between 2ms to 1024ms for continuous 100 readings like 2, 100, 570, 640, 2, 5, 150.
I am using socket timestamps to determine the exact receive and send time of the packet. my setup uses one laptop as wifi access point and two mobile phones. There is not much load on the network. my question is why is it varies less sometime and why it varies so much sometimes.
How to make it vary less. am i missing any configuration on android. Give me some possible reasons for this kind of behavior...
Unlike wired links, wireless links are affected from a lot of different factors. You can get stable results only in a sterile environment without electromagnetic or mechanical interferences.
Most latency fluctuations are a direct result of RF collisions. WiFi networks implement the CSMA/CA protocol to deal with the collisions. In general it detects whether there is any activity in the air and postpones the transmission if it's noisy.
You can try to minimize the external influences but still this doesn't guarantee anything:
Perform WiFi scan and see what channels are the noisiest. Choose the less occupied for our link. Remember, there are overlaps between various channels, so moving to a different channel won't necessary remove all the noises. See here about channels overlapping and how to choose a channel: http://en.wikipedia.org/wiki/List_of_WLAN_channels .
Remove mechanical obstacles from your environment.
Increase the transmission (Tx) power of your devices from their SW configuration.
Check the Quality of Service (QoS) configuration of your devices, some tweaking there might yield improvements.
Solved this problem by using WIFI_MODE_FULL_HIGH_PERF.
After acquiring this lock, i have observed constant network delays.
http://developer.android.com/reference/android/net/wifi/WifiManager.html#WIFI_MODE_FULL_HIGH_PERF

Categories

Resources