WiFi readings unstable in android - android

I am building an application that collects the RSSI levels of APs around me and do some calculations based on that.
The problem is that the readings are unstable even if the device is still. For example, signals from an access point fluctuates between -71, -68,-75 and so on....
Is there a way to stabilize the signals?

In telecommunications, received signal strength indicator (RSSI) is a measurement of the power present in a received radio signal.
I think the best you can do is count them al together and devide them after a x amount of measssure time. (since you would never get an 100% accurate acces point because of al kinds of components).
source:
http://en.wikipedia.org/wiki/Received_signal_strength_indication

"The problem is that the readings are unstable even if the device is still. For example, signals from an access point fluctuates between -71, -68,-75 and so on...."
This is the nature of the wireless signal. Even if the device is still, the environment is "moving", so, the signal is suffering from small scale fading mostly due to the environment scatterers. So, it is normal to obtain these fluctuations in the device measurements.
The accurancy of each Android device Wi-Fi measurement is another story.
Moreover, keep in mind that the values returned are in dBm which means that is in mWatt but expressed in the log scale. So, as Thealon proposed the averaging, you have to be careful because you cannot devide dBm values.
one solution would be to convert the dBm values back to mWatt and then perform the averaging. like below
Convert dBm to mWatt: mWatt = 10^(dBm/10)

When getting the RSSI, Google recommends to:
Use calculateSignalLevel(int, int) to convert this number into an absolute signal level which can be displayed to a user.
Official Docs

Related

Simulate beacon by iPhone and Android has big difference RSSI at the same distance

I wrote an app to detect beacons and calculate the distance by RSSI.
When I use iPhone/android for simulating as beacons will cause a big difference on RSSI.
Such as I put iPhone at 3M far from BLEscanner the RSSI is -65. But at the same place, the RSSI of the android devices will be -80 or more.
Does anyone know how to adjust the difference of RSSI?
As you have discovered, every phone model has a slightly different Bluetooth transmission power. Similarly, every phone model has a slightly different receiver sensitivity. Changing which phone model is used for transmission and reception will affect the RSSI you receive.
You cannot perfectly correct for these differences for two main reasons:
There are lots of other factors that come in to play that affect RSSI that have nothing do do with the phone model (signal reflections, attenuation by the air or obstructions, the effect of a phone case, a nearby hand holding the phone or other human body parts.)
There are limited data on the differences between phones in terms of Bluetooth transmitter power and receiver sensitivity.
With those caveats in mind, Google did create a limited dataset of correction factors as part of its coronavirus contact tracing effort with Apple.
You can read how this works here
And see the full correction data set here
The basic idea is:
Take two values from the CSV data set above for your test setup:
Use "tx" column value for the transmitting phone model as its "TX_Power" in the equation below. Use "RSSI_correction" column value for the receiving phone.
Then plug them into this equation:
Attenuation = TX_power - (RSSI_measured + RSSI_correction)
The resulting Attenuation value is the corrected signal loss (measured in dB and usable similar to RSSI) between your transmitter and receiver and can be used as an input to distance formulas.

How to ignore invalid data from missing battery?

I made a battery bypass to a LG G4 mobile phone, to measure the direct energy consumption with a power monitor. However, 70-80 seconds into the OS boot-up, the system shuts-down with a message 'Invalid battery'.
I've tried several solutions, such as: setting a precise output voltage as measured on the battery; cleaning the 4 battery pins; tighter pin connection, but no luck.
The only hope I could find is within the android docs - Power/Component Power section:
Fake batteries can provide signals on thermistor or fuel gauge pins that mimic temperature and state of charge readings for a normal system, and may also provide convenient leads for connecting to external power supplies. Alternatively, you can modify the system to ignore the invalid data from the missing battery.
And here is my question: How do I modify the system to ignore the invalid data from the missing battery?
There isn't a reference to any article or clear explanation on how to do this. Could anyone point me in the right direction, please? Am I missing something really straightforward?

Android diferentiation between GPS systems Navstar Glonass BeiDou

How can I determine the GPS signal which GPS systems comes from?
How can I differentiate between Navstar, Glonass and BeiDou GPS systems when I received a signal?
As I known there is a way to differentiation, which based on the PRN number of GpsStatus if this value is greater or equals than 200, the GPS system is BeiDou, if this value in range 65 and 88 is it Glonass. (The Navstar PRN range is 1-32 maybe.)
What is the upper limit of BeiDou PRN range?
Is there another way to determinate which GPS system signal received?
Thanks in advance.
BeiDou in its final configuration will consist of 35 satellites, 5 geostationary and 30 MEO (source). Some practical measurements (though not on Android) are here, showing values in the range 201–210. I have personally seen PRN 211 reported on a co-worker's Android phone.
Apparently the PRNs start at 200 or 201, hence the upper boundary is likely to be 234 or 235. Currently only 16 satellites are up, 2 of which are not operational, hence you will never encounter most of these numbers until more satellites are launched. 201–205 seems to be the range for the geostationary satellites, thus you might not encounter any of these unless you are within their coverage range.
Edit: NMEA seems to have created a de-facto standard for satellite IDs, which pretty much matches what has been seen on Android. I have tried to put together a list of all ranges currently in use here.

Device Location Verification

How would I verify/ track device location within a 5' accuracy? I've heard of people using cell towers/ gps combinations.
As far as I know, the only way to get a 5 feet accuracy figure is to use GPS, then it still isn't always that accurate depending on how good a fix of the satellites (clear view to the sky) you have.
Cell tower / Wifi triangulation methods only serve to speed up positioning and will seldom (if ever) be more accurate than satellite positioning methods.
GPS is the way to go. Cell towers won't cut it. In Android (and I believe iOS) the system will provide you with an accuracy reading in addition to the actual location. You could use this value to determine whether the value you've received should be uploaded to your server. Keep in mind using the GPS is very battery intensive and there's no guarantee of how good the accuracy will be. Depending on conditions you may not be able to achieve 5' precision.
As #CommonsWare points out, 5' is really unrealistic anyway although you can get close.
As CommonsWare says you will not get much better that 10 metters accuracy in a consummer-grade device. Even in open sky, the atmosphere characteristcs change minute by minute and thats enough to change the GPS readings.
However, it's teoreticaly possible to increase accuracy if you could get all of the following:
1-There are some stationary GPS receiver stations with fixed known locations which measure the current GPS signals deviation. You would need to have one of those close to you and have access to the data in real time.
2-You would need to have low level access to your phone GPS receiver to read the unprocessed data received from sattelites. This would be different from device to device, and as far as I know, no supplier is providing this access.
3-Finnaly, you would need to do all the calculations required to determine your location applying the deviations got from point 1 above.
good luck.
The only way you can get this type of accuracy is with WAAS. As far as I know, there are no Android handsets that can receive WAAS corrections. If this is for a specific controlled situation, you could use a bluetooth gps receiver with WAAS, and only in WAAS supported locations. However, if this was for a wider deployment, then I think you are out of luck.

Android - cdma snr or ber

Would like to determine via code whether the voice channel on an Android is experiencing noise.
There is a list of functions at http://developer.android.com/reference/android/telephony/SignalStrength.html
I see a function getGsmBitErrorRate() and that will be useful when I switch to GSM.
I see the function getEvdoSnr() but I think that is only for the "data channel."
The only other function with CDMA is getCdmaEci0(). How can one derive SNR from Eci0()?
Is there some other way of deriving this value?
Review: I'm looking for a function that returns something like Bit Error Rate OR a Signal to Noise Ratio. I find one link that implies that ec/i0 is exactly the same thing as SNR.
But other pages have indicated that ec/i0 is the amount of signal that is available.
Ec refers to the pilot signal energy. Io refers to the noise energy. Thus, Ec/Io is the "pilot-to-noise ratio", instead of the "signal-to-noise ratio" (i.e. SNR). While, strictly speaking, the PNR is not the same as the SNR, as a practical matter you should be able to use the PNR as a proxy for the SNR. The whole point of the pilot is to give the receiver information about the signal and allow it to estimate channel effects.
This web page looks useful- Ec/Io.
Edit: I forgot that there is a parameter set by the network provider that specifies what the power of the pilot should be as compared to the actual signal. For instance, they could be the same power, in which case Ec/Io would be equal to the SNR, or the pilot could be half the power of the signal, in which case the SNR would always be 3 dB higher than Ec/Io. The pilot to signal ratio can be any of a number of different values and is, as mentioned earlier, set by the network. I don't recall what the parameter is called.

Categories

Resources