Would like to determine via code whether the voice channel on an Android is experiencing noise.
There is a list of functions at http://developer.android.com/reference/android/telephony/SignalStrength.html
I see a function getGsmBitErrorRate() and that will be useful when I switch to GSM.
I see the function getEvdoSnr() but I think that is only for the "data channel."
The only other function with CDMA is getCdmaEci0(). How can one derive SNR from Eci0()?
Is there some other way of deriving this value?
Review: I'm looking for a function that returns something like Bit Error Rate OR a Signal to Noise Ratio. I find one link that implies that ec/i0 is exactly the same thing as SNR.
But other pages have indicated that ec/i0 is the amount of signal that is available.
Ec refers to the pilot signal energy. Io refers to the noise energy. Thus, Ec/Io is the "pilot-to-noise ratio", instead of the "signal-to-noise ratio" (i.e. SNR). While, strictly speaking, the PNR is not the same as the SNR, as a practical matter you should be able to use the PNR as a proxy for the SNR. The whole point of the pilot is to give the receiver information about the signal and allow it to estimate channel effects.
This web page looks useful- Ec/Io.
Edit: I forgot that there is a parameter set by the network provider that specifies what the power of the pilot should be as compared to the actual signal. For instance, they could be the same power, in which case Ec/Io would be equal to the SNR, or the pilot could be half the power of the signal, in which case the SNR would always be 3 dB higher than Ec/Io. The pilot to signal ratio can be any of a number of different values and is, as mentioned earlier, set by the network. I don't recall what the parameter is called.
Related
I wrote an app to detect beacons and calculate the distance by RSSI.
When I use iPhone/android for simulating as beacons will cause a big difference on RSSI.
Such as I put iPhone at 3M far from BLEscanner the RSSI is -65. But at the same place, the RSSI of the android devices will be -80 or more.
Does anyone know how to adjust the difference of RSSI?
As you have discovered, every phone model has a slightly different Bluetooth transmission power. Similarly, every phone model has a slightly different receiver sensitivity. Changing which phone model is used for transmission and reception will affect the RSSI you receive.
You cannot perfectly correct for these differences for two main reasons:
There are lots of other factors that come in to play that affect RSSI that have nothing do do with the phone model (signal reflections, attenuation by the air or obstructions, the effect of a phone case, a nearby hand holding the phone or other human body parts.)
There are limited data on the differences between phones in terms of Bluetooth transmitter power and receiver sensitivity.
With those caveats in mind, Google did create a limited dataset of correction factors as part of its coronavirus contact tracing effort with Apple.
You can read how this works here
And see the full correction data set here
The basic idea is:
Take two values from the CSV data set above for your test setup:
Use "tx" column value for the transmitting phone model as its "TX_Power" in the equation below. Use "RSSI_correction" column value for the receiving phone.
Then plug them into this equation:
Attenuation = TX_power - (RSSI_measured + RSSI_correction)
The resulting Attenuation value is the corrected signal loss (measured in dB and usable similar to RSSI) between your transmitter and receiver and can be used as an input to distance formulas.
I've seen a lot of discussions on battery for altbeacon, specially if beacons are inside a region for a long time. This post was actually very clarifying.
I am currently working in a solution that requires a good sensibility (which I define as being a small detection time for a new beacon in a region).
As some beacons may be anonymous (which I define as presenting unexpected MAC addresses but share a same matching byte sequence) to the scanner in this particular solution, I would like to achieve good sensibility to new beacons but also a balanced battery impact to the user.
What concerns me is if a first beacon is found and the region triggers based on the matching sequence, how could I get a notification once another beacon approaches (or leaves) ?
A guess I was going to try was to keep monitoring for a generic matching sequence and once a beacon is found for that general sequence, range it to get its address and them create a particular region for the mac I've taken. The only problem with this approach was how could I prevent the first beacon to keep triggering the generic region?
And just out of curiosity. Is the ScanFilter class related to those hardware filters introduced on android 5?
Thank you,
If you need to quickly find new beacons with the same byte patterns as ones that already exist in the vicinity, you really have no choice but to keep ranging.
In such a situation, there is no distinction between ranging and monitoring in terms of battery consumption. Both will require constant Bluetooth scans and decoding of all beacons in the vicinity. Scan filters (yes, the hardware filters introduced in Android 5,) will not help because you expect the byte patterns to be the same. There is no such thing as a packet "does not match" scan filter that could be used to find only new MAC addresses.
You may need to accept the battery drain of constant scans and just try to limit how long they last, if your use case allows. Short scans of 30 minutes or less might be acceptable.
You could possibly save some battery by writing your own BLE scanning parsing code tailored to this use case. You could first look for unique MAC addresses, and only do further processing and parsing if the MAC address has never been seen before. This will not reduce battery usage from the constant scan, but it would cut down on battery usage from CPU expended on parsing packets. This might save 10-30% depending on the number of beacons in the vicinity.
Bottom line: you are right to be concerned about battery usage with this use case.
I'm developing an Android application to control my quadcopter from the smartphone: I have a periodic process that sends the data acquired from the touchscreen.
The data in then received from a microcontroller, that generates a PWM command to 4 DC motors, obtaining the duty cycle values with a control loop that exploits the received commands.
Can someone suggest a precise criterion to choose the period of the process on the smartphone? Or it is possibile only a "trial and error" approach, checking the reactivity of the system?
EDIT: I have successfully implemented it just setting the frequency of the smartphone task as 2*control_loop_frequency
If you knew or could measure the impulse response of the system it would be possible to determine an appropriate control loop rate; however you do not have that data and it will be confounded in any case by external factors such as wind speed and direction. Determining the rate empirically will be faster than determining the precise characteristics.
If the control is open-loop, then probably you have to ask yourself how far off the desired course can you allow the vehicle to get before a correction is applied. That will depend on the vehicles maximum speed (in any direction).
In the end however Android is not a real-time operating system, so there are no guarantees that any particular periodic update will be performed precisely; its always going to be somewhat non deterministic. At a guess I would imagine that such a system might manage 10Hz update reasonably reliably and that would probably be sufficient for adequate control and responsiveness - if the only feedback is via the human controller's hand-eye coordination, that is perhaps the limiting factor in the system response.
I am building an application that collects the RSSI levels of APs around me and do some calculations based on that.
The problem is that the readings are unstable even if the device is still. For example, signals from an access point fluctuates between -71, -68,-75 and so on....
Is there a way to stabilize the signals?
In telecommunications, received signal strength indicator (RSSI) is a measurement of the power present in a received radio signal.
I think the best you can do is count them al together and devide them after a x amount of measssure time. (since you would never get an 100% accurate acces point because of al kinds of components).
source:
http://en.wikipedia.org/wiki/Received_signal_strength_indication
"The problem is that the readings are unstable even if the device is still. For example, signals from an access point fluctuates between -71, -68,-75 and so on...."
This is the nature of the wireless signal. Even if the device is still, the environment is "moving", so, the signal is suffering from small scale fading mostly due to the environment scatterers. So, it is normal to obtain these fluctuations in the device measurements.
The accurancy of each Android device Wi-Fi measurement is another story.
Moreover, keep in mind that the values returned are in dBm which means that is in mWatt but expressed in the log scale. So, as Thealon proposed the averaging, you have to be careful because you cannot devide dBm values.
one solution would be to convert the dBm values back to mWatt and then perform the averaging. like below
Convert dBm to mWatt: mWatt = 10^(dBm/10)
When getting the RSSI, Google recommends to:
Use calculateSignalLevel(int, int) to convert this number into an absolute signal level which can be displayed to a user.
Official Docs
I "play" with google/glm/mmap in order to find the coordinates of a cell. I note that the returned values can changed (queries on several days).
Any explanation ?
It appears that the coordinates returned are a weighted average of locations where cell phones reported connecting to that base station. That is, they are not the coordinate of the base station antenna. This is actually better for geolocation purposes since you want to know where the cell phone is, not where the base station is. One of the parameters returned with the coordinates is a measure of the "spread" of the reports --- which gives one some idea of how accurate the geo-location is likely to be. It appears that as reports continue to come in to mmap, the data base is updated and so changes are possible over time, usually rather minor.
Note that, in the case of CDMA at least, the true location of the base station antenna can be obtained using getBaseStationLatitude() and getBaseStationLongitude() methods on cellLocation in e.g. onCellLocationChanged() callback. This is not supported by all carriers. U.S. Cellular does provide this information, Verizon Wireless sadly does not. Femtocells do provide the information using their built in GPS. Again, depending on your application, knowing the base station location may not be what is required.