How to get RSCP, SINR and EcNo values in Android - android

I need to find RSCP, SINR and EcNo. So far i am able to calculate RSRP and RSRQ values from android.telephony.SignalStrength.
My questions are:
When I try to get SignalStrength#getTdScdmaDbm() via reflection it
returns Integer.MAX_VALUE (if i debug at line 300 its value
is 0)
According to AOSP RIL (Radio Interface Layer)
The Received Signal Code Power in dBm multipled by -1. Range :
25 to 120, INT_MAX: 0x7FFFFFFF denotes invalid value. Reference:
3GPP TS 25.123, section 9.1.1.1
Is there any other way to calculate RSCP.
EcNo = RSCP / RSSI where RSCP is unknown so i can not calculate Ec/No.
SINR = 1 / (1 / 12 . RSRQ) - x, where x = RE / RB, Resource Element RE and Resource Block RB are unknown. Both RE and RB are unknown.
Should i need to write native code to find/calculate these values or is their any other way to achieve this?

Related

BLE Heart Rate Senser Value Interpretation

I have an Android App where I get Heart Rate Measurements from a Polar H10 Device.
I'm totally lost on how to interpret the heart rate. Various links to the bluetooth.com site are resulting in 404 errors unfortunately.
The characteristics value is i.e.
[16, 59, 83, 4]
From what I understood the second byte (59) is the heart rate in BPM. But this does not seem to be decimal as the value goes up to 127 and then goes on -127, -126, -125, ... It is not hex either.
I tried (in kotlin)
characteristic.value[1].toUInt()
characteristic.value[1].toInt()
characteristic.value[1].toShort()
characteristic.value[1].toULong()
characteristic.value[1].toDouble()
All values freak out as soon as the -127 appears.
Do I have to convert the 59 to binary (59=111011) and see it in there? Please give me some insight.
### Edit (12th April 2021) ###
What I do to get those values is a BluetoothDevice.connectGatt().
Then hold the GATT.
In order to get heart rate values I look for
Service 0x180d and its
characteristic 0x2a37 and its only
descriptor 0x2902.
Then I enable notifications by setting 0x01 on the descriptor. I then get ongoing events in the GattClientCallback.onCharacteristicChanged() callback. I will add a screenshot below with all data.
From what I understood the response should be 6 bytes long instead of 4, right? What am I doing wrong?
On the picture you see the characteristic on the very top. It is linked to the service 180d and the characteristic holds the value with 4 bytes on the bottom.
See Heart Rate Value in BLE for the links to the documents. As in that answer, here's the decode:
Byte 0 - Flags: 16 (0001 0000)
Bits are numbered from LSB (0) to MSB (7).
Bit 0 - Heart Rate Value Format: 0 => UINT8 beats per minute
Bit 1-2 - Sensor Contact Status: 00 => Not supported or detected
Bit 3 - Energy Expended Status: 0 => No Present
Bit 4 - RR-Interval: 1 => One or more values are present
So the first byte is a heart rate in UInt8 format, and the next two bytes are an RR interval.
To read this in Kotlin:
characteristic.getIntValue(FORMAT_UINT8, 1)
This return a heart rate of 56 bpm.
And ignore the other two bytes unless you want the RR.
It seems I found a way by retrieving the value as follows
val hearRateDecimal = characteristic.getIntValue(BluetoothGattCharacteristic.FORMAT_UINT8, 1)
2 things are important
first - the format of UINT8 (although I don't know when to use UINT8 and when UINT16. Actually I thought I need to use UINT16 as the first byte is actually 16 (see the question above)
second - the offset parameter 1
What I now get is an Integer even beyond 127 -> 127, 128, 129, 130, ...

GSM RSSI and LTE RSSI and RSRP

I am looking to create an app that gets information about the phones connection to the cellular network.
My understanding is that RSSI is a measure of cellular signal with GSM and RSRP is a good measure for LTE.
To keep it consistent, is it possible to get a RSSI measure for LTE?
I am confused about what classes to use to get some of this information. At the moment, I am using the phone state listener which gives me a SignalStrength object. Using this object, I can call the two string method that provides me the following information when i split it. I am a little confused on what some of this means.
String ssignal = signalStrength.toString();
String[] parts = ssignal.split(" ");
The parts[] array will then contain these elements:
part[0] = "Signalstrength:" _ignore this, it's just the title_
parts[1] = GsmSignalStrength
parts[2] = GsmBitErrorRate
parts[3] = CdmaDbm
parts[4] = CdmaEcio
parts[5] = EvdoDbm
parts[6] = EvdoEcio
parts[7] = EvdoSnr
parts[8] = LteSignalStrength
parts[9] = LteRsrp
parts[10] = LteRsrq
parts[11] = LteRssnr
parts[12] = LteCqi
parts[13] = gsm|lte|cdma
parts[14] = _not really sure what this number is_
What is part 8 providing? RSSI?
Also, when you look at the signal strength in the android settings, it gives you the RSSI for GSM. When connected to LTE, is it giving us the RSRP or RSSI? It seems its providing RSRP.
My understanding is that part[1] provides the RSSI when connected on GSM. However, i am unsure, and interested about, part[2] (what is the rate measured against? what unit of time), part[8] (what does it measure exactly?), part[10] and part [11](what unit is it measured in and what is the unit range)
I understand this thread is all over the place. Hopefully it makes a little bit of sense and someone can clear something up.
Cheers guys!
To put it simply, RSSI and RSRP are signal level measurements for GSM and LTE, respectively. They are not exactly the same, because GSM and LTE are very different technologies. However, they both indicate the same type of information. RSRP holds no meaning in GSM and RSSI means something different in LTE.
This question may be worth reading:
How to get LTE signal strength in Android?
Most of what you are looking for, I was able to find here: https://developer.android.com/reference/packages.html
GsmSignalStrength - GSM Signal Strength, valid values are (0-31, 99) as defined in TS 27.007 8.5
GsmBitErrorRate - GSM bit error rate (0-7, 99) as defined in TS 27.007 8.5
CdmaDbm - CDMA RSSI value in dBm
CdmaEcio - CDMA Ec/Io value in dB*10
EvdoDbm - EVDO RSSI value in dBm
EvdoEcio - EVDO Ec/Io value in dB*10
EvdoSnr - Signal to noise ratio. Valid values are 0-8. 8 is the highest.
I could not locate the following, but here is what I suspect:
LteSignalStrength - LTE Signal Strength in ASU (0-31, 99)
LteRsrp - LTE RSRP value in dBm
LteRssnr - LTE SINR value in dB
LteCqi - LTE CQI (no units)
gsm|lte|cdma - Network type

Android Beacon Library Eddystone Telemetry. Temperature

Android Vers. > 4.3
Standard Android Beacon Library
Estimote Beacons.
Eddystone-UID package
Telemetry package.
I'm trying to read the temp sensor transmission from the Telemetry package of a Eddystone-UID package transmission. I can successfully read the beacon.getExtraDataFields().get(2) data for the temperature transmission as per Eddystone [Telemetry] expamples in Android Beacon Library. This data prints as a 4 or 5 digit number depending on the temp.
I'm informed by same that the beacon temp sensor transmits a 8:8 fixed point number ... reading beacon.getExtraDataFields().get(2) and then dividing by 256 I get the temperature reading in Celsius. However as soon as temp crosses 0 degC into the negative I get large discrepancies. Research among the forums seems to indicate that its to do with signed 8:8 fixed notation math and conversion to decimal. Although I understand the 8:8 fixed point notation concept I cant seem to find a reference on how to read a negative fixed point and convert to negative degC using the Android Beacon Library methods.
[Note: Estimote's Android SDK and their beacon app had the same problem ... they fixed this by updating their SDK ... I'm using Android Library and not Estimote SDK]
Guidance will be most appreciated.
The code below is used to convert the encoded Eddystone telemetry temperature field to degrees celsius. This is taken from the Locate Android app, which also uses the Android Beacon Library. This code has been tested with Eddystone beacons from Radius Networks.
long unsignedTemp = (beacon.getExtraDataFields().get(2) >> 8);
double temperature = unsignedTemp > 128 ?
unsignedTemp - 256 :
unsignedTemp +(beacon.getExtraDataFields().get(2) & 0xff)/256.0;
You can try this conversion formula with the beacons you have on hand. If you find that it doesn't work, also try the Locate app to make sure you see the same thing. If that is the case, it may be that the encoded value is not fully compliant with the Eddystone spec.
I use this method with davidgyoung conversion formula to retrieve temperature from a beacon, while casting to a float with two decimals:
public static float getTemperatureFromBeacon(Beacon beacon) {
long unsignedTemp = (beacon.getExtraDataFields().get(2) >> 8);
double temperatureDouble = unsignedTemp > 128 ?
unsignedTemp - 256 :
unsignedTemp + (beacon.getExtraDataFields().get(2) & 0xff) / 256.0;
float temperature = (float) Math.round(temperatureDouble * 100) / 100;
return temperature;
}

How to convert WiFi level (i.e. -45 , -88 ) in to percentage?

How to convert WiFi level (i.e. -45 , -88 ) in to percentage ?
I want to convert WiFi level in % . I get WiFi level using level ( in dBm format)
I try lot of google but not get proper ans
Problem with this is that is very dependent on the receiving antenna. Some antennas register no useable signal at -90 dBm, some already at -80. You will have a hard time finding 0% (100% strictly being 0dBm).
I have created a Wifi scanner application where I use -100dBm as 0% and 0dBm as 100%, in Java it turns into something like this (MIN_DBM being -100):
public int getPowerPercentage(int power) {
int i = 0;
if (power <= MIN_DBM) {
i = 0;
} else {
i = 100 + power;
}
return i;
}
This is what Microsoft does for dBm <> percent conversion:
https://stackoverflow.com/a/15798024/2096041
Basically -50 .. 0 dBm maps linear to 100 .. 0 %.
Like MS, i would prefer to sit on the safe side and not use -100 as 100% as some answers here suggest.
The WifiManager class has a function calculateSignalLevel, but as it states here, it results in an error if numLevels is greater than 45. Possible workaround could be something like this:
double percentage = WifiManager.calculateSignalLevel(int rssi, 40) * 2.5;
but of course, this will be in steps of 2.5 percents - I don't know your use case but maybe this is sufficient.
As others have stated, calculating percentages is problematic, and there's no simple precise solution for that.
You could derive the percentage from the signal-to-noise ratio, rather than the signal intensity alone, if this information is available. This is probably the desired metric.
An android.net.wifi.ScanResult does not publish the neccessary information (as of Dec 2012), but you might be able to get this information through other means.
Signal = Noise => unusable signal, so you could set 0dB SnR = 0%. Also you could set 10dB SnR to 90% (90% of the signal power is not drowned out in noise), and 100% = no noise at all. More generally,
p = 100% * (1 - 10^(SnR / (10dB)))

Polar Wearlink Bluetooth packet

i am looking at the code of a project called MyTracks:
http://code.google.com/r/jrgert-polar-bluetooth/source/browse/MyTracks/src/com/google/android/apps/mytracks/services/sensors/PolarMessageParser.java?r=ebc01faf49550bc9801633ff38bb3b8ddd6f5698
Now I am having problems with the method isValid(byte[] buffer). I don´t understand what exactly is he checking here. We want to know if the first byte in the array is the header containing 0xFE. I don´t quite understand the following lines :
boolean goodHdr = ((buffer[0] & 0xFF) == 0xFE);
boolean goodChk = ((buffer[2] & 0xFF) == (0xFF - (buffer[1] & 0xFF)));
return goodHdr && goodChk;
any ideas?
Ewoks is correct, refer to this blog post:
http://ww.telent.net/2012/5/3/listening_to_a_polar_bluetooth_hrm_in_linux
"Digging into src/com/google/android/apps/mytracks/services/sensors/PolarMessageParser.java we find a helpful comment revealing that, notwithstanding Polar's ridiculous stance on giving out development info (they don't, is the summary) the Wearlink packet format is actually quite simple.
Polar Bluetooth Wearlink packet example
Hdr - Len - Chk - Seq - Status - HeartRate - RRInterval_16-bits
FE - 08 - F7 - 06 - F1 - 48 - 03 64
where
Hdr always = 254 (0xFE),
Chk = 255 - Len
Seq range 0 to 15
Status = Upper nibble may be battery voltage
bit 0 is Beat Detection flag."
&0xff simply converts signed byte to unsigned int for doing the comparison
First line is checking is received buffer are starting with 0xFE as it should be with this Polar Wearable.
Second line is checking if length byte is correct as well because it's value by specification is 255-value writen is size byte..
This together is super simple verification that messages are correct (more complicated implementation would include CRC or other verification methods). cheers

Categories

Resources