I've seen numerous questions/answers showing how to get temperature information from an Android device - using this approach:
int zoneNumber = 0; // Usually 0 or 1
String temperatureFileLocation = "sys/devices/virtual/thermal/thermal_zone" + zoneNumber + "/temp";
File temperatureFile = new File(temperatureFileLocation);
scanner = new Scanner(temperatureFile);
double temperatureC = scanner.nextFloat(); // Degrees C
...
scanner.close(); // finally
I wasn't really sure what each zone is for (i.e., in which part of the device the sensor is located) but I just discovered that there is also a file that describes the type of each zone - for example:
String zoneTypeFileLocation = "sys/devices/virtual/thermal/thermal_zone" + zoneNumber + "/type"; // NB - that's "/type" not "/temp" !
Now, when using Scanner to read in what type each zone is, I get values back such as this:
mtktswmt
mtktscpu
mtktspmic
mtktspa
mtktsabb
mtktsbattery
tsen_max
sec-fuelguage
Can anyone explain what locations/components all these zone names are actually referring to?
(Ideally, I would like to obtain the temperature of the device's NFC hardware.)
I guess that's the Hardware thermal sensors of the mobile. They usually give the temperature of the given zones when the mobile is working or even when you perform some benchmarks results.
like
mtktswmt is Wifi Chip temperature zone.
mtktscpu is cpu temperature zone.
mtktspmic is Multi IO and Regulator Chip temperature zone.
mtktspa is Thermal sensor MD1
mtktsabb is processor temperature zone.
mtktsbattery is the battery temperature zone.
tsen_max is the maximum temperature sensor capacity(I dont know for sure).
sec-fuelguage is the fuel gauge chip.
the mtkt prefix is just the name of the maker. In this case it is Mediatek
That's pretty hardcore hardware stuff. These are actually used by the makers of the android mobile phone(I guess). Even the above mentioned data is searched from google android open source project where the values were found in kernal drivers. Hence it's pretty hardcore hardware to play with it.
For using the Hardware Properties that actually gives you your desired results try HardwarePropertiesManager.
I hope it Helps.
Related
I am currently developing a driver for an ambient light sensor on a Qualcomm Snapdragon 888 platform.
When I try to interact with the device through the Android Sensor Framework I only get the Lux value (in the onSensorChanged callback the size of the values field in SensorEvent object is 1 whereas multiple data are pushed along with the Lux measure).
When having a look at the proprietary vendor implementation of the HAL I can clearly see that multiple data are being pushed to the HAL event message queue, but only the Lux info is forwarded by the framework.
I guess that somewhere in the AOSP the additional information (raw data in my case) are being discarded / ignored and I can't really find where this operation is done in the codebase.
To summarize, I would like to know which location has to be patched in order to keep these information and be able to use them at application level.
While doing some research I came across this question where users were using some additional info forwarded by the sensor framework for the light sensor: Reading Android RGB light sensor - Galaxy S5
Thank you!
It took me some time but I managed to get it working applying the following patches:
In frameworks/base/core/java/android/hardware/Sensor.java, update the entry of your sensor in the sSensorReportingModes array with the desired data length:
private static final int[] sSensorReportingModes = {
...
2, // SENSOR_TYPE_LIGHT
...
}
In hardware/interfaces/sensors/1.0/default/convert.cpp, update the following methods accordingly:
void convertFromSensorEvent(const sensors_event_t &src, Event *dst) {
...
case SensorType::LIGHT:
dst->u.data[0] = src.data[0];
dst->u.data[1] = src.data[1];
break;
...
}
void convertToSensorEvent(const Event &src, sensors_event_t *dst) {
...
case SensorType::LIGHT:
dst->data[0] = src.u.data[0];
dst->data[1] = src.u.data[1];
break;
...
}
I have spent much time trying to find out where is my mistakes while Im flashing the PIC16F688. The Pic has successfully flashed using PicKit2. Im using the Pic to convert analog pressure sensor to digital output and sending the data via Bluetooth, but the Bluetooth is not receiving stable numbers of data. The data is consist of 4 character decimal number that is between 0 and 1023.
The problem is that the Bluetooth can't wait at specific number and keep reading it, instead, it is reading the 4 digits in random.
I think my mistake is within the configuration of internal oscillator.
I'm attaching my code, the code is written to configure the flexiforce sensor circuit that outputs analog voltage up to 5v, and then the pic duty is to convert it to digital as I mentioned above.
it might be my wiring is not correct, please If you could help out with this one
and what configuration "at edit project" do I need to choose for Mikro PRO software?
I used "Bluetooth terminal" app to see my data asynchronous from Bluetooth.
Thank you.
char *temp = "0000";
unsigned int adc_value;
char uart_rd; int i;
void main()
{
OSCCON = 0x77;
ANSEL = 0b00000100;
CMCON0 = 0X07;
TRISA = 0b00001100;
UART1_Init(9600);
Delay_ms(100);
while (1)
{
adc_value = ADC_Read(2);
temp[0] = adc_value/1000+48;
temp[1] = (adc_value/100)%10+48;
temp[2] = (adc_value/10)%10+48;
temp[3] = adc_value%10+48;
for (i=0;i<4; i++)
UART1_Write(temp[i]);
UART1_Write(13);
Delay_ms(1000);
}
}
You can use itoa function to convert ADC integer value to characters for sending over UART. If there is error in calculation then you wont get appropriate value. Below code snippet for your reference :
while (1)
{
adc_value = ADC_Read(2);
itoa(adc_value, temp, 10);
for (i=0;i<4; i++)
UART1_Write(temp[i]);
UART1_Write(13);
Delay_ms(1000);
}
Please check Baud Rate you have configured at both ends is same or not. If baudrate mismatches then you will get Random value at Bluetooth Terminal where you are reading values.
What i would suggest, if you have a logic analyser, hook it up. If you don't recalculate your oscillator speed with the datasheet. It could just be that the internal oscillator is not accurate enough. What also works, is to write a function in assembly that waits a known time (by copy-pasting a lot of NOPs and using this to blink a led. Then start a stopwatch and count, say, 100 blinks. This is what i used to do before i had a logic analyser. (They are quite cheep on ebay).
I've got a simple iOS app which displays the proximity of the Bluetooth LE beacons it detects using such expressions as "immediate", "near" etc. and I need to write something similar on Android.
I've followed the tutorial at Android developer and I'm able to list detected devices and now want to estimate the distance/proximity - this is where it's become a problem. According to this SO thread it's just a handful of mathematical calculations. However, they require me to provide a txPower value.
According to this tutorial by Dave Smith (and cross-referencing with this Bluetooth SIG statement), it should be broadcast by the beacon devices as an "AD structure" of type 0x0A. So what I do is parse the AD structures and look for the payload of the one that matches the type.
Problem: I've got 4 beacons - 2 estimotes and 2 appflares. The estimotes don't broadcast the txPower at all and the appflares broadcast theirs as 0.
Is there anything I'm missing here? The iOS app seems to be handling it all without any problem, but using the iOS SDK it does it behind the scenes so I'm not sure how to produce the exact same or similar behaviour. Is there any other way I could solve my problem?
In case you'd like to take a look at the code I'm using to parse the AD structures, it's taken from the aforementioned Dave Smith's github and can be found here. The only change I did to that class was add the following method:
public byte[] getData() {
return mData;
}
And this is how I handle the callback from the scans:
// Prepare the callback for BLE device scan
this.leScanCallback = new BluetoothAdapter.LeScanCallback() {
#Override
public void onLeScan(final BluetoothDevice device, int rssi, byte[] scanRecord) {
if (!deviceList.contains(device)) {
MyService.this.deviceList.add(device);
Log.e("Test", "Device: " + device.getName());
List<AdRecord> adRecords = AdRecord.parseScanRecord(scanRecord);
for (AdRecord adRecord : adRecords) {
if (adRecord.getType() == AdRecord.TYPE_TRANSMITPOWER) {
Log.e("Test", "size of payload: " + adRecord.getData().length);
Log.e("Test", "payload: " + Byte.toString(adRecord.getData()[0]));
}
}
}
}
};
And what I see in the console is:
04-01 11:33:35.864: E/Test(15061): Device: estimote
04-01 11:33:36.304: E/Test(15061): Device: estimote
04-01 11:33:36.475: E/Test(15061): Device: n86
04-01 11:33:36.475: E/Test(15061): size of payload: 1
04-01 11:33:36.475: E/Test(15061): payload: 0
04-01 11:33:36.525: E/Test(15061): Device: f79
04-01 11:33:36.525: E/Test(15061): size of payload: 1
04-01 11:33:36.525: E/Test(15061): payload: 0
The txPower mentioned by #davidgyoung is given by the formula:
RSSI = -10 n log d + A
where
d = distance
A = txPower
n = signal propagation constant
RSSI = dBm
In free space n = 2, but it will vary based on local geometry – for example, a wall will reduce RSSI by ~3dBm and will affect n accordingly.
If you want the highest possible accuracy, it may be worthwhile to experimentally determine these values for your particular system.
Reference: see the paper Evaluation of the Reliability of RSSI for Indoor Localization by Qian Dong and Waltenegus Dargie for a more detailed explanation of the derivation and calibration.
double getDistance(int rssi, int txPower) {
/*
* RSSI = TxPower - 10 * n * lg(d)
* n = 2 (in free space)
*
* d = 10 ^ ((TxPower - RSSI) / (10 * n))
*/
return Math.pow(10d, ((double) txPower - rssi) / (10 * 2));
}
It is unclear whether your inability to read the "txPower" or "measuredPower" calibration constant is due to the AdRecord class or due to the information being missing from the advertisements you are trying to parse. It doesn't look to me like that class will parse a standard iBeacon advertisement. Either way, there is a solution:
SOLUTION 1: If your beacons send a standard iBeacon advertisement that includes the calibration constant, you can parse it out using code in the open source Android iBeacon Library's IBeacon class here.
SOLUTION 2: If your beacons DO NOT send a standard iBeacon advertisement or do not include a calibration constant:
You must hard-code a calibration constant in your app for each device type you might use. All you really need from the advertisement to estimate distance is the the RSSI measurement. The whole point of embedding a calibration constant in the transmission is to allow a wide variety of beacons with quite different transmitter output power to work with the same distance estimating algorithm.
The calibration constant, as defined by Apple, basically says what the RSSI should be if your device is exactly one meter away from the beacon. If the signal is stronger (less negative RSSI), then the device is less than one meter away. If the signal is weaker (more negative RSSI), then the device is over one meter away. You can use a formula to make a numerical estimate of distance. See here.
If you aren't dealing with advertisements that contain a "txPower" or "measuredPower" calibration constant, then you can hard-code a lookup table in your app that stores the known calibration constants for various transmitters. You will first need to measure the average RSSI of each transmitter at one meter away. You'll then need some kind of key to look up these calibration constants in the table. (Perhaps you can use the some part of the string from the AD structure, or the mac address?) So your table might look like this:
HashMap<String,Integer> txPowerLookupTable = new HashMap<String,Integer>();
txPowerLookupTable.put("a5:09:37:78:c3:22", new Integer(-65));
txPowerLookupTable.put("d2:32:33:5c:87:09", new Integer(-78));
Then after parsing an advertisement, you can look up the calibration constant in your onLeScan method like this:
String macAddress = device.getAddress();
Integer txPower = txPowerLookupTable.get(macAddress);
use the getAccuracy() method in the library, it gives you the distance of the beacon
rather than an answer I'm looking for an idea here.
I'd like to measure the scheduling latency of sensor sampling in Android. In particular I want to measure the time from the sensor interrupt request to when the bottom half, which is in charge of the data read, is executed.
The bottom half already has, besides the data read, a timestamping instruction. Indeed samples are collected by applications (being java or native, no difference) as a tuple [measurement, timestamp].
The timestamp follows the clock source clock_gettime(CLOCK_MONOTONIC, &t);
So assuming that the bottom-half is not preempted, somehow this timestamp gives an indication of the task scheduling instant. What is missing is a direct or indirect way to find out its corresponding irq instant.
Safely assume that we can ask any sampling rate to the sensor. The driver skeleton is the following (Galaxy's S3 gyroscope)
err = request_threaded_irq(data->client->irq, NULL,
lsm330dlc_gyro_interrupt_thread\
, IRQF_TRIGGER_RISING | IRQF_ONESHOT,\
"lsm330dlc_gyro", data);
static irqreturn_t lsm330dlc_gyro_interrupt_thread(int irq\
, void *lsm330dlc_gyro_data_p) {
...
struct lsm330dlc_gyro_data *data = lsm330dlc_gyro_data_p;
...
res = lsm330dlc_gyro_read_values(data->client,
&data->xyz_data, data->entries);
...
input_report_rel(data->input_dev, REL_RX, gyro_adjusted[0]);
input_report_rel(data->input_dev, REL_RY, gyro_adjusted[1]);
input_report_rel(data->input_dev, REL_RZ, gyro_adjusted[2]);
input_sync(data->input_dev);
...
}
The key constraint is that I need to (well, I only have enough resources to) perform this measurement from user-space, on a commercial device, without toucing and recompliling the kernel. Hopefully with a limited mpact on the experiment accuracy. I don't know if such an experiment is possible with this constraint and so far I couldn't figure out any reasonable method.
I might consider also recompiling the kernel if the experiment then becomes straightforward.
Thanks.
First Its not possible to perform this measurement without touching the kernel.
Second I didnt see any bottom half configured in your ISR code.
Third if at all Bottom half is scheduled and kernel can be recompiled , you can sample jiffie value in ISR and again resample it in bottom half. take the difference between the two samples and subtract that offset from timestamp that is exported to U-space.
I am currently working on an android application where I have to log all the sensor values. I got the sensor event timestamp from "event.timestamp" and I converted this value into a unix timestamp.
long currTimeRelativeToBootMs = SystemClock.uptimeMillis();
long currTimeAbsoluteMs = System.currentTimeMillis();
mStartTimeAbsoluteS = ((double)(currTimeAbsoluteMs - currTimeRelativeToBootMs))/(double)1000.0;
...
//timestampRelativeInNs = event.timestamp
double temp = mStartTimeAbsoluteS+((double)timestampRelativeInNs)/1000000000.0;
My application works fine on my HTC phone (Android 2.x.x) but it did not work on the new Google Nexus7.
I compared the "event.timestamp"-values from the different devices. I started the devices approximately at the same time but I got rather different values. The one from the Nexus7 is longer by 4 figures .....
SensorEvent-Timestamp(HTC): 175120992123000
SensorEvent-Timestamp(Nex): 1355418999245703000
What could be the reason for that issue??? How can I fix that???