I managed to get the heart rate sensor and it is working, with some exceptions.
I start the measurement on click, and when there is at least one result with accuracy >= LOW I stop the measurement (by unregistering the listener).
This sequence works 1-2 times, and then it just do not return a value with accuracy >= LOW. I am waiting a few minutes before I give up and close the application. But after opening the app again, most often it works the first 1-2 measurements and then it stops working with the same issue. (but other times it needs a lot of re-launches of the app in order to start giving some results)
Another issue is that if I start the measurement when the watch is not on my wrist/skin but lets say it is floating in the air, then even if I put it on my hand and wait a few minutes it doesn't start giving any valid/accurate value.
Have anybody observed similar behaviour? Is there any specific sequence that needs to be done in order for the sensor to provide more reliable data? Maybe a calibration can be done somehow to make the sensor more sensitive about my skin type or something?
Related
I'm currently developing an app that uses the activity recogition api.
I have a main activity and an intent service which is called using requestActivityUpdates() and returns data for the activity using the broadcast api.
According to the documentation, requestActivityUpdates' first parameter, detectionIntervalMillis, stands for "the desired time between activity detections. Larger values will result in fewer activity detections while improving battery life. A value of 0 will result in activity detections at the fastest possible rate.".
I'm using a values as low as 100ms and still I get updates once every 3 minutes on average. What could be the problem here? I tested on 2 different fully charged devices already, so I believe this has nothing to do with power saving configurations.
edit: took the devices for a walk and it seems that the rate of updates go up to ~8s. Still way more than expected. :/
I'm playing around with the AltBeacon and these parameters. My goal is to have the fastest (as fast as possible) callback didRangeBeaconsInRegion().
I understand that ranging uses running average to calculate the distance and make the callback. I'm not interested in the distance, but the rssi. With that said, if the rssi is varied by 1 that's ok.
In my current code, I currently use:
RangedBeacon.setSampleExpirationMilliseconds(1000);
try
{
mBeaconManager.setForegroundScanPeriod(700l);
mBeaconManager.setForegroundBetweenScanPeriod(0l);
mBeaconManager.updateScanPeriods();
}
catch(RemoteException ex)
{
...
}
My app is in foreground all the time. Running on Nexus 5X.
I notice that the smaller the value for setSampleExpirationMilliseconds(), the more frequent I get the didRangeBeaconsInRegion() callback, which is good. The setForegroundBetweenScanPeriod is set to 0 which means the service always scans all the time.
In my venue, I have about 30 beacons deployed. With the above code setup, I get callbacks every second, each time a different set of beacons.
The problem is even when I stand right next to a beacon, that beacon is not heard every 1 or less second. When I get the callback, it's usually for other far away beacons. There are times that it takes a good 30 seconds for me to hear that particular beacon to which I'm standing next again.
I know that the beacon we setup chirps every 20ms, so during that 700 ms, I should see them.
I notice that if I raise the setForegroundScanPeriod to 5000 (I hope the scan period to be longer so I can get the nearby beacons), I get less callbacks. The delay between callbacks is about 10 seconds. So I think a smaller value means faster callback.
My questions:
Why don't I get all the beacons in the callback (they all chirp at 20ms)? How is the callback called? When it has enough info, or it has some kind of interval? What controls it?
Is there any relationship between setSampleExpirationMilliseconds, setForegroundScanPeriod, and setForegroundScanPeriod? How to make them work well together?
My app requires that I should hear a nearby beacon (3ft or less) within less than a second, how to best setup the parameters to achieve this?
Thanks for reading such a long question. Appreciate any insights.
#davidgyoung maybe you could shed some light?
The Android Beacon Library isn't designed to give you a callback for every beacon packet detected, but rather to give you regular callbacks at some configured interval to let you know beacons are still around. By default, this interval is 1100 ms in the foreground, which is configured by
setForegroundScanPeriod(1100l);
setForegroundBetweenScanPeriod(0l);
As soon as the scan period ends, a list of beacon detected in the scan period are returned in a list via the didRangeBeaconsInRegion callback.
You can get faster callbacks by setting a shorter scan period. To get callbacks every 500ms, setForegroundScanPeriod(500l); The disadvantage of this is that this stops and restarts BLE scans at the end of each scan period. (Stopping and restarting is necessary for some Android phone models that can only detect one packet per unique bluetooth MAC address in a single scan cycle.) But whenever you stop and restart scanning, you will miss any packets that are being transmitted at that exact time -- it's akin to shutting off the bluetooth radio in the middle of the packet. This leads to a higher percentage of missed packet detections the shorter the scan period.
This may be OK for your use case, provided that the beacon is transmitting every 20ms -- with a 500ms scan interval, you have plenty of samples to ensure a detection.
The setSampleExpirationMilliseconds parameter is largely unrelated to what you are trying to do. It is used for distance estimates when using the default RunningAverageRssiFilter. This setting decides how long to average RSSI measurements for distance estimating purposes. By default, it retains 20 seconds worth of RSSI samples, which affects the results of getDistance() method on Beacon.
I have an application which calls sensorManager.registerListener() for the accelerometer, gyroscope, and magnetometer, which registers all sensors for the handler and they are each set to SENSOR_DELAY_NORMAL. This works fine with no issues the vast majority of the time.
However, when looking at some logs I noticed the accelerometer would seemingly randomly change is frequency from roughly 179ms (which seems the be the average SENSOR_DELAY_NORMAL on my phone) to about 20ms.
After doing a fair amount of digging and testing, I found the cause to be when the phone is shaken rapidly. When this happens all other sensors will maintain their ~179ms rate, but only the accelerometer will increase its rate to ~20ms. After some period of time the rate will eventually decrease from ~20ms back to the set rate of ~179ms.
I'm not sure how long it takes for it to return to the ~179ms rate, I've tried uninstalled and reinstalling the application and if enough time has not passed yet the accelerometer will still be firing events at ~20ms.
I wanted to see if I could resolve the issue but un-registering and re-registering the listener at the correct rate when this happens, however the accelerometer will keep going at ~20ms irregardless of what I reset it to. I did find out though that I can unregister the accelerometer listener, and that seems to work, but it doesn't solve my problem.
Anybody know why the accelerometer listener would change its rate at which it's firing, and how I might be able to resolve this?
The delay that you request Android for is only a suggested delay. Android system and other applications can change this. Source
The reason why this can happen is quite simple -
There are only limited number of physical sensors available on the device, 1 accelerometer, 1 magnetometer, 1 gyroscope
Say your application, registers for all events from accelerometer every 100ms.
Another application, requests for all events from accelerometer every 10ms
Now since there is only one sensor and there 2 different needs, Android enables the accelerometer to provide the data at the lowest of all delay requests made by all the apps and then Android reports all events at this delay frequency only.
In this case, it is up to the application developer to disregard events when they are in excess to what is required.
I've run into an issue trying to create an application that keeps track of Proximity Sensor values. The intent of the application is to record the instantaneous proximity value on regular intervals (say 30 seconds).
Following the API Documentation, the standard listener is created and attached for that type of sensor. However, the value is ALWAYS reported as '5.0' (5cm, the max value of the sensor) even when I cover the sensor with my hand.
However, if I start a DIFFERENT application (that also monitors Proximity sensor values) both applications start to correctly report proximity. When I close the other application (Android Sensor Box) my application reverts to just reporting 5.0 all the time.
I have debugged the application and set a breakpoint in the 'onSensorChanged' event to double-check what SensorEvent object is being passed to the onSensorChanged method, and it's always 5.0
EDIT #2: It doesn't appear to be limited to the Proximity sensor. The Gyroscope sensor behaves in the same manner while Accelerometer and Magnetometer appear to show the correct and up-to-date values. All four are referenced and accessed in the same fashion.
Any ideas?
Well, it turns out that I will once again be answering my own question.
The way the application was structured (and there was a good reason for this) meant that a Listener was registered and the first reported value from the sensor(s) was taken as the reading without continuously accepting new readouts. This works fine for Accelerometer readings but it turned out that some sensors reported their DEFAULT value in this first reading (such as 'FAR' for the proximity sensor or 0.0/0.0/0.0 for the Gyroscope). It was only after 2-3 'onSensorChanged' events that correct values start to be reported.
I am assuming this has something to do with power saving and certain sensors needing time to become 'ready' to report data. I have no idea why the acceptable practice is to fire off a sensorChanged event with incorrect values but that's what appears to be happening.
Nonetheless, the fix for me was to discard the first 2-3 readings (they come in at millisecond intervals anyway) and simply use the fourth one as the more reliable reading.
I'm writing an app that extends the SensorEventListener interface to listen for changes to the barometer, which I log in a logfile. Before I start logging, I prepend a system time in milliseconds (let's call this Millisecond Timestamp 1, or MT1), and after the logging is finished, I append another system timestamp in milliseconds (let's call this Millisecond Timestamp 2, or MT2).
The SensorEvent has its own timestamp (which I will call Nanosecond Timestamps, or NT), which I also log, between MT1 and MT2.
The problem is this: If the phone goes to sleep during the logging, the SensorEvent rate seems to no longer occur at the rate which I set (for example, SENSOR_DELAY_FASTEST). Furthermore, even though the SensorEvent timestamp is supposed to represent the nanoseconds of uptime since the phone has been rebooted, there are "missing" nanoseconds--the time gap between MT2 and MT1 is often twice or more that between NTN (where N is the number of samples) and NT1.
I've been able to sort of resolve this issue by using PowerManager.Wakelock(), but that results in my app being a huge power hog and seems like a really clumsy hack. Is there any other way to work around this problem?
Sensors are not guaranteed to work if the device goes to sleep, or even if the screen turns off (but the CPU has not necessarily yet powered down). The behavior is undocumented and definitely seems to vary by device.
Either settle for being "a huge power hog" or redesign your app to not require sensor readings except when the screen is on.
Sensors in Android are definitely designed to be used actively by foreground apps, not for long-term logging or monitoring purposes.