Is the Smartwatch 2's gyroscope not as responsive? - android

I have developed a few apps for the SmartWatch 1 that take advantage of the watch's gyroscope. I finally got a SmartWatch 2 to develop on, but I notice that the gyroscope is way less responsive. For instance on the SmartWatch 1 it seems that every movement no matter how slight is recorded. However on the SmartWatch 2 the readings seem to run on a 100 millisecond timer. Here is how I interact with the sensor:
private final AccessorySensorEventListener mListener = new AccessorySensorEventListener() {
public void onSensorEvent(AccessorySensorEvent mySensorEvent) {
sensorEvent = mySensorEvent;
float[] values = sensorEvent.getSensorValues();
currentX = values[0];
currentY = values[1];
currentZ = values[2];
}
}
I have also tried different variations for registering my sensor:
mSensor.registerInterruptListener(mListener);
-- and --
mSensor.registerListener(mListener, Sensor.SensorRates.SENSOR_DELAY_FASTEST, Sensor.SensorInterruptMode.SENSOR_INTERRUPT_DISABLED);
-- and --
mSensor.registerFixedRateListener(mListener, Sensor.SensorRates.SENSOR_DELAY_FASTEST);
All of these seem to give the same exact effect. Am I doing this wrong for the SmartWatch 2, or is the gyroscope in the SmartWatch 2 really just less responsive?

There isn't a gyroscope in the SmartWatch 2, but there is an accelerometer. The accelerometer is limited to a 10Hz sampling rate. This is why you are only seeing 10 samples per second. I don't know of any way to increase the rate any higher.
There is additional info in this post: Sony Smartwatch SW2 - accelerometer output rate

Related

onSensorChanged in android continuously getting triggered in emulator

I am using ACCELEROMETER sensors and have registered a listener for the same via
mSensorManager = (SensorManager) getSystemService(Context.SENSOR_SERVICE);
mAcceleratorSensor = mSensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
mSensorManager.registerListener(this, mAcceleratorSensor , SensorManager.SENSOR_DELAY_NORMAL);
This is how my onSensorChanged looks like
#Override
public final void onSensorChanged(SensorEvent event) {
int sensorType = event.sensor.getType();
switch(sensorType){
case Sensor.TYPE_ACCELEROMETER:
float valueX = event.values[0];
float valueY = event.values[1];
float valueZ = event.values[2];
Log.d(TAG, "Sensor Changed value:"+valueX+":"+valueY+":"+valueZ);
break;
}
However this is what I see in my logs
01-12 02:01:18.063 19691-19691/com.taxis.locationupdates2 D/LocationActivity: Sensor Changed value:2.0:2.0:2.0
01-12 02:01:18.129 19691-19691/com.taxis.locationupdates2 D/LocationActivity: Sensor Changed value:2.0:2.0:2.0
It keeps calling the onSensorChanged in infinite loop even though there is no change in the values. I havent tested it with a real device yet. Is there any settings to control the same.
That method is very poorly named.
From the android docs for onSensorChanged:
Called when there is a new sensor event. Note that "on changed" is
somewhat of a misnomer, as this will also be called if we have a new
reading from a sensor with the exact same sensor values (but a newer
timestamp).
The sensor documentation has the details about the same
http://developer.android.com/guide/topics/sensors/sensors_overview.html
The default data delay is suitable for monitoring typical screen orientation changes and uses a delay of 200,000 microseconds. You can specify other data delays, such as SENSOR_DELAY_GAME (20,000 microsecond delay), SENSOR_DELAY_UI (60,000 microsecond delay), or SENSOR_DELAY_FASTEST (0 microsecond delay). As of Android 3.0 (API Level 11) you can also specify the delay as an absolute value (in microseconds).
The delay that you specify is only a suggested delay. The Android system and other applications can alter this delay. As a best practice, you should specify the largest delay that you can because the system typically uses a smaller delay than the one you specify (that is, you should choose the slowest sampling rate that still meets the needs of your application). Using a larger delay imposes a lower load on the processor and therefore uses less power.

ConsumerIrManager.transmit broken in Lollipop?

I upgraded my Samsung Galaxy S4 from latest KitKat to Lollipop (5.0.1) yesterday and my IR remote control app that I have used for months stopped working.
Since I was using a late copy of KitKat ConsumerIrManager, the transmit( ) function was sending the number of pulses using the code below. It worked very nicely.
private void irSend(int freqHz, int[] pulseTrainInMicroS) {
int [] pulseCounts = new int [pulseTrainInMicroS.length];
for (int i=0; i<pulseTrainInMicroS.length; i++) {
long iValue = pulseTrainInMicroS[i] * freqHz / 1000000;
pulseCounts[i] = (int) iValue;
}
m_IRService.transmit(freqHz, pulseCounts);
}
when it stopped working yesterday, I began looking closely at it.
I noticed that the transmitted waveform is not having any relationship with the requested pulse train. even the code below doesn't work correctly! there is
private void TestSend() {
int [] pulseCounts = {100, 100, 100};
m_IRService.transmit(38000, pulseCounts);
}
the resulting waveforms had many problems and so are entirely useless.
the waveforms were entirely wrong
the frequency was wrong and the pulse spacing was not regular
they were not repeatable
looking at the demodulated waveform:
if my 100, 100, 100 were correctly rendered, I should have seen two pulses 2.6ms (before 4.4.3(?) 100 us) long. instead I received (see attached) "[demodulated] not repeatable 1.BMP" and "[demodulated] not repeatable 2.BMP". note that the waveform isn't 2 pulses...in fact, it's not even repeatable.
as for the captures below, the signal goes low when the IR is detected.
we should have seen two pulses going low for 2.6 ms and 2.6 ms between them (see green line below).
I had also tried shorter pulses using 50, 50, 50 and have observed that the first pulse isn't correct either (see below).
looking at the modulated waveform:
the frequency was not correct; instead, it was about 18kHz and irregular.
I'm quite experienced with this and have formal education in electronics.
It seems to me there's a bug in ConsumerIrManager.transmit( )...
curiously, the "WatchOn" application that comes with the phone still works.
thank you for any insights you can give.
Test equipment:
Tektronix TDS-2014B, 100 MHz, used in peak-detect mode.
As #IvanTellez says, a change was made in Android in respect to this functionality. Strangely, when I had it outputting simple IR signals (for troubleshooting purposes), the function behaves as shown above (erratically, wrong carrier frequency, etc). When I eventually returned to normal types of IR signals, it worked correctly.

Estimating beacon proximity/distance based on RSSI - Bluetooth LE

I've got a simple iOS app which displays the proximity of the Bluetooth LE beacons it detects using such expressions as "immediate", "near" etc. and I need to write something similar on Android.
I've followed the tutorial at Android developer and I'm able to list detected devices and now want to estimate the distance/proximity - this is where it's become a problem. According to this SO thread it's just a handful of mathematical calculations. However, they require me to provide a txPower value.
According to this tutorial by Dave Smith (and cross-referencing with this Bluetooth SIG statement), it should be broadcast by the beacon devices as an "AD structure" of type 0x0A. So what I do is parse the AD structures and look for the payload of the one that matches the type.
Problem: I've got 4 beacons - 2 estimotes and 2 appflares. The estimotes don't broadcast the txPower at all and the appflares broadcast theirs as 0.
Is there anything I'm missing here? The iOS app seems to be handling it all without any problem, but using the iOS SDK it does it behind the scenes so I'm not sure how to produce the exact same or similar behaviour. Is there any other way I could solve my problem?
In case you'd like to take a look at the code I'm using to parse the AD structures, it's taken from the aforementioned Dave Smith's github and can be found here. The only change I did to that class was add the following method:
public byte[] getData() {
return mData;
}
And this is how I handle the callback from the scans:
// Prepare the callback for BLE device scan
this.leScanCallback = new BluetoothAdapter.LeScanCallback() {
#Override
public void onLeScan(final BluetoothDevice device, int rssi, byte[] scanRecord) {
if (!deviceList.contains(device)) {
MyService.this.deviceList.add(device);
Log.e("Test", "Device: " + device.getName());
List<AdRecord> adRecords = AdRecord.parseScanRecord(scanRecord);
for (AdRecord adRecord : adRecords) {
if (adRecord.getType() == AdRecord.TYPE_TRANSMITPOWER) {
Log.e("Test", "size of payload: " + adRecord.getData().length);
Log.e("Test", "payload: " + Byte.toString(adRecord.getData()[0]));
}
}
}
}
};
And what I see in the console is:
04-01 11:33:35.864: E/Test(15061): Device: estimote
04-01 11:33:36.304: E/Test(15061): Device: estimote
04-01 11:33:36.475: E/Test(15061): Device: n86
04-01 11:33:36.475: E/Test(15061): size of payload: 1
04-01 11:33:36.475: E/Test(15061): payload: 0
04-01 11:33:36.525: E/Test(15061): Device: f79
04-01 11:33:36.525: E/Test(15061): size of payload: 1
04-01 11:33:36.525: E/Test(15061): payload: 0
The txPower mentioned by #davidgyoung is given by the formula:
RSSI = -10 n log d + A
where
d = distance
A = txPower
n = signal propagation constant
RSSI = dBm
In free space n = 2, but it will vary based on local geometry – for example, a wall will reduce RSSI by ~3dBm and will affect n accordingly.
If you want the highest possible accuracy, it may be worthwhile to experimentally determine these values for your particular system.
Reference: see the paper Evaluation of the Reliability of RSSI for Indoor Localization by Qian Dong and Waltenegus Dargie for a more detailed explanation of the derivation and calibration.
double getDistance(int rssi, int txPower) {
/*
* RSSI = TxPower - 10 * n * lg(d)
* n = 2 (in free space)
*
* d = 10 ^ ((TxPower - RSSI) / (10 * n))
*/
return Math.pow(10d, ((double) txPower - rssi) / (10 * 2));
}
It is unclear whether your inability to read the "txPower" or "measuredPower" calibration constant is due to the AdRecord class or due to the information being missing from the advertisements you are trying to parse. It doesn't look to me like that class will parse a standard iBeacon advertisement. Either way, there is a solution:
SOLUTION 1: If your beacons send a standard iBeacon advertisement that includes the calibration constant, you can parse it out using code in the open source Android iBeacon Library's IBeacon class here.
SOLUTION 2: If your beacons DO NOT send a standard iBeacon advertisement or do not include a calibration constant:
You must hard-code a calibration constant in your app for each device type you might use. All you really need from the advertisement to estimate distance is the the RSSI measurement. The whole point of embedding a calibration constant in the transmission is to allow a wide variety of beacons with quite different transmitter output power to work with the same distance estimating algorithm.
The calibration constant, as defined by Apple, basically says what the RSSI should be if your device is exactly one meter away from the beacon. If the signal is stronger (less negative RSSI), then the device is less than one meter away. If the signal is weaker (more negative RSSI), then the device is over one meter away. You can use a formula to make a numerical estimate of distance. See here.
If you aren't dealing with advertisements that contain a "txPower" or "measuredPower" calibration constant, then you can hard-code a lookup table in your app that stores the known calibration constants for various transmitters. You will first need to measure the average RSSI of each transmitter at one meter away. You'll then need some kind of key to look up these calibration constants in the table. (Perhaps you can use the some part of the string from the AD structure, or the mac address?) So your table might look like this:
HashMap<String,Integer> txPowerLookupTable = new HashMap<String,Integer>();
txPowerLookupTable.put("a5:09:37:78:c3:22", new Integer(-65));
txPowerLookupTable.put("d2:32:33:5c:87:09", new Integer(-78));
Then after parsing an advertisement, you can look up the calibration constant in your onLeScan method like this:
String macAddress = device.getAddress();
Integer txPower = txPowerLookupTable.get(macAddress);
use the getAccuracy() method in the library, it gives you the distance of the beacon

How to convert WiFi level (i.e. -45 , -88 ) in to percentage?

How to convert WiFi level (i.e. -45 , -88 ) in to percentage ?
I want to convert WiFi level in % . I get WiFi level using level ( in dBm format)
I try lot of google but not get proper ans
Problem with this is that is very dependent on the receiving antenna. Some antennas register no useable signal at -90 dBm, some already at -80. You will have a hard time finding 0% (100% strictly being 0dBm).
I have created a Wifi scanner application where I use -100dBm as 0% and 0dBm as 100%, in Java it turns into something like this (MIN_DBM being -100):
public int getPowerPercentage(int power) {
int i = 0;
if (power <= MIN_DBM) {
i = 0;
} else {
i = 100 + power;
}
return i;
}
This is what Microsoft does for dBm <> percent conversion:
https://stackoverflow.com/a/15798024/2096041
Basically -50 .. 0 dBm maps linear to 100 .. 0 %.
Like MS, i would prefer to sit on the safe side and not use -100 as 100% as some answers here suggest.
The WifiManager class has a function calculateSignalLevel, but as it states here, it results in an error if numLevels is greater than 45. Possible workaround could be something like this:
double percentage = WifiManager.calculateSignalLevel(int rssi, 40) * 2.5;
but of course, this will be in steps of 2.5 percents - I don't know your use case but maybe this is sufficient.
As others have stated, calculating percentages is problematic, and there's no simple precise solution for that.
You could derive the percentage from the signal-to-noise ratio, rather than the signal intensity alone, if this information is available. This is probably the desired metric.
An android.net.wifi.ScanResult does not publish the neccessary information (as of Dec 2012), but you might be able to get this information through other means.
Signal = Noise => unusable signal, so you could set 0dB SnR = 0%. Also you could set 10dB SnR to 90% (90% of the signal power is not drowned out in noise), and 100% = no noise at all. More generally,
p = 100% * (1 - 10^(SnR / (10dB)))

How to get x, y, z values from Android Accelerometer sensor on a regular frequency, for instance per 20ms, 40 ms or 60 ms

Im working on an Android project and met the situation below:
Now we are needing the accelerometer value on a regular frequency, such as 20ms, 40ms or 60ms
Now we are SENSOR_DELAY_GAME right now but we found different devices are having different intervals for this parameter. For instance, the G2 is using 40ms, G7 is using 60ms and Nexus S is using 20ms.
I tried to set timer or used thread.sleep but because of the GC problem of Java, they can not let the system to get the value on a regular frequency.
This is very annoying and if any one has any idea to say if inside Android SDK there is a proper method to allow me get the accelerometer values on a regular frequency, that will be very helpful!
Thanks a lot!
I've done this by simply throwing away values that are sooner than I want them. Not ideal from a battery consumption standpoint as I need to have the sensors feed me more often than I need but at least then I can control that they come in on a regular interval.
Something like:
static final int ACCEL_SENSOR_DELAY = 100; // the number of milisecs to wait before accepting another reading from accelerometer sensor
long lastAccelSensorChange = 0; // the last time an accelerometer reading was processed
#Override
public void onSensorChanged(SensorEvent sensorEvent) {
if (sensorEvent.accuracy == SensorManager.SENSOR_STATUS_UNRELIABLE) return;
long now = System.currentTimeMillis();
if (now-ACCEL_SENSOR_DELAY > lastAccelSensorChange) {
lastAccelSensorChange = now;
mCompassValues = event.values.clone();
//... do your stuff
}
I have built a code that allows you to get the exact frequency on any device.
You can download here the project and get some explanations.
In the code, you can try the different rates. For example, the normal mode on my Galaxy S2 is 5Hz.
Use registerListener by setting the sampling period as below:
boolean registerListener (SensorEventListener listener, Sensor sensor, int samplingPeriodUs)
Source
Word of caution: The samplingPeriodUs argument is only a hint to the system. Test it before using this.

Categories

Resources