i am building an app that counts the steps using accelerometer for android devices.(android version< .4.4)
to do this i calculate the gravity (G) using x,y and z. if the value of G is larger than STEP_THRESHOLD then an step has been occured:
#Override
public void onSensorChanged(SensorEvent event) {
synchronized (this) {
if (event.sensor.getType() != Sensor.TYPE_ACCELEROMETER) {
return;
}
final float x = event.values[0];
final float y = event.values[1];
final float z = event.values[2];
final float g = Math.abs((x * x + y * y + z * z) / (SensorManager.GRAVITY_EARTH * SensorManager.GRAVITY_EARTH));
if (g > StepUtil.GRAVITY_THRESHOLD) {
/*
* check step times and other checkings
*/
.
.
.
stepCounter++;
}
}
}
the problem is:
since accelerometer hardwares is varies in each device, there is not a certain value for STEP_THRESHOLD.
it seems that the STEP_THRESHOLD value should be dynamic for each device!
is there any criterion to change STEP_THRESHOLD value base on accelerometer accuracy?
any help will be appreciated.
since accelerometer hardwares is varies in each deviceو there is not a certain value for STEP_THRESHOLD
Having that said, what your app is apparently missing is simple calibration, which would allow your users to set the threshold level on their device. Tell them to do couple of steps, take sensor values, take mean/median/average (whatever will work for you better) and you got your STEP_THRESHOLD. You can even send this value back to your server so you can build kind of database and come with some more/less universtal starting value of STEP_AVERAGE.
When I tilt my mobile phone rapidly to left or right to move the player in my app, the player starts wiggling for a short time. How can I remove this effect and make it as smooth as it is e.g. in Doodle Jump? I'm using the raw accelerometer sensor event values to add to the player coordinates. Is there any algorithm or should I just use the sensor differently?
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() != SENSOR)
return;
if(this.isRunning()) {
spieler.setMoveSpeed(-event.values[0]);
}
}
you need to use a low pass filter , which is specifically designed for the problem you are facing. it will smooth out the accelerometer values
Example code:
float rawX = event.values[0];
float rawY = event.values[1];
float rawZ = event.values[2];
// Apply low-pass filter
mGravity[0] = lowPass(rawX, mGravity[0]);
mGravity[1] = lowPass(rawY, mGravity[1]);
mGravity[2] = lowPass(rawZ, mGravity[2]);
for more information on low pass filter check out
http://developer.android.com/reference/android/hardware/SensorEvent.html
In Android, I am using the accelerometer and magnetic field sensor to calculate spatial positioning, as shown in the code below. The getRotationMatrix method generates values that are in real-world units with azimuth, pitch and roll. Azimuth and roll give values in the range of 0 to 180 or 0 to -180. Pitch however gives values from 0 to 90 or 0 to -90. That's a problem for my app because in my app I need to determine unique locations regardless how the device is oriented. With roll, you can have 2 locations with the same value.
I need to apply a matrix transformation that remaps the sensor values to values that range from 0 to 360 degrees (actually, 360 wouldn't be valid since it's the same as 0 and anything close to 360 would result in a number like 359.99999...)
I am not a mathematician and don't know how to use matrixes, let alone use them in Android but I am aware that this is what is required to get the 0 to 360 degree conversion. It would be nice if the matrix also took care of the azimuth and roll as well so that they also produce values from 0 to 360 but if that isn't possible, that's fine since unique positions can still be derived from their sensor values. Any suggestions how how I create this matrix transformation?
#Override
public void onSensorChanged(SensorEvent event)
{
try
{
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER)
accelerometerValues = event.values;
else if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD)
magneticFieldValues = event.values;
if ((accelerometerValues != null) && (magneticFieldValues != null))
{
float[] R = new float[9];
float I[] = new float[9];
boolean success = SensorManager.getRotationMatrix(R, I, accelerometerValues, magneticFieldValues);
if (success)
{
float[] values = new float[3];
SensorManager.getOrientation(R, values);
// Convert from radians to degrees if preferred.
values[0] = (float) Math.toDegrees(values[0]); // Azimuth
values[1] = (float) Math.toDegrees(values[1]); // Pitch
values[2] = (float) Math.toDegrees(values[2]); // Roll
}
}
}
catch (Exception ex)
{
}
}
EDIT:
The raw event values for the pitch do not give you unique values as you rotate the device 360 degrees, so I highly doubt any matrix transformation is going to produce the results I am after. Maybe I am using the wrong sensors.
i'am trying to build a simple Augmented Reality App, so I start working with sensor Data.
According to this thread (Android compass example) and example (http://www.codingforandroid.com/2011/01/using-orientation-sensors-simple.html), the calculation of the orientation using the Sensor.TYPE_ACCELEROMETER and Sensor.TYPE_MAGNETIC_FIELD doesn't really fit.
So I'm not able to get "good" values. The azimut values doesn't make any sense at all, so if I just move the Phone upside the value changes extremly. Even if I just rotate the phone, the values doesn't represent the phones orientation.
Has anybody an idea, who to improve the values quality according to the given example?
In what kind of orientation do you use this sample app? From what is written is this code, the only orientation supported is Portrait or flat on the table, it depends on devices. What do you mean by "good"?
It is normal that the value is not "good" when rotating the device, the device coordinate system is supposed to be working in Portrait, or flat i don't know (Y axis vertical along the screen pointing up, Z axis pointing out of the screen coming from the center of screen, X axis perpendicular to the Y axis going on the right along the screen). Having this, rotating the device will not rotate the device coordinate system, you'll have to remap it.
But if you want the heading of the device in Portrait orientation, here is a piece of code that works good for me:
#Override
public void onSensorChanged(SensorEvent event)
{
// It is good practice to check that we received the proper sensor event
if (event.sensor.getType() == Sensor.TYPE_ROTATION_VECTOR)
{
// Convert the rotation-vector to a 4x4 matrix.
SensorManager.getRotationMatrixFromVector(mRotationMatrix,
event.values);
SensorManager
.remapCoordinateSystem(mRotationMatrix,
SensorManager.AXIS_X, SensorManager.AXIS_Z,
mRotationMatrix);
SensorManager.getOrientation(mRotationMatrix, orientationVals);
// Optionally convert the result from radians to degrees
orientationVals[0] = (float) Math.toDegrees(orientationVals[0]);
orientationVals[1] = (float) Math.toDegrees(orientationVals[1]);
orientationVals[2] = (float) Math.toDegrees(orientationVals[2]);
tv.setText(" Yaw: " + orientationVals[0] + "\n Pitch: "
+ orientationVals[1] + "\n Roll (not used): "
+ orientationVals[2]);
}
}
You'll get the heading (or azimuth) in:
orientationVals[0]
Answer from Tíbó is good, but if you log roll value, you will expect irregular numbers.
(roll is important for AR Browsers)
This is due to
SensorManager.remapCoordinateSystem(mRotationMatrix,
SensorManager.AXIS_X, SensorManager.AXIS_Z,
mRotationMatrix);
You have to use different matrix for in and out of remap. This following code works for me with a correct roll value:
#Override
public void onSensorChanged(SensorEvent event)
{
// It is good practice to check that we received the proper sensor event
if (event.sensor.getType() == Sensor.TYPE_ROTATION_VECTOR)
{
// Convert the rotation-vector to a 4x4 matrix.
SensorManager.getRotationMatrixFromVector(mRotationMatrixFromVector, event.values);
SensorManager.remapCoordinateSystem(mRotationMatrixFromVector,
SensorManager.AXIS_X, SensorManager.AXIS_Z,
mRotationMatrix);
SensorManager.getOrientation(mRotationMatrix, orientationVals);
// Optionally convert the result from radians to degrees
orientationVals[0] = (float) Math.toDegrees(orientationVals[0]);
orientationVals[1] = (float) Math.toDegrees(orientationVals[1]);
orientationVals[2] = (float) Math.toDegrees(orientationVals[2]);
tv.setText(" Yaw: " + orientationVals[0] + "\n Pitch: "
+ orientationVals[1] + "\n Roll (not used): "
+ orientationVals[2]);
}
}
Probably late to the party. Anyway here is how I got the azimuth
private final int sensorType = Sensor.TYPE_ROTATION_VECTOR;
float[] rotMat = new float[9];
float[] vals = new float[3];
#Override
public void onSensorChanged(SensorEvent event) {
sensorHasChanged = false;
if (event.sensor.getType() == sensorType){
SensorManager.getRotationMatrixFromVector(rotMat,
event.values);
SensorManager
.remapCoordinateSystem(rotMat,
SensorManager.AXIS_X, SensorManager.AXIS_Y,
rotMat);
SensorManager.getOrientation(rotMat, vals);
azimuth = deg(vals[0]); // in degrees [-180, +180]
pitch = deg(vals[1]);
roll = deg(vals[2]);
sensorHasChanged = true;
}
}
Hope it helps
Have you tried the combined (sensor-fusion) type Sensor.TYPE_ROTATION_VECTOR. This may give better results:
Go to https://developer.android.com/reference/android/hardware/SensorEvent.html and search for 'rotation_vector'.
Here's a Kotlin approach with all the necessary matrices included (for some reason the previous answers leave out the array sizes, which matter)
// This is determined from the deprecated Sensor.TYPE_ORIENTATION
var lastOrientation: FloatArray = FloatArray(3)
var lastHeading: Float = 0f
var currentHeading: Float = 0f
// This is from the non deprecated Sensor.TYPE_ROTATION_VECTOR
var lastVectorOrientation: FloatArray = FloatArray(5)
var lastVectorHeading: Float = 0f
var currentVectorHeading: Float = 0f
override fun onSensorChanged(event: SensorEvent) {
when(event.sensor?.type) {
null -> return
Sensor.TYPE_ORIENTATION -> {
lastOrientation = event.values
lastHeading = currentHeading
currentHeading = abs(event.values[0].roundToInt().toFloat())
}
Sensor.TYPE_ROTATION_VECTOR -> {
lastVectorOrientation = event.values
lastVectorHeading = currentVectorHeading
val tempRotationMatrix = FloatArray(9)
val tempOrientationMatrix = FloatArray(3)
getRotationMatrixFromVector(tempRotationMatrix, event.values)
remapCoordinateSystem(tempRotationMatrix, AXIS_X, AXIS_Z, tempRotationMatrix)
getOrientation(tempRotationMatrix, tempOrientationMatrix)
currentVectorHeading = Math.toDegrees(tempOrientationMatrix[0].toDouble()).toFloat()
if(currentVectorHeading < 0) {
currentVectorHeading += 360f//heading = 360 - abs(neg heading), which is really 360 + (-heading)
}
}
else -> return
}
}
I've also included the deprecated Sensor.TYPE_ORIENTATION for anybody wanting to see the difference between the two approaches. There is a several degree difference when using the deprecated method vs the updated approach.
I am developing an application where I would require to retrieve the angle between the device and the vertical axis (the axis pointing to the center of the Earth).
So far, all the documentations and tutorials I found were not very conclusive.
Could you please explain me how can I do this or provide me with a link to a clear tutorial to help me find a solution to this problem?
First, I created a SensorEventListener implementation
private SensorEventListener sensorEventListener =
new SensorEventListener() {
/** The side that is currently up */
//private Side currentSide = null;
//private Side oldSide = null;
private float azimuth;
private float pitch;
private float roll;
public void onAccuracyChanged(Sensor sensor, int accuracy) {}
public void onSensorChanged(SensorEvent event) {
azimuth = event.values[0]; // azimuth
pitch = event.values[1]; // pitch
roll = event.values[2]; // roll
//code to deal with orientation changes;
//pitch is the angle between the vertical axis and the device's y axis (the one from the center of the device to its top)
}
};
Then, I register this listener to an Orientation Sensor
SensorManager sensorManager = (SensorManager) context.getSystemService(Context.SENSOR_SERVICE);
Sensor sensor;
List<Sensor> sensors = sensorManager.getSensorList(
Sensor.TYPE_ORIENTATION);
if (sensors.size() > 0) {
sensor = sensors.get(0);
sensorManager.registerListener(
sensorEventListener, sensor,
SensorManager.SENSOR_DELAY_NORMAL);
} else {
//notify the user that there's no orientation sensor
}