The result of gyroscope sensor Android - android

I just did my project for getting data gyroscope sensor. If I put my handphone on the table horizontally, the result of gyroscope sensor are :
Roll (X) : 5.326322E-7
Pitch (Y) : 5.326322E-7
Yaw (Z) : 5.326322E-7
Logically, the result should be 0, because the handphone lay on the table. So,anybody can help me? I give my code below. Thank you very much for the response in advance :).
public void onSensorChanged(SensorEvent event) {
if(event.sensor.getType()==Sensor.TYPE_GYROSCOPE)
{
float roolX = event.values[0];
float pitchY = event.values[1];
float yawZ = event.values[2];
koordinatrollX.setText("Orientation X (Roll) :" + Float.toString(event.values[0]));
koordinatpitchY.setText("Orientation Y (Pitch) :" + Float.toString(event.values[1]));
koordinatyawZ.setText("Orientation Z (Yaw):" + Float.toString(event.values[2]));
}

E-7 = 10^-7 so it's very close to 0. You can't expect the hardware to be perfect calibrated or the table to be perfect flat. You could let the user recalibrate based on the surface the phone is at but the result you get is already very close to zero.

Related

Can anyone tell me how i get toast when mobile falls down?

I am making an android project to detect when mobile fall down, can anyone tell me which sensor should I use in my app, I know accelerometer will use for this kind of purpose, but accelerometer can also detect when I shake the phone in my hand and I want to get the toast only when the mobile falls down.
here is my code:
int count = 1;
private boolean init;
private Sensor mySensor;
private SensorManager SM;
private float x1, x2, x3;
private static final float ERROR = (float) 7.0;
private static final float SHAKE_THRESHOLD = 15.00f; // m/S**2
private static final int MIN_TIME_BETWEEN_SHAKES_MILLISECS = 1000;
private long mLastShakeTime;
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER) {
long curTime = System.currentTimeMillis();
if ((curTime - mLastShakeTime) > MIN_TIME_BETWEEN_SHAKES_MILLISECS) {
float x = event.values[0];
float y = event.values[1];
float z = event.values[2];
double acceleration = Math.sqrt(Math.pow(x, 2) + Math.pow(y, 2)
+ Math.pow(z, 2))
- SensorManager.GRAVITY_EARTH;
Log.d("mySensor", "Acceleration is " + acceleration + "m/s^2");
if (acceleration > SHAKE_THRESHOLD) {
mLastShakeTime = curTime;
Toast.makeText(getApplicationContext(), "FALL DETECTED",
Toast.LENGTH_LONG).show();
} } }}
There is no "Fall Sensor", your guess in using the accelerometer is right. Record and measure the accelerometer data when it falls and deduct a model from there.
You have to use accelerometer. But it will detect very small movement.
The only way to do is , get the difference between two readings on change of state.
If its very very high, the mobile has traveled a longer distances say fell down from hand, or change an abnormal position say a shock.
For this we have to use little bit of Physics, any object falling under the influence of gravity has an acceleration of 9.8 m/s/s, downward (on Earth). SO with this knowledge we can get the downward direction speed
1)based on the axis(Y-axis) using accelerometer check it is downward and
2) check if it is equal to this 9.8 m/s constant and also check the
if the fall speed is approx near 9.8 m/s show the toast
Hi all i done this by myself a little change was get me rid of this , that is just taking two variable set those values negative and call them as threshold value of acceleration .
thanks everyone

Step detector using android accelerometer

i am building an app that counts the steps using accelerometer for android devices.(android version< .4.4)
to do this i calculate the gravity (G) using x,y and z. if the value of G is larger than STEP_THRESHOLD then an step has been occured:
#Override
public void onSensorChanged(SensorEvent event) {
synchronized (this) {
if (event.sensor.getType() != Sensor.TYPE_ACCELEROMETER) {
return;
}
final float x = event.values[0];
final float y = event.values[1];
final float z = event.values[2];
final float g = Math.abs((x * x + y * y + z * z) / (SensorManager.GRAVITY_EARTH * SensorManager.GRAVITY_EARTH));
if (g > StepUtil.GRAVITY_THRESHOLD) {
/*
* check step times and other checkings
*/
.
.
.
stepCounter++;
}
}
}
the problem is:
since accelerometer hardwares is varies in each device, there is not a certain value for STEP_THRESHOLD.
it seems that the STEP_THRESHOLD value should be dynamic for each device!
is there any criterion to change STEP_THRESHOLD value base on accelerometer accuracy?
any help will be appreciated.
since accelerometer hardwares is varies in each deviceو there is not a certain value for STEP_THRESHOLD
Having that said, what your app is apparently missing is simple calibration, which would allow your users to set the threshold level on their device. Tell them to do couple of steps, take sensor values, take mean/median/average (whatever will work for you better) and you got your STEP_THRESHOLD. You can even send this value back to your server so you can build kind of database and come with some more/less universtal starting value of STEP_AVERAGE.

Android: Problems calculating the Orientation of the Device

i'am trying to build a simple Augmented Reality App, so I start working with sensor Data.
According to this thread (Android compass example) and example (http://www.codingforandroid.com/2011/01/using-orientation-sensors-simple.html), the calculation of the orientation using the Sensor.TYPE_ACCELEROMETER and Sensor.TYPE_MAGNETIC_FIELD doesn't really fit.
So I'm not able to get "good" values. The azimut values doesn't make any sense at all, so if I just move the Phone upside the value changes extremly. Even if I just rotate the phone, the values doesn't represent the phones orientation.
Has anybody an idea, who to improve the values quality according to the given example?
In what kind of orientation do you use this sample app? From what is written is this code, the only orientation supported is Portrait or flat on the table, it depends on devices. What do you mean by "good"?
It is normal that the value is not "good" when rotating the device, the device coordinate system is supposed to be working in Portrait, or flat i don't know (Y axis vertical along the screen pointing up, Z axis pointing out of the screen coming from the center of screen, X axis perpendicular to the Y axis going on the right along the screen). Having this, rotating the device will not rotate the device coordinate system, you'll have to remap it.
But if you want the heading of the device in Portrait orientation, here is a piece of code that works good for me:
#Override
public void onSensorChanged(SensorEvent event)
{
// It is good practice to check that we received the proper sensor event
if (event.sensor.getType() == Sensor.TYPE_ROTATION_VECTOR)
{
// Convert the rotation-vector to a 4x4 matrix.
SensorManager.getRotationMatrixFromVector(mRotationMatrix,
event.values);
SensorManager
.remapCoordinateSystem(mRotationMatrix,
SensorManager.AXIS_X, SensorManager.AXIS_Z,
mRotationMatrix);
SensorManager.getOrientation(mRotationMatrix, orientationVals);
// Optionally convert the result from radians to degrees
orientationVals[0] = (float) Math.toDegrees(orientationVals[0]);
orientationVals[1] = (float) Math.toDegrees(orientationVals[1]);
orientationVals[2] = (float) Math.toDegrees(orientationVals[2]);
tv.setText(" Yaw: " + orientationVals[0] + "\n Pitch: "
+ orientationVals[1] + "\n Roll (not used): "
+ orientationVals[2]);
}
}
You'll get the heading (or azimuth) in:
orientationVals[0]
Answer from Tíbó is good, but if you log roll value, you will expect irregular numbers.
(roll is important for AR Browsers)
This is due to
SensorManager.remapCoordinateSystem(mRotationMatrix,
SensorManager.AXIS_X, SensorManager.AXIS_Z,
mRotationMatrix);
You have to use different matrix for in and out of remap. This following code works for me with a correct roll value:
#Override
public void onSensorChanged(SensorEvent event)
{
// It is good practice to check that we received the proper sensor event
if (event.sensor.getType() == Sensor.TYPE_ROTATION_VECTOR)
{
// Convert the rotation-vector to a 4x4 matrix.
SensorManager.getRotationMatrixFromVector(mRotationMatrixFromVector, event.values);
SensorManager.remapCoordinateSystem(mRotationMatrixFromVector,
SensorManager.AXIS_X, SensorManager.AXIS_Z,
mRotationMatrix);
SensorManager.getOrientation(mRotationMatrix, orientationVals);
// Optionally convert the result from radians to degrees
orientationVals[0] = (float) Math.toDegrees(orientationVals[0]);
orientationVals[1] = (float) Math.toDegrees(orientationVals[1]);
orientationVals[2] = (float) Math.toDegrees(orientationVals[2]);
tv.setText(" Yaw: " + orientationVals[0] + "\n Pitch: "
+ orientationVals[1] + "\n Roll (not used): "
+ orientationVals[2]);
}
}
Probably late to the party. Anyway here is how I got the azimuth
private final int sensorType = Sensor.TYPE_ROTATION_VECTOR;
float[] rotMat = new float[9];
float[] vals = new float[3];
#Override
public void onSensorChanged(SensorEvent event) {
sensorHasChanged = false;
if (event.sensor.getType() == sensorType){
SensorManager.getRotationMatrixFromVector(rotMat,
event.values);
SensorManager
.remapCoordinateSystem(rotMat,
SensorManager.AXIS_X, SensorManager.AXIS_Y,
rotMat);
SensorManager.getOrientation(rotMat, vals);
azimuth = deg(vals[0]); // in degrees [-180, +180]
pitch = deg(vals[1]);
roll = deg(vals[2]);
sensorHasChanged = true;
}
}
Hope it helps
Have you tried the combined (sensor-fusion) type Sensor.TYPE_ROTATION_VECTOR. This may give better results:
Go to https://developer.android.com/reference/android/hardware/SensorEvent.html and search for 'rotation_vector'.
Here's a Kotlin approach with all the necessary matrices included (for some reason the previous answers leave out the array sizes, which matter)
// This is determined from the deprecated Sensor.TYPE_ORIENTATION
var lastOrientation: FloatArray = FloatArray(3)
var lastHeading: Float = 0f
var currentHeading: Float = 0f
// This is from the non deprecated Sensor.TYPE_ROTATION_VECTOR
var lastVectorOrientation: FloatArray = FloatArray(5)
var lastVectorHeading: Float = 0f
var currentVectorHeading: Float = 0f
override fun onSensorChanged(event: SensorEvent) {
when(event.sensor?.type) {
null -> return
Sensor.TYPE_ORIENTATION -> {
lastOrientation = event.values
lastHeading = currentHeading
currentHeading = abs(event.values[0].roundToInt().toFloat())
}
Sensor.TYPE_ROTATION_VECTOR -> {
lastVectorOrientation = event.values
lastVectorHeading = currentVectorHeading
val tempRotationMatrix = FloatArray(9)
val tempOrientationMatrix = FloatArray(3)
getRotationMatrixFromVector(tempRotationMatrix, event.values)
remapCoordinateSystem(tempRotationMatrix, AXIS_X, AXIS_Z, tempRotationMatrix)
getOrientation(tempRotationMatrix, tempOrientationMatrix)
currentVectorHeading = Math.toDegrees(tempOrientationMatrix[0].toDouble()).toFloat()
if(currentVectorHeading < 0) {
currentVectorHeading += 360f//heading = 360 - abs(neg heading), which is really 360 + (-heading)
}
}
else -> return
}
}
I've also included the deprecated Sensor.TYPE_ORIENTATION for anybody wanting to see the difference between the two approaches. There is a several degree difference when using the deprecated method vs the updated approach.

Problem with SensorManager.getOrientation()

n reference pages of android.view.onSensorChanged() the axes of the device are described as
"The X axis refers to the screen's horizontal axis (the small edge in portrait mode, the long edge in landscape mode) and points to the right. The Y axis refers to the screen's vertical axis and points towards the top of the screen (the origin is in the lower-left corner). The Z axis points toward the sky when the device is lying on its back on a table."
and, in android.hardware.SensorManager.getOrientation(), it is mentioned that, the method will return device's azimuth, pitch and roll, which are positive in counter-clockwise direction. But, when i called the function from my code and printing the values, the azimuth and pitch are positive in clock-wise direction, and negative in counter-clockwise direction. The Rotation Matrix which is obtained by android.hardware.SensorManager.getRotationMatrix() is meeting the requirements, if what i claim is correct, i.e the azimuth and pitch are +ve in clock-wise direction and roll is +ve in counter-clockwise direction. Please correct me if i am wrong. I'm using HTC Wildfire S, which runs on Android 2.3.3.
I'm adding the code segment, which i'm using for obtaining the orientation values.
sensormanager = (SensorManager)getSystemService(Context.SENSOR_SERVICE);
Sensor gsensor = sensormanager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
Sensor msensor = sensormanager.getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD);
sensormanager.registerListener(this, gsensor, SensorManager.SENSOR_DELAY_GAME);
sensormanager.registerListener(this, msensor, SensorManager.SENSOR_DELAY_GAME);
-------------------------------------------------------------------------------
final float rad2deg = (float)(180.0f/Math.PI);
int type = event.sensor.getType();
float[] data;
if (type == Sensor.TYPE_ACCELEROMETER) {
data = gravityValues;
} else if (type == Sensor.TYPE_MAGNETIC_FIELD) {
data = geomagneticValues;
} else {
return;
}
for (int i=0 ; i<3 ; i++)
data[i] = event.values[i];
SensorManager.getRotationMatrix(rValues, null, gravityValues, geomagneticValues);
SensorManager.getOrientation(rValues, mOrientation);
result.setText("Compass yaw: " + (int)(mOrientation[0]*rad2deg) + " pitch: " + (int)(mOrientation[1]*rad2deg) +" roll: " + (int)(mOrientation[2]*rad2deg) );
Please let me konw, if i missed mentioning any required information. Thanks in advance.
The documentation is correct, the Z-Axis in the getOrientation call is pointing down toward the earth. Thus counterclockwise is clockwise in normal sense.

Convert values form Sensor.TYPE_ORIENTATION to Euler angles?

I have to write a compass app in Android. The only thing the user sees on the screen is a cube with a red wall which has to point north. This is not important. What's important is that I need to rotate that cube accordingly to the rotation of the device itself so that the red wall continues to point north no matter how the phone is being held. My code is simple and straightforward:
#Override
public void onSensorChanged(SensorEvent event) {
synchronized (this) {
switch (event.sensor.getType()){
case Sensor.TYPE_ACCELEROMETER:
direction = event.values[2];
break;
case Sensor.TYPE_ORIENTATION:
if (direction < 0) {
angleX = event.values[1];
angleY = -event.values[2];
angleZ = event.values[0];
} else {
angleX = -event.values[1];
angleY = -event.values[2];
angleZ = event.values[0];
}
break;
}
}
}
I have added this extra direction variable that simply stores whether the phone's display is pointing downwards or upwards. I don't know if I need it but it seems to fix some bugs. I am using the SensorSimulator for android but whenever my pitch slider goes in the [-90, 90] interval the other variables get mixed up. It's like they get a 180 offset. But I can't detect when I am in this interval because the range of the pitch is from -90 to 90 so I can move that slider from left to write and I will always be in that interval.
This was all just to show you how far has my code advanced. I am not saying how this problem should be solved because I will only probably stir myself into a dead end. You see, I have been trying to write that app for 3 days now, and you can imagine how pissed my boss is. I have read all sorts of tutorials and tried every formula I could find or think of. So please help me. All I have to do is know how to rotate my cube, the rotation angles of which are EULER ANGLES in degrees.
Here's some code I wrote to do something pretty similar, really only caring about the rotation of the device in the roll direction. Hope it helps! It just uses the accelerometer values to determine the pitch, no need to get orientation of the view.
public void onSensorChanged(SensorEvent event) {
float x = -1 * event.values[0] / SensorManager.GRAVITY_EARTH;
float y = -1 * event.values[1] / SensorManager.GRAVITY_EARTH;
float z = -1 * event.values[2] / SensorManager.GRAVITY_EARTH;
float signedRawRoll = (float) (Math.atan2(x, y) * 180 / Math.PI);
float unsignedRawRoll = Math.abs(signedRawRoll);
float rollSign = signedRawRoll / unsignedRawRoll;
float rawPitch = Math.abs(z * 180);
// Use a basic low-pass filter to only keep the gravity in the accelerometer values for the X and Y axes
// adjust the filter weight based on pitch, as roll is harder to define as pitch approaches 180.
float filterWeight = rawPitch > 165 ? 0.85f : 0.7f;
float newUnsignedRoll = filterWeight * Math.abs(this.roll) + (1 - filterWeight) * unsignedRawRoll;
this.roll = rollSign * newUnsignedRoll;
if (Float.isInfinite(this.roll) || Float.isNaN(this.roll)) {
this.roll = 0;
}
this.pitch = filterWeight * this.pitch + (1 - filterWeight) * rawPitch;
for (IAngleListener listener : listeners) {
listener.deviceRollAndPitch(this.roll, this.pitch);
}
}

Categories

Resources