I'm trying to get the device acceleration in unity to move an object
public class ExampleClass : MonoBehaviour {
public float speed = 10.0F;
void Update() {
Vector3 dir = Vector3.zero;
// we assume that device is held parallel to the ground
// and Home button is in the right hand
// remap device acceleration axis to game coordinates:
// 1) XY plane of the device is mapped onto XZ plane
// 2) rotated 90 degrees around Y axis
dir.x = -Input.acceleration.y;
dir.z = Input.acceleration.x;
// clamp acceleration vector to unit sphere
if (dir.sqrMagnitude > 1)
dir.Normalize();
// Make it move 10 meters per second instead of 10 meters per frame...
dir *= Time.deltaTime;
// Move object
transform.Translate(dir * speed);
}
}
But when I run the game on my device, the object moves and stops depending on the orientation of the device and not it's acceleration.
I tried also to print Input.acceleration readings
GUI.Button(new Rect(10, 10, 150, 80), Input.acceleration.x + "\n" + Input.acceleration.y + "\n" + Input.acceleration.z);
and I noticed that the three numbers' values change only when I rotate the device, and their value changes is always -1 and 1.
I know that that Accelerometer Is used for measuring acceleration ,not rotation. and the sensor that measures rotation is gyroscope.
Why is this happening? How can I read acceleration instead of rotation.
Most of the devices have gyroscope these days so try Input.gyro.userAcceleration
Note that on most of the android devices gyroscope is turned off by default and you need to set Input.gyro.enabled to true.
Related
I'm developing an app similar to a camera app, with a line in the center of the camera preview that shows the device tilt in regard to the horizon.
I'm using only the accelerometer, since my use case is a semi-stationary device (the user may move and tilt it and the line will update, but they should only expect a fair reading if the device is somewhat still).
This is what I've got so far:
public void onSensorChanged(SensorEvent event) {
switch( event.sensor.getType() ){
case Sensor.TYPE_ACCELEROMETER:
float[] acc = event.values.clone(); // [accX, accY, accZ]
double gravityNormalized = Math.sqrt(acc[0]*acc[0] + acc[1]*acc[1] + acc[2]*acc[2]);
double[] accNormalized = new double[3];
accNormalized[0] = acc[0]/gravityNormalized;
accNormalized[1] = acc[1]/gravityNormalized;
accNormalized[2] = acc[2]/gravityNormalized;
double[] tiltDegrees = new double[3];
tiltDegrees[0] = Math.toDegrees(Math.asin(accNormalized[0]));
tiltDegrees[1] = Math.toDegrees(Math.asin(accNormalized[1]));
tiltDegrees[2] = Math.toDegrees(Math.asin(accNormalized[2]));
Log.d(TAG, "tiltDegrees: " + Arrays.toString(tiltDegrees) + ", accNormalized: " + Arrays.toString(accNormalized) + ", acc: " + Arrays.toString(acc) );
((ImageView) findViewById(R.id.levelLine)).setRotation((int)tiltDegrees[0]);
break;
}
}
It seems to be working fairly well, as long as the rotation is moderate. When I start nearing 90 degrees rotation of the device (portrait orientation vs horizon), the line stops rotating and is thus no longer in line with the horizon, but slightly tilted (I'd say about 5-10 degrees). Also, if I continue to rotate the device further than 90 degrees, the line becomes more and more tilted away from the horizon and becomes vertical when the device is rotated 135/-135 degrees.
Am I doing something wrong here? Is there a better way of getting the device tilt?
Further along in development I have the need to accurately get the tilt "forwards/backwards" (i.e. how many degrees the phone is tilted forwards/backwards while in portrait orientation). I'm thinking I could use the same approach I'm doing now but on tiltDegrees[1] instead of tiltDegrees[0]?
I somehow find it a bit strange that there is a TYPE_ROTATION_VECTOR that relates to the world-frame (north/south/east/west) but no device-frame TYPE_DEVICE_VECTOR or something similar to easily get the rotation of the device along its X, Y and Z axis (e.g. in relation to a perfect portrait mode).
TL;DR
How come the accelerometer values I get from Sensor.TYPE_ACCELEROMETER are slightly offset? I don't mean by gravity, but by some small error that varies from axis to axis and phone to phone.
Can I calibrate the accelerometer? Or is there a standard way of compensating for these errors?
I'm developing an app that has a need for as precise acceleration measurements as possible (mainly vertical acceleration, i.e. same direction as gravity).
I've been doing A LOT of testing, and it turns out that the raw values I get from Sensor.TYPE_ACCELEROMETER are off. If I let the phone rest at a perfectly horizontal surface with the screen up, the accelerometer shows a Z-value of 9.0, where it should be about 9.81. Likewise, if I put the phone in portrait or landscape mode, the X- and Y- accelerometer values show about 9.6. instead of 9.81.
This of course affects my vertical acceleration, as I'm using SensorManager.getRotationMatrixFromVector(), to calculate the vertical acceleration, resulting in a vertical acceleration that is off by a different amount depending on the rotation of the device.
Now, before anyone jumps the gun and mentions that I should try using Sensor.TYPE_LINEAR_ACCELERATION instead, I must point out that I'm actually doing that as well, parallel to the TYPE_ACCELERATION. By using the gravity sensor I then calculate the vertical acceleration (as described in this answer). The funny thing is that I get EXACTLY the same result as the method that uses the raw accelerometer, SensorManager.getRotationMatrixFromVector() and matrix multiplication (and finally subtracting gravity).
The only way I'm able to get almost exactly zero vertical acceleration for a stationary phone in any rotation is to get the raw accelerometer values, add an offset (from earlier observations, i.e. X+0.21, Y+0.21 and Z+0.81) and then performing the rotation matrix stuff to get the world coordinate system accelerations. Note that since it's not just the calculated vertical acceleration that is wrong - it's actually the raw values from Sensor.TYPE_ACCELEROMETER, which I would think excludes other error sources like gyroscope sensor, etc?
I have tested this on two different phones (Samsung Galaxy S5 and Sony Xperia Z3 compact), and both have these accelerometer value deviances - but of course not the same values on both phones.
How come the the values of Sensor.TYPE_ACCELEROMETER are off, and is there a better way of "calibrating" the accelerometer than simply observing how much they deviate from gravity and adding the difference to the values before using them?
You should calibrate gains, offsets, and angle of the 3 accelerometers.
Unfortunately it's not possible to deepen the whole topic here.
I'll write a small introduction, describing the basic concept, and then I'll post a link to the code of a simple Clinometer that implements the calibration.
The calibration routine could be done with 7 misurations (calculate the mean value of a good number of samples) in different ortogonal positions at your choice, in order to have all +-0 and +-g values of your accelerometers. For example:
STEP 1 = Lay flat
STEP 2 = Rotate 180°
STEP 3 = Lay on the left side
STEP 4 = Rotate 180°
STEP 5 = Lay vertical
STEP 6 = Rotate 180° upside-down
STEP 7 = Lay face down
Then you can use the 7 measurements mean[][] to calculate offsets and gains:
calibrationOffset[0] = (mean[0][2] + mean[0][3]) / 2;
calibrationOffset[1] = (mean[1][4] + mean[1][5]) / 2;
calibrationOffset[2] = (mean[2][0] + mean[2][6]) / 2;
calibrationGain[0] = (mean[0][2] - mean[0][3]) / (STANDARD_GRAVITY * 2);
calibrationGain[1] = (mean[1][4] - mean[1][5]) / (STANDARD_GRAVITY * 2);
calibrationGain[2] = (mean[2][0] - mean[2][6]) / (STANDARD_GRAVITY * 2);
using the values of mean[axis][step], where STANDARD_GRAVITY = 9.81.
Then apply the Gain and Offset Corrections to measurements:
for (int i = 0; i < 7; i++) {
mean[0][i] = (mean[0][i] - calibrationOffset[0]) / calibrationGain[0];
mean[1][i] = (mean[1][i] - calibrationOffset[1]) / calibrationGain[1];
mean[2][i] = (mean[2][i] - calibrationOffset[2]) / calibrationGain[2];
}
and finally calculates the correction angles:
for (int i = 0; i < 7; i++) {
angle[0][i] = (float) (Math.toDegrees(Math.asin(mean[0][i]
/ Math.sqrt(mean[0][i] * mean[0][i] + mean[1][i] * mean[1][i] + mean[2][i] * mean[2][i]))));
angle[1][i] = (float) (Math.toDegrees(Math.asin(mean[1][i]
/ Math.sqrt(mean[0][i] * mean[0][i] + mean[1][i] * mean[1][i] + mean[2][i] * mean[2][i]))));
angle[2][i] = (float) (Math.toDegrees(Math.asin(mean[2][i]
/ Math.sqrt(mean[0][i] * mean[0][i] + mean[1][i] * mean[1][i] + mean[2][i] * mean[2][i]))));
}
calibrationAngle[2] = (angle[0][0] + angle[0][1])/2; // angle 0 = X axis
calibrationAngle[1] = -(angle[1][0] + angle[1][1])/2; // angle 1 = Y axis
calibrationAngle[0] = -(angle[1][3] - angle[1][2])/2; // angle 2 = Z axis
You can find a simple but complete implementation of a 3-axis calibration in this opensource Clinometer app: https://github.com/BasicAirData/Clinometer.
There is also the APK and the link of the Google Play Store if you want to try it.
You can find the calibration routine in CalibrationActivity.java;
The calibration parameters are applied in ClinometerActivity.java.
Furthermore, you can find a very good technical article that deepens the 3-axis calibration here: https://www.digikey.it/it/articles/using-an-accelerometer-for-inclination-sensing.
From my Android device I can read an array of linear acceleration values (in the device's coordinate system) and an array of absolute orientation values (in Earth's coordinate system). What I need is to obtain the linear acceleration values in the latter coord. system.
How can I convert them?
EDIT after Ali's reply in comment:
All right, so if I understand correctly, when I measure the linear acceleration, the position of the phone completely does not matter, because the readings are given in Earth's coordinate system. right?
But I just did a test where I put the phone in different positions and got acceleration in different axes. There are 3 pairs of pictures - the first ones show how I put the device (sorry for my Paint "master skill") and the second ones show readings from data provided by the linear acc. sensor:
device put on left side
device lying on back
device standing
And now - why in the third case the acceleration occurs along the Z axis (not Y) since the device position doesn't matter?
I finally managed to solve it! So to get acceleration vector in Earth's coordinate system you need to:
get rotation matrix (float[16] so it could be used later by android.opengl.Matrix class) from SensorManager.getRotationMatrix() (using SENSOR.TYPE_GRAVITY and SENSOR.TYPE_MAGNETIC_FIELD sensors values as parameters),
use android.opengl.Matrix.invertM() on the rotation matrix to invert it (not transpose!),
use Sensor.TYPE_LINEAR_ACCELERATION sensor to get linear acceleration vector (in device's coord. sys.),
use android.opengl.Matrix.multiplyMV() to multiply the rotation matrix by linear acceleration vector.
And there you have it! I hope I will save some precious time for others.
Thanks for Edward Falk and Ali for hints!!
Based on #alex's answer, here is the code snippet:
private float[] gravityValues = null;
private float[] magneticValues = null;
#Override
public void onSensorChanged(SensorEvent event) {
if ((gravityValues != null) && (magneticValues != null)
&& (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER)) {
float[] deviceRelativeAcceleration = new float[4];
deviceRelativeAcceleration[0] = event.values[0];
deviceRelativeAcceleration[1] = event.values[1];
deviceRelativeAcceleration[2] = event.values[2];
deviceRelativeAcceleration[3] = 0;
// Change the device relative acceleration values to earth relative values
// X axis -> East
// Y axis -> North Pole
// Z axis -> Sky
float[] R = new float[16], I = new float[16], earthAcc = new float[16];
SensorManager.getRotationMatrix(R, I, gravityValues, magneticValues);
float[] inv = new float[16];
android.opengl.Matrix.invertM(inv, 0, R, 0);
android.opengl.Matrix.multiplyMV(earthAcc, 0, inv, 0, deviceRelativeAcceleration, 0);
Log.d("Acceleration", "Values: (" + earthAcc[0] + ", " + earthAcc[1] + ", " + earthAcc[2] + ")");
} else if (event.sensor.getType() == Sensor.TYPE_GRAVITY) {
gravityValues = event.values;
} else if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD) {
magneticValues = event.values;
}
}
According to the documentation you get the linear acceleration in the phone's coordinate system.
You can transform any vector from the phone's coordinate system to the Earth's coordinate system by multiplying it with the rotation matrix. You can get the rotation matrix from getRotationMatrix().
(Perhaps there already is a function doing this multiplication for you but I don't do Android programming and I am not familiar with its API.)
A nice tutorial on the rotation matrix is the Direction Cosine Matrix IMU: Theory manuscript. Good luck!
OK, first of all, if you're trying to do actual inertial navigation on Android, you've got your work cut out for you. The cheap little sensor used in smart phones are just not precise enough. Although, there has been some interesting work done on intertial navigation over small distances, such as inside a building. There are probably papers on the subject you can dig up. Google "Motion Interface Developers Conference" and you might find something useful -- that's a conference that Invensense put on a couple months ago.
Second, no, linear acceleration is in device coordinates, not world coordinates. You'll have to convert yourself, which means knowing the device's 3-d orientation.
What you want to do is use a version of Android that supports the virtual sensors TYPE_GRAVITY and TYPE_LINEAR_ACCELERATION. You'll need a device with gyros to get reasonably accurate and precise readings.
Internally, the system combines gyros, accelerometers, and magnetometers in order to come up with true values for the device orientation. This effectively splits the accelerometer device into its gravity and acceleration components.
So what you want to do is to set up sensor listeners for TYPE_GRAVITY, TYPE_LINEAR_ACCELERATION, and TYPE_MAGNETOMETER. Use the gravity and magnetometer data as inputs to SensorManager. getRotationMatrix() in order to get the rotation matrix that will transform world coordinates into device coordinates or vice versa. In this case, you'll want the "versa" part. That is, convert the linear acceleration input to world coordinates by multiplying them by the transpose of the orientation matrix.
Recently I have made some research to use both the accelerometer + Gyroscope to use those senser to track a smartphone without the help of the GPS (see this post)
Indoor Positioning System based on Gyroscope and Accelerometer
For that purpose I will need my orientation (angle (pitch, roll etc..)) so here what i have done so far:
public void onSensorChanged(SensorEvent arg0) {
if (arg0.sensor.getType() == Sensor.TYPE_ACCELEROMETER)
{
accel[0] = arg0.values[0];
accel[1] = arg0.values[1];
accel[2] = arg0.values[2];
pitch = Math.toDegrees(Math.atan2(accel[1], Math.sqrt(Math.pow(accel[2], 2) + Math.pow(accel[0], 2))));
tv2.setText("Pitch: " + pitch + "\n" + "Roll: " + roll);
} else if (arg0.sensor.getType() == Sensor.TYPE_GYROSCOPE )
{
if (timestamp != 0) {
final float dT = (arg0.timestamp - timestamp) * NS2S;
angle[0] += arg0.values[0] * dT;
filtered_angle[0] = (0.98f) * (filtered_angle[0] + arg0.values[0] * dT) + (0.02f)* (pitch);
}
timestamp = arg0.timestamp;
}
}
Here I'm trying to angle (just for testing) from my accelerometer (pitch), from integration the gyroscope_X trough time filtering it with a complementary filter
filtered_angle[0] = (0.98f) * (filtered_angle[0] + gyro_x * dT) + (0.02f)* (pitch)
with dT begin more or less 0.009 secondes
But I don't know why but my angle are not really accurate...when the device is position flat on the table (Screen facing up)
Pitch (angle fromm accel) = 1.5 (average)
Integrate gyro = 0 to growing (normal it's drifting)
filtered gyro angle = 1.2
and when I lift the phone of 90° (see the screen is facing the wall in front of me)
Pitch (angle fromm accel) = 86 (MAXIMUM)
Integrate gyro = he is out ok its normal
filtered gyro angle = 83 (MAXIMUM)
So the angles never reach 90 ??? Even if I try to lift the phone a bit more...
Why doesn't it going until 90° ? Are my calculation wrong? or is the quality of the sensor crap?
AN other thing that I'm wondering it is that: with Android I don't "read out" the value of the sensor but I'm notified when they change. The problem is that as you see in the code the Accel and Gyro share the same method.... so when I compute the filtered angle I will take the pitch of the accel measure 0.009 seconds before, no ? Is that maybe the source of my problem?
Thank you !
I can only repeat myself.
You get position by integrating the linear acceleration twice but the error is horrible. It is useless in practice. In other words, you are trying to solve the impossible.
What you actually can do is to track just the orientation.
Roll, pitch and yaw are evil, do not use them. Check in the video I already recommended, at 38:25.
Here is an excellent tutorial on how to track orientation with gyros and accelerometers.
Similar questions that you might find helpful:
track small movements of iphone with no GPS
What is the real world accuracy of phone accelerometers when used for positioning?
how to calculate phone's movement in the vertical direction from rest?
iOS: Movement Precision in 3D Space
How to use Accelerometer to measure distance for Android Application Development
Distance moved by Accelerometer
How can I find distance traveled with a gyroscope and accelerometer?
I wrote a tutorial on the use of the Complementary Filter for oriëntation tracking with gyroscope and accelerometer: http://www.pieter-jan.com/node/11 maybe it can help you.
I test your code and found that probably the scale factor is not consistent.
Convert the pitch to 0-pi gives better result.
In my test, the filtered result is ~90 degrees.
pitch = (float) Math.toDegrees(Math.atan2(accel[1], Math.sqrt(Math.pow(accel[2], 2) + Math.pow(accel[0], 2))));
pitch = pitch*PI/180.f;
filtered_angle = weight * (filtered_angle + event.values[0] * dT) + (1.0f-weight)* (pitch);
i tried and this will give you angle 90...
filtered_angle = (filtered_angle / 83) * 90;
I am working on a project which includes an Android application which is used for
controlling/steering.
Speed: When you tilt the phone forward/backwards (pitch) it simulates giving gas and breaking.
Direction: When you tilt the phone left/right (roll) it simulates steering to the left and right.
I have already written some code which seemed to work fine. But when I took a closer look, I found that some values are acting weird.
When I tilt the phone forward/backward to handle the speed it works perfect I get the expected speed and direction values. But when I tilt the phone to the left/right to handle the direction it seems to corrupt some values. When it is tilting to the left/right that doesn't only change the direction value (roll) but it also affects the speed value (pitch).
For extra information:
Programming for Android 2.2
Device is an Google Nexus One
Holding the device in portrait
The most relevant code I use to read the sensor values is as follows:
public void onSensorChanged(SensorEvent sensorEvent)
{
synchronized (this)
{
if (sensorEvent.sensor.getType() == Sensor.TYPE_ORIENTATION)
{
float azimuth = sensorEvent.values[0]; // azimuth rotation around the z-axis
float pitch = sensorEvent.values[1]; // pitch rotation around the x-axis
float roll = sensorEvent.values[2]; // roll rotation around the y-axis
System.out.println("pitch: " + pitch);
System.out.println("roll: " + roll);
System.out.println("--------------------");
// Convert the sensor values to the actual speed and direction values
float speed = (pitch * 2.222f) + 200;
float direction = roll * -2.222f;
So when I run the code, and I look at the printed values. When tilting the device left/right, it seems to affect the pitch value as well. How come? And how can I get the pure pitch value, when 'roll'-ing? So that tilting the phone to the left/right doesn't affect/corrupt the pitch value.
You could read up on Gimbal lock. That's bitten me before.