Calculate 3D rotation between two vectors of acceleration in Android - android

I am creating app that will measure acceleration of vehicle in each axis using accelerometer in Android smartphone. I need somehow rotate phone measurement coordinations system to coordination system of vehicle - to allow driver to put phone in holder - and so phone will have some different rotation to vehicle.
I started with calibration process where I tell user to hold phone to match vehicle coordination system - so I save TYPE_GRAVITY sensor X, Y, and Z gravity acceleration value. Then I tell user to put phone in holder and again save TYPE_GRAVITY sensor X, Y, and Z gravity acceleration value.
Now I need to find some relation between those two vectors so I can use it to correct (rotate) TYPE_LINEAR_ACCELERATION X,Y,Z data to match vehicle coordinations system.

Using just a gravity sensor you'll lack one rotational axis to compute what you want.
Imagine your phone being held vertically, the accelerometer will show you direction to the bottom of the phone. If you now rotate it around vertical axis - you'll still get the same result from the accelerometer. This means you can't get the rotation around vertical axis this way.
One solution would be to use a gyroscope - this gives you entire rotation of the phone, but increases a hardware requirements of your app.
But the better solution IMHO would be to get rid of the entire calibration process. Cars move mostly straight, only sometimes you get the side way acceleration so you could scan your readings for few seconds and find a 'main' axis and a 'side' one.
Besides, your calibration process still depends on user precision during placing the phone, so it may not work as you've expected.

Related

Tracking phone movement Coordinates with accelerometer

I'm new to Android and was playing around with the accelerometer to get data. I am able to get the acceleration values across all 3 axes using the SensorEvent.
Now I was trying to create a small example where I want to track the motion of the phone (in it's own coordinate system). Suppose I touch a point on the screen, I want it to be the (0,0,0) of my system.
Now if I move my screen to the left by 5 cms in a gentle way, I want the screen reading to say my position is (-5,0,0). Same goes for y axis and z axis.
I know that the accelerometer gives only acceleration, but can I use it to determine the position of the phone w.r.t. the original reference point?
Any pointers would be much appreciated.

Differentiate between braking and accelerating on Android

I am trying to write an app that needs to be able to tell if a vehicle is accelerating or breaking. In this sense I used the accelerometer in the phone. At the moment I managed to isolate and subtract the gravitational pull of the earth from my readings and I am using a noise variable to ignore minor changes (smaller than 10^-2).
In order to compute the acceleration relative to the ground I am using the following formula sqrt(Lx^2+Ly^2+Lz^2), where Lx represents the linear acceleration along the x-Axis in m/s^2.
My problem is: How do I differentiate between braking and accelerating, as my final acceleration value will always be greater than 0. Also I need this to work even if the vehicle accelerates -> maintains speed -> brakes/accelerates some more. Is my reasoning wrong? Have I made some false assumptions? Would another way be better? The phone I am using for development also has a gravity sensor and a gyroscope.

Determining which direction phone is rotating with just accelerometer

So, right now I'm grabbing the accelerometer data and converting them to a decently rough estimate of the angle at which the phone is being held. For right now I'm just focused on the yaw axis.
My area of interest is between 0 and 45 degrees on the yaw axis, so I made a limited queue of the past 5 to 10 readings and compared the numbers to determine if it's going up or down, which kind of works, but it is slow and not really as precise or reliable as I'd want it to be.
Is there a way you can kind of just determine which direction your phone is rotating with just the accelerometer and the magnetic field sensor I guess, without keeping a history of past readings, or something like that? I'm really new to sensor manipulation and Android in general. Any help understanding would be great.
It's not clear exactly what you're looking for here, position or velocity. Generally speaking, you don't want to get a position measurement by using integration on the accelerometer data. There's a lot of error associated with that calculation.
If you literally want the "direction your phone is rotating," rather than angular position, you can actually get that directly from the gyroscope sensor, which provides rotational velocities. That would let you get the direction it's rotating from the velocity without storing data. You should be aware that not every phone has a gyroscope sensor, but it does seem like the newer ones do.
If you want the absolute orientation of the phone (position), you can use the Rotation Vector sensor. This is a combined sensor that automatically integrates data from several of the sensors in one go, and provides additional accuracy. From this, you can get roll-pitch-yaw with a single measurement. Basically, you first want to get your data from the Rotation_vector sensor. Then you use the sensor data with getRotationMatrixFromVector. You can use the output from that in getOrientation (see the same page as the previous link), which will spit out roll-pitch-yaw measurements for you. You might need to rotate the axes around a bit to get the angles measured positive in the direction you want.

Android Convert device coordinate system to "user" coordinate system

My question is similar to Changing sensor coordinate system in android
I want to be able to compare a user's movements with each other regardless of device orientation. So that when the users holds out the phone in portrait orientation and bends his arm, acceleration readings are the same as when he holds out his phone in landscape and then bends his arm in the same direction.
This is what I call the "user" coordinate system. It is different from the world coordinate system since it should not matter what wind direction the user is facing. It is different from device coordinates since it should not matter how the user holds his device.
It is acceptable in my application to do a calibration step before each movement so the base/resting orientation matrices can be determined. Is it perhaps just a matter of multiplying the matrix of the first movement with the inverse of the second (and then with the new values?)
The answer in the question mentioned seems about right, but I need a more concrete explanation, actual code samples would be ideal.
Note remapCoordinateSystem won't suffice, it only accepts right angles. I need to be able to work with small deviations since the device is strapped to a wrist, which might not always be at right angles with the arm.
I'm currently working on this issue, and I think this might help:
convert device coordinate system to world corrdinate system
We can assume that for the most of the time, people use the phone standing or walking or sitting, which means the user coordinate system share the same z-axis(gravity), and a fixed difference in degree between y-axis (user coordinate, front of user's face) and y-axis(world coordinate, north direction). The difference of degree can be obtained via TYPE_MAGNETIC_FIELD sensor. Thus we can transform from world coordinate to user coordinate.
What's about the user using the phone lying on the bed? I think for that kind of case, a pre-calibration is need by define y-axis of user coordinate, like a movement to tell the phone which direction is the front of user's face.

Android Sensors

I have a very basic question about Sensors:
Do magnetic sensors return readings w.r.t the phone's initial orientation or w.r.t the world coordinates?
What about accelerometers? Do they return values w.r.t their previous readings or is each value an independent acceleration relative to the world coordinate system?
I know that gyros return readings relative to the phone's initial orientation. So, how do I convert the yaw, pitch and roll readings from a gyro into the azimuth, pitch and roll readings from a magnetic sensor of a smartphone (I'm using HTC hero)
Thanks!
As mentioned, the gyroscope measures the angular velocity.
The third value returned (values[2]) is the angular velocity regarding the
z axis. You can use this value together with the initial value from the magnetometer to
calculate current heading: Theta(i+1) = Theta(i) + Wgyro*deltaT
You can receive initial heading orientation from 'Orientation' measurement (values[0])
This measurement is dependent only on the magnetometer. (you can put a magnet or a second phone close to the Smartphone and watch the output going crazy)
The second and third values of the 'Orientation' are dependent on the readings of the
Accelerometer. Since the Accelerometer measures gravity, it is possible to calculate
the pitch and roll angles from the Accelerometer readings in Axis Y and X.
Hope this helps
Ariel
Android Sensors (upto FroYo) provide the application with "raw" data.
There is bare minimum of "cooking" (ie processing) involved.
The accel & compass device provide absolute accel & magnetic data respectively.
The gyroscope provides relative angular velocity.
Gyroscopes do NOT provide relative data wrt any specific state/position.
What you need to understand is that gyroscopic data is angular-velocity.
Angular velocity is simply, how fast the phone is rotating (in degrees-per-second).
So once you hold it still, gyro says (0,0,0) &
as you rotate, you get how fast it is rotating.
This continues until u again hold it back still
when the gyro reading again becomes (0,0,0).
Theoretically the gyro can be used in "callibrate" the compass.
But to do so would require a lot of experimentation on your part.
The ideal place to fiddle around would be the sensor-HAL.
NOTE: You would need to turn-ON all the sensor h/w even if
ONLY compass data is reqd. As you will be cross-referencing
the gyro/accel data for that. This will mean larger power consumption &
extremely poor battery life. All the sensors turned on continuously can
drain the battery of a standard Android phone in 4-5hrs.
You can read more Android Sensors here.

Categories

Resources