Android accelerometer mapping movement to custom coordinate system - android

If I have custom coordinate system X - left/right, Y - forward/backward, Z - Up/down that is represented on my PC screen inside my unreal project, how would I map the accelerator values In a way that when I move my phone toward the PC screen (regardless of the phone orientation) so that my Y value goes up and same for other axes?
I got something similar working with rotation by taking "referent" rotation quaternion, inverting it and multiplying it by current rotation quaternion, but I'm just stuck on how to transform movement.
Example of my problem is that if I'm moving my phone up with screen pointing at sky my Z axis increases which is what I want, but when I also point my phone screen to my PC screen and move it forward Z axis again goes up, when I would want in this case that my Y value increases.
There is a similar question Acceleration from device's coordinate system into absolute coordinate system but that doesn't really solve my problem since I don't want to depend on the location of the north for Y and so on.

Clarification of question intent
It sounds like what you want is the acceleration of your device with respect to your laptop. As you correctly mentioned, the similar question Acceleration from device's coordinate system into absolute coordinate system maps the local accelerometer data of a device with respect to a global frame of reference (FoR) (the Cartesian "flat" Earth FoR to be specific - as opposed to the ultra-realistic spherical Earth FoR).
What you know
From your device, you know the local Phone FoR, and from the link above, you can also find the behavior of your device with respect to a flat Earth FoR with a rotation matrix, which I'll call R_EP for Rotation in Earth FoR from Phone FoR. In order to represent the acceleration of your device with respect to your laptop, you will need to know how your laptop is oriented and positioned with respect to either your phone's FoR (A), or the flat Earth FoR (B), or some other FoR that is known to both your laptop and your phone but I'll ignore this cause it's irrelevant and the method is identical to B.
What you'll need
In the first case, A, this will allow you to construct a rotation matrix which I'll call R_LP for Rotation in Laptop FoR from Phone FoR - and that would be super convenient because that's your answer. But alas, life isn't fun without a little bit of a challenge.
In the second case, B, this will allow you to construct a rotation matrix which I'll call R_LE for Rotation in Laptop FoR from Earth FoR. Because the Hamilton product is associative (but NOT commutative: Are quaternions generally multiplied in an order opposite to matrices?), you can find the acceleration of your phone with respect to your laptop by daisy-chaining the rotations, like so:
a_P]L = R_LE * R_EP * a_P]P
Where the ] means "in the frame of", and a_P is acceleration of the Phone. So a_P]L is the acceleration of the Phone in the Laptop FoR, and a_P]P is the acceleration of the Phone in the Phone's FoR.
NOTE When "daisy-chaining" rotation matrices, it's important that they follow a specific order. Always make sure that the rotation matrices are multiplied in the correct order, see Sections 2.6 and 3.1.4 in [1] for more information.
Hint
To define your laptop's FoR (orientation and position) with respect to the global "flat" Earth FoR, you can place your phone on your laptop and set the current orientation and position as your laptop's FoR. This will let you construct R_LE.
Misconceptions
A rotation quaternion, q, is NEITHER the orientation NOR attitude of one frame of reference relative to another. Instead, it represents a "midpoint" vector normal to the rotation plane about which vectors from one frame of reference are rotated to the other. This is why defining quaternions to rotate from a GLOBAL frame to a local frame (or vice-versa) is incredibly important. The ENU to NED rotation is a perfect example, where the rotation quaternion is [0; sqrt(2)/2; sqrt(2)/2; 0], a "midpoint" between the two abscissa (X) axes (in both the global and local frames of reference). If you do the "right hand rule" with your three fingers pointing along the ENU orientation, and rapidly switch back and forth from the NED orientation, you'll see that the rotation from both FoR's is simply a rotation about [1; 1; 0] in the Global FoR.
References
I cannot recommend the following open-source reference highly enough:
[1] "Quaternion kinematics for the error-state Kalman filter" by Joan SolĂ . https://hal.archives-ouvertes.fr/hal-01122406v5
For a "playground" to experiment with, and gain a "hands-on" understanding of quaternions:
[2] Visualizing quaternions, An explorable video series. Lessons by Grant Sanderson. Technology by Ben Eater https://eater.net/quaternions

Related

Implementing an accurate compass with Android [duplicate]

I'm using the Android gravity and magnetic field sensors to calculate orientation via SensorManager.getRotationMatrix and SensorManager.getOrientation. This gives me the azimuth, pitch and orientation numbers. The results look sensible when the device is lying flat on a table.
However, I've disabled switches between portrait and landscape in the manifest, so that getWindowManager().getDefaultDisplay().getRotation() is always zero. When I rotate the device by 90 degrees so that it's standing vertical I run into trouble. Sometimes the numbers seem quite wrong, and I've realised that this relates to Gimbal lock. However, other apps don't seem to have this problem. For example, I've compared my app against two free sensor test apps (Sensor Tester (Dicotomica) and Sensor Monitoring (R's Software)). My app agrees with these apps when the device is flat, but as I rotate the device into the vertical position there can be significant differences. The two apps seem to agree with each other, so how do they get around this problem?
When the device is not flat, you have to call remapCoordinateSystem(inR, AXIS_X, AXIS_Z, outR); before calling getOrientation.
The azimuth returns by getOrientation is obtained by orthogonally project the device unit Y axis into the world East-North plane and then calculate the angle between the resulting projection vector and the North axis.
Now we normally think of direction as the direction where the back camera is pointing. That is the direction of -Z where Z is the device axis pointing out of the screen. When the device is flat we do not think of direction and accept what ever given. But when it is not flat we expect it is the direction of -Z. But getOrientation calculate the direction of the Y axis, thus we need to swap the Y and Z axes before calling getOrientation. That is exactly what remapCoordinateSystem(inR, AXIS_X, AXIS_Z, outR) does, it keep the X axis intact and map Z to Y.
Now so how do you know when to remap or not. You can do that by checking
float inclination = (float) Math.acos(rotationMatrix[8]);
if (result.inclination < TWENTY_FIVE_DEGREE_IN_RADIAN
|| result.inclination > ONE_FIFTY_FIVE_DEGREE_IN_RADIAN)
{
// device is flat just call getOrientation
}
else
{
// call remap
}
The inclination above is the angle between the device screen and the world East-North plane. It shows how much the device is tilting.
I think the best way of defining your orientation angles when the device isn't flat is to use a more appropriate angular co-ordinate system that the standard Euler angles that you get from SensorManager.getOrientation(...). I suggest the one that I describe here on math.stackexchange.com. I've also put some code that does implements it in an answer here. Apart from a good definition of azimuth, it also has a definition of the pitch angle which is exactly the angle given by Math.acos(rotationMatrix[8]) that is mentioned in another answer here.
You can get full details from the two links that I've given in the first paragraph. However, in summary, your rotation matrix R from SensorManager.getRotationMatrix(...) is
where (Ex, Ey, Ez), (Nx, Ny, Nz) and (Gx, Gy, Gz) are vectors pointing due East, North, and in the direction of Gravity. Then the azimuth angle that you want is given by
I'll add an example that worked for me, using Hoan's answer, and also highlight some new resources that are available now and may help when looking at apps that need to use the gyro sensors.
Firstly, there is a Google Code lab available with a sample app - I found the completed sample app very useful for understanding a devices behaviour.
The app display looks like this:
The codelabs and the final app version is available at these links (at the time of writing):
https://developer.android.com/codelabs/advanced-android-training-sensor-orientation#0
https://github.com/google-developer-training/android-advanced/tree/master/TiltSpot
In particular this allows you experiment as follows (its easier to do than describe...):
put your device down flat on a table and experiment with rotating on the table while flat, lifting top and bottom (i.e. lift the top of the display, where your front camera typically is, from the table while leaving the bottom, where your home button typically is if you have one, on the table. Similarly, lift the left and right edges of the device. Observe the azimuth, pitch and roll values in the app.
Put your device against a wardrobe door in portrait mode and again experiment moving it around. Open the door to see the effect also. Rotate to landscape to see the difference.
The key thing I think you may see from the above is that everything is very simple and works well when flat and everything is complex and interrelated when not flat.
Looking specifically at a use case I had to display the pitch and roll when vertical in portrait mode, the following worked for me:
when (event?.sensor?.type) {
Sensor.TYPE_ROTATION_VECTOR -> {
// Calculate the rotation matrix
val rotMatrix = FloatArray(9)
var rotationMatrixAdjusted = FloatArray(9)
val azimuthChanged: (Float) -> Unit
val orientation = FloatArray(3)
SensorManager.getRotationMatrixFromVector(rotMatrix, event.values);
SensorManager.remapCoordinateSystem(rotMatrix,
SensorManager.AXIS_Y, SensorManager.AXIS_MINUS_X,
rotationMatrixAdjusted);
SensorManager.getOrientation(rotationMatrixAdjusted, orientation)
val rollAngleRadians = orientation[1]
val pitchAngleRadians = orientation[2]
//Report results back to listener
thisListener.onRollPitchEvent(rollAngleRadians, pitchAngleRadians)
}
}

Unity device movement

I'm trying to move a game object when i raise/lower (Shake) my phone but I don't know how get the device's movement. I already know about Input.acceleration but that just gives me the devices rotation and I want it's actual movement.
Is this possible and how would I go about doing it?
Accelerometer reads the sum of: changes in movement (acceleration) + constant force of gravity. You can use it for directional movement in two ways:
Detect changes of gravity angle - when the device is not moving (or is moving at a constant speed) and is parallel to the ground, the accelerometer will read Earth's gravity, ie: new Vector3(0,-9.81,0). When the device is tilted (so not parallel to the ground), the vector's length will still be 9.81, but it will be rotated a bit like. new Vector3(3.03, -9.32, 0) (this example is rotation in one axis by 56 degrees (pi/10)). Using this will yield this kind of controlls: https://www.youtube.com/watch?v=3EU3ip4k0uE
Detect peaks of acceleration. When the device is still or moving with constant speed, the length of acceleration vector will be equal 9.81, when it changes/starts movement, this number will change. You can detect these changes and interpret this as one time momentary movement (like pressing an arrow).
There are alternatives not using an accelometer, for example you can detect some printed marker with Vuforia: https://www.youtube.com/watch?v=8RnlBEU5tkI - interpret the relative position to the marker as and action in a simmilar fashion as you'd detect acceleration change in #1.

Vertical movement sensor

I am working on an android app that requires the detection of vertical motion. When moving the tablet upward, the Gyroscope, Accelerometer, and Linear Acceleration sensors give a corresponding value indicating upward or downward motion.
The problem I have is that these sensors will also read an upward/downward motion when you tilt the tablet towards the user or away from the user. For example, the x value in the gyroscope represents the vertical plane. But when you tilt the device forwards, the x value will change.
When I make this motion, the same sensor that reads vertical motion reads a value for this.
The same goes for the rest of the sensors. I have tried to use orientation coupled with the gyro to make the conditional statement, if the pitch is not changing, but the x variable is going up/down, then we have vertical motion. The problem with this is that if the user moves it up but tilted slightly, it will no longer work. I also tried making it so if there is a change in tilt, then there is no vertical motion. But it iterates so quickly that there may be a change in tilt for 1/100 of a second, but for the next there isn't.
Is there any way I can read only vertical changes and not changes in the devices pitch?
Here is what I want to detect:
edit:
"Please come up with a mathematically sound definition of what you consider 'moving upwards.'"
This was my initial question, how can I write a function to define when the tablet is moving upwards or downwards? I consider a vertical translation moving upwards. Now how do I detect this? I simply do not know where to begin, thank you.
Ok, even though this question is fairly old, I see a lot of confusion in the present answer and comments, so in case anyone finds this, I intend to clear a few things up.
The Gyroscope
First of all, the gyroscope does not measure vertical motion as per your definition (a translatory motion). It measures rotation around each of the axes, which are defined as in the figure below. Thus having you tilt your device forwards and backwards indeed rotates it around the x axis and therefore you will see non-zero values in the x value of your gyroscope sensor.
the x value in the gyroscope represents the vertical plane.
I'm not sure what is meant by "the vertical plane", however the x value certainly does not represent the plane itself nor the orientation of the device within the plane.
The x value of the gyroscope sensor represents the current angular velocity of the device around the x axis (eg. the change in rotation).
But when you tilt the device forwards, the x value will change. When I make this motion, the same sensor that reads vertical motion reads a value for this.
Not quite sure what you're referring to here. "The same sensor that reads vertical motion" I assume is the gyroscope, but as previously said, it does not read vertical motion. It does exactly what it says on the tin.
The device coordinate system
This is more in response to user Ali's answer than the original question, but it remains relevant in either case.
The individual outputs of the linear acceleration sensor (or any other sensor for that matter) are expressed in the coordinate system of the device, as shown in the image above. This means if you rotate the device slightly, the outputs will no longer be parallel to any world axis they coincided with before. As such, you will either have to enforce that the device is in a particular orientation for your application, or take the new orientation into account.
The ROTATION_VECTOR sensor, combined with quaternion math or the getRotationMatrixFromVector() method, is one way to translate your measurements from device coordinates to world coordinates. There are other ways to achieve the same goal, but once achieved, the way you hold your device won't matter for measuring vertical motion.
In either case, the axis you're looking for is the y axis, not the z axis.
(If by any chance you meant "along device y axis" as "vertical", then just ignore all the orientation stuff and just use the linear acceleration sensor)
Noise
You mentioned some problems regarding noise and update rates in the question, so I'll just mention it here. The simplest and one of the more common ways to get nice, consistent data from something that varies very often is to use a low-pass filter. What type of filter is best depends on the application, but I find that a exponential moving average filter is viable in most cases.
Finishing thoughts
Note that if you take proper care of the orientation, your transformed linear acceleration output will be a good approximation of vertical motion (well, change in motion) without filtering any noise.
Also, if you want to measure vertical "motion", as in velocity, you need to integrate the accelerometer output. For various reasons, this doesn't really turn out too well in most cases, although it is less severe in the case of velocity rather than trying to measure position.
OK, I suspect it is only a partial answer.
If you want to detect vertical movement, you only need linear acceleration, the device orientation doesn't matter. See
iOS - How to tell if device is raised/dropped (CoreMotion)
or
how to calculate phone's movement in the vertical direction from rest?
For some reason you are concerned with the device orientation as well, and I have no idea why. I suspect that you want to detect something else. So please tell us more and then I will improve my answer.
UPDATE
I read the post on coremotion, and you mentioned that higher z lower x and y means vertical motion, can you elaborate?
I will write in pseudo code. You measured the (x, y, z) linear acceleration vector. Compute
rel_z = z/sqrt(x^2+y^2+z^2+1.0e-6)
If rel_z > 0.9 then the acceleration towards the z direction dominates (vertical motion). Note that the constant 0.9 is arbitrary and may require tweaking (should be a positive number less than 1). The 1.0e-6 is there to avoid accidental division by zero.
You may have to add another constraint that z is sufficiently large. I don't know your device, whether it measures gravity as 1 or 9.81. I assume it measures it as 1.
So all in all:
if (rel_z > 0.9 && abs(z) > 0.1) { // we have vertical movement
Again, the constant 0.1 is arbitrary and may require tweaking. It should be positive.
UPDATE 2
I do not want this because rotating it towards me is not moving it upwards
It is moving upwards: The center of mass is moving upwards. My code has the correct behavior.
Please come up with a mathematically sound definition of what you consider "moving upwards."

Changing sensor coordinate system in android

Android provides sensor data in device coordinate system no matter how it is oriented. Is there any way to have sensor data in 'gravity' coordinate system? I mean no matter how the device is oriented I want accelerometer data and orientation in coordinate system where y-axis points toward the sky, x-axis toward the east and z-axis towards south pole.
I took a look at remapCoordinateSystem but seems to be limited to only swapping axis. I guess for orientation I will have to do some low level rotation matrix transformation (or is there any better solution?). But how about acceleration data? Is there any way to have data in relation to coordinate system that is fixed (sort of world coordinate system).
The reason I need this is I'm trying to do some simple motion gestures when phone is in the pocket and it would be easier for me to have all data in coordinates system related to user rather device coordinate system (that will have a little bit different orientation in different user's pockets)
Well you basically get the North orientation when starting - for this you use the accelerometer and the magnetic field sensor to compute orientation (or the deprecated Orientation sensor).
Once you have it you can compute a rotation matrix (Direction Cosine Matrix) from those azimuth, pitch and roll angles. Multiplying your acceleration vector by that matrix will transform your device-frame movements into Earth-frame ones.
As your device will change its orientation as time goes by, you'll need to update it. To do so, retrieve gyroscope's data and update your Direction Cosine Matrix for each new value. You could also get the orientation true value just like the first time, but it's less accurate.
My solution involves DCM, but you could also use quaternions, it's just a matter of choice. Feel free to ask more if needed. I hope this is what you wanted to know !

Compute relative orientation given azimuth, pitch, and roll in android?

When I listen to orientation event in an android app, I get a SensorEvent, which contains 3 floats - azimuth, pitch, and roll in relation to the real-world's axis.
Now say I am building an app like labyrinth, but I don't want to force the user the be over the phone and hold the phone such that the xy plane is parallel to the ground. Instead I want to be able to allow the user to hold the phone as they wish, laying down or, perhaps, sitting down and holding the phone at an angle. In other words, I need to calibrate the phone in accordance with the user's preference.
How can I do that?
Also note that I believe that my answer has to do with getRotationMatrix and getOrientation, but I am not sure how!
Please help! I've been stuck at this for hours.
For a Labyrinth style app, you probably care more for the acceleration (gravity) vector than the axes orientation. This vector, in Phone coordinate system, is given by the combination of the three accelerometers measurements, rather than the rotation angles. Specifically, only the x and y readings should affect the ball's motion.
If you do actually need the orientation, then the 3 angular readings represent the 3 Euler angles. However, I suspect you probably don't really need the angles themselves, but rather the rotation matrix R, which is returned by the getRotationMatrix() API. Once you have this matrix, then it is basically the calibration that you are looking for. When you want to transform a vector in world coordinates to your device coordinates, you should multiply it by the inverse of this matrix (where in this special case, inv(R) = transpose(R).
So, following the example I found in the documentation, if you want to transform the world gravity vector g ([0 0 g]) to the device coordinates, multiply it by inv(R):
g = inv(R) * g
(note that this should give you the same result as reading the accelerometers)
Possible APIs to use here: invertM() and multiplyMV() methods of the matrix class.
I don't know of any android-specific APIs, but all you want to do is decrease the azimuth by a certain amount, right? So you move the "origin" from (0,0,0) to whatever they want. In pseudocode:
myGetRotationMatrix:
return getRotationMatrix() - origin

Categories

Resources