From what I understand, android uses world coordinates for the rotation matrix that I would use to get the orientation of the device. However I'm looking to get the devices orientation relative to the device itself similar to how attitude is represented in iOS.
In other words the axis used for roll would be a line passing through the top and bottom of the device, the pitch axis passing through the sides of the device and the yaw axis passing vertically through the device.
I would like to know if android provides any methods that would allow me to get these orientation values or if there is a way I'd be able to do this manually.
Any help would be greatly appreciated.
I eventually figured out a solution that suited my needs. I simply used the getAngleChange() method from the SensorManager class and used a calibration matrix as the previous matrix.
Related
I want to use my phone as a wheel in an android game. To do so I have to save the current orientation of my phone and get the relative angels to this saved orientation in device coordinates.
For example if I rotate the device around the z axis (see image above) I want to get that angle in respekt to the orientation I saved before.
From libGDX I only get the azimuth, pitch and roll angles relative to the world coordinate system (if I understood this right):
Any idea how I can calculate those relative angles?
How to calculate relative orientation.
1) You can convert them to quaternions using this.
2) Take conjugate of the quaternion of initial orientation using this.
3) Multiply both using this.
4) You now have relative orientation in form of a quaternion. To use it, you may either want to convert it to axis angle form using this, or transform a vector by this orientation.
See whether do you really need relative orientation.
If you need a simple wheel, accelerometer might be enough. More on this here: Compute relative orientation given azimuth, pitch, and roll in android?
Hope this helps. Good luck.
I would like to develop a personal app for this i need to detect my car's rotation.
In a previous thread i got an answert to which sensors are good for that it's okay.
Now i would like to ask you to please summerize the essential/needed mathematical relationship.
What i would like to see in my app:
- The car rotation in degrees
- The car actual speed (in general this app will be used in slow speed situation like 3-5km/h)
I think the harder part of this is the rotation detect in real time. It will be good to the app could work when i place the phone in a car holder in landscape or portrait mode.
So please summerize me which equations,formulas,realtionships are needed to calculate the car rotation. And please tell me your recomendation to which motion/position sensor are best for this purpuse (gravity,accelerometer,gyro,..)
First i thought that i will use Android 2.2 for better compatibility with my phones but for me 2.3.3 is okay too. In this case i can use TYPE_ROTATION_VECTOR which looks like a good thing but i don't really know that it can be a useful for me or not?
I don't want full source codes i would like to develop it myself but i need to know where can i start, what deep math knowlegde needed and what part of math are needed. And for sensor question: i'am a bit confused there are many sensors which are possible ok for me.
Thanks,
There is no deep math that you need. You should use TYPE_MAGNETIC_FIELD and TYPE_GRAVITY if it is available. Otherwise use TYPE_ACCELEROMETER but you need to filter the accelerometer values using Kalman filter or low pass filter. You should use the direction of the back camera as the reference. This direction is the azimuth returned by calling getOrientation, but before calling getOrientation you need to call remapCoordinateSystem(inR, AXIS_X, AXIS_Z, outR) to get the correct azimuth. Then along as the device is not laying flat, it does not matter what the device orientation is (landscape or portrait). Just make sure that the phone screen is facing the opposite direction of the car direction.
Now declare two class members startDirection and endDirection. In the beginning, startDirection and endDirection have the same values, now if the azimuth change by more than say 3 degrees, there is always a little fluctuation, then change the endDirection to this value and continue to change until say 20 returned azimuth have the same values (you have to experiment with this number). This mean that the car stop turning and then you calculate the difference between startDirection and endDirection, this gives you the degree of rotation. Now set startDirection and endDirection to this new azimuth and wait for next turn.
I am developing an app using android OS for which I need to know how can I calculate the movement of the device up in the vertical direction.
For example, the device is at rest (point A), the user picks it up in his hand (point B), now there is a height change between point A and point B, how would i calculate that?
I have already gone through the articles about sensors and accelerometers, but I couldn't really find anything to help me with that. Anyone have any ideas?
If you integrate the acceleration twice you get position but the error is horrible. It is useless in practice. Here is an explanation why (Google Tech Talk) at 23:20. I highly recommend this video.
Now, you do not need anything accurate and that is a different story. The linear acceleration is available after sensor fusion, as described in the video. See Sensor.TYPE_LINEAR_ACCELERATION at SensorEvent. I would first try a high-pass filter to detect sudden increase in the linear acceleration along the vertical axis.
I have no idea whether it is good for your application.
You can actually establish (only) the vertical position without measuring acceleration over time. This is accomplished by measuring the angle between the direction to the center of the earth, and the direction to the magnetic north pole.
This only changes (significantly) when the altitude (height) of the phone changes. What you do is use the accelerometer and magnetometer to get two float[3] arrays, treat these as vectors, make them unit vectors, and then the angle between any two unit vectors is arccos(AxM).
Note that's dot product ie. math.acos(A[0]*B[0]+A[1]*B[1]+A[2]*B[2]) Any change in this angle corresponds to a change in height. Also note that this will have to be calibrated to real units and the ratio of change in angle to height will be different at various longitudes; But this is a method of getting an absolute value for height; though of course the angle also becomes skewed when undergoing acceleration, or when there are nearby magnets :)
you can correlate it to magnetic field sensor in microTesla
You can use dist= integral of integral of acceleration ~ sigma ~ summation
= integral of speed+constant
When I listen to orientation event in an android app, I get a SensorEvent, which contains 3 floats - azimuth, pitch, and roll in relation to the real-world's axis.
Now say I am building an app like labyrinth, but I don't want to force the user the be over the phone and hold the phone such that the xy plane is parallel to the ground. Instead I want to be able to allow the user to hold the phone as they wish, laying down or, perhaps, sitting down and holding the phone at an angle. In other words, I need to calibrate the phone in accordance with the user's preference.
How can I do that?
Also note that I believe that my answer has to do with getRotationMatrix and getOrientation, but I am not sure how!
Please help! I've been stuck at this for hours.
For a Labyrinth style app, you probably care more for the acceleration (gravity) vector than the axes orientation. This vector, in Phone coordinate system, is given by the combination of the three accelerometers measurements, rather than the rotation angles. Specifically, only the x and y readings should affect the ball's motion.
If you do actually need the orientation, then the 3 angular readings represent the 3 Euler angles. However, I suspect you probably don't really need the angles themselves, but rather the rotation matrix R, which is returned by the getRotationMatrix() API. Once you have this matrix, then it is basically the calibration that you are looking for. When you want to transform a vector in world coordinates to your device coordinates, you should multiply it by the inverse of this matrix (where in this special case, inv(R) = transpose(R).
So, following the example I found in the documentation, if you want to transform the world gravity vector g ([0 0 g]) to the device coordinates, multiply it by inv(R):
g = inv(R) * g
(note that this should give you the same result as reading the accelerometers)
Possible APIs to use here: invertM() and multiplyMV() methods of the matrix class.
I don't know of any android-specific APIs, but all you want to do is decrease the azimuth by a certain amount, right? So you move the "origin" from (0,0,0) to whatever they want. In pseudocode:
myGetRotationMatrix:
return getRotationMatrix() - origin
I've been playing around with the Android Accelerometer of late using the Android SDK and the Adobe AIR for Android SDK on my Motorola Droid. What I've noticed is that the accelerometer works just fine, but I was wondering if it is possible to compensate in some fashion so that I don't have to use the device in a horizontal position. In other words, I want to use the accelerometer to control my visual display, but initialize it(or modify in some way) so that I don't have to hold it perfectly flat (not much fun having to lean over the phone).
Does anyone know how I can hold the device comfortably in my hand, say 45 degrees, and still utilize the accelerometer to provide forward/backwards readings? Is this possible? Any examples of this this available?
You'll need some simple matrix multiplication math for that. "Calibrate" the rotation by taking the current matrix when you start the app and invert it, then multiply all subsequent matrices with it - that will give you the delta to the starting position.
I had written an application long long ago which dealt with relative rotations. I've forgotten what the code did, but from what I can see, it seems like -
1) I get the initial rotation matrix using - SensorManager.getRotationMatrix(R, I, gravity.clone(), geomagnetic.clone()); (gravity and geomagnetic are the respective accleration and geomagnetic matrices. Dunno why I use clones but there must be some reason.)
2) Now at any stage, get the current rotation matrix and call it R.
3)Calculate the inverse of the initial matrix and call it "initialInverse".
4)Multiply R with initialInverse (or the other way round, you'll have to figure it out).
5) Get your final orientation using SensorManager.getOrientation(delta, values)
I'm sorry but I've totally forgotten what the above code does. I think I remember reading the words Euler Transform somewhere when I wrote this app, so thats what it might be doing. Unfortunately I cannot give you the complete code since I'll probably release this app in the market. However, if you need some more information, please let me know - I'll look into the code and get back to you.
I am working in a project with similar nature where the accelerometer function is not restricted by the position. My way of handling it is very simple, initialize the accelerometer with the current reading as the default. In other words, you have a button that you press once you have the phone in the proper position, upon pressing the button, the current readings of the accelerometer (measures of G) will be your reference (zero values), and make changes when you go above or below those readings... Hope this helps anyone... cheers