Using the Android Accelerometer without having the device flat - android

I've been playing around with the Android Accelerometer of late using the Android SDK and the Adobe AIR for Android SDK on my Motorola Droid. What I've noticed is that the accelerometer works just fine, but I was wondering if it is possible to compensate in some fashion so that I don't have to use the device in a horizontal position. In other words, I want to use the accelerometer to control my visual display, but initialize it(or modify in some way) so that I don't have to hold it perfectly flat (not much fun having to lean over the phone).
Does anyone know how I can hold the device comfortably in my hand, say 45 degrees, and still utilize the accelerometer to provide forward/backwards readings? Is this possible? Any examples of this this available?

You'll need some simple matrix multiplication math for that. "Calibrate" the rotation by taking the current matrix when you start the app and invert it, then multiply all subsequent matrices with it - that will give you the delta to the starting position.

I had written an application long long ago which dealt with relative rotations. I've forgotten what the code did, but from what I can see, it seems like -
1) I get the initial rotation matrix using - SensorManager.getRotationMatrix(R, I, gravity.clone(), geomagnetic.clone()); (gravity and geomagnetic are the respective accleration and geomagnetic matrices. Dunno why I use clones but there must be some reason.)
2) Now at any stage, get the current rotation matrix and call it R.
3)Calculate the inverse of the initial matrix and call it "initialInverse".
4)Multiply R with initialInverse (or the other way round, you'll have to figure it out).
5) Get your final orientation using SensorManager.getOrientation(delta, values)
I'm sorry but I've totally forgotten what the above code does. I think I remember reading the words Euler Transform somewhere when I wrote this app, so thats what it might be doing. Unfortunately I cannot give you the complete code since I'll probably release this app in the market. However, if you need some more information, please let me know - I'll look into the code and get back to you.

I am working in a project with similar nature where the accelerometer function is not restricted by the position. My way of handling it is very simple, initialize the accelerometer with the current reading as the default. In other words, you have a button that you press once you have the phone in the proper position, upon pressing the button, the current readings of the accelerometer (measures of G) will be your reference (zero values), and make changes when you go above or below those readings... Hope this helps anyone... cheers

Related

Finding a devices relative orientation in android

From what I understand, android uses world coordinates for the rotation matrix that I would use to get the orientation of the device. However I'm looking to get the devices orientation relative to the device itself similar to how attitude is represented in iOS.
In other words the axis used for roll would be a line passing through the top and bottom of the device, the pitch axis passing through the sides of the device and the yaw axis passing vertically through the device.
I would like to know if android provides any methods that would allow me to get these orientation values or if there is a way I'd be able to do this manually.
Any help would be greatly appreciated.
I eventually figured out a solution that suited my needs. I simply used the getAngleChange() method from the SensorManager class and used a calibration matrix as the previous matrix.

Get device motion direction using accelerometer

I'm trying to detect the direction of motion of an android device using the accelerometer. I've removed the gravity components from the accelerometer output values and deducted the angles.
So I am able to get the direction of motion but I also get arbitrary angles momentarily. Like when I stop moving the device.
Any suggestions how I could filter those angles out?
EDIT: I've somewhat been able to solve this by taking the mean of the current and past values.
Another problem that persists is that initially for a few moments, the accelerometer reports values in the opposite direction of motion.
That's a typical trouble with accelerometers... Initially there's no any solution because of the inertia, etc... But You can try using some kind of "integer controller".
Another possible solution is detecting abrupt changes on the acceleration and interpret them as changes of direction, I mean, for example if you have the acceleration on the X edge (Ax).
while(1){
Ax = readAx();
if(changeSign(Ax)){ //From + to - or - to +.
//Do what ever you need, for example if sign is changed and keep on it then is that the mobilephone is been moved in the other direction. Else if it's acceleration is close to 0 it means that the device has stopped
}else{
//The device keep moving on the same direction.
}
}
Feel free to be creative. There are many ways to manage a solution depends on your target.
I hope it helps.

Android turn detection math

I would like to develop a personal app for this i need to detect my car's rotation.
In a previous thread i got an answert to which sensors are good for that it's okay.
Now i would like to ask you to please summerize the essential/needed mathematical relationship.
What i would like to see in my app:
- The car rotation in degrees
- The car actual speed (in general this app will be used in slow speed situation like 3-5km/h)
I think the harder part of this is the rotation detect in real time. It will be good to the app could work when i place the phone in a car holder in landscape or portrait mode.
So please summerize me which equations,formulas,realtionships are needed to calculate the car rotation. And please tell me your recomendation to which motion/position sensor are best for this purpuse (gravity,accelerometer,gyro,..)
First i thought that i will use Android 2.2 for better compatibility with my phones but for me 2.3.3 is okay too. In this case i can use TYPE_ROTATION_VECTOR which looks like a good thing but i don't really know that it can be a useful for me or not?
I don't want full source codes i would like to develop it myself but i need to know where can i start, what deep math knowlegde needed and what part of math are needed. And for sensor question: i'am a bit confused there are many sensors which are possible ok for me.
Thanks,
There is no deep math that you need. You should use TYPE_MAGNETIC_FIELD and TYPE_GRAVITY if it is available. Otherwise use TYPE_ACCELEROMETER but you need to filter the accelerometer values using Kalman filter or low pass filter. You should use the direction of the back camera as the reference. This direction is the azimuth returned by calling getOrientation, but before calling getOrientation you need to call remapCoordinateSystem(inR, AXIS_X, AXIS_Z, outR) to get the correct azimuth. Then along as the device is not laying flat, it does not matter what the device orientation is (landscape or portrait). Just make sure that the phone screen is facing the opposite direction of the car direction.
Now declare two class members startDirection and endDirection. In the beginning, startDirection and endDirection have the same values, now if the azimuth change by more than say 3 degrees, there is always a little fluctuation, then change the endDirection to this value and continue to change until say 20 returned azimuth have the same values (you have to experiment with this number). This mean that the car stop turning and then you calculate the difference between startDirection and endDirection, this gives you the degree of rotation. Now set startDirection and endDirection to this new azimuth and wait for next turn.

Getting magnetic field values in global coordinates

For an Android application, I need to get magnetic field measurements across the axis of global (world's) coordinate system. Here is how I'm going (guessing) to implement this. Please, correct me if necessary. Also, please, note that the question is about algorithmic part of the task, and not about Android APIs for sensors - I have an experience with the latter.
First step is to obtain TYPE_MAGNETIC_FIELD sensor data (M) and TYPE_ACCELEROMETER sensor data (G). The second is supposed to be used according to Android's documentation, but I'm not sure if it shouldn't be TYPE_GRAVITY instead (again as G), because accelerometer seems providing not the pure gravity.
Next step is to get rotation matrices via getRotationMatrix(R, I, G, M), where R and I are rotation and inclination matrix correspondingly.
And now goes the most questionnable part: in order to convert M vector into the world's coordinate system, I suppose to multiply [R * I] * M.
I'm not sure this is a correct way for transforming magnetic field reading into another basis. Also, I don't know if remapCoordinateSystem should be used in addition or as replacement for something above.
If there exists some source code which does this thing already, I'd appreciate posting a link, but I don't want to use big general purposes libraries (for example, for augmented reality support) for this specific task, because I'd like to keep it as simple as possible.
P.S.
I came to the idea to add some information to the original post for clarity.
Let us suppose a device rests on a table and continuously reads data from its magnetic sensor. Each measurement contains 3 values, presenting magnetic field in axis X, Y, Z, which are device's local coordinate system. I take it that I can neglect environmental field fluctuations (smoothed by lowpass filter), so this 3 values should remain almost the same all the time the device remains in place. If we rotate device around any axis, the values change, because we change the local coordinate system. But the field itself is not actually changed. So I want to translate local X, Y, Z field measurements into such X', Y', Z', that they keep their respective values regardless to device rotation, provided that device is not moved from its location (only rotated).
I've implemented the algorithm described above and got regular and noticable changes in values X', Y', Z', obtained through suggested transformations, so there is something wrong in it.
P.P.S.
Occasionally I've found an exact duplicate of my question here on SO - How can I get the magnetic field vector, independent of the device rotation? - but unfortunately the answer contains my suggestions, and OP of that question confirms that they do not work.
The coordinates of M with respect to the word coordinate is just the multiplication R*M.
The rotation matrix R is mathematically the change of basis matrix from the device coordinate to the word coordinate.
Let X, Y, Z be the device coordinate basis and W_1, W_2, W_3 be the word coordinate basis then
M = m_1 X + m_2 Y + m_3
and also
M = c_1 W_1 + c_2 W_2 + c_3 W_3
where R * (m_1, m_2, m_3) = (c_1, c_2, c_3) transpose.
Low pass filter is only used to filter out accelerations in the X, Y directions. RemapCoordinateSystem is used to change the order of the basis, ie changing from W_1, W_2, W_3 to W_1, W_3, W_2.
The magnetometer sensor on your device returns a 3-vector in device coordinates. You can use getRotationMatrix() to get a matrix that could be used to convert that device-coordinates
vector to world coordinates. You could also learn about Quaternions and use
TYPE_ROTATION_VECTOR directly. However, there's no Quaternion library in Android (that I know of) and that's a discussion beyond the scope of this question.
However, none of this will do you any good because the device orientation information is based in part on the value from the magnetometers. In other words, the device will always tell you that the magnetic vector is facing exactly North.
Now, what you can do is get magnetic dip. This is one of the outputs from getRotationMatrix(), although you'll have to convert a matrix to an angle for it to be useful. That too, is beyond the scope of this question.
Finally, your last option is to build a table which is level and which has an arrow on it pointing true north. (You'll have to align it by the stars at night or something.) Then, place your device flat on the table with the top of the device facing north. In this case, device coordinates will be the same as world coordinates and the magnetometer sensor will produce the values you want.
Your comments indicate that you're interested in local variations. There's simply no way to get true north with your Android device alone. Theoretically, you could build a table as I described, and then walk around holding the device in strictly the same orientation as before, keeping an eye on the table for reference. I doubt you could pull it off, though.
You could try using gyros in your app to help you keep the device oriented exactly the same way at all times, but the gyros in any Android device you use are likely to drift too much for this to work.
Or perhaps we still don't understand what you're trying to do. Bottom line, though, is that you simply cannot get a global coordinate system with an Android device alone -- whatever you get will always be aligned with the local magnetic field at that exact spot.

Which android sensor will give ma data if I move phone like this...?

I am getting start sensor. I want to know which sensor in android will help me. and how? If I want to know when my phone moves like this.. Anim Video .
Can any one please help me?
You need Acceleration sensor and Orientation sensor as well to accuratelly recognize the move described in your video.
So, I would recommend to collect the data from mentioned sensors and describe movement (acceleration and orientation should have separate 3D functions) with few mathematical functions/graphs. Then , within certain accuracy, check whether repeated move matches expected behaviour on all axis.
You will need this for reference on the axis:
http://developer.android.com/reference/android/hardware/SensorEvent.html
Looks like no change (or very small) should occur on Y-axis to describe move from your video.

Categories

Resources