why do android rotation sensor events change when phone hasnt rotated? - android

I'm using TYPE_GAME_ROTATION_VECTOR and changing it to a matrix with getRotationMatrixFromVector.
With this rotation matrix, I multiple my view.
Basically I want to change what is shown on the phone, based on whether I'm looking left, up, down, right, etc. Ie a basic VR type app.
However I'm finding it extremely odd that when I leave my phone on my table, in other words, it is flat and not moving, there is about 15-20 seconds where the "rotation matrix settles down" and stops rotating.
Why does this initial "erroneous rotation readings" occur?
And is there a way around this? If not, what is the proper way to calculate rotation of the phone? Am I suppose to use the GYROSCOPE sensor?
Any guidance will be greatly appreciated. Thanks!

Related

How to put stable marker (solid circle on image) . Which will be at same position on next frame of camera?

I am thinking to put markers on images taken from camera output similar to what Google Photoscan application does. As I can see the Google Photoscan app puts four solid circle on image which is overlay and then moves the center hallow circle towards all four solid circles and capture the four images. Stitch them together to create a high quality image.
Screenshots for reference (The Solid dots you can see are always there even on the same color background Even if you move the camera around and back to initial position they will display at same position):
The Solid dots you can see are always there even on the same color background Even if you move the camera around and back to initial position they will display at same position
I am very curious how they are able to stable those four solid circles? Are they using any optical flow algorithm ? Or any motion sensors ? I tested application on white colour or same colour background those dots stay stable.
I implemented this functionality using optical flow algorithm (Lucas–Kanade method in openCV).But they are not stable when I am using them on same colour background or on white colour background (basically in Lucas–Kanade algorithm if it does not find the feature it tries to shift that point). Here is the screenshot for my implementation:
You are almost close. Using single sensor either alone gyroscope or compass will not work. By combining result of these, we can achieve your requirement.
Ingredient 1 : Accelerometer
Accelerometers in mobile phones are used to detect the orientation of the phone. The gyroscope, or gyro for short, adds an additional dimension to the information supplied by the accelerometer by tracking rotation or twist. An accelerometer measures linear acceleration of movement.
Ingredient 2 : gyroscope
In practice, an accelerometer will measure the directional movement of a device but will not be able to resolve its lateral orientation or tilt during that movement accurately unless a gyro is there to fill in that info.
Ingredient 3 : Digital compass
The digital compass that's usually based on a sensor called magnetometer provides mobile phones with a simple orientation in relation to the Earth's magnetic field. As a result, your phone always knows which way is North so it can auto rotate your digital maps depending on your physical orientation.
With an accelerometer you can either get a really "noisy" info output
that is responsive, or you can get a "clean" output that's sluggish.
But when you combine the 3-axis accelerometer with a 3-axis gyro, you
get an output that is both clean and responsive in the same time.
Coming back to your question, Lucas–Kanade method in openCV result delayed causing glitch or the sensors not giving accurate result from your device.
It's more of a CV problem.
I really appreciate #Jeek Axio's answer. You can use multiple sensors on Android device as 'prime' factors in CV problem.
However as of state-of-the-art CV methods, it's possible to solve this tracking problem on a very good accuracy.
You may use EKLT, PointTrack methods to track the feature points.
There's also a full-featured toolbox called FTK.

What is the best sensor to detect the movement

I'm working on an android application to detect the human movements like FRONT, BACK, RIGHT, and LEFT.
So suppose that your phone is in front of your face, if you move it to left the X axe should give some negative values and if you move it up the Y axe should give some positive values, what is the best sensor for this job?
I think accelerometer isn't good for this job.
You should use the rotation vector or the gyroscope, if your device has them.
You can't know if the person using the device is walking (in any direction), because sensors are only relative to the device. You should use position to do that.

Making an app with an arrow always pointing to the earths ground

Which sensor is needed to make an app where an arrow on the display is always pointing to the ground?
My app is always in landscape. Ive tried to use the Orientation sensor but it only works if Im holding my smartphone in portrait mode. The more I move my device to landscape values become instable und doesnt point to the ground anymore.
To be more specific, in portrait mode I can use y-axis (roll) to find out the angle, but the more Im rotating my device to landscape mode it doesnt work anymore with the y-axis.
Maybe its the wrong sensor or its a question of some trigonometry functions?
Any ideas? Please help me.
Ive found the solution.
The problem described is also known as Gimbbal Lock.
See here and here.
For me the solution can be found here and is the trivial sentence:
Using the camera (Y axis along the camera's axis) for an augmented
reality application where the rotation angles are needed:
remapCoordinateSystem(inR, AXIS_X, AXIS_Z, outR);
This way I can do what I want to do because the Gimbbal Lock is lying in a position where it doesnt argue me.

Get device motion direction using accelerometer

I'm trying to detect the direction of motion of an android device using the accelerometer. I've removed the gravity components from the accelerometer output values and deducted the angles.
So I am able to get the direction of motion but I also get arbitrary angles momentarily. Like when I stop moving the device.
Any suggestions how I could filter those angles out?
EDIT: I've somewhat been able to solve this by taking the mean of the current and past values.
Another problem that persists is that initially for a few moments, the accelerometer reports values in the opposite direction of motion.
That's a typical trouble with accelerometers... Initially there's no any solution because of the inertia, etc... But You can try using some kind of "integer controller".
Another possible solution is detecting abrupt changes on the acceleration and interpret them as changes of direction, I mean, for example if you have the acceleration on the X edge (Ax).
while(1){
Ax = readAx();
if(changeSign(Ax)){ //From + to - or - to +.
//Do what ever you need, for example if sign is changed and keep on it then is that the mobilephone is been moved in the other direction. Else if it's acceleration is close to 0 it means that the device has stopped
}else{
//The device keep moving on the same direction.
}
}
Feel free to be creative. There are many ways to manage a solution depends on your target.
I hope it helps.

Android turn detection math

I would like to develop a personal app for this i need to detect my car's rotation.
In a previous thread i got an answert to which sensors are good for that it's okay.
Now i would like to ask you to please summerize the essential/needed mathematical relationship.
What i would like to see in my app:
- The car rotation in degrees
- The car actual speed (in general this app will be used in slow speed situation like 3-5km/h)
I think the harder part of this is the rotation detect in real time. It will be good to the app could work when i place the phone in a car holder in landscape or portrait mode.
So please summerize me which equations,formulas,realtionships are needed to calculate the car rotation. And please tell me your recomendation to which motion/position sensor are best for this purpuse (gravity,accelerometer,gyro,..)
First i thought that i will use Android 2.2 for better compatibility with my phones but for me 2.3.3 is okay too. In this case i can use TYPE_ROTATION_VECTOR which looks like a good thing but i don't really know that it can be a useful for me or not?
I don't want full source codes i would like to develop it myself but i need to know where can i start, what deep math knowlegde needed and what part of math are needed. And for sensor question: i'am a bit confused there are many sensors which are possible ok for me.
Thanks,
There is no deep math that you need. You should use TYPE_MAGNETIC_FIELD and TYPE_GRAVITY if it is available. Otherwise use TYPE_ACCELEROMETER but you need to filter the accelerometer values using Kalman filter or low pass filter. You should use the direction of the back camera as the reference. This direction is the azimuth returned by calling getOrientation, but before calling getOrientation you need to call remapCoordinateSystem(inR, AXIS_X, AXIS_Z, outR) to get the correct azimuth. Then along as the device is not laying flat, it does not matter what the device orientation is (landscape or portrait). Just make sure that the phone screen is facing the opposite direction of the car direction.
Now declare two class members startDirection and endDirection. In the beginning, startDirection and endDirection have the same values, now if the azimuth change by more than say 3 degrees, there is always a little fluctuation, then change the endDirection to this value and continue to change until say 20 returned azimuth have the same values (you have to experiment with this number). This mean that the car stop turning and then you calculate the difference between startDirection and endDirection, this gives you the degree of rotation. Now set startDirection and endDirection to this new azimuth and wait for next turn.

Categories

Resources