Why do z-axis rotations reflect on y-axis in Nexus S? - android

I'm trying to understand the gyroscope sensor output from the Nexus S. When you rotate about the z-axis (the axis perpendicular to the screen), the z-axis correctly records the angular velocity as a result of the turn. However, the y-axis also reports spikes in angular velocity, despite no change in the orientation of the device relative to the y-axis. So if I turn around when holding the phone at a particular orientation, it appears that I have not only turned around, but also tilted the phone left/right (which I haven't).
Why is this? Any ideas how to compensate or correct for this?
Please note I'm talking about the raw sensor output here.

First of all raw sensor data always contains some bias that has to be filtered out. Furtheron it is almost impossible to really suppress other rotational directions manually. If you suspect your Nexus to have a sensor bug or you want to know exactly what's going on I suggest to build your own apparatus. As inspiration a picture of mine :-)

Related

Is there a way to know the exact orientation of the phone?

Usually, there's a compass and an accelerometer on a typical smartphone these days. So, using information from these sensors can we recreate the whole orientation in which the photo was taken?
I mean like if you open the compass app on your phone, it firstly states the direction you are looking (link), how lifted or tilted is your phone in front direction ( I don't know how to state it in a better way) (link) and how much is it titled in sideways (link). Does it cover 3 degrees of freedom (i guess)?
Is it enough information to recreate that orientation of the phone?
Also if you think it's not the right place to ask such a question, can you comment where should I ask this question?
Recording sensor values, it would be possible to restore phone direction in which photo was taken. Some fundamental XR apps(e.g. Google Street View) are actually doing such.
Device rotation
There is a handy helper function, SensorManager.getRotationMatrix exactly for that purpose. Giving magnetic field and gravity sensor-values, you can obtain the device rotation matrix(=complete set of orthogonal 3d basis vectors) which is enough for you to reproduce phone direction afterwards.
Camera orientation
In some cases camera's up direction may be different from device's up direction due to screen orientation changes(portrait or landscape, locked or auto-rotate). So if you handle raw camera inputs, you may need to record screen orientation too. See Display.getRotation.

Measuring exact rotation angle of device in android

I need to get exact angle of my android device from each axis.
First I should say I searched through web and also stackoverflow and I saw pages like these:
How to measure the tilt of the phone in XY plane using accelerometer in Android
Get Android rotation angle in x-axis
But they were not helpful. I used some of them but they give wrong values.
If the axis in device will be like this:
I need to get these rotation angle:
I don't want to use any hardware sensor except accelerometer and I work with API 17.
I think this is possible based on racing games, but I don't know how to do that.
I will appreciate if you help me.
Thanks
There is no way you can get all the rotations using TYPE_ACCELEROMETER alone. You also need TYPE_MAGNETIC_FIELD. If you do not care about how sensitive the values changed with respect to quick movement then you need TYPE_GRAVITY or TYPE_ACCELEROMETER together with TYPE_MAGNETIC_FIELD.
For rotation with respect to the z-axis you can follow the first link in your question.
For rotation with respect to the x and y axes you can use pitch and roll and screen facing.

Android API - disable right landscape?

I am busy writing an app that lets me use my phone (Galaxy S2 2.2.3) as a steering wheel. Just a nerdy weekend project really.
I have gotten everything working regarding calculating the orientation of the device using SensorManager.GetOrientation() with a slight snag. The rotation around the axis which comes out of the screen and back of the phone rises from 0 to 90 degrees and then falls back down in the same manner to 0 degrees instead of proceeding to 180.
This had me really confused until I read something somewhere that suggested the API might be flipping (internally as the screen doesn't flip) orientation so the phones coordinates system is flipping from left to right landscape (Its worth noting that I have it locked in landscape mode in the Manifest). This explains the weird behaviour in terms of orientation.
Does anyone know how to stop this happening, or have I gone wrong completely in my understanding?

Sensor coordinate system in Android doesn't change, does it?

I read in many places like: One Screen Deserves Another that: "The sensor coordinate system used by the API for the natural orientation of the device does not change as the device moves, and is the same as the OpenGL coordinate system."
Now, I get the same reading as this image:
What I don't understand is: If the coordinate system doesn't change if I rotate the phone (always with the screen facing the user), the gravity force should be applied on Y axis, always. It should only change the axis if I put the phone in a position where the screen is not facing the user anymore like resting on a table, where gravity should be applied on Z axis.
What is wrong with my understanding?
Thanks! Guillermo.
The axis are swapped when the device's screen orientation changes. Per the article you cited:
However, the Android sensor APIs define the sensor coordinate space to be relative to the top and side of the device — not the short and long sides. When the system reorients the screen in response to holding the phone sideways, the sensor coordinate system no longer lines up with the screen’s coordinate system, and you get unexpected rotations in the display of your app.
To access the un-swapped values if you'd like, use indices 3, 4 and 5 in values[], otherwise some of the suggestions mentioned in that same article work quite nicely!
Quite old question, but I find it still relevant. As of today, Sensor Overview page in its Sensor Coordinate System chapter still says:
The most important point to understand about this coordinate system is that the axes are not swapped when the device's screen orientation changes—that is, the sensor's coordinate system never changes as the device moves. This behavior is the same as the behavior of the OpenGL coordinate system.
To me this wording is still confusing, of course it might be because I'm not an English native speaker.
My understanding would be that in Android (as is in iOS) the coordinate system assumed by sensors is integral with the device. That is, the coordinate system is stuck with the device and its axes rotate along with the device.
So, for a phone whose natural orientation is portrait, Y-axis points upward when the phone is held vertically in portrait in front of user. See image below, from same Android guide:
Then when user rotates the phone to landscape left orientation (so with home button on the right side), the Y-axis points to the left. See image below, from a Matlab tutorial (although screen is not really user facing anymore):
Then there's the frequently cited post from Android dev blog, One Screen Turn Deserves Another that says:
The sensor coordinate system used by the API for the natural orientation of the device does not change as the device moves, and is the same as the OpenGL coordinate system
which to me sounds exactly the opposite as my previous reasoning. But then again, in its following So What’s the Problem? chapter, you do see that when phone is rotated to landscape left, Y-axis points to the left as in my previous reasoning.

OnTouch locations tablets vs. phone

I made a particle system that was designed for a tablet.. particles follow finger movement etc. anyways.. I implemented the gyroscope so when you tilt the tablet whatever direction.. all the particles fall to that direction. In the manifest I locked it down to landscape view.
So then I loaded it up on a Samsung Intercept. When I moved that screen around nothing was going in the correct direction at all.. So what I did to fix the situation is
if (width<800) // My tablet width is obviously 800 phone is much less
this.setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_PORTRAIT);
else
this.setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);
this fixes my problem.. but I'm unsure if this is for all phones? My samsung intercept is only for development and it's a POS IMO. Is it the phone or is this just how it works..
Some devices have a natural orientation of portrait and some have a natural orientation of landscape. This will affect the default sensor coordinate system for the device. Take a look at this post from the Android Developers blog for more details and solutions: http://android-developers.blogspot.com/2010/09/one-screen-turn-deserves-another.html

Categories

Resources