I need to get exact angle of my android device from each axis.
First I should say I searched through web and also stackoverflow and I saw pages like these:
How to measure the tilt of the phone in XY plane using accelerometer in Android
Get Android rotation angle in x-axis
But they were not helpful. I used some of them but they give wrong values.
If the axis in device will be like this:
I need to get these rotation angle:
I don't want to use any hardware sensor except accelerometer and I work with API 17.
I think this is possible based on racing games, but I don't know how to do that.
I will appreciate if you help me.
Thanks
There is no way you can get all the rotations using TYPE_ACCELEROMETER alone. You also need TYPE_MAGNETIC_FIELD. If you do not care about how sensitive the values changed with respect to quick movement then you need TYPE_GRAVITY or TYPE_ACCELEROMETER together with TYPE_MAGNETIC_FIELD.
For rotation with respect to the z-axis you can follow the first link in your question.
For rotation with respect to the x and y axes you can use pitch and roll and screen facing.
Related
I'm using Project Tango to get the depth data and then I send the 3d point cloud to a native class to process the data using pcl library. I have couple of questions, which I think they will help me understanding more about rotation, translation, and coordinate systems in both pcl and tango. Also, they will help me solving my problem in the project I'm working on.
I'm working in an obstacle detection system which should work in real-time, and I'm using the default coordinate system in Tango? is this right? shouldn't I use the area description coordinate frame?
In my current scenario, I'm taking the 3D cloud data which is in the camera frame of reference and process them without changing the frame of reference. I have three related question here:
Is it alright to work directly on the data while it in the camera frame?
Since I use PCL for processing, what is the coordinate system in pcl are the x point to the right, y point to the bottom, and z forward from the camera as in the camera coordinate system of Tango?
Is transforming the cloud to the start of device is transforming to the origin (world frame)?
In the scenario in my project users should hold the device in a X rotated degree around the x axis, I have read from other posted questions that the pitch/roll/yaw are not the reliable way to get the rotation of the device and I have to use the pose data provided by tango, is it correct? how can I determine the right rotation angle of the device so that I can rotate the cloud to make sure that the surface normal of the floor will be parallel to the Y axis? (Please have a look to the pictures to have an idea of what I mean)
How Can I use the pose data to traslate and rotate the cloud data in pcl?
note:
I have a related quetion to this one, which shows the results of my 3d point cloud processing code and its output:
Related Question: how to detect the floor palne?
Thank you
I will post some answers to my questions.
1 and 2.1 - To get better results regarding floor plane normal angle, the best coordinate work for me is OpenGL world coordinate.
2.2- please refer to the answer in this link: answer link (pcl forum)
For the rotation and translation, I used the pose data from Tango device, I have read that they are better than getting the rotation of the device itself.
Thus, I get the the pose data at the same time as the point cloud and then I send them to the native code and perform the translation using PCL library.
Hope this will be helpful for someone.
I'm trying to create an app which requires me to find the compass bearing.
The users will hold their device so that the screen is always facing them but will be able to rotate it 360 degrees.
I can successfully determine the bearing for one orientation - portrait or landscape - but I need to find a method which will determine the bearing regardless of rotation.
Can I do this?
Thanks.
You should use data from gyroscope that can fully determine angles of device.
You can find this link helpful:
http://www.touchqode.com/misc/20101025_jsug/20101025_touchqode_sensors.pdf
I read in many places like: One Screen Deserves Another that: "The sensor coordinate system used by the API for the natural orientation of the device does not change as the device moves, and is the same as the OpenGL coordinate system."
Now, I get the same reading as this image:
What I don't understand is: If the coordinate system doesn't change if I rotate the phone (always with the screen facing the user), the gravity force should be applied on Y axis, always. It should only change the axis if I put the phone in a position where the screen is not facing the user anymore like resting on a table, where gravity should be applied on Z axis.
What is wrong with my understanding?
Thanks! Guillermo.
The axis are swapped when the device's screen orientation changes. Per the article you cited:
However, the Android sensor APIs define the sensor coordinate space to be relative to the top and side of the device — not the short and long sides. When the system reorients the screen in response to holding the phone sideways, the sensor coordinate system no longer lines up with the screen’s coordinate system, and you get unexpected rotations in the display of your app.
To access the un-swapped values if you'd like, use indices 3, 4 and 5 in values[], otherwise some of the suggestions mentioned in that same article work quite nicely!
Quite old question, but I find it still relevant. As of today, Sensor Overview page in its Sensor Coordinate System chapter still says:
The most important point to understand about this coordinate system is that the axes are not swapped when the device's screen orientation changes—that is, the sensor's coordinate system never changes as the device moves. This behavior is the same as the behavior of the OpenGL coordinate system.
To me this wording is still confusing, of course it might be because I'm not an English native speaker.
My understanding would be that in Android (as is in iOS) the coordinate system assumed by sensors is integral with the device. That is, the coordinate system is stuck with the device and its axes rotate along with the device.
So, for a phone whose natural orientation is portrait, Y-axis points upward when the phone is held vertically in portrait in front of user. See image below, from same Android guide:
Then when user rotates the phone to landscape left orientation (so with home button on the right side), the Y-axis points to the left. See image below, from a Matlab tutorial (although screen is not really user facing anymore):
Then there's the frequently cited post from Android dev blog, One Screen Turn Deserves Another that says:
The sensor coordinate system used by the API for the natural orientation of the device does not change as the device moves, and is the same as the OpenGL coordinate system
which to me sounds exactly the opposite as my previous reasoning. But then again, in its following So What’s the Problem? chapter, you do see that when phone is rotated to landscape left, Y-axis points to the left as in my previous reasoning.
I'm trying to understand the gyroscope sensor output from the Nexus S. When you rotate about the z-axis (the axis perpendicular to the screen), the z-axis correctly records the angular velocity as a result of the turn. However, the y-axis also reports spikes in angular velocity, despite no change in the orientation of the device relative to the y-axis. So if I turn around when holding the phone at a particular orientation, it appears that I have not only turned around, but also tilted the phone left/right (which I haven't).
Why is this? Any ideas how to compensate or correct for this?
Please note I'm talking about the raw sensor output here.
First of all raw sensor data always contains some bias that has to be filtered out. Furtheron it is almost impossible to really suppress other rotational directions manually. If you suspect your Nexus to have a sensor bug or you want to know exactly what's going on I suggest to build your own apparatus. As inspiration a picture of mine :-)
I'm working on an Android program for my research.
It needs to find out the angle while user using my mobile phone.
The angle here is like a rotation angle of entire mobile phone, not the widgets in layout.
I've tried keyword "angle" on http://developer.android.com/reference/packages.html
And I found a method "onSensorChanged" under public interface SensorListener
But the description there is too hard to me to understand :Q
Is it the function I want?
Essentially you would do some inverse trig on the readings of the three accelerometers, to figure out the angle of "gravity" from its vector components in the three measured axis.
See ApiDemos in the sample applications package of the SDK, particularly Sensors.java
http://developer.android.com/resources/samples/get.html
You may just want to build that demo application, install it, and experiment with the screen showing the accelerometer data, then extract that part into a separate project and modify it towards your needs.