I am developing an AR application on Android and would like to to, regardless of device roll orientation get horizontal and vertical values, much like a spirit level. An example would be a user holds their device in portrait mode and spins their phone, I would like the horizon on the phone to match the natural horizon. I have played with the roll value returned from the sensor manager but it seems to take pitch into account (ie. the device is now in landscape mode, what should be pitch affects roll.)
Also, when reading pitch, I would like the horizon to move up and down, regardless of roll. At the moment, when the device has rolled to 90 degrees, any pitch changes move in the horizontal direction rather than the vertical direction.
Any pointers?
Thanks in advance.
Paul
Yeah, I think your best bet, which I understand has it's flaws is using the accelerometers to determine the direction of the ground.
Use something like this....
mSensorManager = (SensorManager)getSystemService(SENSOR_SERVICE);
accelerometer = mSensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
in your onCreate method, then put this
mSensorManager.registerListener(this, accelerometer, SensorManager.SENSOR_DELAY_NORMAL);
in your onResume
and this to handle the updates
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER) {
float xAcceleration = event.values[0];
float yAcceleration = event.values[1];
float zAcceleration = event.values[2];
Then, just use those Acceleration values to determine the direction of the ground. :-)
I find that way is a lot more fluid, I hope that helps. :-)
For more info, check out the following for more info:
http://developer.android.com/reference/android/hardware/SensorEvent.html#values
Related
I am doing a demo in which when I move device like on top-left, top-right,top-center and bottom-right, bottom-left, bottom-center, center-right, center-left this type of movement I need to detect so that I am looking for accelerometer and gyroscope sensor. Am I going on the right track or not?
Do you have any demo or any link?
Yes accelerometer would be a good option to detect the motion of the phone
You need to override the onSensorChange even in your activity in order to detect accelerometer values.
public void onSensorChange(SensorEvent sensorEvent) {
Sensor mySensor = sensorEvent.sensor;
if (mySensor.getType() == Sensor.TYPE_ACCELEROMETER) {
float x = sensorEvent.values[0];
float y = sensorEvent.values[1];
float z = sensorEvent.values[2];
}
}
Get started from here
Sensors Overview
Using the Accelerometer on Android
SensorManager
Also check out this page for a collection of android sensors library.
I need something very simple, but I could not find a suitable example to learn from. My sole purpose is the following:
As the device is placed flat (on its back) on the desk, it should show 0 (or close to 0) for X and Y axis. When I lift it from the top part (where the speaker is) and the bottom part (where the microphone is) stays put down - it should show me how many degrees is the phone tilted. Mathematically described - show in degrees the angle between the back of the phone and the table, for one of the axises. When I lift the bottom part (and the top part stays put down) then show minus degrees.
The same goes for the other axis - rotating the phone around its long sides.
I tried assembling an app from different examples, using Gyroscope or Accelerometer or Rotation Vector Sensors, but could not come with something working properly.
Can someone give me an example of the onSensorChanged function (as all the work goes on in here) and just tell me which sensor is used, so I know what to register?
There are a few examples and tutorials on the web, but be careful. Sensor.TYPE_ORIENTATION became deprecated. You need to calculate rotations by listening to these two sensors Sensor.TYPE_ACCELEROMETER and Sensor.TYPE_MAGNETIC_FIELD.
The tricky part after registering to receive notifications from these sensors, is to figure out how to handle the data received from them. The key part is the following:
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER)
mGravity = event.values;
if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD)
mGeomagnetic = event.values;
if (mGravity != null && mGeomagnetic != null) {
float R[] = new float[9];
float I[] = new float[9];
boolean success = SensorManager.getRotationMatrix(R, I, mGravity, mGeomagnetic);
if (success) {
float orientation[] = new float[3];
SensorManager.getOrientation(R, orientation);
azimuth = orientation[0]; // orientation contains: azimuth, pitch and roll
pitch = orientation[1];
roll = orientation[2];
}
}
}
This is how you should be calculating the azimuth, pitch, roll values of your device in the onSensorChanged(SensorEvent event) callback. Keep in mind that "All three angles above are in radians and positive in the counter-clockwise direction". You can simply convert them to degrees with Math.toDegrees()
As Louis CAD pointed out in the comments, it is a good idea to move the initialization of the I, R and orientation arrays out of the onSensorChanged callback, since it is called frequently. Creating and then leaving them behind for the GC is bad for your apps performance. I left it there for the sake of simplicity.
Based on how your device is rotated you might need to remap the coordinates to get the result you want. You can read more about remapCoordinateSystem and also about getRotationMatrix and getOrientation in the android documentation
Example code:
http://www.codingforandroid.com/2011/01/using-orientation-sensors-simple.html
Have a look at this simple CameraLevel app.
https://github.com/konstantinvoronov/camerahorizon_overlay
Instead of using magnetic field and accelerometer sensors camerahorizon_overlay uses only accelerometer and work well.
Hi I am creating an application in which the user holds the phone upright and then rotates it around the y axis (similar to taking a panorama).
(source: apple.com)
I need to detect the angle of rotation. In iOS this was fairly simple with the gyroscope sensor, but I am not finding the same luck with Android. If anyone could point me in the right direction that would be great.
Assuming your Y axis points to the center of earth, the value you are looking for is called azimuth.
To monitor its change you will need to register a listener for TYPE_ACCELEROMETER and TYPE_MAGNETIC_FIELD events:
mngr = (SensorManager)getSystemService(Context.SENSOR_SERVICE);
accelerometer = mngr.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
magneticField = mngr.getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD);
int rate = SensorManager.SENSOR_DELAY_GAME; // or other
mngr.registerListener(sensorListener, accelerometer, rate);
mngr.registerListener(sensorListener, magneticField, rate);
And within the listener, call:
float[] values = new float[3];
SensorManager.getOrientation(R, values);
float current_azimuth_val = values[0]; // <----------
Note that the quality. and latency, if the data you will obtain is highly hardware dependent.
There are various sensors available that can be managed through a SensorManager. Of course, since every device decides whether or not to put a particular sensor on the hardware platform for their model you have to check whether one exists. Some have gyro like iOS, some can be done with accelerometer and magnometer sensors in its place.
You can get started here: http://developer.android.com/guide/topics/sensors/sensors_overview.html
Currently, I'm trying to rotate 3D Cube using orientation sensor values, using getRotation() method. Some unexpected behaviors are observed when the android device is rotated above some bounds. For instance, if I make the device 'stand up', the value of the 'roll' just becomes crazy.
Also I'm experiencing the phenomenon similar to so-called gimbal-lock. The only difference is I'm experiencing the very problem even before applying the sensor values to the 3D rotation. When I try to change the 'pitch' value by rotating the device around only 'pitch' axis, the 'yaw' value also changes according to the rotation of the pitch. It seems completely unreasonable to me.
Could somebody help me?? I'm stuck in this problem for a month.
This is a common problem with yaw, pitch and roll. You cannot get rid of it as long as you are using yaw, pitch and roll (Euler angles). This video explains why.
I use rotation matrices instead of Euler angles in my motion sensing application. For an introduction to rotation matrices I recommend:
Direction Cosine Matrix IMU: Theory
Rotation matrices work like a charm.
Quaternions are also very popular and said to be the most stable.
[This answer was copied from here.]
Using quaternions to compute YPR won't do much to solve any problem. The problem of gimbal lock (which near pitch of +/-90 can drive yaw and roll -- actually yaw-roll at the north pole -- to go crazy under slight changes/noise in the underlying quaternion).
However, if you use Yaw Pitch and Roll values to perform a rotation of a 3D object shouldn't exhibit any odd behavior near the gimbal lock position. It's just that an amibguity in yaw and roll arise and large variations in yaw and roll do not imply the actual orientation is going crazy -- just that the orientation is insensitive to large changes in yaw-roll near pitch of 90.
BUT, also note that phones and browsers for HTML5 do not properly implement yaw, pitch and roll per conventions for Android. Here is a good blog for reference:
http://www.sensorplatforms.com/understanding-orientation-conventions-mobile-platforms/
Here is a basic example, this will return the vector of gravity. Note that you can change the sensor type and the speed of sampling, more details here
SensorManager sensorManager = (SensorManager) getSystemService(Context.SENSOR_SERVICE);
Sensor sensor = sensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
sensorManager.registerListener(new SensorEventListener() {
#Override
public void onSensorChanged(SensorEvent event) {
float x = event.values[0];
float y = event.values[1];
float z = event.values[2];
double total = Math.sqrt(x * x + y * y + z * z);
}
#Override
public void onAccuracyChanged(Sensor sensor, int accuracy) {
}
}, sensor, SensorManager.SENSOR_DELAY_FASTEST);
Well if you running on the phone.
Quaternions are the best, and you should use it
For rotation matrix and euler angle, you can easily came across such term called gimbal lock. It happens frequently with user violent action.
Gimbal lock is the loss of one degree of freedom in a three-dimensional, three-gimbal mechanism that occurs when the axes of two of the three gimbals are driven into a parallel configuration, "locking" the system into rotation in a degenerate two-dimensional space.
Rotation matrix and euler angle are good for slow moving robot action.
For details on quaternions concatnations and convert point to new system,
you can refer to wiki link
https://en.wikipedia.org/wiki/Quaternion
I am working on a project which includes an Android application which is used for
controlling/steering.
Speed: When you tilt the phone forward/backwards (pitch) it simulates giving gas and breaking.
Direction: When you tilt the phone left/right (roll) it simulates steering to the left and right.
I have already written some code which seemed to work fine. But when I took a closer look, I found that some values are acting weird.
When I tilt the phone forward/backward to handle the speed it works perfect I get the expected speed and direction values. But when I tilt the phone to the left/right to handle the direction it seems to corrupt some values. When it is tilting to the left/right that doesn't only change the direction value (roll) but it also affects the speed value (pitch).
For extra information:
Programming for Android 2.2
Device is an Google Nexus One
Holding the device in portrait
The most relevant code I use to read the sensor values is as follows:
public void onSensorChanged(SensorEvent sensorEvent)
{
synchronized (this)
{
if (sensorEvent.sensor.getType() == Sensor.TYPE_ORIENTATION)
{
float azimuth = sensorEvent.values[0]; // azimuth rotation around the z-axis
float pitch = sensorEvent.values[1]; // pitch rotation around the x-axis
float roll = sensorEvent.values[2]; // roll rotation around the y-axis
System.out.println("pitch: " + pitch);
System.out.println("roll: " + roll);
System.out.println("--------------------");
// Convert the sensor values to the actual speed and direction values
float speed = (pitch * 2.222f) + 200;
float direction = roll * -2.222f;
So when I run the code, and I look at the printed values. When tilting the device left/right, it seems to affect the pitch value as well. How come? And how can I get the pure pitch value, when 'roll'-ing? So that tilting the phone to the left/right doesn't affect/corrupt the pitch value.
You could read up on Gimbal lock. That's bitten me before.