rotate 3D Object (Spatial) with compass - android

I've generated 3D overlay using jMonkey in my Android app. Everything works fine - my ninja model is walking in loop. Awsome !
Now I want to rotate camera according to direction of the phone. I thought compass is the best way BUT unfortunatly I have a lot of problems. So here I go
I've created method that is invoked in activity
public void rotate(float x, float y, float z) {
Log.d(TAG, "simpleUpdate: Nowa rotacja: " + y);
newX = x;
newY = y;
newZ = z;
newPosition = true;
}
in 'simpleUpdate' method I've managed it this way
if(newPosition && ninja != null) {
Log.d(TAG, "simpleUpdate: rotacja: " + newY);
ninja.rotate((float)Math.toRadians(newX), (float)Math.toRadians(newY), (float)Math.toRadians(newZ));
newPosition = false;
}
in my activity I'm checking if phone moved
if(lastAzimuth != (int)azimuthInDegress) {
lastAzimuth = (int)azimuthInDegress;
I cast to int so the distortion won't be so big problem
if ((com.fixus.towerdefense.model.SuperimposeJME) app != null) {
((com.fixus.towerdefense.model.SuperimposeJME) app).rotate(0f, azimuthInDegress, 0f);
}
At the moment I want to rotate it only in Y axis
Now the main problem is that the rotations is more like jump that rotation. When I move my phone a bit and I have 6 degrees diffrence (i see this in my log) the model is rotated like for 90 degrees and he turns back. This has nothing to do with rotation or change taken from my compas.
Any ideas ?
UPDATE
I think I got it. Method rotate, rotates from current state with value I've set. So it looks more like old Y rotate + new value. So I'm setting the diffrence between current value and old value and it now look almoust fine. Is it the good way ?

Related

Compass - Track number of full 360 degree rotations

Suppose a person is using this compass, and beginning from 90 degrees they start rotating either clockwise or counterclockwise. What's the best way to keep count of how many full 360 degree rotations they complete? Assuming they'll be rotating either only clockwise or only counterclockwise from beginning to end.
I kept coming up with solutions where if the beginning bearing is, for example, 90 degrees I keep checking the next bearing when the sensor data changes, and if it's consistently moving in one direction I know they're rotating. And if they keep rotating in that direction and make it back to 90 degrees, that counts as one rotation. My way seems very convoluted and inefficient and I'm having a hard time coming up with a better way.
In this scenario, I'd be expecting multiple full rotations.
I'd appreciate any help. Thank you!
I found this related answer and am trying to put together a code sample for that. If someone has already done something similar, please post it!
#Override
public void onSensorChanged(SensorEvent event)
{
switch(event.sensor.getType())
{
case Sensor.TYPE_GRAVITY:
{
mValuesAccelerometer = lowPass(event.values.clone(), mValuesAccelerometer);
break;
}
case Sensor.TYPE_MAGNETIC_FIELD:
{
mValuesMagneticField = lowPass(event.values.clone(), mValuesMagneticField);
break;
}
}
boolean success = SensorManager.getRotationMatrix(
mMatrixR,
mMatrixI,
mValuesAccelerometer,
mValuesMagneticField);
if (success)
{
SensorManager.getOrientation(mMatrixR, mMatrixValues);
float azimuth = toDegrees(mMatrixValues[0]);
float pitch = toDegrees(mMatrixValues[1]);
float roll = toDegrees(mMatrixValues[2]);
if (azimuth < 0.0d)
{
//The bearing in degrees
azimuth += 360.0d;
}
}
}
If you're sure that they'll be moving in only 1 direction, to optimize your code you can have checkpoints for degrees instead of continuously monitoring if they're still moving in the right direction.
Here's a rough algo to do that
//You noted 90 degree as starting point
// checkpoint 1 will be 180 keep it as a boolean
// now you've reached 180 if the meter gets to 180 before going to next checkpoint
// which is 270 then make 180 false. it means they turned back.
// if they make it to 270 then wait for 0 degrees and do the same.
// if they make it back to 90 like that. You got a rotation and hopefully
// a bit of complexity is reduced as you're just checking for 4 checkpoints
I don't have any code handy at the moment.
This is a tracking problem with a reading that overflows. You need to keep track of the last reading and hope the user doesn't do more than a half turn between each reading.... (because of the Nyquist theorem)
Here is the basic pseudo code.
var totalChange = 0;
var lastAzimuth = -1000;
function CountTurns(az)
{
if (az > 180) az -= 360; // do this if your azimuth is always positive i.e. 0-360.
if (lastAzimuth == -1000)
{
lastAzimuth = az;
}
diff = az - lastAzimuth;
if (diff > 180)
diff -= 360;
if (diff < -180)
diff += 360;
lastAzimuth = az;
totalChange += diff;
return totalChange / 360;
}
Create 3 integers
int rotationCount=0
int currentDegrees=0
int previousDegrees=89
not a java programmer so i dont know how you handle the onSensorChanged event but basically perform a check within a while loop
while (currentDegrees + 90 < 360)
{
if (currentDegrees + 90 == 0)
{
if (previousDegrees == 359)
{
rotationCount = rotationCount + 1
}
}
else if (currentDegrees + 90 == 359)
{
if (previousDegrees == 0)
{
rotationCount = rotationCount - 1
}
}
previousDegrees = currentDegrees + 90
}
sorry about the syntax, this is just an example of how to do so..
Visualize what I will say and you'll definitely hit your goal in no time.
As you don't need to think of the full 360 degree, but you can take half of that and use the signs differences to your advantage.
Take a look at this figure :
We have a circle that is divided to two sides (left and right).
The left side will take negative 180 degree. (West Side).
The right side will take positive 180 degree. (East Side).
Current positing will be always 0 as (North) and positive 180 as (South).
IF the compass goes positive (meaning goes to the right direction)
Then add +1 on each turn.
IF the compass goes negative (meaning goes to the left direction).
Then subtract -1 on each turn
IF the compass hit OR is 0, then it's current position (NORTH).
IF the compass hit OR is 90, then it's (East).
IF the compass hit OR is 180, then it's (South)
IF the compass hit OR is -90, then it's (West).
This will turn out that whenever the person goes East, the counter will add +1 until it reaches 180, Then it'll change from positive to negative, which will subtract -1 on each turn until it reaches 0. That would be a full 360 rotation.

Move an image based on phone movement (ROTATION_VECTOR)

I'm very new to working with the sensors on an Android device. I have the following code (which I obtained from various tutorials).
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() == Sensor.TYPE_ROTATION_VECTOR)
{
float[] orientationVals = new float[3];
// Convert the rotation-vector to a 4x4 matrix.
SensorManager.getRotationMatrixFromVector(mRotationMatrix,
event.values);
SensorManager.getOrientation(mRotationMatrix, orientationVals);
orientationVals[0] = (float) Math.toDegrees(orientationVals[0]);
orientationVals[1] = (float) Math.toDegrees(orientationVals[1]);
orientationVals[2] = (float) Math.toDegrees(orientationVals[2]);
txtAzimuth.setText("Azimuth : "+d.format(orientationVals[0]) + '°');
txtPitch.setText("Pitch : "+d.format(orientationVals[1]) + '°');
txtRoll.setText("Roll : "+d.format(orientationVals[2]) + '°');
}
}
Okay, so I've got these values. How do I use them to move an image (a marker of sorts) based on the movement of the phone. For example, if I move the phone to the right, the image should move to the left and when the phone is moved left later, the image should move right.
I've done a lot of googling and what I've found is mostly OpenGl stuff.
Like this:
OpenGl world using camera to find object
This question asks what I want with the exception of the object moving randomly. I want the object position to be decided by me.
I don't want to use OpenGL for this as it would be overkill for my app. Could I somehow map the values produced by the sensors to moving the image?

Change MapView orientation

I want to change the OSMdroid MapView orientation to face the direction which the user is going (calculated with Location.bearingTo between the previous and current user location on each onLocationChanged, and converted into normal degrees instead of the -180/180° East of True North degrees).
This direction is correct, I'm rotating an arrow image towards this direction and it points towards the right direction without fail.
However, when I want to orientate the MapView to these userDirection using the setMapOrientation method (documented here), this isn't working as I want it to be. When I orientate the map towards the user's direction, the arrow image should always be pointing north, right? Because this is want I want to achieve: to make it seem like the arrow is always pointing forward (like a GPS tracker: your location on GPS is always represented by an icon going forward, my arrow is pointing to all kinds of directions because the map orientation is wrong).
I'm guessing the osmdroid.MapView orientation is expecting another sort of degree value, but I've tried converting back to East of True North degrees, didn't work. Or my logic is completely wrong and it is working correctly.
How do set the orientation for the MapView so that it is always facing the user's current direction, so that the arrow is always pointing forward (and not going backwards, right or left, ... )?
I think what you are referring to as is "True North" orientation of Map using the compass True North. For this you need the device Compass or Sensor Listener to get the direction, after getting the heading you need to set it for the MapView. Here is the Snippet which is very helpful.
private void compassHeadingUp() {
if(enableCompassHeadUp){
mSensorManager.registerListener(mySensorEventListener,
SensorManager.SENSOR_ORIENTATION,
SensorManager.SENSOR_DELAY_FASTEST);
} else {
mSensorManager.unregisterListener(mySensorEventListener);
mDirection = 0;
}
}
public SensorListener mySensorEventListener = new SensorListener(){
#Override
public void onAccuracyChanged(int arg0, int arg1) {
}
#Override
public void onSensorChanged(int sensor, float[] values) {
synchronized (this) {
float mHeading = values[0];
if(Math.abs(mDirection-mHeading) > Constance.ROTATION_SENSITIVITY){
mMapView.setMapOrientation(-mHeading);
mDirection = mHeading;
}
Matrix matrix = new Matrix();
mCompassImageView.setScaleType(ScaleType.MATRIX);
matrix.postRotate((float) -mHeading, mCompassImageView.getDrawable().getBounds().width()/2, mCompassImageView.getDrawable().getBounds().height()/2);
//Set your Arrow image view to the matrix
mCompassImageView.setImageMatrix(matrix);
}
}
};
I have solved this issue by inverting degrees like this:
float bearing = location.getBearing();
float t = (360 - bearing);
if (t < 0) {
t += 360;
}
if (t > 360) {
t -= 360;
}
//help smooth everything out
t = (int) t;
t = t / 5;
t = (int) t;
t = t * 5;
mapOSM.setMapOrientation(t);

Android SensorManager.getOrientation before event triggered

I am working on an app for a tablet where the tablet will be in a stand which rotates about a given point. That means very little movement from the tablet when the app is first opened. The app will display various things depending on the rotation of the tablet in the stand. What I cannot seem to find how to do no matter how hard I search is to get the initial orientation in degrees or radians which is comparable to the value of OrientationEventListener.onOrientationChanged(int orientation). I have an OrientationEventListener configured and my idea was that during the default Activities onCreate method, I would manually call the OrientationEventListener with an updated value.
Demo code that doesn't run by itself, but illustrates the point:
//No problems at this point in the code, the OrientationManager class
//which extends OrientationEventListener properly receives onOrientationChanged
//notices
OrientationEventListener orientationEventListener;
orientationEventListener = new OrientationManager(
this,
SensorManager.SENSOR_DELAY_NORMAL
);
orientationEventListener.enable();
//Here's where the problem is... need to get the current orientation
//and "initialize" the OrientationEventListener
float[] coords = new float[3];
float[] rotation = new float[16];
SensorManager.getOrientation(rotation, coords);
//Test for correct coords values (logs "0.0, -0.0, -0.0")
Log.d("DEBUG", Float.toString(coords[0]) + ", "
+ Float.toString(coords[1]) + ", "
+ Float.toString(coords[2])
);
//Goal is to be able to call orientationEventListener.onOrientationChanged(??)
//with the value that would be sent if the event was called naturally
Now, did I submit a question already asked and answered? I've looked at so many SO posts and haven't gotten anywhere.
Take a look at the getOrientationUsingGetRotationMatrix() method from DeviceOrientationService: Link :
private void getOrientationUsingGetRotationMatrix() {
if (mGravityVector == null || mMagneticFieldVector == null) {
return;
}
// Get the rotation matrix.
// The rotation matrix that transforms from the body frame to the earth frame.
float[] deviceRotationMatrix = new float[9];
if (!SensorManager.getRotationMatrix(
deviceRotationMatrix, null, mGravityVector, mMagneticFieldVector)) {
return;
}
// Convert rotation matrix to rotation angles.
// Assuming that the rotations are appied in the order listed at
// http://developer.android.com/reference/android/hardware/SensorEvent.html#values
// the rotations are applied about the same axes and in the same order as required by the
// API. The only conversions are sign changes as follows.
// The angles are in radians
float[] rotationAngles = new float[3];
SensorManager.getOrientation(deviceRotationMatrix, rotationAngles);
double alpha = Math.toDegrees(-rotationAngles[0]);
while (alpha < 0.0) { alpha += 360.0; } // [0, 360)
double beta = Math.toDegrees(-rotationAngles[1]);
while (beta < -180.0) { beta += 360.0; } // [-180, 180)
double gamma = Math.toDegrees(rotationAngles[2]);
while (gamma < -90.0) { gamma += 360.0; } // [-90, 90)
maybeSendChange(alpha, beta, gamma);
}
Check if rotation indeed holds valid values and that getRotationMatrix(...) is returning true.

How to detect left and right tilt of an android device mounted with an accelerometer?

Lets say you have the acceleration readings in all the 3 dimensions i.e X, Y and Z. How do you infer using the readings the phone was tilted left or right? The readings get generated every 20ms.
I actually want the logic of inferring the tilt from the readings. The tilt needs to be smooth.
A tilt can be detected in a sort of diferent ways. You can take into account 1 axis, 2 axis, or the 3 axis. Depending on how accurate you want it, and how much you feel like fighting with maths.
If you use only one axis, it is quite simple. Think the mobile is completely horizontal, and you move it like this:
using just one axis, lets say, axis x, will be enough, since you can detect accurately a change in that axis position, since even any small movement will do a change in the axis.
But, if your application is only reading that axis, and the user has the phone almost vertical, the difference in x axis will be really small even rotating the phone a big angle.
Anyways,for applications that only need coarse resolution, a single-axis can be used.
Referring to basic trigonometry, the projection of the gravity vector on the x-axis produces an output acceleration equal to the sine of the angle between the accelerometer x-axis and the horizon.
This means that having the values of an axis (those are acceleration values) you can calculate the angle in which the device is.
this means that the value given to you by the sensor, is = to 9,8 * sine of the angle, so doing the maths you can get the actual angle.
But don't worry, you don't even have to do this. Since the values are more or less proportional, as you can see in the table below, you can work directly with the value of the sensor, without taking much care of what angle represents, if you don't need it to be much accurate, since a change in that value means a proportional change in the angle, so with a few test, you will find out how big should be the change in order to be relevant to you.
So, if you take the value over the time, and compare to each other, you can figure out how big the rotation was. For this,
you consider just one axis. this will be axis X.
write a function to get the difference in the sensor value for that axis between one function call, and the next
Decide a maximum time and a minimum sensor difference, that you will consider a valid movement (e.g. a big rotation is good but only if it is fast enough, and a fast movement is good only if the difference in the angle is big enough)
if you detect two measurements that accomplish those conditions, you take note of half tilt done (in a boolean for instance), and start measuring again, but now, the new reference value is the value that was considered half tilt.
if the last difference was positive, now you need a negative difference, and if the last difference was negative, now you need a positive difference; this is, coming back. so start taking values comparing the new reference value with the new values coming from the sensor, and see if one accomplish what you decided in point 3.
if you find a valid value (accomplishing value difference and time conditions ), you have a tilt. But if you dont get a good value and the time is consumed, you reset everything: let your reference value be the last one, reset the timers, reset the half-tilt-done boolean to false, and keep measuring.
I hope this is good enough for you. For sure you can find some libraries or code snippets to help you out with this, but i think is good, as you say, to know the logic of inferring the tilt from the readings
The pictures was taken from this article, wich i recomend to read if you want to improve the accuracy and consider 2 o 3 axis for the tilt
The commonsware Sensor Monitor app does a pretty good job with this. It converts the sensor readouts to X, Y, Z values on each sensor reading, so it's pretty easy from there to determine which way the device is moving.
https://github.com/commonsguy/cw-omnibus/tree/master/Sensor/Monitor
Another item worth noting (from the Commonsware book):
There are four standard delay periods, defined as constants on the
SensorManager class:
SENSOR_DELAY_NORMAL, which is what most apps would use for broad changes, such as detecting a screen rotating from portrait to
landscape
SENSOR_DELAY_UI, for non-game cases where you want to update the UI continuously based upon sensor readings
SENSOR_DELAY_GAME, which is faster (less delay) than SENSOR_DELAY_UI, to try to drive a higher frame rate
SENSOR_DELAY_FASTEST, which is the “firehose” of sensor readings, without delay
You can use the accelerometer and magnetic field sensor to accomplish this. You can call this method in your OnSensorChanged method to detect if the phone was tilt upwards. This currently only works if the phone is held horizontally. Check the actual blog post for a more complete solution.
http://www.ahotbrew.com/how-to-detect-forward-and-backward-tilt/
public boolean isTiltUpward()
{
if (mGravity != null && mGeomagnetic != null)
{
float R[] = new float[9];
float I[] = new float[9];
boolean success = SensorManager.getRotationMatrix(R, I, mGravity, mGeomagnetic);
if (success)
{
float orientation[] = new float[3];
SensorManager.getOrientation(R, orientation);
/*
* If the roll is positive, you're in reverse landscape (landscape right), and if the roll is negative you're in landscape (landscape left)
*
* Similarly, you can use the pitch to differentiate between portrait and reverse portrait.
* If the pitch is positive, you're in reverse portrait, and if the pitch is negative you're in portrait.
*
* orientation -> azimut, pitch and roll
*
*
*/
pitch = orientation[1];
roll = orientation[2];
inclineGravity = mGravity.clone();
double norm_Of_g = Math.sqrt(inclineGravity[0] * inclineGravity[0] + inclineGravity[1] * inclineGravity[1] + inclineGravity[2] * inclineGravity[2]);
// Normalize the accelerometer vector
inclineGravity[0] = (float) (inclineGravity[0] / norm_Of_g);
inclineGravity[1] = (float) (inclineGravity[1] / norm_Of_g);
inclineGravity[2] = (float) (inclineGravity[2] / norm_Of_g);
//Checks if device is flat on ground or not
int inclination = (int) Math.round(Math.toDegrees(Math.acos(inclineGravity[2])));
/*
* Float obj1 = new Float("10.2");
* Float obj2 = new Float("10.20");
* int retval = obj1.compareTo(obj2);
*
* if(retval > 0) {
* System.out.println("obj1 is greater than obj2");
* }
* else if(retval < 0) {
* System.out.println("obj1 is less than obj2");
* }
* else {
* System.out.println("obj1 is equal to obj2");
* }
*/
Float objPitch = new Float(pitch);
Float objZero = new Float(0.0);
Float objZeroPointTwo = new Float(0.2);
Float objZeroPointTwoNegative = new Float(-0.2);
int objPitchZeroResult = objPitch.compareTo(objZero);
int objPitchZeroPointTwoResult = objZeroPointTwo.compareTo(objPitch);
int objPitchZeroPointTwoNegativeResult = objPitch.compareTo(objZeroPointTwoNegative);
if (roll < 0 && ((objPitchZeroResult > 0 && objPitchZeroPointTwoResult > 0) || (objPitchZeroResult < 0 && objPitchZeroPointTwoNegativeResult > 0)) && (inclination > 30 && inclination < 40))
{
return true;
}
else
{
return false;
}
}
}
return false;
}
Is this what you're looking for?
public class AccelerometerHandler implements SensorEventListener
{
float accelX;
float accelY;
float accelZ;
public AccelerometerHandler(Context paramContext)
{
SensorManager localSensorManager = (SensorManager)paramContext.getSystemService("sensor");
if (localSensorManager.getSensorList(1).size() != 0)
localSensorManager.registerListener(this, (Sensor)localSensorManager.getSensorList(1).get(0), 1);
}
public float getAccelX()
{
return this.accelX;
}
public float getAccelY()
{
return this.accelY;
}
public float getAccelZ()
{
return this.accelZ;
}
public void onAccuracyChanged(Sensor paramSensor, int paramInt)
{
}
public void onSensorChanged(SensorEvent paramSensorEvent)
{
this.accelX = paramSensorEvent.values[0];
this.accelY = paramSensorEvent.values[1];
this.accelZ = paramSensorEvent.values[2];
}
}

Categories

Resources