Android camera affects sensors (Accelerometer & Magnetic Field) while phone faces user - android

For an application I'm making, I need to have a camera and a compass. The application is set to be at landscape mode in the manifest.
First I've implemented the compass. As suggested in Android Developers, I used two sensors - Accelerometer and Magnetic Field. This is how I've done it:
I have my activity implement SensorEventListener. In onCreate() I initialize my sensorManager using:
sManager = (SensorManager) getSystemService(SENSOR_SERVICE);
I register my listeners in onResume() like so:
sManager.registerListener(this, sManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER),SensorManager.SENSOR_DELAY_NORMAL);
sManager.registerListener(this, sManager.getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD),SensorManager.SENSOR_DELAY_NORMAL);
and of course unregister them in onPause().
I don't use onAccuracyChanged(). this is what I do in onSensorChanged():
#Override
public void onSensorChanged(SensorEvent event) {
switch (event.sensor.getType()) {
case Sensor.TYPE_MAGNETIC_FIELD:
mags = event.values.clone();
break;
case Sensor.TYPE_ACCELEROMETER:
accels = event.values.clone();
break;
}
if (mags != null && accels != null) {
gravity = new float[9];
magnetic = new float[9];
SensorManager.getRotationMatrix(gravity, magnetic, accels, mags);
float[] outGravity = new float[9];
float inclination = (float) Math.acos(gravity[8]);
if (inclination < Math.toRadians(25)
|| inclination > Math.toRadians(155)) {
// device is close to flat. Remap for landscape.
SensorManager.remapCoordinateSystem(gravity, SensorManager.AXIS_Y,SensorManager.AXIS_MINUS_X, outGravity);
SensorManager.getOrientation(outGravity, values);
} else {
// device is not flat. Remap for landscape and perpendicular
SensorManager.remapCoordinateSystem(gravity, SensorManager.AXIS_X,SensorManager.AXIS_Z, outGravity);
SensorManager.getOrientation(outGravity, values);
}
azimuth = Math.round(Math.toDegrees(values[0]));
}
}
As you can see, I differentiate between when the phone is lying flat on the table, and when the user holds it (as you would when taking a picture). When I use this code alone, everything works great more or less. I'm getting correct azimuth values both when phone is lying on the table and when holding it perpendicular to the table (about 5-10 degrees difference, but I can live with that).
The problem starts when adding the camera preview to the application.
I have my activity implement SurfaceHolder.Callback. I initialize my camera in onCreate():
SurfaceView cameraView = (SurfaceView)findViewById(R.id.camera_view);
surfaceHolder = cameraView.getHolder();
surfaceHolder.addCallback(this);
if (Build.VERSION.SDK_INT < Build.VERSION_CODES.HONEYCOMB) {
surfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
}
This is how I implement the interface:
#Override
public void surfaceCreated(SurfaceHolder surfaceHolder) {
camera = Camera.open();
camera.setDisplayOrientation(0);
}
#Override
public void surfaceChanged(SurfaceHolder surfaceHolder, int i, int i1, int i2) {
if (isCameraOn) {
camera.stopPreview();
isCameraOn = false;
}
if (camera != null) {
try {
camera.setPreviewDisplay(surfaceHolder);
camera.startPreview();
isCameraOn = true;
} catch (IOException e) {
e.printStackTrace();
}
}
}
#Override
public void surfaceDestroyed(SurfaceHolder surfaceHolder) {
camera.stopPreview();
camera.release();
camera = null;
}
When I add the camera code to my project, and show the camera on the phone's screen, my sensors dont work properly when phone is perpendicular suddenly. If phone is lying flat on the table, the azimuth values I'm getting are correct. When phone is being held perpendicular to the table, my azimuth values are off by about 40 degrees (though stable).
I've tried looking for a solution (both by myself and online), but so far my efforts were in vain. I would love to get some direction on how to tackle this problem.Thanks!

First TYPE_MAGNETIC_FIELD sensor will not available in all devices.
You can use TYPE_ACCELEROMETER sensor alone to accomplish your requirement.
Retrieve accelerometer sensor
Sensor accelerometer = mSensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
Just compare and copy values when sensor change event call
#Override
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER)
mGravity = event.values;
}
Then you can use below function to get sensor values of all axis.
public int[] getDeviceAngles() {
float[] g = mGravity.clone();
double normOfG = Math.sqrt(g[0] * g[0] + g[1] * g[1] + g[2] * g[2]);
// Normalize the accelerometer vector
g[0] = (float) (g[0] / normOfG);
g[1] = (float) (g[1] / normOfG);
g[2] = (float) (g[2] / normOfG);
int x = (int) Math.round(Math.toDegrees(Math.atan2(g[1], g[0])));
int pitch = (int) Math.round(Math.toDegrees(Math.atan2(g[1], g[2])));
int rollValue = (int) Math.round(Math.toDegrees(Math.atan2(g[2], g[0])));
int pitchValue = pitch * -1;
int[] values = new int[3];
values[0] = x;
values[1] = pitchValue;
values[2] = rollValue;
//values contains: azimut, pitch and roll
return values;
}

Related

Get phone angle (pitch)

I really hope someone is able to help me:
I am working on an Android App that displays the angle the phone is held
I am using this code:
protected void onCreate(Bundle savedInstanceState)
{
mSensorManager = (SensorManager)getSystemService(SENSOR_SERVICE);
mAccelerometer = mSensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
mMagnetometer = mSensorManager.getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD);
}
protected void onResume() {
super.onResume();
mSensorManager.registerListener(this, mAccelerometer,100000);
mSensorManager.registerListener(this, mMagnetometer, 100000);
}
#Override
protected void onCreate(Bundle savedInstanceState) {
mSensorManager = (SensorManager)getSystemService(SENSOR_SERVICE);
mAccelerometer = mSensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
mMagnetometer = mSensorManager.getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD);
}
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER)
mGravity = lowPass(event.values.clone(), mGravity);
if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD)
mGeomagnetic = lowPass(event.values.clone(), mGeomagnetic);
if (mGravity != null && mGeomagnetic != null) {
boolean success = SensorManager.getRotationMatrix(Ra, Ia, mGravity, mGeomagnetic);
if (success) {
SensorManager.getOrientation(Ra, orientation);
float tempxGy = (orientation[0]);// az
float tempyGy = (orientation[1]);// pitch
float tempzGy = (orientation[2]); //roll
}
Im further processing the values that I get here but thats not the problem here.
When the phone is flat on the table the values look like that:
roll -0,018
pitch 0,024
yaw 2,51
that looks fine to me (also if I convert the values to euler angles)
now imagine that I take the phone from the table - the bottom of the phone (where the micro usb port is) stays on the table and I lift the side with the camera
the values of the pitch are getting more and more negative - till the pitch reaches -1.53 (-90 in euler) as soon as this value is reached the values are increasing again and if the phone is on the table with the display facing the table the pitch is back to 0 again.
The problem is, that I have to differenciate between the two values (for example -1 and -1
How can I do that? The roll isnt a problem as the roll has a 180 degree maximum and not this 90 degree problem..
I really hope you can help me!
So
if(mGravity[2]<0) orientation[1] = (float) (Math.PI + orientation[1]);
is enough to get values that are like I need them --- but now I have a new problem - I would like to remap the coordinate system, so I get 0 degrees (euler) when the phone is on the table and +90 if the phone is standing on the micro usb port
I tried
SensorManager.remapCoordinateSystem(Ra, SensorManager.AXIS_Y, SensorManager.AXIS_MINUS_X, Ri);
but now the roll and pitch are swapped --- what would I have to do to simply get positive pitch values as long as the display is facing up and negative values if the display is facing down?
Is the remap the wrong idea and I could do it simpler by doing something else?

Angle of phone rotated along X axis

I am working on an Android app wherein I want to scroll a large image horizontally. I used the accelerometer (Sensor.TYPE_ACCELEROMETER) and magnetic field (Sensor.TYPE_MAGNETIC_FIELD) data to get the angle of rotation. This data being to frequent infested with noise I am not able to implement a smooth motion effect.
#Override
public void onSensorChanged(SensorEvent event) {
switch (event.sensor.getType()) {
case Sensor.TYPE_MAGNETIC_FIELD:
mags = event.values.clone();
break;
case Sensor.TYPE_ACCELEROMETER:
accels = event.values.clone();
break;
}
if (mags != null && accels != null) {
gravity = new float[16];
boolean success = SensorManager.getRotationMatrix(gravity, null, accels, mags);
if (success) {
float[] outGravity = new float[16];
SensorManager.remapCoordinateSystem(gravity, SensorManager.AXIS_X, SensorManager.AXIS_Z, outGravity);
SensorManager.getOrientation(outGravity, values);
rollingAverage[0] = roll(rollingAverage[0], values[0]);
rollingAverage[1] = roll(rollingAverage[1], values[1]);
rollingAverage[2] = roll(rollingAverage[2], values[2]);
azimuth = Math.toDegrees(values[0]);
pitch = Math.toDegrees(values[1]);
roll = Math.toDegrees(values[2]);
mags = null;
accels = null;
double diffRoll = lastRoll - roll;
double diffPitch = lastPitch - pitch;
long curTime = System.currentTimeMillis();
if (Math.abs(diffRoll) >= 2) {
if (diffRoll > 0)
imageView.panLeft();
else
imageView.panRight();
lastRoll = roll;
}
}
}
}
Any ideas on achieving this using other methods?
You have to implement sensor fusion techniques based on Kalman filter or other filters. You can use open source libraries if needed. Refer this bitbucket repository. If you want to do yourself, read the tutorial.

Measuing car rotation with Android phone

I have found many threads about how to handle the device' rotation,orientation with motion and position sensors.
I would like to create an app which i will use in my car, first i would like to measure the rotation degree of the car.
So i put my phone to a phone case and for example when i turn left with the car i would like to see the car' turning degree on the phone.
Is it possible by magnetic and accelero meter?
I post a code that for first i think okay. (let's say that i hold my phone "portait" mode so not landscape for first)
private static SensorManager sensorService;
//magnetic
private Sensor mSensor;
//accelerometer
private Sensor gSensor;
private float[] mValuesMagnet = new float[3];
private float[] mValuesAccel = new float[3];
private float[] mValuesOrientation = new float[3];
private float[] mRotationMatrix = new float[9];
#Override
public void onCreate(Bundle savedInstanceState) {
nf.setMaximumFractionDigits(2);
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
sensorService = (SensorManager) getSystemService(Context.SENSOR_SERVICE);
this.mSensor = sensorService.getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD);
this.gSensor = sensorService.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
sensorService.registerListener(this, gSensor, SensorManager.SENSOR_DELAY_NORMAL);
sensorService.registerListener(this, mSensor, SensorManager.SENSOR_DELAY_NORMAL);
}
#Override
public void onAccuracyChanged(Sensor arg0, int arg1) {
}
#Override
public void onSensorChanged(SensorEvent event) {
switch (event.sensor.getType()) {
case Sensor.TYPE_ACCELEROMETER:
System.arraycopy(event.values, 0, mValuesAccel, 0, 3);
break;
case Sensor.TYPE_MAGNETIC_FIELD:
System.arraycopy(event.values, 0, mValuesMagnet, 0, 3);
break;
}
SensorManager.getRotationMatrix(mRotationMatrix, null, mValuesAccel, mValuesMagnet);
SensorManager.getOrientation(mRotationMatrix, mValuesOrientation);
// double azimuth = Math.toDegrees(mValuesOrientation[0]); //azimuth, rotation around the Z axis.
// double pitch = Math.toDegrees(mValuesOrientation[1]); // pitch, rotation around the X axis.
double roll = Math.toDegrees(mValuesOrientation[2]); //roll, rotation around the Y axis.
//normalize
// azimuth = azimuth>0?azimuth:azimuth+360;
roll = roll>0?roll:roll+360;
String txt = "roll= "+Math.round(roll);
((EditText)findViewById(R.id.szog)).setText(txt);
}
Questions:
- How accurate will this app in a car? (what can i do to be more accurate?)
- What should i do when i hold my phone at "landscape" mode?
Is the roll from orientation still okay?
Please note that this is a very first try so there are so much to do!
But first i want to see how can i achive that
Thanks!
If you had your Android device set up like a compass, then there would be an arrow that always pointed to magnetic north. So by measuring the change in the direction of that arrow, you could measure the rotation of your car.
To get a better indication of the orientation of your Android device, use
Sensor.TYPE_GRAVITY and Sensor.TYPE_MAGNETIC_FIELD, instead of Sensor.TYPE_ACCELEROMETER and Sensor.TYPE_MAGNETIC_FIELD. This is better because Sensor.TYPE_GRAVITY tries to filter out the effects due to the movement of the device.
If the Android device is lying flat on a surface, then the direction of magnetic north is defined by the azimuth that comes out of SensorManager.getOrientation(...). When it's standing up or when it's on its side then it's a bit more complicated. However, I suggest starting with the device lying flat first because that's the easiest case, and then you can progress to more difficult cases.

Android: Get physical device orientation in a portrait-only app

My app is locked into portrait orientation only, however in one fragment I have a camera preview where I would like to rotate captured images based on the device orientation. I believe that because my app is portrait only, the following code always logs zero.
Display display = ((WindowManager)getActivity().getSystemService(Context.WINDOW_SERVICE)).getDefaultDisplay();
int rotation = display.getRotation();
Log.i(TAG, "Rotation: " + rotation );
Is it possible to get the actual orientation of the device while locking the app to portrait?
I am targeting android 4.0+ so I'm not concerned if the solution won't work on older devices.
you could implement a SensorEventListener, then look at the Roll in onSensorChanged:
#Override
public void onSensorChanged(SensorEvent event) {
synchronized(this)
{
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER) {
mAccelerometerValues = event.values;
}
if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD) {
mGeomageneticValues = event.values;
}
if ((mAccelerometerValues != null) && (mGeomageneticValues != null))
{
boolean success = SensorManager.getRotationMatrix(R, I, mAccelerometerValues, mGeomageneticValues);
if (success)
{
SensorManager.remapCoordinateSystem(R, SensorManager.AXIS_X, SensorManager.AXIS_Z, outR);
SensorManager.getOrientation(outR,orientation);
mYaw = orientation[0] * MathX.toDegreesF;
mPitch = orientation[1] * MathX.toDegreesF;
mRoll = orientation[2] * MathX.toDegreesF;
String sText = String.format("a:%1.4f\np:%1.4f\nr:%1.4f", yaw,pitch,roll);
}
}
}
}

Sensors TYPE_ACCELEROMETER and TYPE_MAGNETIC_FIELD - They work on phones, but doesn't on tablets

In my 2d game I have the following code which is responsible for game entity control (flying plane). It all seems to be working fine when it comes for phones, but unfortunately I've getting some information that on Android tablets the steering is completely unreliable (axis are messed up, or it doesn't work at all). Unfortunately I don't have a tablet of my own, so I cannot investigate it closer. So.. what's wrong with the following code? (for the clarity I put only the code related to sensors)
// ...
private float[] accelerometerValues;
private float[] magneticFieldValues;
private float[] R;
private float[] I;
private float[] outR;
private float[] sensorValues;
private Sensor accelerometer;
private Sensor magneticField;
// ...
// ... sensor initialization
sensorManager = (SensorManager)activity.getSystemService(Context.SENSOR_SERVICE);
if(sensorManager == null)
return;
accelerometer = sensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
magneticField = sensorManager.getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD);
// ...
// ... onPause() sensor is being unregistered
public void onResume() {
if(!sensorManager.registerListener(sensorListener, accelerometer, SensorManager.SENSOR_DELAY_GAME) ||
!sensorManager.registerListener(sensorListener, magneticField, SensorManager.SENSOR_DELAY_GAME))
// ...
// ...
sensorListener = new SensorEventListener() {
public void onAccuracyChanged(Sensor arg0, int arg1) {
}
public void onSensorChanged(android.hardware.SensorEvent event) {
synchronized(InputMgr.this) {
switch(event.sensor.getType()) {
case Sensor.TYPE_ACCELEROMETER:
System.arraycopy(event.values, 0, accelerometerValues, 0, 3);
break;
case Sensor.TYPE_MAGNETIC_FIELD:
System.arraycopy(event.values, 0, magneticFieldValues, 0, 3);
break;
}
}
}
};
// ...
// used somewhere in the game
public void getSensorValues(float values[]) {
synchronized(InputMgr.this) {
SensorManager.getRotationMatrix(R, I, accelerometerValues, magneticFieldValues);
SensorManager.remapCoordinateSystem(R, SensorManager.AXIS_Y, SensorManager.AXIS_X, outR);
SensorManager.getOrientation(outR, sensorValues);
System.arraycopy(sensorValues, 0, values, 0, sensorValues.length);
}
}
This wont cheer you up much but I have a similar problem. I can see a TYPE_MAGNETIC_FIELD sensor, I can add a an event listener to it but I never get any data from it. Other sensors work fine. This is on a Galaxy Tab 7.
onSensorChanged(SensorEvent event) never fires a case Sensor.TYPE_MAGNETIC_FIELD:
As such I get no data from the Magnetic Field Sensor.
if you solve it let us know =)
On devices whose default orientation is landscape (-> most tablets), the sensor values are kind of 'wrong' (I don't know why). So you need to catch those devices and remap your Rotation Matrix.
To check whether the matrix needs to be remapped, you can use this code:
public boolean needToRemapOrientationMatrix;
// compute once (e.g. in onCreate() of your Activity):
Display display = ((WindowManager)getSystemService(Context.WINDOW_SERVICE)).getDefaultDisplay();
int orientation;
if(display.getWidth() < display.getHeight()) orientation = Configuration.ORIENTATION_PORTRAIT;
else if(display.getWidth() > display.getHeight()) orientation = Configuration.ORIENTATION_LANDSCAPE;
else orientation = Configuration.ORIENTATION_SQUARE;
int rotation = display.getRotation();
needToRemapOrientationMatrix =
(orientation==Configuration.ORIENTATION_LANDSCAPE && (rotation==Surface.ROTATION_0 || rotation==Surface.ROTATION_180)) ||
(orientation==Configuration.ORIENTATION_PORTRAIT && (rotation==Surface.ROTATION_90 || rotation==Surface.ROTATION_270));
And when you read the sensor values, remap the matrix if needed:
public void getSensorValues(float values[]) {
synchronized(InputMgr.this) {
SensorManager.getRotationMatrix(R, I, accelerometerValues, magneticFieldValues);
if(needToRemapOrientationMatrix)
SensorManager.remapCoordinateSystem(R, SensorManager.AXIS_MINUS_Y, SensorManager.AXIS_X, R);
SensorManager.remapCoordinateSystem(R, SensorManager.AXIS_Y, SensorManager.AXIS_X, outR);
SensorManager.getOrientation(outR, sensorValues);
System.arraycopy(sensorValues, 0, values, 0, sensorValues.length);
}
}
This worked for me, I hope it helps.

Categories

Resources