How can I use the Accelerometer for detecting jump in LibGDX? - android

I make an app for android. I'll use the accelerometer for detecting jump.Does someone know how that works?
And which value will change and with how much?

Here are some of the methods:
// Check accelerometer availability on device
boolean available = Gdx.input.isPeripheralAvailable(Peripheral.Accelerometer);
// Get current orientation
// Orientation.Landscape or Orientation.Portrait
Orientation nativeOrientation = Gdx.input.getNativeOrientation();
// Reading values
float accelX = Gdx.input.getAccelerometerX();
float accelY = Gdx.input.getAccelerometerY();
float accelZ = Gdx.input.getAccelerometerZ();
You will have to write your own function to combine above values and get a threshold or range of values to detect jump.
Reference: libgdx accelerometer

Related

constant rotating device- how to receive valid orientation data?

I am currently implementing an speedometer by receiving orientation data from my phone. I am using
SensorManager.getRotationMatrix(R, I, gravity, geomagnetic);
float orientation[] = new float[3];
SensorManager.getOrientation(R, orientation);
float azimuth = orientation[0];
double azimuthD = Math.toDegrees(azimuth);
if(azimuthD < 0) azimuthD = 360 + azimuthD;
With this i am able to receive the rotation data from my phone, such as azimuth etc..
Anyway, this works fine while the device is placed on a table or something. But when rotating around a certain point (in my case the device is fixed on a wheel and rotating at a certain speed) the values are far away from being accurate. I believe, since I am using gravity and the geomagnetic sensor, there could be an conflict with forces that influence these sensors, while rotating. As the wheel turns, the rotation changes relative to a point, but the local device rotation stays the same.
How can I access the orientation of the device while it's turning without running into a lot of noisy data?
I read some about the ´Sensor.TYPE_ROTATION_VECTOR´ property, but couldn't quite figure out how it works. Also I read about the possibility to remap the coordination system, but how is that supposed to help, since my phone is never not vertical to the floor more like with an angle of 5°-10°.
I would appreciate any help.
Cheers,
viehlieb
I guess i found my answer.
The solution was to throw away all the code i posted above and use the gyroscope, obviously.
The gyroscope values measure angular velocity of the device's rotation. The coordinate system used is the devices own coordinate system. In my case the relevant value was the rotation around the z-axis.
Values are in radiant per second, which can be mapped to m/s if you figure multiply the wheel's circumference. So the trick was in the OnSensorChanged method:
if(sensorEvent.sensor.getType() == Sensor.TYPE_GYROSCOPE){
gyroscope = sensorEvent.values;
double rotZ = gyroscope[2];
double degrees = Math.toDegrees(gyroscope[2]);
//calculate the speed with circumference = 2.23m
float speed = (float) degrees/ 360.0f * 2.23f * 3.6f;
}
If now you'd like to have accurate values, you could store them in an array and calculate the average. Remember to clear the array every 20th time (or so) the onSensorChanged method is called. With SENSOR_DELAY_GAME registered there's sufficient data over which you could build the average.

Detect if device has taken a turn using location service android

I want to detect if the user has taken a turn on the road while driving using the sensors on the android phone. How do I code this? I am collecting data live from all the sensors(accelerometer,location,rotation,geomagnetic) and storing them on the sd card. So now i just want to know whether the user has a taken a turn and in which direction he has turned.
I assume the registration of the sensor is done properly. You can detect the direction by using the orientation sensor (deprecated) as follows:
#Override
public void onSensorChanged(SensorEvent event) {
float azimuth_angle = event.values[0];
int precision = 2;
if (prevAzimuth - azimuth_angle < precision * -1)
Log.v("->", "RIGHT");
else if (prevAzimuth - azimuth_angle > precision)
Log.v("<-", "LEFT");
prevAzimuth = azimuth_angle;
}
Note: The variable of "prevAzimuth" is declared as global. You can change "precision" value to whatever you want. We need this value because we do not want to see output after each trivial change in azimuth angle. However, too large precision gives imprecise results. To me, "2" is optimum.
If you are tracking location coordinates, you can also track shifts between the angle from previous locations.
angle = arctan((Y2 - Y1) / (X2 - X1)) * 180 / PI
See this answer for calculating x and y.
Decision to use sensor values is based on an unrealistic assumption that the device is never rotated with respect to the vehicle.

Air mouse for Android, which sensor?

I'm working on an app that uses the smartphone to control the pointer on a desktop. I saw many app of this type on the store so I know it isn't impossible! I'm doing this for university, I'm not looking for the perfect app but I can't figure which sensor should I use... I tried to use the rotation vector sensor and it's good for pitch (y values of pointer) but not the same for azimuth (for the x axis)... I didn't filter values. Do you think I have to change sensor or is it just a matter of filtering data? Any tips?
EDIT: Gyro mouse in Ali's video is really smooth and accurate and it says it's very simple to do but not for me! I take the rotation Matrix, get the angle change and then I map it to pixels but it's very inaccurate (the azimuth)! I'll post the code to show clearly what I'm doing:
//get values from Rotation Vector sensor
SensorManager.getRotationMatrixFromVector(rotationMatrix, sensorValues);
SensorManager.getAngleChange(orientationValues, rotationMatrix, prevMatrix);
prevMatrix = rotationMatrix;
azimuth += orientationValues[0];
pitch += orientationValues[1];
//getAngleChange returns radians so I multiply by 100
float dX = (azimuth - lastX)*100.f;
float dY = (pitch - lastY)*100.f;
pixel.x += dX;
pixel.y += dY;
// store values
lastX = azimuth;
lastY = pitch;
prevMatrix = rotationMatrix;
What you are looking for is most likely the gyro mouse. Basically, you just accumulate the change in angles along each axis. Code snippet is also given in the linked video.
UPDATE: Yes, you need sensor fusion. Unfortunately, I am not familiar with the Android API or with the Android sensors so I can only guess. I would try the SensorManager.getRotationMatrix() to get the rotation matrices, then determine the change of angle with SensorManager.getAngleChange(). Then you just accumulate the change along each axis.

What sensor can be used to detect rotation when upright?

Hi I am creating an application in which the user holds the phone upright and then rotates it around the y axis (similar to taking a panorama).
(source: apple.com)
I need to detect the angle of rotation. In iOS this was fairly simple with the gyroscope sensor, but I am not finding the same luck with Android. If anyone could point me in the right direction that would be great.
Assuming your Y axis points to the center of earth, the value you are looking for is called azimuth.
To monitor its change you will need to register a listener for TYPE_ACCELEROMETER and TYPE_MAGNETIC_FIELD events:
mngr = (SensorManager)getSystemService(Context.SENSOR_SERVICE);
accelerometer = mngr.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
magneticField = mngr.getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD);
int rate = SensorManager.SENSOR_DELAY_GAME; // or other
mngr.registerListener(sensorListener, accelerometer, rate);
mngr.registerListener(sensorListener, magneticField, rate);
And within the listener, call:
float[] values = new float[3];
SensorManager.getOrientation(R, values);
float current_azimuth_val = values[0]; // <----------
Note that the quality. and latency, if the data you will obtain is highly hardware dependent.
There are various sensors available that can be managed through a SensorManager. Of course, since every device decides whether or not to put a particular sensor on the hardware platform for their model you have to check whether one exists. Some have gyro like iOS, some can be done with accelerometer and magnometer sensors in its place.
You can get started here: http://developer.android.com/guide/topics/sensors/sensors_overview.html

(LIBGDX) How can I incorporate, step by step, the accelerometer and use the Z axis on an ANDROID to move a car?

I am trying to have the tilt of the Z axis in an Android phone power the movement of a Libgdx Box2D car. I already know how to make the car move on my computer, but how do I implement that into an Android? The orientation is LANDSCAPE. Thanks so much!
I know something has to do with getRotation or something.... :)
you can get the Accelerometer Readings like this:
float accelX = Gdx.input.getAccelerometerX();
float accelY = Gdx.input.getAccelerometerY();
float accelZ = Gdx.input.getAccelerometerZ();
Wiki entry for more info:
https://github.com/libgdx/libgdx/wiki/Accelerometer
Then just apply a force to the body with it:
body.applyForceToCenter(<your_horizontal_accel_reading>, <vertical_if_needed_0_if_not>, true);

Categories

Resources