How to calculate rotation angle using Android Sensors? - android

I am doing an Opengl appln in which i have to rotate the camera, if the android device is rotated/tilted along Z axis.
I tried the SensorManager.getOrientation(R, orientVals); using the magnetic and accelerometer sensors. But the values are very much fluctuating.
Gyroscope is also available in my device.
Since I am animating (rotate) the camera, I need a smooth rotation values
Please guide me in this regard.

See How do you calculate the rate of rotation using the accelerometer values in Android for a particular axis on how to read Android's software-derived sensors that combine the data from the accelerometers, magnetometers, and (if available) gyroscopes.
To smooth values, use a low-pass filter or (better but more complicated) a Kalman filter. I suspect that Android's software-derived sensors such as the "rotation sensor" already use a Kalman filter to combine data from the different sensors. (One could search the source code...)

Related

Kalman filter sensor fusion for FALL detection: Accelerometer + Gyroscope

I am trying to understand the process of sensor fusion and along with it Kalman filtering too.
My goal is to detect Fall of a device using Accelerometer and Gyroscope.
In most of the papers such as this one, It mentions how to overcome drift due to Gyroscope and noise due to Accelerometer. Eventually the sensor fusion provides us with better measurements of Roll, Pitch and Yaw and not better acceleration.
Is it possible to get better 'acceleration results' by sensor fusion and in turn use that for 'Fall detection' ? As only better Roll, Yaw and Pitch are not enough to detect a Fall.
However this source recommends to smoothen Accelerometer (Ax,Ay,Az) and Gyroscope (Gx,Gy,Gz) using Kalman filter individually and using some classification algorithm such as k-NN Algorithm or clustering to detect Fall using supervised learning.
Classification part is not my problem, it is if I should fuse the sensors(3D accelerometer and 3D gyroscope) or smoothen the sensors separately, with my goal of detecting a fall.
Several clarifications
Kalman Filter is typically to perform sensor fusion for position and orientation estimation, usually to combine IMU (accel and gyro) with some no-drifting absolute measurements (computer vision, GPS)
Complimentary filter, which is typically used to have good orientation estimation by combining accel(noisy but non-drifting) and gyro(accurate but drifting) . Using accel and combine with gyro, one can have fairly good orientation estimation. The orientation estimation you can see as primary using the gyro, but corrected using accel.
For the application of Fall detection using IMU, I believe that acceleration is very important. There is no known way to "correct" the acceleration reading, and thinking of this way is likely to be the wrong approach. My suggestion is to use accelerations as one of your inputs to the system, collect a bunch of data simulating the fall situation, you might be surprised that there are a lot of viable signals there.
I dont think you need to use KF to detect fall detection. Using simple Accelerometer will able to detect the fall of device. If you apply low pass filter to smooth accelerometer and check if total acceleration is close to zero (in free fall device is going with -g (9.8 m/s2) acc) for more than certain duration, you can detect as fall.
The issue with above approach is if device is rotating fast then acceleration wont be close to zero. For robust solution, you can implement simple complementary (search for Mahony) filter rather than KF for this application.

Accelerometer and Magnetometer sensor fusion to get Gyroscopic Data

I am making a raspberry pi robot with an FVP (First Person View) camera mounted on a Pan/tilt Servo. I want to make it VR compatible by connecting it to my Phone. But my phone doesn't have Gyroscope sensor to detect horizontal movements, but it has magnetometer and accelerometer. How can I combine data from accelerometer and magnetometer to make a virtual gyroscope that can move with my camera. I am noob in all of these.
You should have an rotation vector sensor that is already fusing the two. You will not get better results than it.
Note that this will not be as high quality as a proper gyroscope and will have artifacts if the robot moves.
If you're still interested in how to make this yourself, you can get roll and pitch information from the accelerometer, then get the yaw information from the magnetometer. Best if you find a library for 3d maths and do this with quaternions or matrices. This seems like a use case where you will be affected by gimbal locks easily, so euler angles will be problematic.
I guess you want to use this for VR? Don't try to move the servos to compensate for head movement directly, you'll only make a motion sickness generator. Look at how timewarp works - you move the servos in the general direction a person is looking at and render the video reprojected on a sphere - this way you have almost zero lag.

What is the different between using accelerometer&gyroscope and accelerometer&Magnetometer to find the orientation?

Using accelerometer&gyroscope to find orientation need to integrate gyroscope and need to use filter. The program will become very complicated. I try to use this method but I can't control the time interval of getting sensor data. Therefore, I can't really figure out how to do it.
Using accelerometer&Magnetometer seems easier to do it.
Then, what is the different between these two method?
Thank you
Accelerometer and gyroscope measures acceleration and orientation along linear and rotational axis while magnetometer and accelerometer measures orientation along linear axis and magnetic fields produced by surface of earth. I went through a module that has an inbuilt filter on the sensor shield and the code also worked well. This is 3D Accelerometer 3D Gyroscope 3D Magnetometer.
https://www.controleverything.com/content/Accelorometer?sku=LSM9DS0_I2CS
You can use the code according to the language you desire
https://github.com/ControlEverythingCommunity/LSM9DS0
Thanks..

calculating android device moving speed (Using accelerometer and gyroscope)

I need to create an app that Calculates the device's velocity, with x/y/z speed.
My idea is using device's accelerometer and gyroscope,
like this pipeline
I wanted to know that whether accelerometer and gyroscope right sensor choice for this ?
(in the pipeline).
What Rotation table should i use for this?

Android Sensors - Which of them get direct input?

the Android SDK actually offers a nice interface to access the sensors.
But e.g. the linear acceleration-sensor can be evaluated as the documentation describes from gravity and acceleration - so there is no real physical counterpart for this Sensor, it is rather a - let's call it - "virtual sensor".
For the proximity-sensor things are rather clear, i can't imagine it is influenced by some other values.
But the GPS-sensor could be influenced by the accleremoter sensor when the GPS-signal is rather weaks I think values are somehow estimated supported by other sensors.
So basically my question is: which sensors do get direct input from physical sensors and which are somehow altered or totally calculated by the Android-SDK?
And how do I get raw input from the sensors?
I appended a list of the sensors available through the Sensor-class. GPS, W-LAN, Camera, etc. missing
//API-Level: 3
TYPE_ACCELEROMETER
TYPE_GYROSCOPE
TYPE_LIGHT
TYPE_MAGNETIC_FIELD
TYPE_PRESSURE
TYPE_PROXIMITY
TYPE_TEMPERATURE
//API-Level: 9 (2.3)
TYPE_GRAVITY
TYPE_LINEAR_ACCELERATION // can be calculated via acc. and grav. (link above)
TYPE_ROTATION_VECTOR
I am pretty sure the GPS at the moment is a stand alone or give raw data output.
The orientation sensor is one that I know that is not a raw from a single sensor but is actually the fusion of 2 sensors and in the future possibly more (gyro). As of now the orientation is a combination of the magnetic field sensor (compass) and the accelerometer. Any modern day compass will use both the compass and accelerometer to calculate its final direction and to compensate for drift, noise and other interference. If you notice when calculating the orientation with get rotation matrix and get orientation it requires you to listen for both magnetic field and accelerometer sensors.
I would say the gravity, linear acceleration and rotation vector sensors are not actual sensors and just parts of data from other sensors separated out, mostly from accelerometer and compass.
Lastly the pressure and temperature sensor are actually calculated through a single sensor.

Categories

Resources