I want to create an android app for vibrations analysis using embedded accelerometer, would you suggest an easy solution/library to get a frequency spectrum out of the x,y,z accelerations?
Related
Is it possible to reading data from accelerometer with very high sample rate? I need about an 1-2 KHZ. My sensor is can read data with sample rate 25 KHZ, but on android it reading data with 200-400 HZ sample rate. What to do?
The sample rate depends in the sensor hardware.
Most Android phones are not equipped with high frequency accelerometers.
You can check what your phone sensor supports using [getHighestDirectReportRateLevel][1] method of the sensor object.
You can also use the Sensor object you get from SensorManager to check the exact model of the accelerometer in your phone and look up its speck sheet.
I am trying to understand the process of sensor fusion and along with it Kalman filtering too.
My goal is to detect Fall of a device using Accelerometer and Gyroscope.
In most of the papers such as this one, It mentions how to overcome drift due to Gyroscope and noise due to Accelerometer. Eventually the sensor fusion provides us with better measurements of Roll, Pitch and Yaw and not better acceleration.
Is it possible to get better 'acceleration results' by sensor fusion and in turn use that for 'Fall detection' ? As only better Roll, Yaw and Pitch are not enough to detect a Fall.
However this source recommends to smoothen Accelerometer (Ax,Ay,Az) and Gyroscope (Gx,Gy,Gz) using Kalman filter individually and using some classification algorithm such as k-NN Algorithm or clustering to detect Fall using supervised learning.
Classification part is not my problem, it is if I should fuse the sensors(3D accelerometer and 3D gyroscope) or smoothen the sensors separately, with my goal of detecting a fall.
Several clarifications
Kalman Filter is typically to perform sensor fusion for position and orientation estimation, usually to combine IMU (accel and gyro) with some no-drifting absolute measurements (computer vision, GPS)
Complimentary filter, which is typically used to have good orientation estimation by combining accel(noisy but non-drifting) and gyro(accurate but drifting) . Using accel and combine with gyro, one can have fairly good orientation estimation. The orientation estimation you can see as primary using the gyro, but corrected using accel.
For the application of Fall detection using IMU, I believe that acceleration is very important. There is no known way to "correct" the acceleration reading, and thinking of this way is likely to be the wrong approach. My suggestion is to use accelerations as one of your inputs to the system, collect a bunch of data simulating the fall situation, you might be surprised that there are a lot of viable signals there.
I dont think you need to use KF to detect fall detection. Using simple Accelerometer will able to detect the fall of device. If you apply low pass filter to smooth accelerometer and check if total acceleration is close to zero (in free fall device is going with -g (9.8 m/s2) acc) for more than certain duration, you can detect as fall.
The issue with above approach is if device is rotating fast then acceleration wont be close to zero. For robust solution, you can implement simple complementary (search for Mahony) filter rather than KF for this application.
So, the Android JAVA SDK has provisions to do sensor fusion.
It can combine gyro and accelerometer readings to generate a reliable gravity vector.
This is exposed in the API with the sensor type TYPE_GRAVITY.
Now, the NDK has sensor support as well, for instance, here is an example NDK app that reads the accelerometer at 60Hz: NativeActivity sample source
However, judging from the sensor.h sourcecode, the NDK only lets me generate the following sensor types:
Accelerometer
Magnetic field
Gyroscope
Light
Proximity
So the sample code uses ASENSOR_TYPE_ACCELEROMETER. Sadly, there is no ASENSOR_TYPE_GRAVITY. Is this simply a case of NDK only doing a barebones subset of the SDK API?
How can I do sensor fusion on Android NDK? I would like to avoid bringing in the values from Java, as I have read warnings on heavy CPU use when doing high frequency readings in Java.
Thank you.
Not sure if you're still looking for this, but I've had the same problem recently and just thought that I would share how I solved this.
ASENSOR_TYPE_*'s are defined as int so replacing ASENSOR_TYPE_ACCELEORMETER with 1 would still give you access to the accelerometer sensor to use other sensors take a look at
http://developer.android.com/reference/android/hardware/Sensor.html
and find the int value for the sensor you want use and use that in place of ASENSOR_TYPE_ACCELEROMETER. For example to use the GYROSCOPE you could use 4.
or you could define your own set of int too using #define
How to use Android Accelerometer feature to measure the distance when phone moved position? What is the best way to build this kind of application
example on Youtube
Android Provide SensorManager for this.There are many samples are available which for creating such types of app.Below links may help you :
http://www.techrepublic.com/blog/app-builder/a-quick-tutorial-on-coding-androids-accelerometer/472
http://www.ibm.com/developerworks/opensource/library/os-android-sensor/
There is no way to do it reliably. For true inertial navigation, you need accelerometer as well as hyroscopes. For meassuring distance on plain surface, it could work. You will have to
listen to sensor values with maximal frequency phone gives you.
integrate values over the time.
As your phone lies on flat surface, you shall ignore respective acceleration component.
You may also need some denoizing of acelerometer values - chips used in phones are of miserable quality
I am using LG Optimus 2x smartphone(Gyroscope and Accelerometer sensor) for positioning.
I want to get correct rotation angles from gyroscope that can be used later on for body to earth coordinate transformation. My question is that
How I can measure and remove the drift in gyro sensor.
The one way is to take the average of gyro samples (when mobile is in static condition) for some time and subtracting from current sample, which is not good way.
When the mobile is in rotation/motion how to get the drift free angles?
As far as I know, either the Kalman filter or something similar is implemented in the SensorManager. Check out Sensor Fusion on Android Devices: A Revolution in Motion Processing.
You are trying to solve a problem that is already solved.
I am the author of a compass application that integrates data from magnetic and gyroscope sensors (steady compass). I have tested this application mostly on a LG Optimus black (the device that you can see on the video) running Android 2.2, so I am going to share my experiences:
Gyroscope readings are very accurate. This sensor is just the opposite to accelerometer and magnetic sensors which give readings with a lot of jitter.
The readings from the gyroscope (i.e. the angular speed) does not drift at all. You will have a drift in the estimation of the orientation if you just integrate gyroscope readings. Since you are integrating samples in different times, you will obtain just an approach that will degrade after every integration step.
In order to avoid such a drift in the orientation estimation, you must consider other input sources to correct the results coming from gyroscope data integration. The solution is the integration of data coming from the orientation sensor (magnetic + acceleration) and data coming from the gyroscope.
Be careful with LG phones: According to the Android API, gyroscope will return data in rad/s. The LG Optimus Black with Froyo gives readings in degrees/s. The update to Android 2.3 has just been released for such phone. I have to test whether the new version behaves according to the specifications.
What Android version does your phone have? Have you tested any application using gyroscope? Did you get the expected results?
Basically gyros drift over long time periods. Whereas accelerometers have no drift but tend to be unstable. By combining information from both sensors using a Kalman filter you can obtain a accurate attitude. For some this less complex you could also use a Complementary Filter.
See this post for more info:
Combine Gyroscope and Accelerometer Data