Doing sensor fusion in Android NDK - android

So, the Android JAVA SDK has provisions to do sensor fusion.
It can combine gyro and accelerometer readings to generate a reliable gravity vector.
This is exposed in the API with the sensor type TYPE_GRAVITY.
Now, the NDK has sensor support as well, for instance, here is an example NDK app that reads the accelerometer at 60Hz: NativeActivity sample source
However, judging from the sensor.h sourcecode, the NDK only lets me generate the following sensor types:
Accelerometer
Magnetic field
Gyroscope
Light
Proximity
So the sample code uses ASENSOR_TYPE_ACCELEROMETER. Sadly, there is no ASENSOR_TYPE_GRAVITY. Is this simply a case of NDK only doing a barebones subset of the SDK API?
How can I do sensor fusion on Android NDK? I would like to avoid bringing in the values from Java, as I have read warnings on heavy CPU use when doing high frequency readings in Java.
Thank you.

Not sure if you're still looking for this, but I've had the same problem recently and just thought that I would share how I solved this.
ASENSOR_TYPE_*'s are defined as int so replacing ASENSOR_TYPE_ACCELEORMETER with 1 would still give you access to the accelerometer sensor to use other sensors take a look at
http://developer.android.com/reference/android/hardware/Sensor.html
and find the int value for the sensor you want use and use that in place of ASENSOR_TYPE_ACCELEROMETER. For example to use the GYROSCOPE you could use 4.
or you could define your own set of int too using #define

Related

Composite Android sensor not present despite what documentation says

I am using a old phone running android 4.0.4 that has the following sensors (printed by using TYPE_ALL in android studio):
Accelerometer Sensor
Magnetic Field Sensor
Proximity Sensor
Orientation Sensor
According to the android documentation on composite sensors this phone should be able to output linear acceleration, however when i ask for it, only null is returned. Other simple sensors such as accelerometer or magnetometer work fine (the basic sensors that are fused to obtain linear acceleration).
The code so far is pretty basic, it's just printing values.
Any idea why this happen? is this standard behavior? Are not all available sensors to a smartphone supposed to be implemented?
The sensors that are specified as hardware/software behave differently according to the sensors available. If https://developer.android.com/guide/topics/sensors/sensors_overview.html#sensors-identify says software, it is the developer responsibility to use the android API to fuse the right basic signals into the composite signal you are looking for.

Can I use TYPE_LINEAR_ACCELERATION on all phones?

I'm developing an app that measures car's accelerations.
Instead of using the accelerometer, I have read about linear accelerometer and implemented it.
I have two devices to test, It's working fine on a Sony Xperia Z1 but I don't get any value on a old Alcatel onetouch.
Is it because of its android version (5.1 vs 4.2), because my code is wrong or is it a hardware limitation?
Taken from Android's documentation:
TYPE_LINEAR_ACCELERATION
Added in API level 9
int TYPE_LINEAR_ACCELERATION
A constant describing a linear acceleration sensor type.
See SensorEvent.values for more details.
Constant Value: 10 (0x0000000a)
This means it should work on any device post API 9
Please take in consideration Lork's response in this post. It states which type of sensor you should use in order to measure certain types of movements.
Update: I'll add the information of Lork's response as it might help future readers:
TYPE_ACCELEROMETER uses the accelerometer and only the accelerometer. It returns raw accelerometer events, with minimal or no processing at all.
TYPE_GYROSCOPE (if present) uses the gyroscope and only the gyroscope. Like above, it returns raw events (angular speed un rad/s) with no processing at all (no offset / scale compensation).
TYPE_ORIENTATION is deprecated. It returns the orientation as yaw/ pitch/roll in degres. It's not very well defined and can only be relied upon when the device has no "roll". This sensor uses a combination of the accelerometer and the magnetometer. Marginally better results can be obtained using SensorManager's helpers. This sensor is heavily "processed".
TYPE_LINEAR_ACCELERATION, TYPE_GRAVITY, TYPE_ROTATION_VECTOR are "fused" sensors which return respectively the linear acceleration, gravity and rotation vector (a quaternion). It is not defined how these are implemented. On some devices they are implemented in h/w, on some devices they use the accelerometer + the magnetometer, on some other devices they use the gyro.
On Nexus S and Xoom, the gyroscope is currently NOT used. They behave as if there was no gyro available, like on Nexus One or Droid. We are planing to improve this situation in a future release.
Currently, the only way to take advantage of the gyro is to use TYPE_GYROSCOPE and integrate the output by hand.
I hope this helps,
Mathias

accelerometer on android

How to use Android Accelerometer feature to measure the distance when phone moved position? What is the best way to build this kind of application
example on Youtube
Android Provide SensorManager for this.There are many samples are available which for creating such types of app.Below links may help you :
http://www.techrepublic.com/blog/app-builder/a-quick-tutorial-on-coding-androids-accelerometer/472
http://www.ibm.com/developerworks/opensource/library/os-android-sensor/
There is no way to do it reliably. For true inertial navigation, you need accelerometer as well as hyroscopes. For meassuring distance on plain surface, it could work. You will have to
listen to sensor values with maximal frequency phone gives you.
integrate values over the time.
As your phone lies on flat surface, you shall ignore respective acceleration component.
You may also need some denoizing of acelerometer values - chips used in phones are of miserable quality

iOS/Android Device orientation (pitch, yaw, roll). Is it better with accelerometer or gyroscope?

I've been researching for a bit now and now I have to decide which road to take.
Mine requirements: Need to know device's orientation relative to the true heading (geographic north pole, not magnetic pole).
For that I must use compass and now I have to decide which other thing, accelerometer or gyroscope.
Since this is a new thing to me I've spent last few hours reading stacks and wikipedia articles and still I am confused.
I am targeting both platforms (iOS and Android) and I am developing them with Appcelerator Titanium. With Titanium I can easily get accelerometer's values (x,y,z) and trueHeading.
Since iPhone 3GS does not have gyroscope obviously I can't use it on that device. Newer iPhones and Android devices have it.
So the questions are:
Is accelerometer's XYZ and compas's TrueHeading data enough for me to calculate device pitch, roll and yaw? But it has to be accurate.
Is it more accurate to use TrueHeading from compas and use gyroscope's values instead of accelerometer's?
Is it clever to combine both accelerometer and gyroscope with TrueHeading?
If I take the first road I don't have to write Titanium module for fetching the gyroscope data since it gives me only accelerometer data and I can use this on 3GS iPhone also.
If I take the second road I have to write two modules (iOS and Android) to fetch me gyroscope data and I lose 3GS support.
If I take the third road I again have to write Titanium modules and I lose 3GS support.
First of all if you don't have a huge installed base of 3GS users but write a new app, don't care about old hardware. IMO it doesn't make sense from an economical point of view but will shrink your number of alternatives in system architecture.
What you are looking for is called sensor fusion. Doing this consequently requires some heavy math like Kalman Filters etc. The good news is that it exists on iPhone (Core Motion) and AFAIK on Andriod as well (sounds like it is in Sensor fusion implemented on Android?).
I don't know much about appcelerator aside from the name and thus cannot say anything about an easy way to use it. Anyway if not implemtented on an abstract layer, I assume appcelerator provides you with the possibility to do native API calls. Thus you should be able to embed the native API (after fiddling around some time;-).
In general it depends on your requirements. The faster you need to have an exact result the more I'd recommend real sensor fusion including all 3 sensors. If you are fine with a slower response time, a combination of compass and accelerometer will do.

Android get Sensor values from sensor

is there a way to root my Android phone and get sensorinformation direct from the sensor or system?
LG marcel
I don't really know what you mean by getting the information in a "direct" way.
There are two types of sensors in the android sdk: sensors representing an actual hardware sensor (e.g Sensor.TYPE_ACCELEROMETER) and virtual sensors like Sensor.TYPE_LINEAR_ACCELERATION which values are calculated based on the data of one or more sensors. Using the sensor types which represent real hardware allows you to read the data delivered from the sensor The problem with all kind of sensors in smartphones today is that there might be some kind of preprocessing (e.g lowpass filtering) involved and there is little documentation on where and how this is done as there is often no information on which sensor is used the device. I recommend reading this article on android sensors.
Hope this helps...
In android, you have the SensorManager class which give you the possibility to work with your sensors on the device (except the GPS and the camera sensor).
To understand how it works, see this:
http://developer.android.com/reference/android/hardware/SensorManager.html
If you have questions abaut this, feel fre to post them.

Categories

Resources