is there a way to root my Android phone and get sensorinformation direct from the sensor or system?
LG marcel
I don't really know what you mean by getting the information in a "direct" way.
There are two types of sensors in the android sdk: sensors representing an actual hardware sensor (e.g Sensor.TYPE_ACCELEROMETER) and virtual sensors like Sensor.TYPE_LINEAR_ACCELERATION which values are calculated based on the data of one or more sensors. Using the sensor types which represent real hardware allows you to read the data delivered from the sensor The problem with all kind of sensors in smartphones today is that there might be some kind of preprocessing (e.g lowpass filtering) involved and there is little documentation on where and how this is done as there is often no information on which sensor is used the device. I recommend reading this article on android sensors.
Hope this helps...
In android, you have the SensorManager class which give you the possibility to work with your sensors on the device (except the GPS and the camera sensor).
To understand how it works, see this:
http://developer.android.com/reference/android/hardware/SensorManager.html
If you have questions abaut this, feel fre to post them.
Related
This may look pretty straight forward, but I have googled, stack overflowed and searched Android Developer guide as well.
How can I access the Hall sensor present in the phone? I know my phone has one and programmatic access is possible (it came with a Flip cover that worked on the Stock ROM). I'm trying to extend that functionality to AEX custom ROM.
I've played around with various sensors present in the Sensor class (android.hardware.Sensor) - but couldn't find the Hall sensor among it.
So my question - Is High level programmatic access to it is possible, or is it something like Kernel-only, or is there any third party API?
Of course, I know I could instead use a Proximity sensor - but accessing Hall sensor should be simple, right? What obvious thing am I missing?
Isn't it just a magnetic field that you want to detect? Wouldn't a Magnetometer give the values you need.
Maybe physical sensor does not exist but a composite one does, i.e. something that will allow you to determine values based on a collection of values from other sensors.
In this case other sensors could be the Position Sensors
for example you could get a reading on the magnetic field strength and orientation to perhaps get the value you need.
I am using a old phone running android 4.0.4 that has the following sensors (printed by using TYPE_ALL in android studio):
Accelerometer Sensor
Magnetic Field Sensor
Proximity Sensor
Orientation Sensor
According to the android documentation on composite sensors this phone should be able to output linear acceleration, however when i ask for it, only null is returned. Other simple sensors such as accelerometer or magnetometer work fine (the basic sensors that are fused to obtain linear acceleration).
The code so far is pretty basic, it's just printing values.
Any idea why this happen? is this standard behavior? Are not all available sensors to a smartphone supposed to be implemented?
The sensors that are specified as hardware/software behave differently according to the sensors available. If https://developer.android.com/guide/topics/sensors/sensors_overview.html#sensors-identify says software, it is the developer responsibility to use the android API to fuse the right basic signals into the composite signal you are looking for.
I'm working on Sensor fusion with Accelerometer, Gyroscope and Magnetic Field on Android. Thanks to SensorsManager I can be noticed for each new value of theses sensors.
In reality and this is the case for my Nexus 5 (I'm not sure for others Android devices), acceleration, rotation rate and magnetic field are sampled in same time. We can verify it using event.timestamp.
On others systems (like iOS, xSens...), Sensor SDK provides a notification with these 3 vectors in same time.
Of course, when I receive an acceleration(t), I can write some lines of codes with arrays to wait rotationRate(t) and magneticField(t). But if there is a way to have an access directly to these 3 vectors together it could be very interesting to know!
An other question relative to sensors data:
Is there advices from Android team to device constructors to provide data in chronological order ?
Thank you,
Thibaud
Short answer, no, Android doesn't provide a way to get all the sensor readings as it reads them.
Furthermore, the behavior that you've observed with SensorManager, namely that readings from different sensors happen to have the same timestamp suggesting that they were read together - should not be relied upon. There isn't documentation that guarantees this behavior (also, this is likely a quirk of your testing and update configuration), so relying upon it could come to bite you in some future update (and trying to take advantage of this is likely much more difficult to get right or fast than the approach I outline below).
Generally, unless all results are generated by the same sensor, it is impossible to get them all "at the same time". Furthermore, just about all the sensors are noisy so you'd already need to do some smoothing if you read them as fast as possible.
So, what you could do is sample them pretty quickly, then at specific intervals, report the latest sample from all sensors (or some smoothed value that accounts for the delta between sample time and report time). This is a trivial amount of extra code, especially if you're already smoothing noisy sensor data.
There is a workaround for this particular problem. When multiple registered listeners are present in an activty at the same time, timestamp for those event may be misleading. But you can multiple fragment objects into the said activity which have different context's. Then listen to every sensor in these fragments. With this approach the sensor reading timestamps become reliable.
Or listen in parallel threads if you know about concurrency...
I am trying to determine the benefit of making use of Android's Linear Acceleration data as opposed to simply applying a low pass filter as presented in Androids API reference and discussed in this other stackoverflow question.
I am asking as I am trying to get hold of a free app that records Linear Acceleration (as well as fullfils my other requirements (sampling rate, writing data to file etc...)). I haven't been able to find one, so I have considered just using an app that records using the standard accelerometer and then I'll simply apply the low pass filter to the data. Alternatively I could just write my own app to do what I need - but I don't have much experience in Android dev and this will take some time.
I have explored this subject at some length and I may be able to help point you in the right direction.
As others have mentioned, only some phones have implemented TYPE_LINEAR_ACCELERATION and TYPE_GRAVITY and they usually are equipped with a gyroscope. A Droid Razr even has a gyroscope, but they never bothered to implement it or TYPE_LINEAR_ACCELERATION. I believe the GS2 has TYPE_LINEAR_ACCELERATION implemented, but no gyroscope so they must have used the magnetic sensor or some sort of low-pass filter. It can be frustrating.
On most phones with a gyroscope there is some sort of fusion between the acceleration sensor and gyroscope (probably a complementary filter to compensate for drift and then quaternions or cardan angles to isolate gravity). These fusions and filters can be implemented differently and use different hardware, etc... Latency and accuracy are going to vary among devices, so TYPE_LINEAR_ACCELERATION isn't always going to produce the same results.
If you do not have a phone with TYPE_LINEAR_ACCELERATION, you are stuck with TYPE_ACCELERATION, which cannot separate gravity (tilt) from linear acceleration.
One option is to apply the low-pass filter. This may or may not work depending on your application. I have written a free application to help developers and other interested parties explore the low-pass filter option.
Another option is to just measure the tilt of the device when it is static and then apply that gravity measurement while the device is not static. If the device isn't changing the orientation often, this can be an excellent option because it is really fast and simple.
An excellent alternative sensor fusion option is to use the magnetic sensor instead of a gyroscope. This option will work on almost all devices assuming the magnetic field isn't under the effects of hard or soft iron distortions.
I have implemented all of these approaches in the open source project Acceleration Explorer
I am a newbie in android.
Just wondered how gyroscope works in android . Is it any hardware that is mounted inside or what exactly it is ?
Also, would like to know if gyroscope and accelerometer sensor are in anyway related ??
Thanks in advance !!
As for your first question, Google "gyro MEMS". For example, you will find An Overview of MEMS Inertial Sensing Technology.
Your second question is answered here.
Gyroscopes is the device which measures the rate(angular speed) change and Accelerometer measure the change in speed in x,y and z directions.
These devices has internal mechanism(made up of capacitors,filters, ADC components and an interface(I2c, spi)) to carry these analog outputs to digital world where they are interpreted by the software(sensor fusion libraries).
In the device (phone or tablet) they may or may not be present as a separate module(chip) but sure they are connected to the MCU of the Phone either directly or Via some target MCU(to which they are connected via interface like I2C or SPI).
No Gyroscopes and Accelerometers are not related in any way but their data when combined together on the basis of certain algorithms(of what sensor fusion libraries are made up of) gives the orientation of the the device and may help to flip the screen owing to their output.
-Rp