What I am trying for :-
I would like to find magnet using my android device.
What I have done yet :-
I am registering my sensor of type magnetic field like this
Sensor magnetometer = mSensorManager
.getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD);
and in onSensorChanged(SensorEvent event) method I am getting x,y,z from event.value if type is magnetic field.
Upto this everything is fine, then I get absolute value from these three values using
Math.sqrt((x*x)+(y*y)+(z*z)) and I get one absolute value.
But I think this is something only related to device because when I keep one magnet closer to my device nothing happens(neither absolute value nor its (x,y,z) changes.
Thanks.
I'd guess your device does not measure your magnet but the magnetic field of the earth. And independent from the orientation of your device when you concatenate the values this way you always get back the same value. Except the returned value is always 0, then you simply measure nothing and it could be a hardware failure.
I am a newbie on Android development too and was interested in creating a similar app, where i get notified when a magnet is brought sufficiently near to my phone.
I used the same formula for calculating the net Magnetic Field, and if u refer to this Android Documentation on Sensor Managers here, u will find that MAX and MIN values for the Earth's Magnetic Field.
My app uses the same range to detect a magnet and it works fine. Depending upon the "promixity" you require, you can tweak the values as you wish.
In case, you don't notice any considerable difference in field values upon bringing a magnet closer, i think, that is some hardware issue.
Related
As you all know when compass come near magnetic field its value got disturbance and it went away from its actual value. I have been working on a project in which i have to get device rotation with 95% accuracy. I have tried number of sensors which can be used to get device angle but all have some sort of external factors which make it un-useful for me to use in my case.
For example when i use TYPE_MAGNETIC_FIELD and i came closer to any magnetic thing or electric wires or even my laptop its value adds up error.
when i use TYPE_ORIENTATION (i know its deprecated) its value adds up a Drift Value after some time which makes it unreliable for me to use.
I have also used some other sensors to combine and make it to behave like compass without magnetic field but all of them either become unreliable with magnetic field or adds up some drift after some time.
I have read some of research papers from difference research institutes but i don't think anyone has ever produce rotation angle with 95% accuracy. I want to know if there is any way even with AI algorithm from which we can achieve this case? The end goal is to get a device angle which is 95%> accurate.
I am building an Android app (for version 8 currently) that uses the magnetic compass. In the application, I have added a check to confirm that the compass is currently giving accurate results and, if it is not, the user gets prompted to perform a 'figure of 8' calibration sequence.
In order to properly test this, I would like to UNcalibrate the compass, so that I can check that the calibration code is actually working correctly. Is there any way to do this reliably?
Related to compass calibration although that refers to a long superseded version of Android.
If you allow the device to sit motionless for a while (with your app not reading the compass), that may do it.
You also might bring something magnetic close to it to throw the magnetometer off.
My understanding of "calibration" is as follows: the hardware (or low level drivers) know that earth's magnetic field is uniform, hence length of resulting magnetic vector should be constant regardless of device orientation. If there is a second magnetic field which rotates with the device, the sum of the two fields will not be constant; they detect this and mathematically subtract out the second field by analyzing how the sum field is distorted. Waving in a figure 8 provides enough sample data at various orientations to accomplish this.
We have a project where we just switched our use of Sensor type from type TYPE_ROTATION_VECTOR to TYPE_GAME_ROTATION_VECTOR.
According to google's doc:
Identical to TYPE_ROTATION_VECTOR except that it doesn't use the geomagnetic field. Therefore the Y axis doesn't point north, but instead to some other reference, that reference is allowed to drift by the same order of magnitude as the gyroscope drift around the Z axis.
My question is (being an android noob) how can one 'calibrate' that reference to the current device's orientation? meaning from the starting orientation the device is at before onSensorChanged enters into play?
what we need is orientation data from a reference frame, that reference frame being the initial device orientation of the device in space (so deltas are needed, not absolute rotations)
any help is highly appreciated, i'm an iOS developer mainly and this is all of 1 line of code there :S
how can one 'calibrate' that reference to the current device's orientation? meaning from the starting orientation the device is at before onSensorChanged enters into play?
You could literally do the maths. Measure the yaw/heading at the start, and compare it to your current value over time. This number wouldn't be affected by magnetic interference, unlike TYPE_ROTATION_VECTOR.
The trade off is your reference can drift.
Just answering this question because I was researching out about TYPE_GAME_ROTATION_VECTOR.
PS: I would guess the reason you'd switch to TYPE_GAME_ROTATION_VECTOR is to avoid magnetic interference with surroundings. People usually play games indoors, where the compass might interfere with hard/soft iron.
TYPE_GAME_ROTATION_VECTOR docs
TYPE_ROTATION_VECTOR docs
I'm developing a tool which receives motion sensor data and sends it to a machine learning algorithm, which ultimately will deduce different types of movement.
I read the Motion sensor guide and it seems like there is some redundancy in the data you can get from the sensors. For example: the accelrometer data contains gravity data and the linear acceleration data shows acceleration without acceleration due to gravity.
So my question is: do i really need all the sensors to get all forms of motion or can I give up some of them?
EDIT: (clarifying the question)
I want to collect the minimal data that will allow me to deduce the same things. What I'm looking for is user behavior: the angle which the user holds his phone, the way the user moves while using his phone, etc..
The answer I'm looking for should include the sets of sensors that have high correlation within them, such that only some of the sensors in this set are required to deduce the same type of motion\movement\rotation\acceleration\etc..
The term "Motion" in the question have no precise meaning. So I answer more generally.
"The way one holds his phone" is nothing but the orientation of the phone.There are three sensors which individually tells the orientation of the phone.
Accelerometer sensor
Orientation sensor
Rotation Vector sensor
Among them only the accelerometer is physical sensor and other two are virtual sensors (they don't have special piece of hardware, they use accelerometer data and report the orientation in different formats).
The orientation sensor is deprecated so you can't use it.
Rotation vector sensor tells the orientation encoded in a quaternion. If your code is based on quaternions then normalize the sensor output using SensorManager.getQuaternionFromVector() and continue. If your code is based on rotation matrix then obtain rotation matrix by calling SensorManager.getRotationMatrixFromVector() passing sensor output and continue. If you want the orientation alone get it by calling SensorManager.getOrientation() passing rotation matrix obtained previously.
Using accelerometer sensor we can find the orientation, but the recommended approach is to combine it with magnetic field sensor output. Call SensorManager.getRotationMatrix() by passing the output of accelerometer output and magnetic field sensor output and get the rotation matrix. If your code is based on rotation matrix, just continue. If you want the orientation alone get it by calling SensorManager.getOrientation() passing rotation matrix obtained in previously. If your code is based on quaternion call SensorManager.getQuaternionFromVector() by passing rotation vector (orientation) obtained previously.
"The way one moves his phone" - Here I consider four motions.
Change of position (Simple translation) and rate change of position (velocity) - No sensor to detect them.
Rate of change of velocity (Simple acceleration) - Accelerometer detects it. But it also contains the gravity component. Normally we need acceleration without gravity component. This could be calculated simply as explained here. However there is another virtual sensor called Linear Acceleration which does the job for us.
Change of orientation (Rotation) - Whenever the orientation changes the accelerometer, orientation and rotation vector sensors report us (gyroscope also reports, but is explained in next point). How to use this sensor to get the current orientation is explained in first part of the answer.
Rate of change of orientation (Angular velocity) - Whenever the orientation changes the gyroscope sensor reports. The output is three numbers representing angular acceleration along x, y and z axes. The unit is radians per second.
Output of the gyroscope sensors is not accurate in long term and the output of accelerometer is not accurate in short term, so combine them to get steady output. For details see this question.
Now it is clear that the gyroscope and accelerometer is required in minimum. However using wide range of sensors minimizes our work.
You can't decide what you get - each sensor's data is already defined, and you get all or nothing. If you see closely, there isn't a place in public API which would let you ask for specific things.
To back this up here's quote from Google's document explaining sensor types:
An accelerometer sensor reports the acceleration of the device along the 3 sensor axes. The measured acceleration includes both the physical acceleration (change of velocity) and the gravity. The measurement is reported in the x, y and z fields of sensors_event_t.acceleration.
If you see into android source, the structs here are strictly defined, and struct for acceleration contains specific fields. So even if you would get 0 in fields you don't like, you won't gain anything.
But what you're referring to are two things - base sensors, which are roughly equivalent to physical sensors on the device, and composite sensors, which combine readings from various physical sensors to get more useful data.
So while you can't decide what you get for a particular sensor (like "only gravity" or "only acceleration in Y axis"), composite sensors do give you data that you can compute by yourself using only base sensors. So linear acceleration is composition of data from accelerometer and gyroscope (or magnetic sensor), after some calculations. Similarly step detector "sensor" uses only accelerometer, but interpretes the data automatically to just give you an event that "yes, someone has made a step" with single value 1.
If you're feeding raw motion data to some algorithms, I would guess base sensors are what you're looking for. That said, I believe you can still safely register for all sensors (both base and composite ones) that combined give you all data that you need (and maybe more), without meaningful battery impact.
For more detailed information on each of the sensors refer to Sensor types on Android website, and if you're curious, you can read up short summary on sensors stack as well.
No, you don't need every sensor. Some of the sensors exist as a convenience to the user. Your example of the linear acceleration sensor is one- it tells you the results of the accelerometer with gravity taken out. You could do this yourself from the raw accelerometer data, but that takes a bit of math (you need to subtract the vector gravity over all 3 axes) and a bit of knowhow (did you remember to calibrate the sensor? It may not read 9.8 at rest. For that matter, 9.8 may not be your gravity if you're not at sea level). That's a lot of work that would need to be repeated by each app, so they created a software "sensor" that sits on top of the accelerometer and provides the computed data. It would be unusual for an app to use raw and linear accelerometers in the same app, generally its one or the other. The step counter is another example of this, it guesses at what a step is based on the accelerometer data. You also wouldn't want calibrated and uncalibrated gyroscope data.
As for what you do need- no clue, you don't say enough about what you're trying to do. One warning though- you said you're trying to detect motion. YOu can't do that. You can detect accelerations and rotation. You cannot detect motion at a constant speed. If you're developing any type of app using these it pays to use the correct terminology and think in terms of physics and how the physical accelerometer and gyroscope work, otherwise you're going to cause yourself bugs.
I am needing to implement a shake recognizer, and I am using the accelerometer on the device to that. However, when I check the values I get from the sensor, it appears that they vary wildly from device to device. For instance, I get a value range of 0-8 as force (after some calculations) on one device, and on the other 0 - 4.
So it looks like they have very different ranges.
Is there anything I can do to make these ranges equal. Or are there some variables that I can use to somehow calculate what a fairly hard shake would be?
According to specification accelerometer should return Measures the acceleration force in m/s2. So it should be calibrated. One thing you could check however is the Sensor class's getMaximumRange() and getResolution()
The physical placement of the chip on the pcb and the securing of the pcb within the device and the construction of the device could all lead to different damping effects in responce to your shaking input force.
You don't say how your processing the sensor data there may well be effects related to sampleing and filtering performed at the driver level.
You clearly need to be flexible in your code with the range of values you expect and test on a good range of devices.
The sensor should be calibrated.
Apparently it isn't. If the gain in the different directions (that is x, y, z) is not significantly different then it is enough to look for sudden changes in the length^2 of the accelerometer vector: x^2+y^2+z^2.
If the gains are also significantly different then you have no choice but to write an app for accelerometer calibration...
By the way, you are not the first one to report gross inaccuracies, see for example Android: the range of z-value in the accelerometer sensor are different on different devices.