how to detect a metal using magnetic sensor in android phone? - android

I want to detect a metal using magnetic sensor values. i am getting values continuously like x=30.00 ,y=-20.00 ,z=-13.00
now i want to know how to use these values for detecting any metal(mathameticalcalu,formulas)
code is
sensorManager = (SensorManager) getSystemService(SENSOR_SERVICE);
// get compass sensor (ie magnetic field)
myCompassSensor = sensorManager.getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD);
float azimuth = Math.round(event.values[0]);
float pitch = Math.round(event.values[1]);
float roll = Math.round(event.values[2]);

To detect metal, you have to check the intensity of the magnetic field, i.e. the magnitude of the magnetic field vector.
float mag = Math.sqrt(x^2 + y^2 + z^2);
Then you need to compare this value to the expected value of the magnetic field at your location on Earth. Luckily, Android provides functions to do so. Look at the GeomagneticField, reference is here https://developer.android.com/reference/android/hardware/GeomagneticField.html
Then if the value you are reading out of the sensors is quite far from the expected value, there's "something" (you guessed it, metal) that is disturbing the Earth magnetic field in the vicinity of your sensor. A test you could implement for instance is the following:
if (mag > 1.4*expectedMag || mag < 0.6*expectedMag) {
//there is a high probability that some metal is close to the sensor
} else {
//everything is normal
}
You should experiment a bit with the 1.4 and 0.6 values so that it fits your application. Note that this is never going to work 100% of the time because the magnetic sensors on a smartphone are quite cheap and nasty.

You can detect magnetic field using android Magnetic field sensor not metals.But Metals which are having magnetic field also will be detected e.g iron,nickel etc.Because ferrous metals behave the same way as a live electric cable .

Related

Getting device orientation (yaw, roll and pitch) in Flutter

In Android I can get the device yaw, roll and pitch using a GAME_ROTATION_VECTOR sensor.
I need to do the same thing in Flutter, but I haven't been able to find anything but the sensors package, which only gives access to accelerometer & gyroscope sensors.
What can I do? Do I need to calculate the orientation myself from the accelerometer and gyro?
Don't know if this is still relevant, as the question wasn't closed, but for those seeking the answer... there is now a Flutter Sensors package which does all the magic for you.
You'll have to subscribe to the streams to get the current values of the accelerometers and gyroscopes, of course.
For example:
#override
void initState() {
super.initState();
_streamSubscriptions
.add(accelerometerEvents.listen((AccelerometerEvent event) {
setState(() {
_accelerometerValues = <double>[event.x, event.y, event.z];
});
}));
_streamSubscriptions.add(gyroscopeEvents.listen((GyroscopeEvent event) {
setState(() {
_gyroscopeValues = <double>[event.x, event.y, event.z];
});
}));
_streamSubscriptions
.add(userAccelerometerEvents.listen((UserAccelerometerEvent event) {
setState(() {
_userAccelerometerValues = <double>[event.x, event.y, event.z];
});
}));
}
I would not use the Flutter sensors_plus package to get yaw, roll or pitch. Even if you only wanted yaw (aka. heading or azimuth), as shown in this Github issue: MagnetometerEvent to compass values. It is challenging to convert magnetometer readings (microteslas) into roll, pitch, yaw because of the maths. You only get raw magnetometer readings.
As I wrote in the comment there:
I would highly recommend using
https://pub.dev/packages/flutter_compass, https://pub.dev/packages/motion_sensors or writing your own plugin to
use the OS APIs to get the data you need directly, instead of using
the magnetometer readings (unit: micro Teslas), to allow you to use the
operating-system calculated heading, which could use multiple
sensors (sensor fusion) to give improved accuracy based on the
hardware, as well as compensate for the location (the magnetometer
only allows you to calculate magnetic north, not true north).
motion_sensors provides an OrientationEvent, which contains roll, pitch and yaw, in radians.
Writing your own plugin
If you need roll or pitch (more than yaw/heading/azimuth), the OS APIs provide it:
On Android, in Java from SO answer:
float orientationData[] = new float[3];
SensorManager.getOrientation(R, orientationData);
azimuth = orientationData[0];
pitch = orientationData[1];
roll = orientationData[2];
On iOS, in Swift, from SO answer
CMQuaternion quat = self.motionManager.deviceMotion.attitude.quaternion;
myRoll = radiansToDegrees(atan2(2*(quat.y*quat.w - quat.x*quat.z), 1 - 2*quat.y*quat.y - 2*quat.z*quat.z)) ;
myPitch = radiansToDegrees(atan2(2*(quat.x*quat.w + quat.y*quat.z), 1 - 2*quat.x*quat.x - 2*quat.z*quat.z));
myYaw = radiansToDegrees(asin(2*quat.x*quat.y + 2*quat.w*quat.z));

TYPE_LINEAR_ACCELERATION Sensor values changes according to smartphone brand - model

I published a game that uses motion sensors two days ago on Google Play Store. This game is a boxer game and allows users when they shake smartphone score will be evaluated automatically. To evaluate score, it uses TYPE_LINEAR_ACCELERATION sensor.
But the problem is after I published the game, some of users send their score. I saw that for some smartphones it is easy to get 900 points when for some smartphones it is hard to get 500 points. I mean that, if the same user shake(with same force) different phones in this game; for X smartphone he gets (for example) 400 points, for Y smartphone the gets (for example) 850 points.
Why does this inequality occur?
I understood(guess) that some smartphones evaluates score less, when some smartphones evaluates score more.
My implentation (roughly)
sensorManager = (SensorManager) getActivity().getSystemService(Context.SENSOR_SERVICE);
sensorManager.registerListener(this, sensorManager.getDefaultSensor(Sensor.TYPE_LINEAR_ACCELERATION), SensorManager.SENSOR_DELAY_FASTEST);
-
#Override
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() == Sensor.TYPE_LINEAR_ACCELERATION) {
float[] values = event.values;
float x = values[0];//to get X - axis acceleration
//.......
//........
}
}
I can share my application link, but is it allowed? If yes, please say. I can share my application link.
Note: I want to explain my algorithm, to get score I get the maximum acceleration from sensor and keep the maximum value in a variable. after the user click show score button, I multiply by 40 and get score.
I do not understand your question. Type_Linear_Acceleration is a virtual sensor that gets its data from a combination of data from the accelerometer and the gyroscope.
Most smartphone accelerometer and gyroscopes do not have the exact same sensor, and obviously the data will be slightly different between each device.
Your haven't described how you calculate the score.
I would do a square root calculation of the 3 dimensions, and then use the result (the magnetude) as the score
score = getMagnetude(values);
and the function:
private float getMagnetude(float[] v) {
return Math.sqrt( v[0]*v[0] + v[1]*v[1] + v[2]*v[2] );
}
I understood my problem. I am using Sensor.TYPE_LINEAR_ACCELERATION so it uses Gyroscope + Accelerometer sensor.
What is the operation logic of TYPE_LINEAR_ACCELERATION sensor?
It uses accelerometer and gyroscope sensor.It have a formula that:
Linear Acceleration = k1 x accelerometer + k2 xgyroscope
k1 and k2 is constant values.
For gyroscope there is no any maximum(I think) value but for accelerometer sensor different smartphone uses different maximum value.
These maximum values are -+g(9.8), -+2g, -+4g and -+8g.
Finally, for different smartphones different results is shown because users usually reach the maximum value when they shake their smartphone.

What sensor can be used to detect rotation when upright?

Hi I am creating an application in which the user holds the phone upright and then rotates it around the y axis (similar to taking a panorama).
(source: apple.com)
I need to detect the angle of rotation. In iOS this was fairly simple with the gyroscope sensor, but I am not finding the same luck with Android. If anyone could point me in the right direction that would be great.
Assuming your Y axis points to the center of earth, the value you are looking for is called azimuth.
To monitor its change you will need to register a listener for TYPE_ACCELEROMETER and TYPE_MAGNETIC_FIELD events:
mngr = (SensorManager)getSystemService(Context.SENSOR_SERVICE);
accelerometer = mngr.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
magneticField = mngr.getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD);
int rate = SensorManager.SENSOR_DELAY_GAME; // or other
mngr.registerListener(sensorListener, accelerometer, rate);
mngr.registerListener(sensorListener, magneticField, rate);
And within the listener, call:
float[] values = new float[3];
SensorManager.getOrientation(R, values);
float current_azimuth_val = values[0]; // <----------
Note that the quality. and latency, if the data you will obtain is highly hardware dependent.
There are various sensors available that can be managed through a SensorManager. Of course, since every device decides whether or not to put a particular sensor on the hardware platform for their model you have to check whether one exists. Some have gyro like iOS, some can be done with accelerometer and magnometer sensors in its place.
You can get started here: http://developer.android.com/guide/topics/sensors/sensors_overview.html

Android sensor: getRotationMatrix() returns wrong values, why?

It's past several days since I started using this function and have not yet succeeded in obtaining valid results.
What i want is basically convert acceleration vector from device's coordinates system, to real world coordinates. I' know that is possible because i have acceleration in relative coordinates and i know the orientation of the device in real world system.
Reading Android developers seems that using getRotationMatrix() i get R = rotation matrix.
So if i want A (acceleration vector in world system) from A' (acceleration vector in phone system) i must do simply:
A=R*A'
But i cant'n understand why the vector A has ALWAYS the first and the second component zero (example: +0,00;-0,00;+6,43)
My current code is similar to this:
public void onSensorChanged(SensorEvent event) {
synchronized (this) {
switch(event.sensor.getType()){
case Sensor.TYPE_ACCELEROMETER:
accelerometervalues = event.values.clone();
break;
case Sensor.TYPE_MAGNETIC_FIELD:
geomagneticmatrix =event.values.clone();
break;
}
if (geomagneticmatrix != null && accelerometervalues != null) {
float[] Rs = new float[16];
float[] I = new float[16];
SensorManager.getRotationMatrix(Rs, I, accelerometervalues, geomagneticmatrix);
float resultVec[] = new float[4];
float relativacc[]=new float [4];
relativacc[0]=accelerationvalues[0];
relativacc[1]=accelerationvalues[1];
relativacc[2]=accelerationvalues[2];
relativacc[3]=0;
Matrix.multiplyMV(resultVec, 0, Rs, 0, relativacc, 0);
//resultVec[] is the vector acceleration relative to world coordinates system..but doesn't WORK!!!!!
}
}
}
This question is very similar to this one Transforming accelerometer's data from device's coordinates to real world coordinates but there i can't find the solution...i had tried all the ways..
Please help me, i need help!!!
UPDATE:
Now my code is below, i had tried to explain matrix product, but nothing change:
float[] Rs = new float[9];
float[] I = new float[9];
SensorManager.getRotationMatrix(Rs, I, accelerationvalues, geomagneticmatrix);
float resultVec[] = new float[4];
resultVec[0]=Rs[0]*accelerationvalues[0]+Rs[1]*accelerationvalues[1]+Rs[2]*accelerationvalues[2];
resultVec[1]=Rs[3]*accelerationvalues[0]+Rs[4]*accelerationvalues[1]+Rs[5]*accelerationvalues[2];
resultVec[2]=Rs[6]*accelerationvalues[0]+Rs[7]*accelerationvalues[1]+Rs[8]*accelerationvalues[2];
Here some example of data read and result:
Rs separated by " " Rs[0] Rs[1]....Rs[8]
Av separated by " " accelerationvalues[0] ...accelerationvalues[2]
rV separated by " " resultVec[0] ...resultVec[2]
As you can notice the component on x and y axes in real world are zero (around) even if you move speddy the phone. Instead the relative acceleration vector detect correctly each movement!!!
SOLUTION
The errors in the numberrs are relative to float vars multiplication that is not the same as a double multyplication.
This sums to the fact that rotation matrix isn't costant if the phone, even if with the same orientation, is accelerating.
So is impossible translate acceleration vector to absolute coordinates during motion...
It's hard but it's the reality.
Finnaly i found the answer:
The errors in the numbers are relative to float vars multiplication that is not the same as a double multyplication. Here there is the solution.
This sums to the fact that rotation matrix isn't costant if the phone, even if with the same orientation, is accelerating. So is impossible translate acceleration vector to absolute coordinates during motion... It's hard but it's the reality.
FYI the orientation vector is made from magnetomer data AND gravity vector. This cause a ciclic problem: convert relative acc needs oirentation needs magnetic field AND gravity, but we know gravity only if the phone is stop by relative acc..so we are return to begin.
This is confirmed in Android Developers where is explained that rotation matrix give true result only when the phone isn't accelerate (e.g. they talk of free fall, infact there shouldn't be gravity mesaurement) or when it isn't in a non regulare magnetic field.
The matrices returned by this function are meaningful only when the
device is not free-falling and it is not close to the magnetic north.
If the device is accelerating, or placed into a strong magnetic field,
the returned matrices may be inaccurate.
In others world, fully un-useful...
You can trust this thing doing simple experiment on the table with Android Senor or something like this..
You must track down this arithmetic error before you worry about rotation, acceleration or anything else.
You have confirmed that
resultVec[0]=Rs[0]*accelerationvalues[0];
gives you
Rs[0]: 0.24105562
accelerationValues[0]: 6.891896
resultVec[0]: 1.1920929E-7
So once again, simpify. Try this:
Rs[0] = 0.2;
resultVec[0] = Rs[0] * 6.8
EDIT:
The last one gave resultVec[0]=1.36, so let's try this:
Rs[0] = 0.2;
accelerationValues[0] = 6.8
resultVec[0] = Rs[0] * accelerationValues[0];
If you do the sums, using the printed values you have appended, I get
`(0.00112, -0.0004, 10)`
which is not as small as what you have. Therefore there is an arithmetic error!
Could the problem be that you are using accelerationvalues[] in the last block, and accelerometervalues[] later?
I have developed several applications that make use of android sensors, so I am answering to one of your questions according to my experience:
But i cant'n understand why the vector A has ALWAYS the first and the
second component zero (example: +0,00;-0,00;+6,43)
I have observed this problem with the acceleration sensor and the magnetic field sensor, too. The readings are zero for some of the axis (two as you point, or just one in other occasions). This problem happens when you have just enabled the sensors (registerListener()) and I assume that it is related to some kind of sensor initialization.
In the case of the acceleration sensor, I have observed that just a small shaking of the device makes it to start giving correct sensor readings.
The correct solution would be the method onAccuracyChanged() giving the correct information about the sensor state. It should be returning a staus of SensorManager.SENSOR_STATUS_UNRELIABLE, but instead of that, it permanently returns SensorManager.SENSOR_STATUS_ACCURACY_HIGH on all physical devices that I have tested so far. With the method onAccuracyChanged() properly implemented, you could ignore bad readings or ask the user to wait while the sensor is being initialized.

How to make an accurate compass on android

my android application shows the direction of a particular place in the world and therefore in needs to get the compass degree.
This is the code I've been using to calculate the degrees:
public void getDirection() {
mySensorManager = (SensorManager)getSystemService(Context.SENSOR_SERVICE);
List<Sensor> mySensors = mySensorManager.getSensorList(Sensor.TYPE_ORIENTATION);
if(mySensors.size() > 0){
mySensorManager.registerListener(mySensorEventListener, mySensors.get(0), SensorManager.SENSOR_DELAY_UI);
}
else{
TextView alert = (TextView)findViewById(R.id.instruct);
alert.setText(getString(R.string.direction_not_found));
myCompassView.setVisibility(myCompassView.INVISIBLE);
}
}
private SensorEventListener mySensorEventListener = new SensorEventListener(){
#Override
public void onAccuracyChanged(Sensor sensor, int accuracy) {
// TODO Auto-generated method stub
}
#Override
public void onSensorChanged(SensorEvent event) {
// TODO Auto-generated method stub
compassBearing = (float)event.values[0];
float bearing;
bearing = compassBearing - templeBearing;
if (bearing < 0)
bearing = 360 + bearing;
myCompassView.updateDirection(bearing);
}
};
This method usually works but sometimes it just gets the wrong north, what do I have to do to get a more accurate location?
I have a couple suggestions for you:
1) Your device may not be calibrated. In order to do it, move it around in a form of 8 (see this). If you don't if your device is calibrated or not make some tests by pointing the device at some known cardinal point direction and compare the values. Typically, if a device is not calibrated, you will see great variations in the azimuth value for small rotations. That is what I would be worried about.
Now, don't forget that the sensor gives you the bearing to Magnetic North, and not True North! This difference is known as declination of the magnetic field and its value changes from place to place and from time to time due to changes in Earth's magnetic field. This app can compute some of the values for you with relative accuracy. I wouldn't be too much worried about this as the declination is typically small, but you might be looking for good precision (where I live the declination is 3º, currently).
2) Stay away from metal objects or stuff that generate a strong magnetic field. For example, don't do tests if you have your phone near the computer or any physical keyboards! This is pure poison for testing compass-geolocation. Some apps can measure the intensity of the magnetic field (if the device supports it). When you get closer to metal stuff you will experience higher values and strong changes in directions. For fun, there are also some "metal detectors": this app recognises changes in the magnetic field and vibrates when you are close "metal object" or stuff that magnetically interfere with the device.
3) Remember to update the bearing when you tilt your device to landscape mode. (article is a must read!) This is because azimuth value is based on the rotation of the perpendicular axis to the plane of the phone. When you rotate the device to landscape, this value is changed by +/-90º! This is not resolved by disabling the application landscape mode! You will have to determine it programmatically by analysing rotations around the other two axis (pitch and roll). This is not trivial, but there are some examples somewhere in the net.
edit: If you are interested in some code, check out Mixare, it is an open source augmented reality framework under the GPL3 for Android. Take a look at their code regarding orientation, compass geolocation and bearing.
PS: I don't have any sort of connection with the creators of the mentioned applications.

Categories

Resources