Android MotionEvent pressure unit of measurement - android

I am using the getPressure(index) method from the MotionEvent instance to get a value of the pressure applied to screen.
I am trying to figure out how to convert that value to at least an approximation of a standard measurement unit.
in Android the pressure value is a float ranging from 0 to 1. I need to express it in Newtons in some way.
From what i understood this is different across devices so its not possible get a really precise unit measurement but i am fine with an approximation.
Like what amount in newtons is normal for a stylus touching the screen on full force (the device measuring 1.0f of pressure)

I think you can only guess, and know that the results will be affected by huge uncertainity. Solutions I see:
Put an object of appropriate, known weight on the screen. Don't know about screens, but if it needs human skin to trigger the event, you can put your finger on the screen (making no strength on it) and then put some object on your finger.
Take a stylus, and by debugging learn how much force you need to get a 0.5f result. Then take a scale (foreign speaker here; I mean the tool that measures weights..?) and apply the same pressure on it with the stylus, and read the results.
In both cases, you can have a single map point (e.g., 0.5f -> 10 N), and then assume a linear dependency (knowing also that 0f -> 0 N) to fill the whole range.
With some patience you can fill different values too - I would not expect the relation to be linear actually.

getPressure returns a value 0-1 because the way pressure is calculated is device-dependant. Some devices will calculate the value from how much of the area of your finger is touching the screen. So from that it's probably not possible to convert to newtons in a way which will work on multiple Android devices unless you write a solution for each one.

Related

Android Real Scale Application

I have a nice idea for android application want to make real scale, not like others which are fake i was thinking of how to do it but don't have any idea.
EDIT: want to make real scale (what means) It means for example i wanna calculate how much is the weight of a coin, then i'm putting the coin in the screen and calculates how much is the weight of the coin and if its possible the scale to get the weights to 50 grams
Hope its understood now.
Actually it is possible, just probably not precise enough. Most touch controllers can report touch pressure and it is available via MotionEvent.getPressure() call. So with some luck and tedious calibration you can measure weight of something. It is going to work better with cheaper resistive screen.
There is nothing in the Android SDK that supports measuring the weight of a coin.
You can get the pressure (0-1). Calibrate it with a quarter or known value (quarter is 2.5g). Then if a quarter gives you a pressure value of 0.2, then you know that to calculate weight, you use the formula (p * 2.5/2), where p is the value read from getPressure().
I don't know if you can scale it...
Good Luck.

Android Accelerometer Profiling

I have written a simple Activity which is a SensorEventListener for Sensor.TYPE_ACCELEROMETER.
In my onSensorChanged(SensorEvent event) i just pick the values in X,Y,Z format and write them on to a file.
Added to this X,Y,Z is a label, the label is specific to the activity i am performing.
so its X,Y,Z,label
Like this i obtain my activity profile. Would like to have suggestions on what operations to perform after data collection so as to remove noise and get the best data for an activity.
The main intent of this data collection is to construct a user activity detection application using neural network library (NeuroPh for Android) Link.
Just for fun I wrote a pedometer a few weeks ago, and it would have been able to detect the three activities that you mentioned. I'd make the following observations:In addition to Sensor.TYPE_ACCELEROMETER, Android also has Sensor.TYPE_GRAVITY and Sensor.TYPE_LINEAR_ACCELERATION. If you log the values of all three, then you notice that the values of TYPE_ACCELEROMETER are always equal to the sum of the values of TYPE_GRAVITY and TYPE_LINEAR_ACCELERATION. The onSensorChanged(…) method first gives you TYPE_ACCELEROMETER, followed by TYPE_GRAVITY and TYPE_LINEAR_ACCELERATION which are the results of its internal methodology of splitting the accelerometer readings into gravity and the acceleration that's not due to gravity. Given that you're interested in the acceleration due to activities, rather than the acceleration due to gravity, you may find TYPE_LINEAR_ACCELERATION is better for what you need.Whatever sensors you use, the X, Y, Z that you're measuring will depend on the orientation of the device. However, for detecting the activities that you mention, the result can't depend on e.g. whether the user is holding the device in a portrait or landscape position, or whether the device is flat or vertical, so the individual values of X, Y and Z won't be any use. Instead you'll have to look at the length of the vector, i.e. sqrt(XX+YY+ZZ) which is independent of the device orientation.You only need to smooth the data if you're feeding it into something which is sensitive to noise. Instead, I'd say that the data is the data, and you'll get the best results if you use mechanisms which aren't sensitive to noise and hence don't need the data to be smoothed. By definition, smoothing is discarding data. You want to design an algorithm that takes noisy data in at one end and outputs the current activity at the other end, so don't prejudge whether it's necessary to include smoothing as part of that algorithmHere is a graph of sqrt(XX+YY+ZZ) from Sensor.TYPE_ ACCELEROMETER which I recorded when I was building my pedometer. The graphs shows the readings measured when I walked for 100 steps. The green line is sqrt(XX+YY+Z*Z), the blue line is an exponentially weighted moving average of the green line which gives me the average level of the green line, and the red line shows my algorithm counting steps. I was able to count the steps just by looking for the maximum and minimums and when the green line crosses the blue line. I didn't use any smoothing or Fast Fourier Transforms. In my experience, for this sort of thing the simplest algorithms often work best, because although complex ones might work in some situations it's harder to predict how they'll behave in all situations. And robustness is a vital characteristic of any algorithm :-).
This sounds like an interesting problem!
Have you plotted your data against time to get a feel for it, to see what kind of noise you are dealing with, and to help decide how you might pre-process your data for input to the detector?
^
|
A |
|
|
|
|_________________>
| time
|
v
I'd start with lines for each activity:
|Ax + Ay + Az|
|Vx + Vy + Vz| (approximate by calculating area of trapezoids formed by your data points)
... etc
Maybe you can work out the orientation of the phone by attempting to detect gravity, then rotate your vectors to a 'standard' orientation (eg positive Z axis = up). If you can do that, then the different axes may become more meaningful. For example, walking (in pocket) would tend to have a velocity on the horizontal plane, which might be distinguished from walking (in hand) by motion in the vertical plane.
As for filters, if the data appears noisy, a simple starting point is to apply a moving average to smooth it. This is a common technique for sensor data in general:
https://en.wikipedia.org/wiki/Moving_average
Also, this post seems relevant to your question:
How to remove Gravity factor from Accelerometer readings in Android 3-axis accelerometer
Things Identified by me:
The data has to be preprocessed as and how you need it to be, In my
case i just want 3 inputs and one output
The data has to be
subjected to Smoothing (Five-point Smoothing or Any other technique
which suites you the best) Reference. So that Noise gets filtered out (not completely though). Moving Average is one of the techniques
Linearized data would be good, because you dont have any idea how the data was sampled, Use interpolation to help you in Linearizing the data
Finally use FFT (Fast Fourier Transform) to extract the recipe out of the dish, that is to extract features out of your dataset!

Vertical movement sensor

I am working on an android app that requires the detection of vertical motion. When moving the tablet upward, the Gyroscope, Accelerometer, and Linear Acceleration sensors give a corresponding value indicating upward or downward motion.
The problem I have is that these sensors will also read an upward/downward motion when you tilt the tablet towards the user or away from the user. For example, the x value in the gyroscope represents the vertical plane. But when you tilt the device forwards, the x value will change.
When I make this motion, the same sensor that reads vertical motion reads a value for this.
The same goes for the rest of the sensors. I have tried to use orientation coupled with the gyro to make the conditional statement, if the pitch is not changing, but the x variable is going up/down, then we have vertical motion. The problem with this is that if the user moves it up but tilted slightly, it will no longer work. I also tried making it so if there is a change in tilt, then there is no vertical motion. But it iterates so quickly that there may be a change in tilt for 1/100 of a second, but for the next there isn't.
Is there any way I can read only vertical changes and not changes in the devices pitch?
Here is what I want to detect:
edit:
"Please come up with a mathematically sound definition of what you consider 'moving upwards.'"
This was my initial question, how can I write a function to define when the tablet is moving upwards or downwards? I consider a vertical translation moving upwards. Now how do I detect this? I simply do not know where to begin, thank you.
Ok, even though this question is fairly old, I see a lot of confusion in the present answer and comments, so in case anyone finds this, I intend to clear a few things up.
The Gyroscope
First of all, the gyroscope does not measure vertical motion as per your definition (a translatory motion). It measures rotation around each of the axes, which are defined as in the figure below. Thus having you tilt your device forwards and backwards indeed rotates it around the x axis and therefore you will see non-zero values in the x value of your gyroscope sensor.
the x value in the gyroscope represents the vertical plane.
I'm not sure what is meant by "the vertical plane", however the x value certainly does not represent the plane itself nor the orientation of the device within the plane.
The x value of the gyroscope sensor represents the current angular velocity of the device around the x axis (eg. the change in rotation).
But when you tilt the device forwards, the x value will change. When I make this motion, the same sensor that reads vertical motion reads a value for this.
Not quite sure what you're referring to here. "The same sensor that reads vertical motion" I assume is the gyroscope, but as previously said, it does not read vertical motion. It does exactly what it says on the tin.
The device coordinate system
This is more in response to user Ali's answer than the original question, but it remains relevant in either case.
The individual outputs of the linear acceleration sensor (or any other sensor for that matter) are expressed in the coordinate system of the device, as shown in the image above. This means if you rotate the device slightly, the outputs will no longer be parallel to any world axis they coincided with before. As such, you will either have to enforce that the device is in a particular orientation for your application, or take the new orientation into account.
The ROTATION_VECTOR sensor, combined with quaternion math or the getRotationMatrixFromVector() method, is one way to translate your measurements from device coordinates to world coordinates. There are other ways to achieve the same goal, but once achieved, the way you hold your device won't matter for measuring vertical motion.
In either case, the axis you're looking for is the y axis, not the z axis.
(If by any chance you meant "along device y axis" as "vertical", then just ignore all the orientation stuff and just use the linear acceleration sensor)
Noise
You mentioned some problems regarding noise and update rates in the question, so I'll just mention it here. The simplest and one of the more common ways to get nice, consistent data from something that varies very often is to use a low-pass filter. What type of filter is best depends on the application, but I find that a exponential moving average filter is viable in most cases.
Finishing thoughts
Note that if you take proper care of the orientation, your transformed linear acceleration output will be a good approximation of vertical motion (well, change in motion) without filtering any noise.
Also, if you want to measure vertical "motion", as in velocity, you need to integrate the accelerometer output. For various reasons, this doesn't really turn out too well in most cases, although it is less severe in the case of velocity rather than trying to measure position.
OK, I suspect it is only a partial answer.
If you want to detect vertical movement, you only need linear acceleration, the device orientation doesn't matter. See
iOS - How to tell if device is raised/dropped (CoreMotion)
or
how to calculate phone's movement in the vertical direction from rest?
For some reason you are concerned with the device orientation as well, and I have no idea why. I suspect that you want to detect something else. So please tell us more and then I will improve my answer.
UPDATE
I read the post on coremotion, and you mentioned that higher z lower x and y means vertical motion, can you elaborate?
I will write in pseudo code. You measured the (x, y, z) linear acceleration vector. Compute
rel_z = z/sqrt(x^2+y^2+z^2+1.0e-6)
If rel_z > 0.9 then the acceleration towards the z direction dominates (vertical motion). Note that the constant 0.9 is arbitrary and may require tweaking (should be a positive number less than 1). The 1.0e-6 is there to avoid accidental division by zero.
You may have to add another constraint that z is sufficiently large. I don't know your device, whether it measures gravity as 1 or 9.81. I assume it measures it as 1.
So all in all:
if (rel_z > 0.9 && abs(z) > 0.1) { // we have vertical movement
Again, the constant 0.1 is arbitrary and may require tweaking. It should be positive.
UPDATE 2
I do not want this because rotating it towards me is not moving it upwards
It is moving upwards: The center of mass is moving upwards. My code has the correct behavior.
Please come up with a mathematically sound definition of what you consider "moving upwards."

android find pressure on screen

I would like to roughly understand the amount of pressure the finger presses on the capacitive screen on android. My idea is to get the area covered by the finger when it is touched (maybe some extra parameters to get it more accurate, but thats the main idea).
So, is there any way to find the are covered? (for example get the number of pixels covered).
There is only MotionEvent.getPressure() (which you probably already found). I doubt that there is something that reports how many pixel are covered by the finger.
I do not really know but you have access to the following function :
MotionEvent e;
float press = e.getPressure(...);
press will be between 0 and 1, from 0 = no pressure, to 1 = normal pressure, however it can be more then 1...
Your thing is totally NIH... Use something that already exist ? Or maybe it doesn't cover your needs !
You can use MotionEvent.getSize() to get the normalized value (from 0 to 1) of the area of screen being pressed. This value is correlated with the number of pixels pressed.

how to calculate phone's movement in the vertical direction from rest?

I am developing an app using android OS for which I need to know how can I calculate the movement of the device up in the vertical direction.
For example, the device is at rest (point A), the user picks it up in his hand (point B), now there is a height change between point A and point B, how would i calculate that?
I have already gone through the articles about sensors and accelerometers, but I couldn't really find anything to help me with that. Anyone have any ideas?
If you integrate the acceleration twice you get position but the error is horrible. It is useless in practice. Here is an explanation why (Google Tech Talk) at 23:20. I highly recommend this video.
Now, you do not need anything accurate and that is a different story. The linear acceleration is available after sensor fusion, as described in the video. See Sensor.TYPE_LINEAR_ACCELERATION at SensorEvent. I would first try a high-pass filter to detect sudden increase in the linear acceleration along the vertical axis.
I have no idea whether it is good for your application.
You can actually establish (only) the vertical position without measuring acceleration over time. This is accomplished by measuring the angle between the direction to the center of the earth, and the direction to the magnetic north pole.
This only changes (significantly) when the altitude (height) of the phone changes. What you do is use the accelerometer and magnetometer to get two float[3] arrays, treat these as vectors, make them unit vectors, and then the angle between any two unit vectors is arccos(AxM).
Note that's dot product ie. math.acos(A[0]*B[0]+A[1]*B[1]+A[2]*B[2]) Any change in this angle corresponds to a change in height. Also note that this will have to be calibrated to real units and the ratio of change in angle to height will be different at various longitudes; But this is a method of getting an absolute value for height; though of course the angle also becomes skewed when undergoing acceleration, or when there are nearby magnets :)
you can correlate it to magnetic field sensor in microTesla
You can use dist= integral of integral of acceleration ~ sigma ~ summation
= integral of speed+constant

Categories

Resources