I'm new to Android and was playing around with the accelerometer to get data. I am able to get the acceleration values across all 3 axes using the SensorEvent.
Now I was trying to create a small example where I want to track the motion of the phone (in it's own coordinate system). Suppose I touch a point on the screen, I want it to be the (0,0,0) of my system.
Now if I move my screen to the left by 5 cms in a gentle way, I want the screen reading to say my position is (-5,0,0). Same goes for y axis and z axis.
I know that the accelerometer gives only acceleration, but can I use it to determine the position of the phone w.r.t. the original reference point?
Any pointers would be much appreciated.
Related
I am creating app that will measure acceleration of vehicle in each axis using accelerometer in Android smartphone. I need somehow rotate phone measurement coordinations system to coordination system of vehicle - to allow driver to put phone in holder - and so phone will have some different rotation to vehicle.
I started with calibration process where I tell user to hold phone to match vehicle coordination system - so I save TYPE_GRAVITY sensor X, Y, and Z gravity acceleration value. Then I tell user to put phone in holder and again save TYPE_GRAVITY sensor X, Y, and Z gravity acceleration value.
Now I need to find some relation between those two vectors so I can use it to correct (rotate) TYPE_LINEAR_ACCELERATION X,Y,Z data to match vehicle coordinations system.
Using just a gravity sensor you'll lack one rotational axis to compute what you want.
Imagine your phone being held vertically, the accelerometer will show you direction to the bottom of the phone. If you now rotate it around vertical axis - you'll still get the same result from the accelerometer. This means you can't get the rotation around vertical axis this way.
One solution would be to use a gyroscope - this gives you entire rotation of the phone, but increases a hardware requirements of your app.
But the better solution IMHO would be to get rid of the entire calibration process. Cars move mostly straight, only sometimes you get the side way acceleration so you could scan your readings for few seconds and find a 'main' axis and a 'side' one.
Besides, your calibration process still depends on user precision during placing the phone, so it may not work as you've expected.
Say I have a character walking around in 2d space. Lets say my phone is flat on a table in landscape orientation.
If I tilt the phone away from me, the character should start moving up. If I tilt it towards me she should start moving down. Same goes for right and left.
The reason I ask here is because I found Google's explanation rather confusing.
http://developer.android.com/reference/android/hardware/SensorManager.html#getRotationMatrix%28float%5b%5d,%20float%5b%5d,%20float%5b%5d,%20float%5b%5d%29
This link implies the x and y are relative to compass coordinates? I can't imagine that's how the accelerometer works. I just want to do this relative to the phone being tilted on a certain axis.
For example, should the phone tilt away from me, I feel it should be easy to say "the phone is tilting ___ radians positively on the y axis." Then I should just be able to use trig to calculate the acceleration on my character.
I guess my real question is how do I read from the accelerometer and determine to what angle the phone is tilting on a given axis. This image details how I currently think the axis are laid out on the phone.
I'm sure this has been asked before, so a link to a good source of question solving is awesome as well.
So, right now I'm grabbing the accelerometer data and converting them to a decently rough estimate of the angle at which the phone is being held. For right now I'm just focused on the yaw axis.
My area of interest is between 0 and 45 degrees on the yaw axis, so I made a limited queue of the past 5 to 10 readings and compared the numbers to determine if it's going up or down, which kind of works, but it is slow and not really as precise or reliable as I'd want it to be.
Is there a way you can kind of just determine which direction your phone is rotating with just the accelerometer and the magnetic field sensor I guess, without keeping a history of past readings, or something like that? I'm really new to sensor manipulation and Android in general. Any help understanding would be great.
It's not clear exactly what you're looking for here, position or velocity. Generally speaking, you don't want to get a position measurement by using integration on the accelerometer data. There's a lot of error associated with that calculation.
If you literally want the "direction your phone is rotating," rather than angular position, you can actually get that directly from the gyroscope sensor, which provides rotational velocities. That would let you get the direction it's rotating from the velocity without storing data. You should be aware that not every phone has a gyroscope sensor, but it does seem like the newer ones do.
If you want the absolute orientation of the phone (position), you can use the Rotation Vector sensor. This is a combined sensor that automatically integrates data from several of the sensors in one go, and provides additional accuracy. From this, you can get roll-pitch-yaw with a single measurement. Basically, you first want to get your data from the Rotation_vector sensor. Then you use the sensor data with getRotationMatrixFromVector. You can use the output from that in getOrientation (see the same page as the previous link), which will spit out roll-pitch-yaw measurements for you. You might need to rotate the axes around a bit to get the angles measured positive in the direction you want.
I'm trying to write a small android game where the phone is placed on the table.
On the screen there is a ball, which the user control its movement by moving the phone.
Along all the game the user won't lift the phone from the table.
At the beginning the ball will placed in the middle of the screen:
Pushing the phone from the user:
should move the ball toward the top of the smartphone screen:
And from the current position of the ball, moving the phone back to the user, and to the right:
will move the ball accordingly:
I read the Android Motion Sensors Guide carefully but I didn't even realize what Sensor \ Sensors should I use.
I would love to get any directions.
First of all TYPE_LINEAR_ACCELERATION, TYPE_ROTATION_VECTOR, TYPE_GRAVITY are not physical sensors but made from sensor fusion.
Secondly from Android 4+ these fused sensors make use of device Gyroscope, so they WON'T work if the mobile device doesn't has a gyroscope.
So if you want to make a generic app for all phones prefer using only Accelerometer (TYPE_ACCELEROMETER).
Now for your case since user won't lift the mobile from table, and if you want you can easily subtract the Gravity component from accelerometer. See http://developer.android.com/reference/android/hardware/SensorEvent.html
under section Sensor.TYPE_ACCELEROMETER. (Code is given too).
Now you can see How can I find distance traveled with a gyroscope and accelerometer? to find the linear displacement & the 1st answer states their is NO use of Gyroscope. (Or you can just google for finding the displacement/Linear velocity from acceleromter readings)
Hope this all would give you quite lot an idea.
It's really difficult to do this type of linear position sensing using the types of sensors that smartphones have. Acceleration is the second derivative of position with respect to time. So in theory, you could use the accelerometer data, integrating twice in time to achieve the absolute position. In practice, noise makes that calculation inaccurate.
On the other hand, if you're really talking about putting the camera facedown on the table, maybe you can come up with some clever way of using OpenCV's optical flow features to detect motion using the camera, almost like an optical mouse. I don't know how you would light the surface (the flash would probably burn through the battery) but it might be possible to work something out.
My question is similar to Changing sensor coordinate system in android
I want to be able to compare a user's movements with each other regardless of device orientation. So that when the users holds out the phone in portrait orientation and bends his arm, acceleration readings are the same as when he holds out his phone in landscape and then bends his arm in the same direction.
This is what I call the "user" coordinate system. It is different from the world coordinate system since it should not matter what wind direction the user is facing. It is different from device coordinates since it should not matter how the user holds his device.
It is acceptable in my application to do a calibration step before each movement so the base/resting orientation matrices can be determined. Is it perhaps just a matter of multiplying the matrix of the first movement with the inverse of the second (and then with the new values?)
The answer in the question mentioned seems about right, but I need a more concrete explanation, actual code samples would be ideal.
Note remapCoordinateSystem won't suffice, it only accepts right angles. I need to be able to work with small deviations since the device is strapped to a wrist, which might not always be at right angles with the arm.
I'm currently working on this issue, and I think this might help:
convert device coordinate system to world corrdinate system
We can assume that for the most of the time, people use the phone standing or walking or sitting, which means the user coordinate system share the same z-axis(gravity), and a fixed difference in degree between y-axis (user coordinate, front of user's face) and y-axis(world coordinate, north direction). The difference of degree can be obtained via TYPE_MAGNETIC_FIELD sensor. Thus we can transform from world coordinate to user coordinate.
What's about the user using the phone lying on the bed? I think for that kind of case, a pre-calibration is need by define y-axis of user coordinate, like a movement to tell the phone which direction is the front of user's face.