I'm trying to write a small android game where the phone is placed on the table.
On the screen there is a ball, which the user control its movement by moving the phone.
Along all the game the user won't lift the phone from the table.
At the beginning the ball will placed in the middle of the screen:
Pushing the phone from the user:
should move the ball toward the top of the smartphone screen:
And from the current position of the ball, moving the phone back to the user, and to the right:
will move the ball accordingly:
I read the Android Motion Sensors Guide carefully but I didn't even realize what Sensor \ Sensors should I use.
I would love to get any directions.
First of all TYPE_LINEAR_ACCELERATION, TYPE_ROTATION_VECTOR, TYPE_GRAVITY are not physical sensors but made from sensor fusion.
Secondly from Android 4+ these fused sensors make use of device Gyroscope, so they WON'T work if the mobile device doesn't has a gyroscope.
So if you want to make a generic app for all phones prefer using only Accelerometer (TYPE_ACCELEROMETER).
Now for your case since user won't lift the mobile from table, and if you want you can easily subtract the Gravity component from accelerometer. See http://developer.android.com/reference/android/hardware/SensorEvent.html
under section Sensor.TYPE_ACCELEROMETER. (Code is given too).
Now you can see How can I find distance traveled with a gyroscope and accelerometer? to find the linear displacement & the 1st answer states their is NO use of Gyroscope. (Or you can just google for finding the displacement/Linear velocity from acceleromter readings)
Hope this all would give you quite lot an idea.
It's really difficult to do this type of linear position sensing using the types of sensors that smartphones have. Acceleration is the second derivative of position with respect to time. So in theory, you could use the accelerometer data, integrating twice in time to achieve the absolute position. In practice, noise makes that calculation inaccurate.
On the other hand, if you're really talking about putting the camera facedown on the table, maybe you can come up with some clever way of using OpenCV's optical flow features to detect motion using the camera, almost like an optical mouse. I don't know how you would light the surface (the flash would probably burn through the battery) but it might be possible to work something out.
Related
Good day.
I want to make an app which is somewhat very close to the device movement.
In example: If the device moves backward, the unity 3D object should move backward, if forward then the object goes forward in 3D environment, then same and for left and right.
Anyway the workaround i have googled only brought me to accelerometer, which I don't need, as accelerometer is used to detect the tilt and not actual movement. So i wanted to ask, is that even possible to detect the movement of device in unity? If yes, what classes should i look for to achieve what i want.
Thank you beforehand.
Jow Blow's comment is correct: The accelerometer gives the acceleration (you don't say). Tilt can be found from the amount of acceleration, but other movements will be seen as well. If you are standing still and start moving forward, your accelerometer value will show it (since you accelerated).
There is no device that will measure if you are moving forward at constant speed on your phone. If you know a bit of physics you find more obvious why the accelerometer doesn't work: because constant speed means that the resultant of forces is null. So there is no force to detect.
If you need to detect constant speed and plan to move for a rather big distance, however, you could try using the GPS position but if you need a high precision, you won't be able to do it with a standard device.
Forget about detecting moving at constant speed on a short distance easily. A solution could be to detect an acceleration and estimate the direction and speed resulting from it (speed = time*acceleration). This method won't be really precise as any error will add up but it could be sufficiant
My goal is to have a simple stroke rate detector displayed on my Android Watch (Sony Smartwatch), for this I need to detect when the watch changes from moving forwards to moving backwards.
I have code working that will get the event values (x,y,z) as detected in the onSensorChanged event (and display them on the watch), but I am struggling to make sense of these.
I understand the values report acceleration in the given axis, and I understand that z reports gravity. But if these values are reporting just acceleration, I am not clear how to know when there is a change of direction. I presume a positive number indicates acceleration, a number of 0 is a constant speed and a negative number is deceleration...is that correct? And if so, how can I detect when the Watch has changed direction from going forwards to going backwards?
Thanks in advance.
Android Wear is no different than "conventional" Android when it comes to detecting motion. Basically, you need to consider exactly what the accelerometers are recording: raw acceleration (with a lot of noise). To determine motion from that, you need to look at trends over time, and probably integrate the smoothed accelerometer data. It's not trivial.
You'll probably want to use the TYPE_LINEAR_ACCELERATION sensor, because it filters out gravity (which isn't relevant to your use case). And because the watch will probably experience some rotation during a rowing stroke (which you want to factor out), you may need to combine TYPE_ROTATION_VECTOR with the acceleration vector to accurately determine direction changes.
A couple of other SO questions which may help point you in the right direction:
how to calculate phone's movement in the vertical direction from rest?
How can I get the direction of movement using an accelerometer?
I've not attempted this specific problem, of course, but I've done enough other work with Android motion sensors to know that you have a good programming challenge ahead of you. Enjoy it!
I'm developing an app that uses Android sensors to help vehicles navigate in an indoor location. As part of my evaluation process of different sensors, I wanted to try the "rotation vector" sensors. For various reasons, magnetic field readings are not very useful for my location, so thus I wanted to try the "Game Rotation Vector" sensor (sensor fusion, available from API level 18 and later). The description states that it is identical to the regular Rotation Vector sensor except no magnetic field information is used to correct for gyroscope drift around the vertical axis.
When looking for information about the Rotation Vector sensors, I came across an example from Google, where they show the Rotation Vector sensor using a 3d cube. It works pretty well, except for being very sensitive to local magnetic fields (and me being far north, even worse, since the horizontal component is very small here).
Since long term drift can be compensated by other reference data (map information), I wanted to use the Game Rotation Vector sensor for my app. However, when changing all references from "TYPE_ROTATION_VECTOR" to "TYPE_GAME_ROTATION_VECTOR" in the example code, the cube no longer reacted to rotations around the vertical axis (eg. me spinning my chair, holding the device in front of me). Tilting the device in the other two directions moved the cube. I also noticed the cube was a lot more "laggy" this time around, reacting very slowly to any movement.
Is this the way the Game Rotation Vector sensor is supposed to work (eg. ignoring any Z axis rotations)? It would kind of make sense, since a gamer playing in the back seat shouldn't be affected by the vehicle turning, but at the same time it differs from the description provided by Google (my first link). From the description I was under the impression that it would drift slowly, not ignore rotation all together.
I would be deeply grateful for any input on this issue.
Best Regards,
John
Ok, just in case anyone happens to find this, here are my findings:
The Game Rotation Vector sensor does detect rotation around the vertical axis. It is quite accurate in most situations.
However, it has a couple of issues... First, while lying still it has accelerating horizontal drift (even when a gyroscope-based orientation has linear drift). For my device, Game Rotation Vector started out good, but accelerated and finally drifted more than 400 degrees over the course of an hour.
Secondly, and even more disturbing, it does not seem to ignore magnetic fields, contrary to the official description (linked in the question). I tried driving around the parking lot with my device fixed on the passenger seat, and the Game Rotation Vector fell behind largely (it was more than 180 degrees off after one full rotation over 40 seconds), while integrated gyroscope data was accurate within a few degrees. It also showed changes in rotation when the gyroscope was hovering around zero, suggesting that it was in fact compensating for a change in (what I presume to be) magnetic field.
I still don't know why it acted wierd in the test app I linked to before, but I have since decided to use a complementary filter to combine accelerometer and gyro data instead.
I am trying to use the phone as a gun aiming with the gyroscope. I calibrate with the phone or tablet in a certain orientation. This will shoot straight. Then depending on the direction the phone is turned (left/right/up/down.), the gun shoots in that direction.
I am using the gyroscope. And all this works. Except after shooting for about 30 secs, the gyroscope slowly starts drifting towards left or right. So when I go back to the orientation I calibrated with, it doesn't shoot straight anymore. Does anyone have any experience writing a Complementary or Kalman Filter to fuse gyro and accelerometer data to give better results in Unity 3D?
I've found this online - http://www.x-io.co.uk/open-source-ahrs-with-x-imu/. It seems to do exactly what I want. But I am using it wrong. I sometime get better and sometimes get worse results with. Anybody have any experience with it ?
First place, Gyro/accelerometer fusion will stabilize your pitch/roll angles, since gravity indicates in which direction ground is. However, you cannot correct "left/right" drift because actual heading is unknown. Getting proper heading stabilization cannot be achieved with gyro/accelerometer alone: it requires additional information.
The example you provide (Madgwick’s MARG/IMU filter) is a filter that can integrate magnetometers ("north" reference), but it has two requirements for getting good results:
The magnetometer has been properly calibrated.
There are no magnetic field disturbances. This is generally not true if you are indoors, or if you are moving close to power lines or metallic structures.
An alternative is using a video signal to get optical flow information, or detecting if the phone is resting in a fixed position to compensate gyro biases from time to time.
I have a question regarding inertial navigation with a mobile device.
I am using an android tablet for development but I think the question is related to
all types (even with better sensors) of hardware.
The most basic question when developing an inertial system is how to determine the
direction of the carrier's movement.
Even if we assume that the magnetometer readings are 100% accurate (which they are obviously not!) There is still the question of the device orientation relative to the user.
Simple example - if the user is walking north, but holds the device with the device's Y axis points north-east, (a link to a picture of the different axis: http://developer.android.com/reference/android/hardware/SensorEvent.html)
Then the magnetometer will point towards north-east.
How can we tell which way the user is actually heading?
(The same will be true if we use both magnetometer and Gyro for determining heading)
A possible solution will be to use the Accelerometer's Y-axis and X-axis readings.
Something in the direction of using arctan(a-Y/a-X)
(for example - if the user holds the device perfectly straight, then the X-Axis will show nothing...)
But since the Accelerometer's readings are not stable, it is not so easy...
Does anyone know of an algorithm that actually works? I am sure this is a well known problem, but I can't seem to find references to solutions...
Thanks in advance!
Ariel
See this answer for an idea: by obtaining the acceleration values in relation to the earth, you can then use the atan2 function to compute the actual direction.
You mention the user holds the tablet, and I assume fairly stable (unlike a case I am working on, where the user moves the phone constantly). Yet, for some reason, the user may change the orientation of the device, and this may influence the readings you obtain.
Thus, in the event of an orientation change, you should call remapCoordinates() accordingly to fix the readings you obtain.
NOTE: You can also use the getOrientation() method accordingly, and the first field represents the heading direction.
The really right answer is to leave the device sitting there for a while, detect the rotation of the earth, and then compute true north from that. Unfortunately, the gyros in a mobile phone aren't accurate enough for that....