I am trying to make a game in which the rate a game event occurs (such as climbing or rowing or spinning a wheel) is proportional to the rate at which the user rotates/spins an android device, using either world coordinates or device coordinates.I would like to use accelerometer and gyroscope(or any sensor present on current generation smartphones)
I have looked at calculating distances using accelerometer and Indoor Positioning System based on Gyroscope and Accelerometer . Apparently accurate position/velocity detection is impossible using only these two sensors due to noise.
So my question is : Is it still possible to measure the rate of spinning or rotating of a current gen smartphone.If so how and which types of rotation can be measured?
I do not care about exact displacement or velocity of the device , just some fairly accurate measurement of rate of some repetitive motion to drive game events.
Related
I'm trying to figure out in what direction (roughly) a phone has moved over a period of a couple of seconds. Is it possible ?
I'm using the gyroscope to track what direction the phone is facing. This seems to be working fine as a couple of tests confirm that the application can track the phone's orientation relatively accurately.
What I want to do..
I get the accelerometer reading x,y,z and the timestamp.
Using the equation v = v0 + a*t
I can caluclate the change in velocity of the device. Then using the phones orientation (calculated from the gyroscope) I can rotate those vectors to match the initial orentiation of the device. Then using the velocity I can estimate the direction of motion.
However the problem I'm facing is with the acceleromter as I'm unsure of data I'm receiving from the accelerometer. I want to know if this 'raw data' I'm receiving is modified in anyway before it reaches the application level is it the average acceleration over a short period of time ? or is it the instantaneous acceleration ?
Documentation:
https://source.android.com/devices/sensors/sensor-stack
https://developer.android.com/guide/topics/sensors/sensors_overview :
'You can access sensors available on the device and acquire raw sensor data by using the Android sensor framework.'
https://developer.android.com/reference/android/hardware/SensorManager
I am making a raspberry pi robot with an FVP (First Person View) camera mounted on a Pan/tilt Servo. I want to make it VR compatible by connecting it to my Phone. But my phone doesn't have Gyroscope sensor to detect horizontal movements, but it has magnetometer and accelerometer. How can I combine data from accelerometer and magnetometer to make a virtual gyroscope that can move with my camera. I am noob in all of these.
You should have an rotation vector sensor that is already fusing the two. You will not get better results than it.
Note that this will not be as high quality as a proper gyroscope and will have artifacts if the robot moves.
If you're still interested in how to make this yourself, you can get roll and pitch information from the accelerometer, then get the yaw information from the magnetometer. Best if you find a library for 3d maths and do this with quaternions or matrices. This seems like a use case where you will be affected by gimbal locks easily, so euler angles will be problematic.
I guess you want to use this for VR? Don't try to move the servos to compensate for head movement directly, you'll only make a motion sickness generator. Look at how timewarp works - you move the servos in the general direction a person is looking at and render the video reprojected on a sphere - this way you have almost zero lag.
I need to create an app that Calculates the device's velocity, with x/y/z speed.
My idea is using device's accelerometer and gyroscope,
like this pipeline
I wanted to know that whether accelerometer and gyroscope right sensor choice for this ?
(in the pipeline).
What Rotation table should i use for this?
Is there any method of calculating linear distance using accelerometer and gyroscope sensor data as double integral on acceleration seems to give lot of drift.
Note : Image processing techniques using the camera / GPS seem to be a heavy duty on battery.
Since you wish to calculate linear distance, you should not read from raw Accelerometer data. In API 9. android introduced Sensor.TYPE_LINEAR_ACCELERATION, which is nothing but the gravity component deducted from raw accelerometer values.
The drift can build up much quickly if there is even a single degree error in your own calculations for finding linear component in raw data. Check out an experiences from Google: SensorFusion.
The question is very similar to Calculating distance using Linear acceleration android
I have a very basic question about Sensors:
Do magnetic sensors return readings w.r.t the phone's initial orientation or w.r.t the world coordinates?
What about accelerometers? Do they return values w.r.t their previous readings or is each value an independent acceleration relative to the world coordinate system?
I know that gyros return readings relative to the phone's initial orientation. So, how do I convert the yaw, pitch and roll readings from a gyro into the azimuth, pitch and roll readings from a magnetic sensor of a smartphone (I'm using HTC hero)
Thanks!
As mentioned, the gyroscope measures the angular velocity.
The third value returned (values[2]) is the angular velocity regarding the
z axis. You can use this value together with the initial value from the magnetometer to
calculate current heading: Theta(i+1) = Theta(i) + Wgyro*deltaT
You can receive initial heading orientation from 'Orientation' measurement (values[0])
This measurement is dependent only on the magnetometer. (you can put a magnet or a second phone close to the Smartphone and watch the output going crazy)
The second and third values of the 'Orientation' are dependent on the readings of the
Accelerometer. Since the Accelerometer measures gravity, it is possible to calculate
the pitch and roll angles from the Accelerometer readings in Axis Y and X.
Hope this helps
Ariel
Android Sensors (upto FroYo) provide the application with "raw" data.
There is bare minimum of "cooking" (ie processing) involved.
The accel & compass device provide absolute accel & magnetic data respectively.
The gyroscope provides relative angular velocity.
Gyroscopes do NOT provide relative data wrt any specific state/position.
What you need to understand is that gyroscopic data is angular-velocity.
Angular velocity is simply, how fast the phone is rotating (in degrees-per-second).
So once you hold it still, gyro says (0,0,0) &
as you rotate, you get how fast it is rotating.
This continues until u again hold it back still
when the gyro reading again becomes (0,0,0).
Theoretically the gyro can be used in "callibrate" the compass.
But to do so would require a lot of experimentation on your part.
The ideal place to fiddle around would be the sensor-HAL.
NOTE: You would need to turn-ON all the sensor h/w even if
ONLY compass data is reqd. As you will be cross-referencing
the gyro/accel data for that. This will mean larger power consumption &
extremely poor battery life. All the sensors turned on continuously can
drain the battery of a standard Android phone in 4-5hrs.
You can read more Android Sensors here.