I want take photo and try to do it on the inclination of 180 degrees +/- 5 degrees, in other degrees camera should be blocked.
In brief I want force take photo horizontally to floor (or close to it).
What is the easiest way? I should use gyroscope? The problem is that angle must be checked all the time that degree is good.
Use orientation sensor. Framework reports to you when value of sensor is changed, so as long as you keep sensor registered for usage, you will have actual device orientation.
Retrieve SensorManager by calling context.getSystemService(Context.SENSOR_SERVICE). Also look at Android Orientation sensor for rotation the 3D cube
Related
I am able to get the orientation of my Android phone using TYPE_ROTATION_VECTOR and am able to get the angles in degrees of the 3 rotation angles. However, how do I set it so the starting orientation of the phone are all 0's? This way, regardless of starting orientation, all the measured orientations will be relative to the starting orientation -- not fixed to magnetic north, and the ground.
I found this page:
http://www.outware.com.au/insights/which-direction-am-i-facing-using-the-sensors-on-your-android-phone-to-record-where-you-are-facing/
and it got this working. Use TYPE_VECTOR_ROTATION with a math class from org.apache. Save you starting sensor reading, it will be a 4-element array of quaternions. Then for future readings, apply the starting reading and then get android's rotation matrix.
I am trying to measure angle at which my phone is using accelerometer. I refer to the method demonstrated here. But the problem is that when I rotate my phone at a high speed, it skips readings. For example, if I rotate my phone from 0 degrees to 90 degrees within a second, I get readings in logcat as : 0, 50, 90. What I really want is to obtain all the values through which the phone has rotated, like 0,1,2,...,90. Is it possible? If yes, what am I missing? Do I need to use gyroscope/magnetometer as well?
Basically it was the sensor delay which was causing the issue. I resolved it by setting the delay to SensorManager.SENSOR_DELAY_FASTEST.
I want to use my phone as a wheel in an android game. To do so I have to save the current orientation of my phone and get the relative angels to this saved orientation in device coordinates.
For example if I rotate the device around the z axis (see image above) I want to get that angle in respekt to the orientation I saved before.
From libGDX I only get the azimuth, pitch and roll angles relative to the world coordinate system (if I understood this right):
Any idea how I can calculate those relative angles?
How to calculate relative orientation.
1) You can convert them to quaternions using this.
2) Take conjugate of the quaternion of initial orientation using this.
3) Multiply both using this.
4) You now have relative orientation in form of a quaternion. To use it, you may either want to convert it to axis angle form using this, or transform a vector by this orientation.
See whether do you really need relative orientation.
If you need a simple wheel, accelerometer might be enough. More on this here: Compute relative orientation given azimuth, pitch, and roll in android?
Hope this helps. Good luck.
Android provides sensor data in device coordinate system no matter how it is oriented. Is there any way to have sensor data in 'gravity' coordinate system? I mean no matter how the device is oriented I want accelerometer data and orientation in coordinate system where y-axis points toward the sky, x-axis toward the east and z-axis towards south pole.
I took a look at remapCoordinateSystem but seems to be limited to only swapping axis. I guess for orientation I will have to do some low level rotation matrix transformation (or is there any better solution?). But how about acceleration data? Is there any way to have data in relation to coordinate system that is fixed (sort of world coordinate system).
The reason I need this is I'm trying to do some simple motion gestures when phone is in the pocket and it would be easier for me to have all data in coordinates system related to user rather device coordinate system (that will have a little bit different orientation in different user's pockets)
Well you basically get the North orientation when starting - for this you use the accelerometer and the magnetic field sensor to compute orientation (or the deprecated Orientation sensor).
Once you have it you can compute a rotation matrix (Direction Cosine Matrix) from those azimuth, pitch and roll angles. Multiplying your acceleration vector by that matrix will transform your device-frame movements into Earth-frame ones.
As your device will change its orientation as time goes by, you'll need to update it. To do so, retrieve gyroscope's data and update your Direction Cosine Matrix for each new value. You could also get the orientation true value just like the first time, but it's less accurate.
My solution involves DCM, but you could also use quaternions, it's just a matter of choice. Feel free to ask more if needed. I hope this is what you wanted to know !
I want to know about the values of X,Y and Z axis for any position/movement of device so I can use these values for my further work. As I searched there are two methods, Orientation Sensor(gives value in degree,as azimuth,pitch and roll) and Accelerometer(gives values between 1 to 10 for x,y and z).
As per my understanding both are same for my requirement. I can't find difference between them. Please can any one clear me about them in detail w.r.t my aim. which sensor should I use?
There are differences between both:
Accelerometer detects acceleration in space. The reason why it will always detect an acceleration of 9.8m/s^2 downwards is because gravity is equivalent to acceleration in space.
Orientation detects if your device's axis are rotated from the real-world; it detects tilts and and degrees from the magnetic North. Please note that this sensor is deprecated and Google recommends you use the accelerometer and magnetometer to calculate orientation.
You'll need the accelerometer to detect movement. So you should use this one, since your aim is to know this movement.
The orientation sensor gives information about its position compared to a reference plane. So you could use this to see of the device is tilted, upside down or something like that.