" 'onSensorChanged' overrides nothing " Fault - android

I'm trying to get values from SensorManager. I copied the code from Android API. But problems occurred. Please look at the code.
I was working on the gyroscope sensor. I wanted to examine gyroscope values and results. I found codes on this website
" https://developer.android.com/guide/topics/sensors/sensors_motion#sensors-motion-gyro "
I took error message at override fun onSensorChanged(event: SensorEvent?)
It says " 'onSensorChanged' overrides nothing "
class MainActivity : AppCompatActivity() {
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_main)
}
// Create a constant to convert nanoseconds to seconds.
private val NS2S = 1.0f / 1000000000.0f
private val deltaRotationVector = FloatArray(4) { 0f }
private var timestamp: Float = 0f
override fun onSensorChanged(event: SensorEvent?) {
// This timestep's delta rotation to be multiplied by the current rotation
// after computing it from the gyro sample data.
if (timestamp != 0f && event != null) {
val dT = (event.timestamp - timestamp) * NS2S
// Axis of the rotation sample, not normalized yet.
var axisX: Float = event.values[0]
var axisY: Float = event.values[1]
var axisZ: Float = event.values[2]
// Calculate the angular speed of the sample
val omegaMagnitude: Float = sqrt(axisX * axisX + axisY * axisY + axisZ * axisZ)
// Normalize the rotation vector if it's big enough to get the axis`enter code here`
// (that is, EPSILON should represent your maximum allowable margin of error)
if (omegaMagnitude > EPSILON) {
axisX /= omegaMagnitude
axisY /= omegaMagnitude
axisZ /= omegaMagnitude
}
// Integrate around this axis with the angular speed by the timestep
// in order to get a delta rotation from this sample over the timestep
// We will convert this axis-angle representation of the delta rotation
// into a quaternion before turning it into the rotation matrix.
val thetaOverTwo: Float = omegaMagnitude * dT / 2.0f
val sinThetaOverTwo: Float = sin(thetaOverTwo).toFloat()
val cosThetaOverTwo: Float = cos(thetaOverTwo).toFloat()
deltaRotationVector[0] = sinThetaOverTwo * axisX
deltaRotationVector[1] = sinThetaOverTwo * axisY
deltaRotationVector[2] = sinThetaOverTwo * axisZ
deltaRotationVector[3] = cosThetaOverTwo
Log.d("DENEME", "onSensorChanged: " + axisX)
Log.d("DENEME", "onSensorChanged: " + axisY)
Log.d("DENEME", "onSensorChanged: " + axisZ)
}
timestamp = event?.timestamp?.toFloat() ?: 0f
val deltaRotationMatrix = FloatArray(9) { 0f }
SensorManager.getRotationMatrixFromVector(deltaRotationMatrix, deltaRotationVector);
// User code should concatenate the delta rotation we computed with the current rotation
// in order to get the updated rotation.
// rotationCurrent = rotationCurrent * deltaRotationMatrix;
}
fun onClickDevam(view: View) // click event button to check values
{
val sensorManager = getSystemService(Context.SENSOR_SERVICE) as SensorManager
val sensor: Sensor? = sensorManager.getDefaultSensor(Sensor.TYPE_GYROSCOPE)
}
}

The override keyword in Kotlin suggests that the class is inheriting a function from a super class or interface. The Android documentation seems to be missing a pretty important step which is having your activity class implement the SensorEventListener interface.
To do this change your MainActivity declaration to something like this:
class MainActivity : AppCompatActivity(), SensorEventListener {
SensorEventListener contains the onSensorChanged function you're talking about. It will also require you to override an additional function, onAccuracyChanged, so you'll need to do this as well (but if you don't really care about accuracy changes you can leave the function's body empty - you just need to override it to satisfy the interface).
Android Studio has a handy shortcut for automatically overriding functions from interfaces which you may find useful: Ctrl+O

Related

How to implement the "fast inverse square root" with Kotlin?

fun invSqrt(x: Float): Float {
var x = x
val xhalf = 0.5F * x
var i = java.lang.Float.floatToIntBits(x)
i = 0x5f3759df - (i shr 1)
x = java.lang.Float.intBitsToFloat(i)
x *= 1.5F - xhalf * x * x
return x
}
Is there any shorter or faster way to do this with Kotlin?

pitch returned by getOrientation function is wrong

The old Sensor.TYPE_ORIENTATION sensor returned a pitch between -180° and 180°. This was a nice API which included filtering and worked great. Sadly Sensor.TYPE_ORIENTATION was deprecated and is not available on modern phones.
The blessed replacement for Sensor.TYPE_ORIENTATION is a complicated combination of Context.SENSOR_SERVICE and TYPE_MAGNETIC_FIELD and the SensorManager.getRotationMatrix() and SensorManager.getOrientation() functions. You're on your own when it comes to filtering. (As an aside I used iirj - the trivial low pass filters I found on Stackoverflow did not work as well as whatever Sensor.TYPE_ORIENTATION did)
The documentation for getOrientation claims that it returns a pitch between -π to π. This can't be true since the implementation is values[1] = (float) Math.asin(-R[7]); (asin returns values between -π/2 and π/2)
Is there any way to get the full 360° of pitch and roll from the rotation matrix?
This is a known issue that Google won't fix. I created my own getOrientation function based on Gregory G. Slabaugh's paper
// Based on pseudo code from http://www.close-range.com/docs/Computing_Euler_angles_from_a_rotation_matrix.pdf
object EulerAngleHelper {
private const val R11 = 0
private const val R12 = 1
private const val R13 = 2
private const val R21 = 3
private const val R22 = 4
private const val R23 = 5
private const val R31 = 6
private const val R32 = 7
private const val R33 = 8
private const val AZIMUTH = 0
private const val PITCH = 1
private const val ROLL = 2
private const val PHI_Z = AZIMUTH
private const val PSI_X = PITCH
private const val THETA_Y = ROLL
fun getOrientation(r: DoubleArray, values: DoubleArray): DoubleArray {
when {
r[R31] < -0.98 -> {
values[PHI_Z] = 0.0 // Anything; can set to 0
values[THETA_Y] = Math.PI / 2
values[PSI_X] = values[PHI_Z] + atan2(r[R12], r[R13])
}
r[R31] > 0.98 -> {
values[PHI_Z] = 0.0 // Anything; can set to 0
values[THETA_Y] = -Math.PI / 2
values[PSI_X] = values[PHI_Z] + atan2(-r[R12], -r[R13])
}
else -> {
values[THETA_Y] = -asin(r[R31])
val cosTheta = cos(values[THETA_Y])
values[PSI_X] = atan2(r[R32] / cosTheta, r[R33] / cosTheta)
values[PHI_Z] = atan2(r[R21] / cosTheta, r[R11] / cosTheta)
}
}
return values
}
}
I've only tested the pitch and roll.

MPAndroidChart - Piechart - custom label lines

I'm trying to draw the label lines as in picture using MPAndroidChart with a pie chart. I can't figure out how to
decouple the lines from the chart
draw that little circle at the beginning of the line.
Thank you.
This is by no means easy to achieve. To decouple the lines from the chart, you can use valueLinePart1OffsetPercentage and play with line part lengths. But to get the chart to draw dots at the end of lines, you need a custom renderer. Here's one:
class CustomPieChartRenderer(pieChart: PieChart, val circleRadius: Float)
: PieChartRenderer(pieChart, pieChart.animator, pieChart.viewPortHandler) {
override fun drawValues(c: Canvas) {
super.drawValues(c)
val center = mChart.centerCircleBox
val radius = mChart.radius
var rotationAngle = mChart.rotationAngle
val drawAngles = mChart.drawAngles
val absoluteAngles = mChart.absoluteAngles
val phaseX = mAnimator.phaseX
val phaseY = mAnimator.phaseY
val roundedRadius = (radius - radius * mChart.holeRadius / 100f) / 2f
val holeRadiusPercent = mChart.holeRadius / 100f
var labelRadiusOffset = radius / 10f * 3.6f
if (mChart.isDrawHoleEnabled) {
labelRadiusOffset = (radius - radius * holeRadiusPercent) / 2f
if (!mChart.isDrawSlicesUnderHoleEnabled && mChart.isDrawRoundedSlicesEnabled) {
rotationAngle += roundedRadius * 360 / (Math.PI * 2 * radius).toFloat()
}
}
val labelRadius = radius - labelRadiusOffset
val dataSets = mChart.data.dataSets
var angle: Float
var xIndex = 0
c.save()
for (i in dataSets.indices) {
val dataSet = dataSets[i]
val sliceSpace = getSliceSpace(dataSet)
for (j in 0 until dataSet.entryCount) {
angle = if (xIndex == 0) 0f else absoluteAngles[xIndex - 1] * phaseX
val sliceAngle = drawAngles[xIndex]
val sliceSpaceMiddleAngle = sliceSpace / (Utils.FDEG2RAD * labelRadius)
angle += (sliceAngle - sliceSpaceMiddleAngle / 2f) / 2f
if (dataSet.valueLineColor != ColorTemplate.COLOR_NONE) {
val transformedAngle = rotationAngle + angle * phaseY
val sliceXBase = cos(transformedAngle * Utils.FDEG2RAD.toDouble()).toFloat()
val sliceYBase = sin(transformedAngle * Utils.FDEG2RAD.toDouble()).toFloat()
val valueLinePart1OffsetPercentage = dataSet.valueLinePart1OffsetPercentage / 100f
val line1Radius = if (mChart.isDrawHoleEnabled) {
(radius - radius * holeRadiusPercent) * valueLinePart1OffsetPercentage + radius * holeRadiusPercent
} else {
radius * valueLinePart1OffsetPercentage
}
val px = line1Radius * sliceXBase + center.x
val py = line1Radius * sliceYBase + center.y
if (dataSet.isUsingSliceColorAsValueLineColor) {
mRenderPaint.color = dataSet.getColor(j)
}
c.drawCircle(px, py, circleRadius, mRenderPaint)
}
xIndex++
}
}
MPPointF.recycleInstance(center)
c.restore()
}
}
This custom renderer extends the default pie chart renderer. I basically just copied the code from PieChartRenderer.drawValues method, converted it to Kotlin, and removed everything that wasn't needed. I only kept the logic needed to determine the position of the points at the end of lines.
I tried to reproduce the image you showed:
val chart: PieChart = view.findViewById(R.id.pie_chart)
chart.setExtraOffsets(40f, 0f, 40f, 0f)
// Custom renderer used to add dots at the end of value lines.
chart.renderer = CustomPieChartRenderer(chart, 10f)
val dataSet = PieDataSet(listOf(
PieEntry(40f),
PieEntry(10f),
PieEntry(10f),
PieEntry(15f),
PieEntry(10f),
PieEntry(5f),
PieEntry(5f),
PieEntry(5f)
), "Pie chart")
// Chart colors
val colors = listOf(
Color.parseColor("#4777c0"),
Color.parseColor("#a374c6"),
Color.parseColor("#4fb3e8"),
Color.parseColor("#99cf43"),
Color.parseColor("#fdc135"),
Color.parseColor("#fd9a47"),
Color.parseColor("#eb6e7a"),
Color.parseColor("#6785c2"))
dataSet.colors = colors
dataSet.setValueTextColors(colors)
// Value lines
dataSet.valueLinePart1Length = 0.6f
dataSet.valueLinePart2Length = 0.3f
dataSet.valueLineWidth = 2f
dataSet.valueLinePart1OffsetPercentage = 115f // Line starts outside of chart
dataSet.isUsingSliceColorAsValueLineColor = true
// Value text appearance
dataSet.yValuePosition = PieDataSet.ValuePosition.OUTSIDE_SLICE
dataSet.valueTextSize = 16f
dataSet.valueTypeface = Typeface.DEFAULT_BOLD
// Value formatting
dataSet.valueFormatter = object : ValueFormatter() {
private val formatter = NumberFormat.getPercentInstance()
override fun getFormattedValue(value: Float) =
formatter.format(value / 100f)
}
chart.setUsePercentValues(true)
dataSet.selectionShift = 3f
// Hole
chart.isDrawHoleEnabled = true
chart.holeRadius = 50f
// Center text
chart.setDrawCenterText(true)
chart.setCenterTextSize(20f)
chart.setCenterTextTypeface(Typeface.DEFAULT_BOLD)
chart.setCenterTextColor(Color.parseColor("#222222"))
chart.centerText = "Center\ntext"
// Disable legend & description
chart.legend.isEnabled = false
chart.description = null
chart.data = PieData(dataSet)
Again, not very straightforward. I hope you like Kotlin! You can move most of that configuration code to a subclass if you need it often. Here's the result:
I'm not a MPAndroidChart expert. In fact, I've used it only once, and that was 2 years ago. But if you do your research, you can find a solution most of the time. Luckily, MPAndroidChart is a very customizable.

How to calculate 2D rotation from 3D matrix obtained from sensor TYPE_ROTATION_VECTOR (to lock one axis rotation)

I want to build application which uses phone sensor fusion to rotate 3D object in OpenGL. But I want the Z axis to be locked therefore I want basically to apply 2D rotation to my model.
In order to build 2D rotation from the 3D matrix I get from SensorManager.getRotationMatrixFromVector() I build rotation matrix for each axis as explained on the picture:
I want to apply 2D rotation matrix which would be R=Ry*Rx however this seems not working. But applying R=Rz*Ry works as expected. My guess is that Rx values are not correct.
To build the Rz, Ry, Rx matrices I looked up values used by SensorManager.getOrientation() to calculate angles:
values[0] = (float) Math.atan2(R[1], R[4]);
values[1] = (float) Math.asin(-R[7]);
values[2] = (float) Math.atan2(-R[6], R[8]);
So here is how I build matrices for each axis:
private val degConst = 180/Math.PI
private var mTempRotationMatrix = MatrixCalculations.createUnit(3)
override fun onSensorChanged(event: SensorEvent?) {
val sensor = event?.sensor ?: return
when (sensor.type) {
Sensor.TYPE_ROTATION_VECTOR -> {
SensorManager.getRotationMatrixFromVector(mTempRotationMatrix, event.values)
val zSinAlpha = mTempRotationMatrix[1]
val zCosAlpha = mTempRotationMatrix[4]
val ySinAlpha = -mTempRotationMatrix[6]
val yCosAlpha = mTempRotationMatrix[8]
val xSinAlpha = -mTempRotationMatrix[7]
val xCosAlpha = mTempRotationMatrix[4]
val rx = MatrixCalculations.createUnit(3)
val ry = MatrixCalculations.createUnit(3)
val rz = MatrixCalculations.createUnit(3)
val sina = xSinAlpha
val cosa = xCosAlpha
val sinb = ySinAlpha
val cosb = yCosAlpha
val siny = zSinAlpha
val cosy = zCosAlpha
rx[4] = cosa
rx[5] = -sina
rx[7] = sina
rx[8] = cosa
ry[0] = cosb
ry[2] = sinb
ry[6] = -sinb
ry[8] = cosb
rz[0] = cosy
rz[1] = -siny
rz[3] = siny
rz[4] = cosy
val ryx = MatrixCalculations.multiply(ry, rx)
mTempRotationMatrix = ryx
MatrixCalculations.copy(mTempRotationMatrix, mRenderer.rotationMatrix)
LOG.info("product: [" + mRenderer.rotationMatrix.joinToString(" ") + "]")
val orientation = FloatArray(3)
SensorManager.getOrientation(mTempRotationMatrix, orientation)
LOG.info("yaw: " + orientation[0] * degConst + "\n\tpitch: " + orientation[1] * degConst + "\n\troll: " + orientation[2] * degConst)
}
The question is what I am doing wrong and what values to use for the Rx matrix. Is my math applied to this problem broken? Also interesting would be to know how value of event.values[3] is related to build rotation matrix in SensorManager.getRotationMatrixFromVector().
In theory operation R=Ry*Rx should give me correct rotation but it is not the case.

Solving for calibration quaternion

I'm writing an Android app that requires the rotation vector. I'd like to use the TYPE_ROTATION_VECTOR but in some of my test devices the magnetometer doesn't perform well to say the least. Instead, the TYPE_GAME_ROTATION_VECTOR provides much smoother data (but I can't get direction relative to the Earth). What I ended up doing is while my data is loading, I run both virtual sensors. I now have an average quaternion for both, call them R (TYPE_ROTATION_VECTOR) and Rg (TYPE_GAME_ROTATION_VECTOR).
Once calibration is over I only run the TYPE_GAME_ROTATION_VECTOR, but would like to correct it for North. What I think I can do is something like: R = Rg * C where C is my calibration and Rg is the new TYPE_GAME_ROTATION_VECTOR data after a low pass filter. What I tried:
1. R = Rg * C
2. R * R' = Rg * C * R'
3. U = Rg * C * R' // Here U is the unit quaternion
4. C * R' = Rg' // This is because quaternion multiplication is associative
// Rg * (C * R') = U from line 3 therefore (C * R') must be
// equal to the conjugate of Rg
5. C = Rg' * R'' // I found this online somewhere (I hope this is right)
6. C = Rg' * R // R'' is just R
Now that I have C, I can take new values (after low pass filter) for the TYPE_GAME_ROTATION_VECTOR multiply them by C and get the actual rotation quaternion R that should be similar to the one that would have been provided by the TYPE_ROTATION_VECTOR with a steady North.
This gets me pretty close, but it doesn't quite work. I'm testing using a very simple AR like app that shows an item (who's position is determined by the device orientation) floating on the screen. If I leave out the calibration the character shows up and tracks perfectly, but it doesn't show up North of me (I have it fixed at (0, 1, 0) for now). If I take the rotation vector, get the quaternion, multiply by the calibration constant, the tracking gets thrown off:
Rotating the device about the Y axis shifts the item correctly horizontally, but it also adds a vertical component where rotating in the positive direction (using right hand rule) moves my item up (negative Y on the screen).
Rotating the device about the X axis shifts the item correctly vertically, but it also adds a horizontal component where rotation in the positive direction (using right hand rule) moves my item right (positive X on the screen).
Rotating the device about the Z axis works.
Sorry for the long description, I just want to make sure all the details are there. Summary of the question: I want to be able to get a rotation matrix that is roughly north and avoid using the magnetometer. I'm trying to do this by taking the average difference between TYPE_ROTATION_VECTOR and TYPE_GAME_ROTATION_VECTOR and using that to "calibrate" future values from the TYPE_GAME_ROTATION_VECTOR but it doesn't work. Does anyone know what the issue might be with how I'm calculating the calibration (or any other part of this)?
Some additional info:
private float[] values = null
public void onSensorChanged(SensorEvent event) {
values = lowPass(event.values.clone(), values);
Quaternion rawQuaternion = Quaternion.fromRotationVector(values);
Quaternion calibratedQuaternion = rawQuaternion.mult(calibration);
float[] rotationMatrix = calibratedQuaternion.getRotationMatrix();
float[] pos = new float[] { 0f, 1f, 0f, 1f };
Matrix.multiplyMV(pos, 0, rotationMatrix, 0, pos, 0);
Matrix.multiplyMV(pos, 0, matrixMVP, 0, pos, 0);
// Screen position should be found at pos[0], -pos[1] on a [-1,1] scale
}
Quaternion fromRotationVector(float[] r) {
float[] Q = new float[4];
SensorManager.getQuaternionFromVector(Q, r);
return new Quaternion(Q);
}
Quaternion mult(Quaternion q) {
Quaternion qu = new Quaternion();
qu.w = w*q.w - x*q.x - y*q.y - z*q.z;
qu.x = w*q.x + x*q.w + y*q.z - z*q.y;
qu.y = w*q.y + y*q.w + z*q.x - x*q.z;
qu.z = w*q.z + z*q.w + x*q.y - y*q.x;
return qu;
}
float[] getRotationMatrix() {
float[] M = new float[16];
float[] V = new float[] { x, y, z, w };
SensorManager.getRotationMatrixFromVector(M, V);
return M;
}
I had the same issue and did some research and realized where the problem is. So basically, by only looking at a stationary orientation of the IMU, you only align one axis of the coordinate system which is the vertical axis in the direction of gravity. That's why you rotations around Z axis works fine.
To complete your static calibrations, you have to include a planar motion and find the principal vectors of the motion which will be the, say, your X axis. Y axis follows the right-hand rule.
Simply, rotate the IMU around the global X axis and look at the gyroscope outputs of your IMU. The principal component of your gyroscope should be towards the X axis. After finding the Z axis in the first step and X axis in the second step, you can find Y axis by the cross product of the two. Using these axes, create the rotation matrix or the quaternion for the translations.
Here's what I ended up doing (there are some changes coming soon and once done I'll publish it on jcenter as a library). What this tries to solve is being able to run the Game Rotation Vector sensor (which has much less drift than the Rotation Vector sensor) while still pointing roughly north. Answer is in Kotlin:
class RotationMatrixLiveData(context Context): LiveData<FloatArray>(), SensorEventListener {
private val sensorManager = context.getSystemService(Context.SENSOR_SERVICE) as SensorManager
private val rotationSensor = sensorManager.getDefaultSensor(Sensor.TYPE_ROTATION_VECTOR)
private val gameRotationSensor =
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.JELLY_BEAN_MR2)
sensorManager.getDefaultSensor(Sensor.TYPE_GAME_ROTATION_VECTOR)
else null
private var isActive = false
private var isCalibrating = false
private var rotationValues: FloatArray? = null
var calibrationCount = 0
var calibrationQuaternion: FloatArray? = null
var calibrationGameCount = 0
var calibrationGameQuat: FloatArray? = null
var calibration: Quaternion? = null
var rotationQuaternionValues = FloatArray(4)
var gameQuaternionValues = FloatArray(4)
private val rotationVectorQuaternion = Quaternion()
init {
value = floatArrayOf(
1f, 0f, 0f, 0f,
0f, 1f, 0f, 0f,
0f, 0f, 1f, 0f,
0f, 0f, 0f, 1f)
}
/**
* Starts calibrating the rotation matrix (if the game rotation vector sensor
* is available.
*/
fun beginCalibration() {
gameRotationSensor?.let {
isCalibrating = true
calibration = null
calibrationQuaternion = null
calibrationCount = 0
calibrationGameQuat = null
calibrationGameCount = 0
sensorManager.registerListener(this, rotationSensor, SensorManager.SENSOR_DELAY_FASTEST)
sensorManager.registerListener(this, it, SensorManager.SENSOR_DELAY_FASTEST)
}
}
/**
* Stop calibrating the rotation matrix.
*/
fun stopCalibration() {
isCalibrating = false
if (!isActive) {
// Not active, just turn off everthing
sensorManager.unregisterListener(this)
} else if (gameRotationSensor != null) {
// Active and has both sensors, turn off rotation and leave the game rotation running
sensorManager.unregisterListener(this, rotationSensor)
}
}
override fun onActive() {
super.onActive()
isActive = true
val sensor = gameRotationSensor ?: rotationSensor
sensorManager.registerListener(this, sensor, SensorManager.SENSOR_DELAY_FASTEST)
}
override fun onInactive() {
super.onInactive()
isActive = false
if (!isCalibrating) {
sensorManager.unregisterListener(this)
}
}
//
// SensorEventListener
//
override fun onAccuracyChanged(sensor: Sensor, accuracy: Int) {}
override fun onSensorChanged(event: SensorEvent) {
if (isCalibrating) {
if (event.sensor.type == Sensor.TYPE_ROTATION_VECTOR) {
SensorManager.getQuaternionFromVector(rotationQuaternionValues, event.values)
calibrationQuaternion?.let { quat ->
for (i in 0..3) {
rotationQuaternionValues[i] += quat[i]
}
}
calibrationQuaternion = rotationQuaternionValues
calibrationCount++
} else if (event.sensor.type == Sensor.TYPE_GAME_ROTATION_VECTOR) {
SensorManager.getQuaternionFromVector(gameQuaternionValues, event.values)
calibrationGameQuat?.let {quat ->
for (i in 0..3) {
gameQuaternionValues[i] += quat[i]
}
}
calibrationGameQuat = gameQuaternionValues
calibrationGameCount++
}
} else if (gameRotationSensor == null || event.sensor.type != Sensor.TYPE_ROTATION_VECTOR) {
// Only calculate rotation if there is no game rotation sensor or if the event is a game
// rotation
val calibrationQ = calibrationQuaternion
val calibrationQg = calibrationGameQuat
if (calibrationQ != null && calibrationQg != null) {
for (i in 0..3) {
calibrationQ[i] /= calibrationCount.toFloat()
calibrationQg[i] /= calibrationGameCount.toFloat()
}
calibration = (Quaternion(calibrationQg).apply { conjugate() } *
Quaternion(calibrationQ)).apply {
x = 0f
y = 0f
normalize()
}
}
calibrationQuaternion = null
calibrationGameQuat = null
// Run values through low-pass filter
val values = lowPass(event.values, rotationValues)
rotationValues = values
rotationVectorQuaternion.setFromRotationVector(values)
// Calibrate if available
calibration?.let { rotationVectorQuaternion.preMult(it) }
// Generate rotation matrix
value = rotationVectorQuaternion.getRotationMatrix(value)
}
}
}
For the quaternion class I'm using:
class Quaternion(val values: FloatArray = floatArrayOf(1f, 0f, 0f, 0f)) {
companion object {
fun fromRotationVector(rv: FloatArray): Quaternion {
val Q = FloatArray(4)
SensorManager.getQuaternionFromVector(Q, rv)
return Quaternion(Q)
}
}
private val buffer = FloatArray(4)
var w: Float
get() = values[0]
set(value) { values[0] = value }
var x: Float
get() = values[1]
set(value) { values[1] = value }
var y: Float
get() = values[2]
set(value) { values[2] = value }
var z: Float
get() = values[3]
set(value) { values[3] = value }
fun setFromRotationVector(rv: FloatArray) {
SensorManager.getQuaternionFromVector(values, rv)
}
fun conjugate() {
x = -x
y = -y
z = -z
}
fun getRotationMatrix(R: FloatArray? = null): FloatArray {
val matrix = R ?: FloatArray(16)
for (i in 0..3) {
buffer[i] = values[(i+1)%4]
}
SensorManager.getRotationMatrixFromVector(matrix, buffer)
return matrix
}
fun magnitude(): Float {
var mag = 0f
for (i in 0..3) {
mag += values[i]*values[i]
}
return Math.sqrt(mag.toDouble()).toFloat()
}
fun normalize() {
val mag = magnitude()
x /= mag
y /= mag
z /= mag
w /= mag
}
fun preMult(left: Quaternion) {
buffer[0] = left.w*this.w - left.x*this.x - left.y*this.y - left.z*this.z
buffer[1] = left.w*this.x + left.x*this.w + left.y*this.z - left.z*this.y
buffer[2] = left.w*this.y + left.y*this.w + left.z*this.x - left.x*this.z
buffer[3] = left.w*this.z + left.z*this.w + left.x*this.y - left.y*this.x
for (i in 0..3) {
values[i] = buffer[i]
}
}
operator fun times(q: Quaternion): Quaternion {
val qu = Quaternion()
qu.w = w*q.w - x*q.x - y*q.y - z*q.z
qu.x = w*q.x + x*q.w + y*q.z - z*q.y
qu.y = w*q.y + y*q.w + z*q.x - x*q.z
qu.z = w*q.z + z*q.w + x*q.y - y*q.x
return qu
}
operator fun times(v: FloatArray): FloatArray {
val conj = Quaternion(values.clone()).apply { conjugate() }
return multiplyQV(multiplyQV(values, v), conj.values)
}
override fun toString(): String {
return "(${w.toString(5)}(w), ${x.toString(5)}, ${y.toString(5)}, ${z.toString(5)}) |${magnitude().toString(5)}|"
}
private fun multiplyQV(q: FloatArray, r: FloatArray): FloatArray {
val result = FloatArray(4)
result[0] = r[0]*q[0]-r[1]*q[1]-r[2]*q[2]-r[3]*q[3]
result[1] = r[0]*q[1]+r[1]*q[0]-r[2]*q[3]+r[3]*q[2]
result[2] = r[0]*q[2]+r[1]*q[3]+r[2]*q[0]-r[3]*q[1]
result[3] = r[0]*q[3]-r[1]*q[2]+r[2]*q[1]+r[3]*q[0]
return result
}
}

Categories

Resources