Step Counting Algorithm - android

I am Currently working on pedometer Application for intel device with LSM9DS0 sensor. My requirement is to develop a pedometer with the application code intel published in the Native Application Guide
This code is available in the section 6.1.4 of below shared pdf
http://download.intel.com/support/edison/sb/edison_nag_331192003.pdf
As a matter of fact I found some similar code but android version
public StepDetector() {
int h = 480; // TODO: remove this constant
mYOffset = h * 0.5f;
mScale[0] = - (h * 0.5f * (1.0f / (SensorManager.STANDARD_GRAVITY * 2)));
mScale[1] = - (h * 0.5f * (1.0f / (SensorManager.MAGNETIC_FIELD_EARTH_MAX)));
}
public void setSensitivity(float sensitivity) {
mLimit = sensitivity; // 1.97 2.96 4.44 6.66 10.00 15.00 22.50 33.75 50.62
}
public void addStepListener(StepListener sl) {
mStepListeners.add(sl);
}
//public void onSensorChanged(int sensor, float[] values) {
public void onSensorChanged(SensorEvent event) {
Sensor sensor = event.sensor;
synchronized (this) {
if (sensor.getType() == Sensor.TYPE_ORIENTATION) {
}
else {
int j = (sensor.getType() == Sensor.TYPE_ACCELEROMETER) ? 1 : 0;
if (j == 1) {
float vSum = 0;
for (int i=0 ; i<3 ; i++) {
final float v = mYOffset + event.values[i] * mScale[j];
vSum += v;
}
int k = 0;
float v = vSum / 3;
float direction = (v > mLastValues[k] ? 1 : (v < mLastValues[k] ? -1 : 0));
if (direction == - mLastDirections[k]) {
// Direction changed
int extType = (direction > 0 ? 0 : 1); // minumum or maximum?
mLastExtremes[extType][k] = mLastValues[k];
float diff = Math.abs(mLastExtremes[extType][k] - mLastExtremes[1 - extType][k]);
if (diff > mLimit) {
boolean isAlmostAsLargeAsPrevious = diff > (mLastDiff[k]*2/3);
boolean isPreviousLargeEnough = mLastDiff[k] > (diff/3);
boolean isNotContra = (mLastMatch != 1 - extType);
if (isAlmostAsLargeAsPrevious && isPreviousLargeEnough && isNotContra) {
Log.i(TAG, "step");
for (StepListener stepListener : mStepListeners) {
stepListener.onStep();
}
mLastMatch = extType;
}
else {
mLastMatch = -1;
}
}
mLastDiff[k] = diff;
}
mLastDirections[k] = direction;
mLastValues[k] = v;
}
}
}
}
Complete android source code available in Github
https://github.com/bagilevi/android-pedometer
My problem is that I am not able to make perfect design document using these codes. I need clarifications on
How does Scale values are calculated?
its look like a random constant from nowhere is used here
int h = 480; // TODO: remove this constant
mScale[0] = - (h * 0.5f * (1.0f / (SensorManager.STANDARD_GRAVITY * 2)));
mScale[1] = - (h * 0.5f * (1.0f / (SensorManager.MAGNETIC_FIELD_EARTH_MAX)));
How does these mLimit value calculated?
if (diff > mLimit)
Intel code saying
v = vSum / 2300;
ButAndroid code saying
v = vSum / 3;
The Calculation of vector Sum is similar in both the codes. Which equation actually followed here?
Please some one help me to get these things straight

I found out that the mechanism is just trial and run algorithm
These constant values may subject to change based on the environment, device, device packing, sensitivity of the accelerometer etc.

Related

ARCore OpenGL ES Moving Object using onScroll

I'm attempting to add in the moving of objects around with single finger scrolling. (See Google AR Stickers for example) I'm using native ARCore/OpenGL based originally off the ARCore examples. When you slide 1 finger, I want to move the object in 3D space along the X and Z axes.
I can easily create the movement using Translation, however, it performs the movement based on the original camera orientation. If you physically move the phone/camera a few lateral steps, the finger movements no longer match what a user would expect.
So I changed up my code and mapped the finger distanceX and distanceY that is swiped to affect X and Z coordinates depending on the angle of change from the original camera starting point to the current camera point.
The issue that I'm running into is determining the angle at which the camera has been moved. I've been looking at the value from the Camera View matrix:
camera.getViewMatrix(viewmtx, 0);
But the X, Y, and Z coordinates always say 0. I'm assuming this is because it's always making the camera the origin? Does anyone know of a way to calculate the angle of rotation of a camera from a 3D object using the ARCore/OpenGL ES libraries? The red angle in the illustration below (from top-down perspective) is what I'm trying to get. Sorry for the crude drawing:
Here's my code for your reference:
// Handle Gestures - Single touch for Translation and Placement
mGestureDetector = new GestureDetector(this, new GestureDetector.SimpleOnGestureListener()
{
#Override
public boolean onSingleTapUp(MotionEvent e)
{
onSingleTap(e);
return true;
}
#Override
public boolean onScroll(MotionEvent e1, MotionEvent e2, float distanceX, float distanceY)
{
if (e2.getPointerCount() == 1 && ...)
{
double angle = findCameraAngleFromOrigin();
double speed = 0.005d;
if (angle / 90d < 1) //Quadrant 1
{
double transX = -(distanceY * (angle / 90d)) + (distanceX * ((90d - angle) / 90d));
double transY = (distanceY * ((90d - angle) / 90d)) + (distanceX * (angle / 90d));
mTranslationX.set(mTranslationX.size() - 1, (float) (mTranslationX.get(mTranslationX.size() - 1) + (transX * -speed)));
mTranslationZ.set(mTranslationZ.size() - 1, (float) (mTranslationZ.get(mTranslationZ.size() - 1) + (transY * -speed)));
}
else if (angle / 90d < 2) //Quadrant 2
{
angle -= 90d;
double transX = (distanceX * (angle / 90d)) + (distanceY * ((90d - angle) / 90d));
double transY = (-distanceX * ((90d - angle) / 90d)) + (distanceY * (angle / 90d));
mTranslationX.set(mTranslationX.size() - 1, (float) (mTranslationX.get(mTranslationX.size() - 1) + (transX * speed)));
mTranslationZ.set(mTranslationZ.size() - 1, (float) (mTranslationZ.get(mTranslationZ.size() - 1) + (transY * speed)));
}
else if (angle / 90d < 3) //Quadrant 3
{
angle -= 180d;
double transX = (distanceY * (angle / 90d)) + (-distanceX * ((90d - angle) / 90d));
double transY = (-distanceY * ((90d - angle) / 90d)) + (-distanceX * (angle / 90d));
mTranslationX.set(mTranslationX.size() - 1, (float) (mTranslationX.get(mTranslationX.size() - 1) + (transX * -speed)));
mTranslationZ.set(mTranslationZ.size() - 1, (float) (mTranslationZ.get(mTranslationZ.size() - 1) + (transY * -speed)));
}
else //Quadrant 4
{
angle -= 270d;
double transX = (-distanceX * (angle / 90d)) + (-distanceY * ((90d - angle) / 90d));
double transY = (distanceX * ((90d - angle) / 90d)) + (-distanceY * (angle / 90d));
mTranslationX.set(mTranslationX.size() - 1, (float) (mTranslationX.get(mTranslationX.size() - 1) + (transX * speed)));
mTranslationZ.set(mTranslationZ.size() - 1, (float) (mTranslationZ.get(mTranslationZ.size() - 1) + (transY * speed)));
}
return true;
}
return false;
}
}
EDIT: Update to code
I wasn't able to figure out how to find the distance from the camera points, but I was able to at least find the difference in the phone's rotation, which is closer. I accomplished it by the code below. I'm still not happy with the results, so I'll post updates later on when I find a more effective solution.
private double getDegree(double value1, double value2)
{
double firstAngle = value1 * 90;
double secondAngle = value2 * 90;
if (secondAngle >= 0 && firstAngle >= 0)
{
Log.d(TAG, "FIRST QUADRANT");
return firstAngle; // first quadrant
}
else if (secondAngle < 0 && firstAngle >= 0)
{
Log.d(TAG, "SECOND QUADRANT");
return 90 + (90 - firstAngle); //second quadrant
}
else if (secondAngle < 0 && firstAngle < 0)
{
Log.d(TAG, "THIRD QUADRANT");
return 180 - firstAngle; //third quadrant
}
else
{
Log.d(TAG, "FOURTH QUADRANT");
return 270 + (90 + firstAngle); //fourth quadrant
}
}
private double findCameraAngleFromOrigin()
{
double angle = getDegree(mCurrentCameraMatrix[2], mCurrentCameraMatrix[0]) - getDegree(mOriginCameraMatrix[2], mOriginCameraMatrix[0]);
if (angle < 0)
return angle + 360;
return angle;
}
#Override
public void onDrawFrame(GL10 gl)
{
...
//When creating a new object
Anchor anchor = hit.createAnchor();
mAnchors.add(anchor);
camera.getDisplayOrientedPose().toMatrix( mOriginCameraMatrix, 0);
//During each draw frame
camera.getDisplayOrientedPose().toMatrix( mCurrentCameraMatrix, 0);
int ac = 0;
for (Anchor anchor : mAnchors)
{
if (anchor.getTrackingState() != TrackingState.TRACKING)
{
continue;
}
// Get the current pose of an Anchor in world space. The Anchor pose is updated
// during calls to session.update() as ARCore refines its estimate of the world.
anchor.getPose().toMatrix(mAnchorMatrix, 0);
// Update and draw the model
if (mModelSet)
{
if (mScaleFactors.size() <= ac)
{
mScaleFactors.add(1.0f);
}
if (mRotationThetas.size() <= ac)
{
mRotationThetas.add(0.0f);
}
if (mTranslationX.size() <= ac)
{
mTranslationX.add(viewmtx[3]);
}
if (mTranslationZ.size() <= ac)
{
mTranslationZ.add(viewmtx[11]);
}
translateMatrix(mTranslationX.get(ac), 0, mTranslationZ.get(ac));
rotateYAxisMatrix(mRotationThetas.get(ac));
ObjectRenderer virtualObject = mVirtualObjects.get(mAnchorReferences.get(ac));
virtualObject.updateModelMatrix(mAnchorMatrix, mScaleFactors.get(ac));
virtualObject.draw(viewmtx, projmtx, lightIntensity);
}
ac++;
}
}
This is kind of hard to track, so I'll probably just post the entire class once I feel more comfortable with the implementation and code cleanup.
public class HelloArActivity extends AppCompatActivity implements GLSurfaceView.Renderer {
private static final String TAG = HelloArActivity.class.getSimpleName();
// Rendering. The Renderers are created here, and initialized when the GL surface is created.
private GLSurfaceView surfaceView;
private boolean installRequested;
private Session session;
private GestureDetector gestureDetector;
private Snackbar messageSnackbar;
private DisplayRotationHelper displayRotationHelper;
private final BackgroundRenderer backgroundRenderer = new BackgroundRenderer();
private final ObjectRenderer virtualObject = new ObjectRenderer();
private final ObjectRenderer virtualObjectShadow = new ObjectRenderer();
private final PlaneRenderer planeRenderer = new PlaneRenderer();
private final PointCloudRenderer pointCloud = new PointCloudRenderer();
private int mCurrent = -1;
private final List<ObjectRenderer> mVirtualObjects = new ArrayList<ObjectRenderer>();
// Temporary matrix allocated here to reduce number of allocations for each frame.
private final float[] anchorMatrix = new float[16];
//Rotation, Moving, & Scaling
private final List<Float> mRotationThetas = new ArrayList<Float>();
private GestureDetector mGestureDetector;
private ScaleGestureDetector mScaleDetector;
private RotationGestureDetector mRotationDetector;
private final List<Float> mScaleFactors = new ArrayList<Float>();
private final List<Float> mTranslationX = new ArrayList<Float>();
private final List<Float> mTranslationZ = new ArrayList<Float>();
private final float[] mOriginCameraMatrix = new float[16];
private final float[] mCurrentCameraMatrix = new float[16];
private boolean mModelSet = false;
// Tap handling and UI.
private final ArrayBlockingQueue<MotionEvent> mQueuedSingleTaps = new ArrayBlockingQueue<>(16);
private final ArrayList<Anchor> mAnchors = new ArrayList<>();
// Tap handling and UI.
private final ArrayBlockingQueue<MotionEvent> queuedSingleTaps = new ArrayBlockingQueue<>(16);
private final List<Integer> mAnchorReferences = new ArrayList<Integer>();
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
surfaceView = findViewById(R.id.surfaceview);
displayRotationHelper = new DisplayRotationHelper(/*context=*/ this);
//Handle Gestures - Multitouch for Scaling and Rotation
mScaleDetector = new ScaleGestureDetector(this, new ScaleGestureDetector.SimpleOnScaleGestureListener() {
#Override
public boolean onScale(ScaleGestureDetector detector) {
if (mScaleFactors.size() > 0) {
mScaleFactors.set(mScaleFactors.size() - 1, Math.max(0.1f, Math.min(detector.getScaleFactor() * mScaleFactors.get(mScaleFactors.size() - 1), 5.0f)));
return true;
}
return false;
}
});
mRotationDetector = new RotationGestureDetector(this, new RotationGestureDetector.OnRotationGestureListener() {
#Override
public void OnRotation(RotationGestureDetector rotationDetector) {
if (mRotationThetas.size() > 0) {
mRotationThetas.set(mRotationThetas.size() - 1, (mRotationThetas.get(mRotationThetas.size() - 1) + (rotationDetector.getAngle() * -0.001f)));
}
}
});
// Set up tap listener.
gestureDetector =
new GestureDetector(
this,
new GestureDetector.SimpleOnGestureListener() {
#Override
public boolean onSingleTapUp(MotionEvent e) {
onSingleTap(e);
return true;
}
#Override
public boolean onScroll(MotionEvent e1, MotionEvent e2, float distanceX, float distanceY) {
if (e2.getPointerCount() == 1 && mTranslationX.size() > 0 && mTranslationZ.size() > 0) {
double angle = findCameraAngleFromOrigin();
double speed = 0.001d;
if (angle / 90d < 1) //Quadrant 1
{
double transX = -(distanceY * (angle / 90d)) + (distanceX * ((90d - angle) / 90d));
double transY = (distanceY * ((90d - angle) / 90d)) + (distanceX * (angle / 90d));
// showSnackbarMessage("ANGLE: " + angle + ", distanceX: " + distanceX + ", distanceY: " + distanceY, false);
mTranslationX.set(mTranslationX.size() - 1, (float) (mTranslationX.get(mTranslationX.size() - 1) + (transX * -speed)));
mTranslationZ.set(mTranslationZ.size() - 1, (float) (mTranslationZ.get(mTranslationZ.size() - 1) + (transY * -speed)));
} else if (angle / 90d < 2) //Quadrant 2
{
angle -= 90d;
double transX = (distanceX * (angle / 90d)) + (distanceY * ((90d - angle) / 90d));
double transY = (-distanceX * ((90d - angle) / 90d)) + (distanceY * (angle / 90d));
// showSnackbarMessage("ANGLE: " + angle + ", distanceX: " + distanceX + ", distanceY: " + distanceY, false);
mTranslationX.set(mTranslationX.size() - 1, (float) (mTranslationX.get(mTranslationX.size() - 1) + (transX * speed)));
mTranslationZ.set(mTranslationZ.size() - 1, (float) (mTranslationZ.get(mTranslationZ.size() - 1) + (transY * speed)));
} else if (angle / 90d < 3) //Quadrant 3
{
angle -= 180d;
double transX = (distanceY * (angle / 90d)) + (-distanceX * ((90d - angle) / 90d));
double transY = (-distanceY * ((90d - angle) / 90d)) + (-distanceX * (angle / 90d));
// showSnackbarMessage("ANGLE: " + angle + ", distanceX: " + distanceX + ", distanceY: " + distanceY, false);
mTranslationX.set(mTranslationX.size() - 1, (float) (mTranslationX.get(mTranslationX.size() - 1) + (transX * -speed)));
mTranslationZ.set(mTranslationZ.size() - 1, (float) (mTranslationZ.get(mTranslationZ.size() - 1) + (transY * -speed)));
} else //Quadrant 4
{
angle -= 270d;
double transX = (-distanceX * (angle / 90d)) + (-distanceY * ((90d - angle) / 90d));
double transY = (distanceX * ((90d - angle) / 90d)) + (-distanceY * (angle / 90d));
// showSnackbarMessage("ANGLE: " + angle + ", distanceX: " + distanceX + ", distanceY: " + distanceY, false);
mTranslationX.set(mTranslationX.size() - 1, (float) (mTranslationX.get(mTranslationX.size() - 1) + (transX * speed)));
mTranslationZ.set(mTranslationZ.size() - 1, (float) (mTranslationZ.get(mTranslationZ.size() - 1) + (transY * speed)));
}
return true;
}
return false;
}
#Override
public boolean onDown(MotionEvent e) {
return true;
}
});
surfaceView.setOnTouchListener(
new View.OnTouchListener() {
#Override
public boolean onTouch(View v, MotionEvent event) {
boolean retVal = mScaleDetector.onTouchEvent(event);
if (retVal)
mRotationDetector.onTouchEvent(event);
retVal = gestureDetector.onTouchEvent(event) || retVal;
return retVal || gestureDetector.onTouchEvent(event);
//return gestureDetector.onTouchEvent(event);
}
});
// Set up renderer.
surfaceView.setPreserveEGLContextOnPause(true);
surfaceView.setEGLContextClientVersion(2);
surfaceView.setEGLConfigChooser(8, 8, 8, 8, 16, 0); // Alpha used for plane blending.
surfaceView.setRenderer(this);
surfaceView.setRenderMode(GLSurfaceView.RENDERMODE_CONTINUOUSLY);
installRequested = false;
}
private double findCameraAngleFromOrigin() {
double angle = getDegree(mCurrentCameraMatrix[2], mCurrentCameraMatrix[0]) -
getDegree(mOriginCameraMatrix[2], mOriginCameraMatrix[0]);
if (angle < 0)
return angle + 360;
return angle;
}
private double getDegree(double value1, double value2) {
double firstAngle = value1 * 90;
double secondAngle = value2 * 90;
if (secondAngle >= 0 && firstAngle >= 0) {
return firstAngle; // first quadrant
} else if (secondAngle < 0 && firstAngle >= 0) {
return 90 + (90 - firstAngle); //second quadrant
} else if (secondAngle < 0 && firstAngle < 0) {
return 180 - firstAngle; //third quadrant
} else {
return 270 + (90 + firstAngle); //fourth quadrant
}
}
#Override
protected void onResume() {
super.onResume();
if (session == null) {
Exception exception = null;
String message = null;
try {
switch (ArCoreApk.getInstance().requestInstall(this, !installRequested)) {
case INSTALL_REQUESTED:
installRequested = true;
return;
case INSTALLED:
break;
}
// ARCore requires camera permissions to operate. If we did not yet obtain runtime
// permission on Android M and above, now is a good time to ask the user for it.
if (!CameraPermissionHelper.hasCameraPermission(this)) {
CameraPermissionHelper.requestCameraPermission(this);
return;
}
session = new Session(/* context= */ this);
} catch (UnavailableArcoreNotInstalledException
| UnavailableUserDeclinedInstallationException e) {
message = "Please install ARCore";
exception = e;
} catch (UnavailableApkTooOldException e) {
message = "Please update ARCore";
exception = e;
} catch (UnavailableSdkTooOldException e) {
message = "Please update this app";
exception = e;
} catch (Exception e) {
message = "This device does not support AR";
exception = e;
}
if (message != null) {
showSnackbarMessage(message, true);
Log.e(TAG, "Exception creating session", exception);
return;
}
// Create default config and check if supported.
Config config = new Config(session);
if (!session.isSupported(config)) {
showSnackbarMessage("This device does not support AR", true);
}
session.configure(config);
}
showLoadingMessage();
// Note that order matters - see the note in onPause(), the reverse applies here.
session.resume();
surfaceView.onResume();
displayRotationHelper.onResume();
}
#Override
public void onPause() {
super.onPause();
if (session != null) {
// Note that the order matters - GLSurfaceView is paused first so that it does not try
// to query the session. If Session is paused before GLSurfaceView, GLSurfaceView may
// still call session.update() and get a SessionPausedException.
displayRotationHelper.onPause();
surfaceView.onPause();
session.pause();
}
}
#Override
public void onRequestPermissionsResult(int requestCode, String[] permissions, int[] results) {
if (!CameraPermissionHelper.hasCameraPermission(this)) {
Toast.makeText(this, "Camera permission is needed to run this application", Toast.LENGTH_LONG)
.show();
if (!CameraPermissionHelper.shouldShowRequestPermissionRationale(this)) {
// Permission denied with checking "Do not ask again".
CameraPermissionHelper.launchPermissionSettings(this);
}
finish();
}
}
#Override
public void onWindowFocusChanged(boolean hasFocus) {
super.onWindowFocusChanged(hasFocus);
if (hasFocus) {
// Standard Android full-screen functionality.
getWindow()
.getDecorView()
.setSystemUiVisibility(
View.SYSTEM_UI_FLAG_LAYOUT_STABLE
| View.SYSTEM_UI_FLAG_LAYOUT_HIDE_NAVIGATION
| View.SYSTEM_UI_FLAG_LAYOUT_FULLSCREEN
| View.SYSTEM_UI_FLAG_HIDE_NAVIGATION
| View.SYSTEM_UI_FLAG_FULLSCREEN
| View.SYSTEM_UI_FLAG_IMMERSIVE_STICKY);
getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);
}
}
private void onSingleTap(MotionEvent e) {
// Queue tap if there is space. Tap is lost if queue is full.
queuedSingleTaps.offer(e);
}
#Override
public void onSurfaceCreated(GL10 gl, EGLConfig config) {
GLES20.glClearColor(0.1f, 0.1f, 0.1f, 1.0f);
// Create the texture and pass it to ARCore session to be filled during update().
backgroundRenderer.createOnGlThread(/*context=*/ this);
// Prepare the other rendering objects.
try {
virtualObject.createOnGlThread(/*context=*/ this, "andy.obj", "andy.png");
virtualObject.setMaterialProperties(0.0f, 3.5f, 1.0f, 6.0f);
virtualObjectShadow.createOnGlThread(/*context=*/ this, "andy_shadow.obj", "andy_shadow.png");
virtualObjectShadow.setBlendMode(BlendMode.Shadow);
virtualObjectShadow.setMaterialProperties(1.0f, 0.0f, 0.0f, 1.0f);
} catch (IOException e) {
Log.e(TAG, "Failed to read obj file");
}
try {
planeRenderer.createOnGlThread(/*context=*/ this, "trigrid.png");
} catch (IOException e) {
Log.e(TAG, "Failed to read plane texture");
}
pointCloud.createOnGlThread(/*context=*/ this);
}
#Override
public void onSurfaceChanged(GL10 gl, int width, int height) {
displayRotationHelper.onSurfaceChanged(width, height);
GLES20.glViewport(0, 0, width, height);
mVirtualObjects.add(virtualObject);
}
#Override
public void onDrawFrame(GL10 gl) {
// Clear screen to notify driver it should not load any pixels from previous frame.
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);
if (session == null) {
return;
}
// Notify ARCore session that the view size changed so that the perspective matrix and
// the video background can be properly adjusted.
displayRotationHelper.updateSessionIfNeeded(session);
try {
session.setCameraTextureName(backgroundRenderer.getTextureId());
// Obtain the current frame from ARSession. When the configuration is set to
// UpdateMode.BLOCKING (it is by default), this will throttle the rendering to the
// camera framerate.
Frame frame = session.update();
Camera camera = frame.getCamera();
// Handle taps. Handling only one tap per frame, as taps are usually low frequency
// compared to frame rate.
MotionEvent tap = queuedSingleTaps.poll();
if (tap != null && camera.getTrackingState() == TrackingState.TRACKING) {
for (HitResult hit : frame.hitTest(tap)) {
// Check if any plane was hit, and if it was hit inside the plane polygon
Trackable trackable = hit.getTrackable();
// Creates an anchor if a plane or an oriented point was hit.
if ((trackable instanceof Plane && ((Plane) trackable).isPoseInPolygon(hit.getHitPose()))
|| (trackable instanceof Point
&& ((Point) trackable).getOrientationMode()
== OrientationMode.ESTIMATED_SURFACE_NORMAL)) {
// Hits are sorted by depth. Consider only closest hit on a plane or oriented point.
// Cap the number of objects created. This avoids overloading both the
// rendering system and ARCore.
if (mAnchors.size() >= 20)
{
//Alert the user that the maximum has been reached, must call from
//Handler as this is a UI action being done on a worker thread.
new Handler(Looper.getMainLooper())
{
#Override
public void handleMessage(Message message)
{
Toast.makeText(HelloArActivity.this,
"You've reached the maximum!", Toast.LENGTH_LONG).show();
}
};
// Alternatively, you can start detaching, however, a revision to the
// mAnchorReferences and mScalingFactors should be made!
// mAnchors.get(0).detach();
// mAnchors.remove(0);
}
else
{
// Adding an Anchor tells ARCore that it should track this position in
// space. This anchor is created on the Plane to place the 3d model
// in the correct position relative both to the world and to the plane.
mScaleFactors.add(1.0f);
Anchor anchor = hit.createAnchor();
mAnchors.add(anchor);
mAnchorReferences.add(mCurrent);
camera.getDisplayOrientedPose().toMatrix(mOriginCameraMatrix, 0);
}
break;
}
}
}
// Draw background.
backgroundRenderer.draw(frame);
// If not tracking, don't draw 3d objects.
if (camera.getTrackingState() == TrackingState.PAUSED) {
return;
}
// Get projection matrix.
float[] projmtx = new float[16];
camera.getProjectionMatrix(projmtx, 0, 0.1f, 100.0f);
// Get camera matrix and draw.
float[] viewmtx = new float[16];
camera.getViewMatrix(viewmtx, 0);
// Compute lighting from average intensity of the image.
final float lightIntensity = frame.getLightEstimate().getPixelIntensity();
// Visualize tracked points.
PointCloud pointCloud = frame.acquirePointCloud();
this.pointCloud.update(pointCloud);
this.pointCloud.draw(viewmtx, projmtx);
// Application is responsible for releasing the point cloud resources after
// using it.
pointCloud.release();
// Check if we detected at least one plane. If so, hide the loading message.
if (messageSnackbar != null) {
for (Plane plane : session.getAllTrackables(Plane.class)) {
if (plane.getType() == com.google.ar.core.Plane.Type.HORIZONTAL_UPWARD_FACING
&& plane.getTrackingState() == TrackingState.TRACKING) {
hideLoadingMessage();
break;
}
}
}
camera.getDisplayOrientedPose().toMatrix(mCurrentCameraMatrix, 0);
// Visualize planes.
planeRenderer.drawPlanes(
session.getAllTrackables(Plane.class), camera.getDisplayOrientedPose(), projmtx);
// Visualize anchors created by touch.
int ac = 0;
for (Anchor anchor : mAnchors) {
if (anchor.getTrackingState() != TrackingState.TRACKING) {
continue;
}
// Get the current pose of an Anchor in world space. The Anchor pose is updated
// during calls to session.update() as ARCore refines its estimate of the world.
anchor.getPose().toMatrix(anchorMatrix, 0);
if (mScaleFactors.size() <= ac) {
mScaleFactors.add(1.0f);
}
if (mRotationThetas.size() <= ac) {
mRotationThetas.add(0.0f);
}
if (mTranslationX.size() <= ac) {
mTranslationX.add(viewmtx[3]);
}
if (mTranslationZ.size() <= ac) {
mTranslationZ.add(viewmtx[11]);
}
translateMatrix(mTranslationX.get(ac), 0, mTranslationZ.get(ac));
rotateYAxisMatrix(mRotationThetas.get(ac));
// Update and draw the model and its shadow.
ObjectRenderer vitualObject = mVirtualObjects.get(mAnchorReferences.get(ac));
vitualObject.updateModelMatrix(anchorMatrix, mScaleFactors.get(ac));
vitualObject.draw(viewmtx, projmtx, lightIntensity);
}
} catch (Throwable t) {
// Avoid crashing the application due to unhandled exceptions.
Log.e(TAG, "Exception on the OpenGL thread", t);
}
}
private void rotateYAxisMatrix(float rotationTheta) {
if (rotationTheta != 0.0f) {
anchorMatrix[0] = (float) Math.cos(rotationTheta);
anchorMatrix[2] = (float) Math.sin(rotationTheta);
anchorMatrix[5] = 1;
anchorMatrix[8] = -(float) Math.sin(rotationTheta);
anchorMatrix[10] = (float) Math.cos(rotationTheta);
anchorMatrix[15] = 1;
}
}
private void translateMatrix(float xDistance, float yDistance, float zDistance) {
Matrix.translateM(anchorMatrix, 0, xDistance, yDistance, zDistance);
}

Android SurfaceView onTouchEvent not getting called

I'm developing a game using SurfaceView which listens to touch events. The onTouchEvent method in SurfaceView works fine for many of the devices, but in some devices, sometimes it doesn't get called (Moto X Style is the one) and my app also stops responding.
I guess that this might be due to the overloading of main thread due to which onTouchEvent is starving.
Could some Android experts over here give me some tips to reduce the load on main thread if it's getting overloaded, or there might be some other reason which may cause this
The code is quite complex but still I'm posting some if you want to go through it
GameLoopThread
public class GameLoopThread extends Thread{
private GameView view;
// desired fps
private final static int MAX_FPS = 120;
// maximum number of frames to be skipped
private final static int MAX_FRAME_SKIPS = 5;
// the frame period
private final static int FRAME_PERIOD = 1000 / MAX_FPS;
private boolean running = false;
public GameLoopThread(GameView view){
this.view = view;
}
public void setRunning(boolean running){
this.running = running;
}
public boolean isRunning() {
return running;
}
#Override
public void run() {
Canvas canvas;
long beginTime; // the time when the cycle begun
long timeDiff; // the time it took for the cycle to execute
int sleepTime; // ms to sleep (<0 if we're behind)
int framesSkipped; // number of frames being skipped
while (running) {
canvas = null;
// try locking the canvas for exclusive pixel editing
// in the surface
try {
canvas = view.getHolder().lockCanvas();
synchronized (view.getHolder()) {
beginTime = System.nanoTime();
framesSkipped = 0; // resetting the frames skipped
// update game state
// render state to the screen
// draws the canvas on the panel
this.view.draw(canvas);
// calculate how long did the cycle take
timeDiff = System.nanoTime() - beginTime;
// calculate sleep time
sleepTime = (int)(FRAME_PERIOD - timeDiff/1000000);
if (sleepTime > 0) {
// if sleepTime > 0 we're OK
try {
// send the thread to sleep for a short period
// very useful for battery saving
Thread.sleep(sleepTime);
} catch (InterruptedException e) {}
}
while (sleepTime < 0 && framesSkipped < MAX_FRAME_SKIPS) {
// update without rendering
// add frame period to check if in next frame
sleepTime += FRAME_PERIOD;
framesSkipped++;
}
}
}
finally {
// in case of an exception the surface is not left in
// an inconsistent state
view.getHolder().unlockCanvasAndPost(canvas);
} // end finally
}
}
}
GameView
public class GameView extends SurfaceView {
ArrayList<Bitmap> circles = new ArrayList<>();
int color;
public static boolean isGameOver;
public GameLoopThread gameLoopThread;
Circle circle; // Code for Circle class is provided below
public static int score = 0;
public static int stars = 0;
final Handler handler = new Handler();
int remainingTime;
boolean oneTimeFlag;
Bitmap replay;
Bitmap home;
Bitmap star;
int highScore;
boolean isLeaving;
public GameView(Context context, ArrayList<Bitmap> circles, int color) {
super(context);
this.circles = circles;
this.color = color;
oneTimeFlag = true;
gameLoopThread = new GameLoopThread(GameView.this);
getHolder().addCallback(new SurfaceHolder.Callback() {
#Override
public void surfaceCreated(SurfaceHolder holder) {
if (!gameLoopThread.isRunning()) {
gameLoopThread.setRunning(true);
gameLoopThread.start();
}
}
#Override
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
}
#Override
public void surfaceDestroyed(SurfaceHolder holder) {
gameLoopThread.setRunning(false);
gameLoopThread = new GameLoopThread(GameView.this);
}
});
initializeCircles();
if(!gameLoopThread.isRunning()) {
gameLoopThread.setRunning(true);
gameLoopThread.start();
}
}
public void initializeCircles() {
ArrayList<String> numbers = new ArrayList<>();
for(int i=0;i<10;i++)
numbers.add(i+"");
Random random = new Random();
int position = random.nextInt(4);
numbers.remove(color + "");
int p1 = position;
int r1 = Integer.valueOf(numbers.get(random.nextInt(9)));
numbers.remove(r1+"");
int r2 = Integer.valueOf(numbers.get(random.nextInt(8)));
numbers.remove(r2 + "");
int r3 = Integer.valueOf(numbers.get(random.nextInt(7)));
ArrayList<Bitmap> bitmaps = new ArrayList<>();
if(position == 0) {
bitmaps.add(circles.get(color));
bitmaps.add(circles.get(r1));
bitmaps.add(circles.get(r2));
bitmaps.add(circles.get(r3));
}
else if(position == 1) {
bitmaps.add(circles.get(r1));
bitmaps.add(circles.get(color));
bitmaps.add(circles.get(r2));
bitmaps.add(circles.get(r3));
}
else if(position == 2) {
bitmaps.add(circles.get(r1));
bitmaps.add(circles.get(r2));
bitmaps.add(circles.get(color));
bitmaps.add(circles.get(r3));
}
else {
bitmaps.add(circles.get(r1));
bitmaps.add(circles.get(r2));
bitmaps.add(circles.get(r3));
bitmaps.add(circles.get(color));
}
numbers = new ArrayList<>();
for(int i=0;i<10;i++)
numbers.add(i+"");
position = random.nextInt(4);
numbers.remove(color + "");
r1 = Integer.valueOf(numbers.get(random.nextInt(9)));
numbers.remove(r1 + "");
r2 = Integer.valueOf(numbers.get(random.nextInt(8)));
numbers.remove(r2 + "");
r3 = Integer.valueOf(numbers.get(random.nextInt(7)));
if(position == 0) {
bitmaps.add(circles.get(color));
bitmaps.add(circles.get(r1));
bitmaps.add(circles.get(r2));
bitmaps.add(circles.get(r3));
}
else if(position == 1) {
bitmaps.add(circles.get(r1));
bitmaps.add(circles.get(color));
bitmaps.add(circles.get(r2));
bitmaps.add(circles.get(r3));
}
else if(position == 2) {
bitmaps.add(circles.get(r1));
bitmaps.add(circles.get(r2));
bitmaps.add(circles.get(color));
bitmaps.add(circles.get(r3));
}
else {
bitmaps.add(circles.get(r1));
bitmaps.add(circles.get(r2));
bitmaps.add(circles.get(r3));
bitmaps.add(circles.get(color));
}
circle = new Circle(this, bitmaps, circles, p1, position, color, getContext());
}
#Override
public void draw(Canvas canvas) {
if(canvas != null) {
super.draw(canvas);
canvas.drawColor(Color.WHITE);
if(!isGameOver && timer != null)
stopTimerTask();
try {
circle.draw(canvas);
} catch (GameOverException e) {
isGameOver = true;
if(isLeaving)
gameOver(canvas);
else if(GameActivity.counter > 0) {
gameOver(canvas);
GameActivity.counter++;
} else {
if (oneTimeFlag) {
int size1 = 200 * GameActivity.SCREEN_HEIGHT / 1280;
int size2 = 125 * GameActivity.SCREEN_HEIGHT / 1280;
float ratio = (float) GameActivity.SCREEN_HEIGHT / 1280;
replay = GameActivity.decodeSampledBitmapFromResource(getResources(), R.drawable.replay, size1, size1);
home = GameActivity.decodeSampledBitmapFromResource(getResources(), R.drawable.home, size2, size2);
continueButton = GameActivity.decodeSampledBitmapFromResource(getContext().getResources(), R.drawable.button, (int) (540 * ratio), (int) (100 * ratio));
star = GameActivity.decodeSampledBitmapFromResource(getContext().getResources(), R.drawable.star1, (int) (220 * ratio), (int) (220 * ratio));
int w = (int) ((float) GameActivity.SCREEN_WIDTH * 0.9);
oneTimeFlag = false;
}
if (askPurchaseScreen == 2) {
gameOver(canvas);
} else {
canvas.drawColor(Circle.endColor);
}
}
}
}
}
#Override
public boolean onTouchEvent(MotionEvent event) {
float x = event.getX();
float y = event.getY();
circle.onTouch(x, y);
return true;
}
}
Circle
public class Circle {
int x;
int y1;
int y2;
public static float speedY1 = 12.5f*(float)GameActivity.SCREEN_HEIGHT/1280;
public static float speedY2 = 12.5f*(float)GameActivity.SCREEN_HEIGHT/1280;
ArrayList<Bitmap> bitmaps;
GameView gameView;
int p1; // Position of required circle in slot 1
int p2; // Position of required circle in slot 2
int color;
int tempColor;
int width;
Context context;
// Centers of required circle
float centerX1;
float centerX2;
float centerY1;
float centerY2;
ArrayList<Bitmap> circles = new ArrayList<>();
boolean touchedFirst;
boolean touchedSecond;
int count1 = 1; // Slot 1 circle radius animation
int count2 = 1; // Slot 2 circle radius animation
float tempSpeedY1;
float tempSpeedY2;
boolean stopY1;
boolean stopY2;
int barCounter = 1;
int loopCount = 0;
int endGameCount = 0; // Count to move circle upwards
double limit;
float endRadiusSpeed;
int endSlot; // Where you died
int endRadiusCount = 0; // Count to increase circle radius
int barEndCounter = 1;
final Handler handler = new Handler();
boolean exception;
public static int endColor;
public Circle(GameView gameView, ArrayList<Bitmap> bitmaps, ArrayList<Bitmap> circles, int p1, int p2, int color, Context context) {
this.gameView = gameView;
this.bitmaps = bitmaps;
this.circles = circles;
this.p1 = p1;
this.p2 = p2;
this.color = color;
this.context = context;
width = GameActivity.SCREEN_WIDTH / 4 - 10;
x = 10;
y1 = 0;
y2 = -(GameActivity.SCREEN_HEIGHT + width) / 2;
centerX1 = x + p1 * (10 + width) + width / 2;
centerY1 = y1 + width / 2;
centerX2 = x + p2 * (10 + width) + width / 2;
centerY2 = y2 + width / 2;
}
public void update() throws GameOverException {
y1+= speedY1;
y2+= speedY2;
centerY1+= speedY1;
centerY2+= speedY2;
float ratio = (float)GameActivity.SCREEN_HEIGHT/1280;
limit = width/(20*ratio);
if(y1 >= gameView.getHeight()) {
loopCount++;
if(touchedFirst)
touchedFirst = false;
else {
speedY1 = speedY2 = -(12.5f * ratio);
endColor = bitmaps.get(p1).getPixel(width/2, width/2);
endGameCount += 1;
endSlot = 1;
}
if(endGameCount == 0) {
if (stopY1) {
tempSpeedY1 = speedY1;
speedY1 = 0;
ArrayList<Integer> numbers = new ArrayList<>();
for (int i = 0; i < 10; i++) {
if (i != color)
numbers.add(i);
}
tempColor = numbers.get(new Random().nextInt(9));
}
y1 = -(gameView.getWidth() / 4 - 10);
count1 = 1;
setBitmaps(1);
}
}
else if(y2 >= gameView.getHeight()) {
loopCount++;
if(touchedSecond)
touchedSecond = false;
else {
speedY1 = speedY2 = -(12.5f * ratio);
endColor = bitmaps.get(p2 + 4
).getPixel(width/2, width/2);
endGameCount += 1;
endSlot = 2;
}
if(endGameCount == 0) {
if (stopY2) {
tempSpeedY2 = speedY2;
speedY2 = 0;
}
y2 = -(gameView.getWidth() / 4 - 10);
count2 = 1;
setBitmaps(2);
}
}
}
public void setBitmaps(int slot) {
ArrayList<String> numbers = new ArrayList<>();
for(int i=0;i<10;i++)
numbers.add(i+"");
Random random = new Random();
int position = random.nextInt(4);
numbers.remove(color + "");
int r1 = Integer.valueOf(numbers.get(random.nextInt(9)));
numbers.remove(r1+"");
int r2 = Integer.valueOf(numbers.get(random.nextInt(8)));
numbers.remove(r2 + "");
int r3 = Integer.valueOf(numbers.get(random.nextInt(7)));
if(position == 0) {
bitmaps.set((slot - 1)*4, circles.get(color));
bitmaps.set((slot - 1)*4 + 1, circles.get(r1));
bitmaps.set((slot - 1)*4 + 2, circles.get(r2));
bitmaps.set((slot - 1)*4 + 3, circles.get(r3));
}
else if(position == 1) {
bitmaps.set((slot - 1)*4, circles.get(r1));
bitmaps.set((slot - 1)*4 + 1, circles.get(color));
bitmaps.set((slot - 1)*4 + 2, circles.get(r2));
bitmaps.set((slot - 1)*4 + 3, circles.get(r3));
}
else if(position == 2) {
bitmaps.set((slot - 1)*4, circles.get(r1));
bitmaps.set((slot - 1)*4 + 1, circles.get(r2));
bitmaps.set((slot - 1)*4 + 2, circles.get(color));
bitmaps.set((slot - 1)*4 + 3, circles.get(r3));
} else {
bitmaps.set((slot - 1)*4,circles.get(r1));
bitmaps.set((slot - 1)*4 + 1,circles.get(r2));
bitmaps.set((slot - 1)*4 + 2,circles.get(r3));
bitmaps.set((slot - 1)*4 + 3,circles.get(color));
}
if(slot == 1) {
p1 = position;
centerX1 = x+position*(10 + width) + width/2;
centerY1 = y1 + width/2;
}
else if(slot == 2) {
p2 = position;
centerX2 = x+position*(10 + width) + width/2;
centerY2 = y2 + width/2;
}
}
public void onTouch(float X, float Y) {
int radius = (gameView.getWidth() / 4 - 10) / 2;
if(endGameCount == 0) {
if ((X >= centerX1 - radius) && (X <= centerX1 + radius) && (Y >= centerY1 - radius) && (Y <= centerY1 + radius)) {
GameView.score++;
touchedFirst = true;
centerX1 = centerY1 = -1;
if(p1 == (timerCount - 1) && timer != null && starSlot == 1) {
GameView.stars++;
starCollected = true;
timerCount = 0;
stopTimerTask(0);
}
} else if ((X >= centerX2 - radius) && (X <= centerX2 + radius) && (Y >= centerY2 - radius) && (Y <= centerY2 + radius)) {
GameView.score++;
touchedSecond = true;
centerX2 = centerY2 = -1;
if(p2 == (timerCount - 1) && timer != null && starSlot == 2) {
GameView.stars++;
starCollected = true;
timerCount = 0;
stopTimerTask(0);
}
} else {
endSlot = 0;
if ((Y >= centerY1 - radius) && (Y <= centerY1 + radius)) {
endSlot = 1;
if (X >= 10 && X <= 10 + 2 * radius) {
p1 = 0;
centerX1 = 10 + radius;
} else if (X >= 20 + 2 * radius && X <= 20 + 4 * radius) {
p1 = 1;
centerX1 = 20 + 3 * radius;
} else if (X >= 30 + 4 * radius && X <= 30 + 6 * radius) {
p1 = 2;
centerX1 = 30 + 5 * radius;
} else if (X >= 40 + 6 * radius && X <= 40 + 8 * radius) {
p1 = 3;
centerX1 = 40 + 2 * radius;
} else
endSlot = 0;
} else if ((Y >= centerY2 - radius) && (Y <= centerY2 + radius)) {
endSlot = 2;
if (X >= 10 && X <= 10 + 2 * radius) {
p2 = 0;
centerX2 = 10 + radius;
} else if (X >= 20 + 2 * radius && X <= 20 + 4 * radius) {
p2 = 1;
centerX2 = 20 + 3 * radius;
} else if (X >= 30 + 4 * radius && X <= 30 + 6 * radius) {
p2 = 2;
centerX2 = 30 + 5 * radius;
} else if (X >= 40 + 6 * radius && X <= 40 + 8 * radius) {
p2 = 3;
centerX2 = 40 + 2 * radius;
} else
endSlot = 0;
}
if (endSlot != 0) {
speedY1 = speedY2 = 0;
limit = endGameCount = 6;
if (endSlot == 1) {
endColor= bitmaps.get(p1).getPixel(width/2, width/2);
} else {
endColor = bitmaps.get(p2 + 4).getPixel(width/2, width/2);
}
}
}
if (GameView.score % 5 == 0 && GameView.score <= 110 && barCounter == 1) {
float ratio = (float)GameActivity.SCREEN_HEIGHT/1280;
speedY1 += ratio*0.5;
speedY2 += ratio*0.5;
}
if (GameView.score > 0 && GameView.score % 15 == 14) {
if(isOddScore)
stopY1 = true;
else
stopY2 = true;
}
if (GameView.score > 0 && GameView.score % 15 == 0 && barCounter == 1) {
if(isOddScore)
stopY2 = true;
else
stopY1 = true;
}
if (GameView.score % 15 == 1)
barCounter = 1;
}
}
public void draw(Canvas canvas) throws GameOverException {
GameView.isGameOver = false;
if(exception)
throw new GameOverException(color);
update();
for(int i=0;i<bitmaps.size();i++) {
if(i<4) {
Rect rect = new Rect(x+i*(10 + width),y1,(x+width)*(i+1),y1+width);
if(endGameCount == Math.ceil(limit) && endSlot == 1) {
if(i == p1) {
endRadiusCount += 1;
if (endRadiusCount > 23) {
star.recycle();
loopCount = loopCount%starInterval;
Cryptography.saveFile((loopCount + "").getBytes(), context, "interval");
endGameCount = 0;
exception = true;
throw new GameOverException(color);
}
rect = new Rect(x + i * (10 + width) - endRadiusCount*(int)Math.ceil(endRadiusSpeed), y1 - endRadiusCount*(int)Math.ceil(endRadiusSpeed), (x + width) * (i + 1) + endRadiusCount*(int)Math.ceil(endRadiusSpeed), y1 + width + endRadiusCount*(int)Math.ceil(endRadiusSpeed));
canvas.drawBitmap(bitmaps.get(i), null, rect, null);
}
}
// TOUCH ANIMATION : DIMINISH CIRCLE
else if(i==p1 && touchedFirst) {
rect = new Rect(x + i * (10 + width) + 3*count1 + ((int)speedY1-15), y1 + 3*count1 + ((int)speedY1-15), (x + width) * (i + 1) - 3*count1 - ((int)speedY1-15), y1 + width - 3*count1 - ((int)speedY1-15));
canvas.drawBitmap(bitmaps.get(i), null, rect, null);
count1++;
}
else if(endSlot != 2) {
canvas.drawBitmap(bitmaps.get(i), null, rect, null);
if(timerCount > 0 && starSlot == 1) {
int size = width * 30 / 50;
int difference = (width - size) / 2;
Rect starRect = new Rect(x + (timerCount - 1) * (10 + width) + difference, y1 + difference, (x + width) * (timerCount) - difference, y1 + width - difference);
canvas.drawBitmap(star, null, starRect, null);
}
}
}
if(i >= 4) {
Rect rect = new Rect(x + (i % 4) * (10 + width), y2, (x + width) * ((i % 4) + 1), y2 + width);
if(endGameCount == Math.ceil(limit) && endSlot == 2) {
if((i%4)==p2) {
endRadiusCount += 1;
if (endRadiusCount > 23) {
star.recycle();
loopCount = loopCount%starInterval;
Cryptography.saveFile((loopCount + "").getBytes(), context, "interval");
endGameCount = 0;
exception = true;
throw new GameOverException(color);
}
rect = new Rect(x + (i % 4) * (10 + width) - endRadiusCount*(int)Math.ceil(endRadiusSpeed), y2 - endRadiusCount*(int)Math.ceil(endRadiusSpeed), (x + width) * ((i % 4) + 1) + endRadiusCount*(int)Math.ceil(endRadiusSpeed), y2 + width + endRadiusCount*(int)Math.ceil(endRadiusSpeed));
canvas.drawBitmap(bitmaps.get(i), null, rect, null);
}
}
else if((i%4)==p2 && touchedSecond) {
rect = new Rect(x + (i % 4) * (10 + width) + 3*count2 + ((int)speedY1-15), y2 + 3*count2 + ((int)speedY1-15), (x + width) * ((i % 4) + 1) - 3*count2 - ((int)speedY1-15), y2 + width - 3*count2 - ((int)speedY1-15));
canvas.drawBitmap(bitmaps.get(i), null, rect, null);
count2++;
}
else if(endSlot != 1) {
canvas.drawBitmap(bitmaps.get(i), null, rect, null);
if(timerCount > 0 && starSlot == 2) {
int size = width * 30 / 50;
int difference = (width - size) / 2;
Rect starRect = new Rect(x + (timerCount - 1) * (10 + width) + difference, y2 + difference, (x + width) * (timerCount) - difference, y2 + width - difference);
canvas.drawBitmap(star, null, starRect, null);
}
}
}
}
Rect src = new Rect(circles.get(color).getWidth()/2 - 10,circles.get(color).getHeight()/2 - 10,circles.get(color).getWidth()/2 + 10,circles.get(color).getHeight()/2 + 10);
Rect dst;
Paint paint = new Paint();
paint.setColor(Color.WHITE);
paint.setTextAlign(Paint.Align.RIGHT);
paint.setTypeface(Typeface.SANS_SERIF);
paint.setTextSize(72 * ratio);
canvas.drawText(GameView.score + " ", GameActivity.SCREEN_WIDTH, width / 2, paint);
dst = new Rect(5,5, (int) (120 * ratio - 5), (int) (120 * ratio - 5));
canvas.drawBitmap(star,null,dst,null);
paint.setTextAlign(Paint.Align.LEFT);
canvas.drawText("" + GameView.stars, 120 * ratio, width/2, paint);
}
}
Don't override draw(). That's used to render the View, not the Surface, and you generally shouldn't override that method even if you're creating a custom View:
When implementing a view, implement onDraw(android.graphics.Canvas) instead of overriding this method.
SurfaceViews have two parts, the Surface and the View. The View part is handled like any other View, but is generally just a transparent "hole" in the layout. The Surface is a separate layer that, by default, sits behind the View layer. Whatever you draw on the Surface "shows through" the transparent hole.
By overriding draw() you're drawing on the View whenever the View UI is invalidated. You're also calling draw() from the render thread, so you're drawing on the Surface, but with default Z-ordering you can't see that because the View contents are fully opaque. You will reduce your impact on the UI thread by not drawing everything in two different layers.
Unless you're deliberately drawing on the View, it's best to avoid subclassing SurfaceView entirely, and just use it as a member.
Because your draw code is synchronized, the two draw passes will not execute concurrently. That means your View layer draw call will block waiting for the Surface layer rendering to complete. Canvas rendering on a Surface is not hardware-accelerated, so if you're touching a lot of pixels it can get slow, and the UI thread will have to wait for it to run. That wouldn't be so bad, but you're holding on to the mutex while you're sleeping, which means the only opportunity for the main UI thread to run comes during the brief instant when the loop wraps around. The thread scheduler does not guarantee fairness, so it's entirely possible to starve the main UI thread this way.
If you change #override draw() to myDraw() things should get better. You should probably move your sleep call out of the synchronized block just on general principles, or work to eliminate it entirely. You might also want to consider using a custom View instead of SurfaceView.
On an unrelated note, you should probably avoid doing this every update:
Random random = new Random();
for the reasons noted here.
Successfully solved the issue. Can't imagine that the solution would be this much simple as compared to the problem that I was considering that complex. Just reduced the frame rate from 120 to 90 and guess what, it worked like charm!
Due to a high frame rate, the SurfaceView was busy doing all the drawing and onTouchEvent() method had to starve

Android - Accurately get step count

For my Android app I need to get more or less accurately the step count. I say "more or less" because what I need to do is to know if the steps taken in 2 seconds are more or less than the previous ones.
I have tried the following two algorithms, and none of them worked effectively. Also I have tested requesting the step count to Google Fit, but it sometimes returns 215, and then 155, so I can't rely on that.
My solutions were as follows:
private SensorEventListener sensorEventListener = new SensorEventListener() {
//Listens for change in acceleration, displays and computes the steps
#Override
public void onSensorChanged(SensorEvent event) {
//Gather the values from accelerometer
float x = event.values[0];
float y = event.values[1];
float z = event.values[2];
//Fetch the current y
currentY = y;
//Measure if a step was taken
if(Math.abs(currentY - previousY) > threshold) {
numSteps++;
textViewSteps.setText(String.valueOf(numSteps));
//Vibramos
Vibrator vibrator = (Vibrator) getSystemService(VIBRATOR_SERVICE);
vibrator.vibrate(200);
}
//Display the values
//textViewX.setText(String.valueOf(x));
//textViewY.setText(String.valueOf(y));
//textViewZ.setText(String.valueOf(z));
//Store the previous y
previousY = y;
}
#Override
public void onAccuracyChanged(Sensor sensor, int accuracy) {
}
};
private SensorEventListener sensorEventListener = new SensorEventListener()
{
private final static String TAG = "StepDetector";
private float mLimit = 10;
private float mLastValues[] = new float[3*2];
private float mScale[] = new float[2];
private float mYOffset;
private float mLastDirections[] = new float[3*2];
private float mLastExtremes[][] = { new float[3*2], new float[3*2] };
private float mLastDiff[] = new float[3*2];
private int mLastMatch = -1;
//Como no podemos crear un constructor en una clase anónima, uso un inicializador de instancia (instance initializer)
{
int h = 480;
mYOffset = h * 0.5f;
mScale[0] = - (h * 0.5f * (1.0f / (SensorManager.STANDARD_GRAVITY * 2)));
mScale[1] = - (h * 0.5f * (1.0f / (SensorManager.MAGNETIC_FIELD_EARTH_MAX)));
}
public void setSensitivity(float sensitivity) {
mLimit = sensitivity; // 1.97 2.96 4.44 6.66 10.00 15.00 22.50 33.75 50.62
}
//public void onSensorChanged(int sensor, float[] values) {
public void onSensorChanged(SensorEvent event) {
if(calculateKmhtime == 0)
calculateKmhtime = System.currentTimeMillis();
Sensor sensor = event.sensor;
synchronized (this) {
if (sensor.getType() == Sensor.TYPE_ORIENTATION) {
}
else {
int j = (sensor.getType() == Sensor.TYPE_ACCELEROMETER) ? 1 : 0;
if (j == 1) {
float vSum = 0;
for (int i=0 ; i<3 ; i++) {
final float v = mYOffset + event.values[i] * mScale[j];
vSum += v;
}
int k = 0;
float v = vSum / 3;
float direction = (v > mLastValues[k] ? 1 : (v < mLastValues[k] ? -1 : 0));
if (direction == - mLastDirections[k]) {
// Direction changed
int extType = (direction > 0 ? 0 : 1); // minumum or maximum?
mLastExtremes[extType][k] = mLastValues[k];
float diff = Math.abs(mLastExtremes[extType][k] - mLastExtremes[1 - extType][k]);
if (diff > mLimit) {
boolean isAlmostAsLargeAsPrevious = diff > (mLastDiff[k]*2/3);
boolean isPreviousLargeEnough = mLastDiff[k] > (diff/3);
boolean isNotContra = (mLastMatch != 1 - extType);
if (isAlmostAsLargeAsPrevious && isPreviousLargeEnough && isNotContra) {
Log.i(TAG, "step");
//Hemos detectado un paso. Aumentamos el número de pasos
numSteps++;
textViewSteps.setText(String.valueOf(numSteps));
mLastMatch = extType;
long currentTime = System.currentTimeMillis();
//Vibramos
Vibrator vibrator = (Vibrator) getSystemService(VIBRATOR_SERVICE);
vibrator.vibrate(200);
if(currentTime - calculateKmhtime >= CALCULATE_KMH_DELAY) {
calculateKmh(stepsInSpecifiedSeconds, (currentTime - calculateKmhtime) / 1000);
calculateKmhtime = System.currentTimeMillis();
stepsInSpecifiedSeconds = 0;
}else{
stepsInSpecifiedSeconds++;
}
}
else {
mLastMatch = -1;
}
}
mLastDiff[k] = diff;
}
mLastDirections[k] = direction;
mLastValues[k] = v;
}
}
}
}
public void onAccuracyChanged(Sensor sensor, int accuracy) {
}
};
How can it be improved?
Thank you

Android animation equivalent for iOS "calculationMode"

I'm trying to animate some drawables in Android, I've set a path using PathEvaluator that animates along some curves along a full path.
When I set a duration (e.g. 6 seconds) it splits the duration to the number of curves I've set regardless of their length which causes the animation to be to slow on some segments and too fast on others.
On iOS this can be fixed using
animation.calculationMode = kCAAnimationCubicPaced;
animation.timingFunction = ...;
Which lets iOS to smooth your entire path into mid-points and span the duration according to each segment length.
Is there any way to get the same result in Android?
(besides breaking the path into discrete segments and assigning each segment its own duration manually which is really ugly and unmaintainable).
I don't think that anything can be done with ObjectAnimator because there seems to be no function that can be called to assert the relative duration of a certain fragment of the animation.
I did develop something similar to what you need a while back, but it works slightly differently - it inherits from Animation.
I've modified everything to work with your curving needs, and with the PathPoint class.
Here's an overview:
I supply the list of points to the animation in the constructor.
I calculate the length between all the points using a simple distance calculator. I then sum it all up to get the overall length of the path, and store the segment lengths in a map for future use (this is to improve efficiency during runtime).
When animating, I use the current interpolation time to figure out which 2 points I'm animating between, considering the ratio of time & the ratio of distance traveled.
I calculate the time it should take to animate between these 2 points according to the relative distance between them, compared to the overall distance.
I then interpolate separately between these 2 points using the calculation in the PathAnimator class.
Here's the code:
CurveAnimation.java:
public class CurveAnimation extends Animation
{
private static final float BEZIER_LENGTH_ACCURACY = 0.001f; // Must be divisible by one. Make smaller to improve accuracy, but will increase runtime at start of animation.
private List<PathPoint> mPathPoints;
private float mOverallLength;
private Map<PathPoint, Double> mSegmentLengths = new HashMap<PathPoint, Double>(); // map between the end point and the length of the path to it.
public CurveAnimation(List<PathPoint> pathPoints)
{
mPathPoints = pathPoints;
if (mPathPoints == null || mPathPoints.size() < 2)
{
Log.e("CurveAnimation", "There must be at least 2 points on the path. There will be an exception soon!");
}
calculateOverallLength();
}
#Override
protected void applyTransformation(float interpolatedTime, Transformation t)
{
PathPoint[] startEndPart = getStartEndForTime(interpolatedTime);
PathPoint startPoint = startEndPart[0];
PathPoint endPoint = startEndPart[1];
float startTime = getStartTimeOfPoint(startPoint);
float endTime = getStartTimeOfPoint(endPoint);
float progress = (interpolatedTime - startTime) / (endTime - startTime);
float x, y;
float[] xy;
if (endPoint.mOperation == PathPoint.CURVE)
{
xy = getBezierXY(startPoint, endPoint, progress);
x = xy[0];
y = xy[1];
}
else if (endPoint.mOperation == PathPoint.LINE)
{
x = startPoint.mX + progress * (endPoint.mX - startPoint.mX);
y = startPoint.mY + progress * (endPoint.mY - startPoint.mY);
}
else
{
x = endPoint.mX;
y = endPoint.mY;
}
t.getMatrix().setTranslate(x, y);
super.applyTransformation(interpolatedTime, t);
}
private PathPoint[] getStartEndForTime(float time)
{
double length = 0;
if (time == 1)
{
return new PathPoint[] { mPathPoints.get(mPathPoints.size() - 2), mPathPoints.get(mPathPoints.size() - 1) };
}
PathPoint[] result = new PathPoint[2];
for (int i = 0; i < mPathPoints.size() - 1; i++)
{
length += calculateLengthFromIndex(i);
if (length / mOverallLength >= time)
{
result[0] = mPathPoints.get(i);
result[1] = mPathPoints.get(i + 1);
break;
}
}
return result;
}
private float getStartTimeOfPoint(PathPoint point)
{
float result = 0;
int index = 0;
while (mPathPoints.get(index) != point && index < mPathPoints.size() - 1)
{
result += (calculateLengthFromIndex(index) / mOverallLength);
index++;
}
return result;
}
private void calculateOverallLength()
{
mOverallLength = 0;
mSegmentLengths.clear();
double segmentLength;
for (int i = 0; i < mPathPoints.size() - 1; i++)
{
segmentLength = calculateLengthFromIndex(i);
mSegmentLengths.put(mPathPoints.get(i + 1), segmentLength);
mOverallLength += segmentLength;
}
}
private double calculateLengthFromIndex(int index)
{
PathPoint start = mPathPoints.get(index);
PathPoint end = mPathPoints.get(index + 1);
return calculateLength(start, end);
}
private double calculateLength(PathPoint start, PathPoint end)
{
if (mSegmentLengths.containsKey(end))
{
return mSegmentLengths.get(end);
}
else if (end.mOperation == PathPoint.LINE)
{
return calculateLength(start.mX, end.mX, start.mY, end.mY);
}
else if (end.mOperation == PathPoint.CURVE)
{
return calculateBezeirLength(start, end);
}
else
{
return 0;
}
}
private double calculateLength(float x0, float x1, float y0, float y1)
{
return Math.sqrt(((x0 - x1) * (x0 - x1)) + ((y0 - y1) * (y0 - y1)));
}
private double calculateBezeirLength(PathPoint start, PathPoint end)
{
double result = 0;
float x, y, x0, y0;
float[] xy;
x0 = start.mX;
y0 = start.mY;
for (float progress = BEZIER_LENGTH_ACCURACY; progress <= 1; progress += BEZIER_LENGTH_ACCURACY)
{
xy = getBezierXY(start, end, progress);
x = xy[0];
y = xy[1];
result += calculateLength(x, x0, y, y0);
x0 = x;
y0 = y;
}
return result;
}
private float[] getBezierXY(PathPoint start, PathPoint end, float progress)
{
float[] result = new float[2];
float oneMinusT, x, y;
oneMinusT = 1 - progress;
x = oneMinusT * oneMinusT * oneMinusT * start.mX +
3 * oneMinusT * oneMinusT * progress * end.mControl0X +
3 * oneMinusT * progress * progress * end.mControl1X +
progress * progress * progress * end.mX;
y = oneMinusT * oneMinusT * oneMinusT * start.mY +
3 * oneMinusT * oneMinusT * progress * end.mControl0Y +
3 * oneMinusT * progress * progress * end.mControl1Y +
progress * progress * progress * end.mY;
result[0] = x;
result[1] = y;
return result;
}
}
Here's a sample that shows how to activate the animation:
private void animate()
{
AnimatorPath path = new AnimatorPath();
path.moveTo(0, 0);
path.lineTo(0, 300);
path.curveTo(100, 0, 300, 900, 400, 500);
CurveAnimation animation = new CurveAnimation(path.mPoints);
animation.setDuration(5000);
animation.setInterpolator(new LinearInterpolator());
btn.startAnimation(animation);
}
Now, keep in mind that I'm currently calculating the length of the curve according to an approximation. This will obviously cause some mild inaccuracies in the speed. If you feel it's not accurate enough, feel free to modify the code. Also, if you want to increase the length accuracy of the curve, try decreasing the value of BEZIER_LENGTH_ACCURACY. It must be dividable by 1, so accepted values can be 0.001, 0.000025, etc.
While you might notice some mild fluctuations in speed when using curves, I'm sure it's much better than simply dividing the time equally between all paths.
I hope this helps :)
I tried using Gil's answer, but it didn't fit how I was animating.
Gil wrote an Animation class which is used to animate Views.
I was using ObjectAnimator.ofObject() to animate custom classes using ValueProperties which can't be used with custom Animation.
So this is what I did:
I extend PathEvaluator and override its evaluate method.
I use Gil's logic to calculate path total length, and segmented lengths
Since PathEvaluator.evaluate is called for each PathPoint with t values
0..1, I needed to normalize the interpolated time given to me, so it'll be incremental and won't zero out for each segment.
I ignore the start/end PathPoints given to me so the current position can be
before start or after end along the path depending on the segment's duration.
I pass the current progress calculated to my super
(PathEvaluator) to calc the actual position.
This is the code:
public class NormalizedEvaluator extends PathEvaluator {
private static final float BEZIER_LENGTH_ACCURACY = 0.001f;
private List<PathPoint> mPathPoints;
private float mOverallLength;
private Map<PathPoint, Double> mSegmentLengths = new HashMap<PathPoint, Double>();
public NormalizedEvaluator(List<PathPoint> pathPoints) {
mPathPoints = pathPoints;
if (mPathPoints == null || mPathPoints.size() < 2) {
Log.e("CurveAnimation",
"There must be at least 2 points on the path. There will be an exception soon!");
}
calculateOverallLength();
}
#Override
public PathPoint evaluate(float interpolatedTime, PathPoint ignoredStartPoint,
PathPoint ignoredEndPoint) {
float index = getStartIndexOfPoint(ignoredStartPoint);
float normalizedInterpolatedTime = (interpolatedTime + index) / (mPathPoints.size() - 1);
PathPoint[] startEndPart = getStartEndForTime(normalizedInterpolatedTime);
PathPoint startPoint = startEndPart[0];
PathPoint endPoint = startEndPart[1];
float startTime = getStartTimeOfPoint(startPoint);
float endTime = getStartTimeOfPoint(endPoint);
float progress = (normalizedInterpolatedTime - startTime) / (endTime - startTime);
return super.evaluate(progress, startPoint, endPoint);
}
private PathPoint[] getStartEndForTime(float time) {
double length = 0;
if (time == 1) {
return new PathPoint[] { mPathPoints.get(mPathPoints.size() - 2),
mPathPoints.get(mPathPoints.size() - 1) };
}
PathPoint[] result = new PathPoint[2];
for (int i = 0; i < mPathPoints.size() - 1; i++) {
length += calculateLengthFromIndex(i);
if (length / mOverallLength >= time) {
result[0] = mPathPoints.get(i);
result[1] = mPathPoints.get(i + 1);
break;
}
}
return result;
}
private float getStartIndexOfPoint(PathPoint point) {
for (int ii = 0; ii < mPathPoints.size(); ii++) {
PathPoint current = mPathPoints.get(ii);
if (current == point) {
return ii;
}
}
return -1;
}
private float getStartTimeOfPoint(PathPoint point) {
float result = 0;
int index = 0;
while (mPathPoints.get(index) != point && index < mPathPoints.size() - 1) {
result += (calculateLengthFromIndex(index) / mOverallLength);
index++;
}
return result;
}
private void calculateOverallLength() {
mOverallLength = 0;
mSegmentLengths.clear();
double segmentLength;
for (int i = 0; i < mPathPoints.size() - 1; i++) {
segmentLength = calculateLengthFromIndex(i);
mSegmentLengths.put(mPathPoints.get(i + 1), segmentLength);
mOverallLength += segmentLength;
}
}
private double calculateLengthFromIndex(int index) {
PathPoint start = mPathPoints.get(index);
PathPoint end = mPathPoints.get(index + 1);
return calculateLength(start, end);
}
private double calculateLength(PathPoint start, PathPoint end) {
if (mSegmentLengths.containsKey(end)) {
return mSegmentLengths.get(end);
} else if (end.mOperation == PathPoint.LINE) {
return calculateLength(start.mX, end.mX, start.mY, end.mY);
} else if (end.mOperation == PathPoint.CURVE) {
return calculateBezeirLength(start, end);
} else {
return 0;
}
}
private double calculateLength(float x0, float x1, float y0, float y1) {
return Math.sqrt(((x0 - x1) * (x0 - x1)) + ((y0 - y1) * (y0 - y1)));
}
private double calculateBezeirLength(PathPoint start, PathPoint end) {
double result = 0;
float x, y, x0, y0;
float[] xy;
x0 = start.mX;
y0 = start.mY;
for (float progress = BEZIER_LENGTH_ACCURACY; progress <= 1; progress += BEZIER_LENGTH_ACCURACY) {
xy = getBezierXY(start, end, progress);
x = xy[0];
y = xy[1];
result += calculateLength(x, x0, y, y0);
x0 = x;
y0 = y;
}
return result;
}
private float[] getBezierXY(PathPoint start, PathPoint end, float progress) {
float[] result = new float[2];
float oneMinusT, x, y;
oneMinusT = 1 - progress;
x = oneMinusT * oneMinusT * oneMinusT * start.mX + 3 * oneMinusT * oneMinusT * progress
* end.mControl0X + 3 * oneMinusT * progress * progress * end.mControl1X + progress
* progress * progress * end.mX;
y = oneMinusT * oneMinusT * oneMinusT * start.mY + 3 * oneMinusT * oneMinusT * progress
* end.mControl0Y + 3 * oneMinusT * progress * progress * end.mControl1Y + progress
* progress * progress * end.mY;
result[0] = x;
result[1] = y;
return result;
}
}
This is the usage:
NormalizedEvaluator evaluator = new NormalizedEvaluator((List<PathPoint>) path.getPoints());
ObjectAnimator anim = ObjectAnimator.ofObject(object, "position", evaluator, path.getPoints().toArray());
UPDATE: I just realized that I might have reinvented the wheel, please look at Specifying Keyframes.
It is shocking to see that nothing is available of this kind. Anyways if you don't want to calculate path length at run time then I was able to add functionality of assigning weights to paths. Idea is to assign a weight to your path and run the animation if it feels OK then well and good otherwise just decrease or increase weight assigned to each Path.
Following code is modified code from official Android sample that you pointed in your question:
// Set up the path we're animating along
AnimatorPath path = new AnimatorPath();
path.moveTo(0, 0).setWeight(0);
path.lineTo(0, 300).setWeight(30);// assign arbitrary weight
path.curveTo(100, 0, 300, 900, 400, 500).setWeight(70);// assign arbitrary weight
final PathPoint[] points = path.getPoints().toArray(new PathPoint[] {});
mFirstKeyframe = points[0];
final int numFrames = points.length;
final PathEvaluator pathEvaluator = new PathEvaluator();
final ValueAnimator anim = ValueAnimator.ofInt(0, 1);// dummy values
anim.setDuration(1000);
anim.setInterpolator(new LinearInterpolator());
anim.addUpdateListener(new AnimatorUpdateListener() {
#Override
public void onAnimationUpdate(ValueAnimator animation) {
float fraction = animation.getAnimatedFraction();
// Special-case optimization for the common case of only two
// keyframes
if (numFrames == 2) {
PathPoint nextPoint = pathEvaluator.evaluate(fraction,
points[0], points[1]);
setButtonLoc(nextPoint);
} else {
PathPoint prevKeyframe = mFirstKeyframe;
for (int i = 1; i < numFrames; ++i) {
PathPoint nextKeyframe = points[i];
if (fraction < nextKeyframe.getFraction()) {
final float prevFraction = prevKeyframe
.getFraction();
float intervalFraction = (fraction - prevFraction)
/ (nextKeyframe.getFraction() - prevFraction);
PathPoint nextPoint = pathEvaluator.evaluate(
intervalFraction, prevKeyframe,
nextKeyframe);
setButtonLoc(nextPoint);
break;
}
prevKeyframe = nextKeyframe;
}
}
}
});
And that's it !!!.
Of course I modified other classes as well but nothing big was added. E.g. in PathPoint I added this:
float mWeight;
float mFraction;
public void setWeight(float weight) {
mWeight = weight;
}
public float getWeight() {
return mWeight;
}
public void setFraction(float fraction) {
mFraction = fraction;
}
public float getFraction() {
return mFraction;
}
In AnimatorPath I modified getPoints() method like this:
public Collection<PathPoint> getPoints() {
// calculate fractions
float totalWeight = 0.0F;
for (PathPoint p : mPoints) {
totalWeight += p.getWeight();
}
float lastWeight = 0F;
for (PathPoint p : mPoints) {
p.setFraction(lastWeight = lastWeight + p.getWeight() / totalWeight);
}
return mPoints;
}
And thats pretty much it. Oh and for better readability I added Builder Pattern in AnimatorPath, so all 3 methods were changed like this:
public PathPoint moveTo(float x, float y) {// same for lineTo and curveTo method
PathPoint p = PathPoint.moveTo(x, y);
mPoints.add(p);
return p;
}
NOTE: To handle Interpolators that can give fraction less then 0 or greater than 1 (e.g. AnticipateOvershootInterpolator) look at com.nineoldandroids.animation.KeyframeSet.getValue(float fraction) method and implement the logic in onAnimationUpdate(ValueAnimator animation).

Simulating multi-touch events (one finger linear swipe)

I am doing an edit to a driver file that uses Multi-Touch protocol.
My goal is to bind a swipe gesture from (x1,y1) to (x2,y2) that follows a line.
To do this i execute this function inside the key pressure detection function:
static void do_swipe(struct kp *kp, long xstart, long ystart, long xend, long yend,
int id) {
printk("SWIPE! X1 %d Y1 %d X2 %d Y2 %d ID %d \n",xstart,ystart,xend,yend,id);
int sx, sy;
int dx = abs(xend - xstart);
int dy = abs(yend - ystart);
if (xstart < xend)
sx = 1;
else
sx = -1;
if (ystart < yend)
sy = 1;
else
sy = -1;
int err = dx - dy;
long tempx = xstart;
long tempy = ystart;
while (1) {
key_report(kp, tempx, tempy, id);
//Touch is now detected with input_sync
input_sync(kp->input);
if(tempx == xend && tempy == yend) break;
int e2 = 2 * err;
if (e2 > (-1)*dy) {
err = err - dy;
tempx = tempx + sx;
} else if (e2 < dx) {
err = err + dx;
tempy = tempy + sy;
}
}
printk("END! X %d Y %d \n",tempx, tempy);
}
This function is an edited Bresenham's line algorhitm that instead of drawing a line calls key_report to simulate a touch at the given coordinates.
static void key_report(struct kp *kp, long x, long y, int id) {
if (x == 0 && y == 0) {
//printk("---------- zero point --------------\n");
;
} else {
input_report_key(kp->input, BTN_TOUCH, 1);
input_report_abs(kp->input, ABS_MT_TRACKING_ID, id);
input_report_abs(kp->input, ABS_MT_TOUCH_MAJOR, 20);
input_report_abs(kp->input, ABS_MT_WIDTH_MAJOR, 20);
input_report_abs(kp->input, ABS_MT_POSITION_X, x);
input_report_abs(kp->input, ABS_MT_POSITION_Y, y);
input_mt_sync(kp->input);
}
release = 1;
}
Where kp-> input is so defined
struct input_dev *input;
Edit: I updated the code and I'm also attaching the generic code of a key pressure, because I'd like the driver to perform a swipe only once for each key pressure, to avoid swipe looping.
static void kp_work(struct kp *kp) {
int i, code, value;
kp_search_key(kp);
if (key_param[0] == 3) {
if (kp->key_valid[4] == 1) {
value = kp->key_value[4];
kp->flagchan4 = 0;
if (value >= 0 && value <= (9 + 40)) {
if (key_param[99] == 1 ) {
do_swipe(kp, key_param[25], key_param[26], key_param[97],
key_param[98], 12);
} else
key_report(kp, key_param[25], key_param[26], 12);
} else if (value >= 392 - 40 && value <= (392 + 40)) {
if (key_param[102] == 1) {
do_swipe(kp, key_param[23], key_param[24], key_param[100],
key_param[101], 12);
} else
key_report(kp, key_param[23], key_param[24], 12);
}
kp->flagchan4 = 0;
} else {
kp->flagchan4 = 1;
}
}
input_sync(kp->input);
}
Maybe i could set a check on kp->flagchan4...
Edit: Setting a check on flagchan4 did the trick.

Categories

Resources