i have drawn an square,now i want to move it along the plane following the mouse pointer.I am using open GL es 1.0.I tried to get difference in motion position with the below code:
#Override public boolean onTouchEvent(MotionEvent e) {
float x = e.getX();
float y = e.getY();
switch (e.getAction()) {
case MotionEvent.ACTION_MOVE:
mRenderer.dx = x - mPreviousX;
mRenderer.dy = y - mPreviousY;
requestRender();
}
mPreviousX = x;
mPreviousY = y;
return true;
}
and then i am translating the object by using gtranslatef function with dx and dy values.
gl.glTranslatef(dx, dy,0);
but for a little movement in mouse pointer.there is large displacement in object position.
How can i move object along with mouse?
Pointer coordinates usually are in screen physical dimensions, i.e. pixels. Your OpenGL coordinates depend on the projection and modelview matrices you apply.
What you must do is doing a backtransform of screen/viewport coordinates into your object or world space. This is done by reversing the transformation pipeline. Since matrix multiplication is a linear operation, you can not only put absolute values through it, it works as well for differentials.
I'd tell you more, but I need to say your existing program structure to give you sensible advice how to extend or change it. Please post it on http://pastebin.com or similar.
Related
I have an Android application that uses Native OpenCV Library to track objects in my camera view. I find the position of the objects using moments:
Moments moment = moments((Mat) contours[i]);
double area = moment.m00;
object.setXPos(moment.m10 / area);
object.setYPos(moment.m01 / area);
What I am trying to implement is a way to see if my finger touch Point is within a distance threshold to the object position. However, Android calculates my finger position based on touch location on screen, whereas the object's position is calculated by moments, which I believe is causing wacky results when I calculate the distance from the touch event to the object location. Is there any way to remedy this, or am I going about this the wrong way? Thanks in advance for your help!
Other possibly useful info:
#Override
public boolean onTouchEvent(MotionEvent event) {
double x = event.getX();
double y = event.getY();
switch(event.getAction()) {
case MotionEvent.ACTION_UP:
{
//jni function. Converts x and y to Point(x, y) and compares its distance to tracked object locations
GetTouchedPoint(x, y);
}
}
return false;
}
//Native function: p1 is finger touch location, p2 is object location (found by moments)
int ObjectDetector::distance(Point p1, Point p2) {
int dx = p1.x - p2.x;
int dy = p1.y - p2.y;
int distance = sqrt(pow(dx, 2) + pow(dy, 2));
return distance;
}
I am working with andEngine the Open source game platform. I have a sprite that moves continuously on the screen and change direction when collides with screen boundary. Now, I wanna change its direction to the players touch point on the screen. I can't manage this part. I use PhysicsHandler to move the sprite with a velocity. I understand i have to implements IOnSceneTouchListener, to get touched point and set the direction on the sprite . But found nothing now. Here is my code goes:
Pilot aPilot;
PhysicsHandler mPhysicsHandler;
aPilot = new Pilot(222, 333, pilotTexures, vbom) {
#Override
protected void onManagedUpdate(float pSecondsElapsed) {
/*
* change direction when collides with boundary wall of Screen
*/
if (this.mX < 0) {
mPhysicsHandler.setVelocityX(AtomicEngine.DEMO_VELOCITY);
} else if (this.mX + this.getWidth() > ResourcesManager.CAMERA_WIDTH) {
mPhysicsHandler.setVelocityX(-AtomicEngine.DEMO_VELOCITY);
}
if (this.mY < 0) {
mPhysicsHandler.setVelocityY(AtomicEngine.DEMO_VELOCITY);
} else if (this.mY + this.getHeight() > ResourcesManager.CAMERA_HEIGHT) {
mPhysicsHandler.setVelocityY(-AtomicEngine.DEMO_VELOCITY);
}
super.onManagedUpdate(pSecondsElapsed);
}
};
/*
* initialize mPhysicsHandler
*/
mPhysicsHandler = new PhysicsHandler(aPilot);
registerUpdateHandler(this.mPhysicsHandler);
mPhysicsHandler.setVelocity(AtomicEngine.DEMO_VELOCITY,
AtomicEngine.DEMO_VELOCITY);
attachChild(aPilot);
aPilot.setScale(3f);
And my override onSceneTouchEvent method is like:
#Override
public boolean onSceneTouchEvent(Scene pScene, TouchEvent pSceneTouchEvent) {
if (pSceneTouchEvent.isActionDown()) {
// need some idea here
}else if (pSceneTouchEvent.isActionMove()) {
}
return false;
}
Wait for your super knock.
you have to calculate difference between the pilots current position (e.g. by calling getSceneCenterCoordinates() of your pilots sprite you get the coordinates in the scene) and the position of the the touch event - with that difference in mind, you can calculate the angle (measured on the UnitCircle) or use a factor that is a percentage between your max_velocity & distance length, then use your physicshandler and set a new velocity. the factor is used to limit the speed to a max speed.
so, your code should look like something like this (didn't test, ask if it didn't work)
#Override
public boolean onSceneTouchEvent(Scene pScene, TouchEvent pSceneTouchEvent) {
MainActivity.this.mCamera.convertCameraSceneToSceneTouchEvent(touchEvent);//see edit
float touchX = touchEvent.getX();
float touchY = touchEvent.getY();
float[] pilotCoord = aPilot.getEntity.getSceneCenterCoordinates();
float pilotX = pilotCoord[0];
float pilotY = pilotCoord[1];
float xDiff = touchX - pilotX;
float yDiff = touchY - pilotY; // could be wrong with AnchorCenter branch
// use the max velo divided by the distance to get the velo factor for x & y,
// but perhaps calculating angles is faster, dunno
float veloFactor = MAX_VELO/sqrt(xDiff^2 + yDiff^2);
float xVelo = xDiff*veloFactor;
float yVelo = yDiff*veloFactor;
mPhysicshandler.setVelocityX(xVelo);
mPhysicshandler.setVelocityY(yVelo);
return true;
}
so far the calculation for setting the velocity into the direction of the finger. if you want some kind of (de)acceleration (like: as long as the finger is down, the pilot (de)accelerates into the direction of the finger, else he will stick with his speed, you have to setLinearVelocity(xVelo, yVelo) instead and set the current velocity as velocity (to maintain speed)
Edit
The conversion of the touchEvent from a CameraScene to a SceneTouchEvent is only usefull if you add your onSceneTouchListener to your HUD. it converts the events x/y values based on the current camera position (over the scene) to xy-values as they would have occured on the scene.
else, if you add the listener directly to your Scene, you don't need to convert the touch event and the line could be deleted.
I have a background image as a drawable in my custom view. This drawable may be pinch zoomed or moved.
Currently I need a green dot that is drawn on the image to be stationary relative to the screen. That is, it should be always at the same position with the pin as shown below. (Of course, the pin is simply an ImageView and does NOT move at all!)
I have successfully made it stationary relative to the screen, when the map behind is moved as follows in my custom view, MapView:
#Override
public boolean onTouchEvent(MotionEvent ev) {
// Let the ScaleGestureDetector inspect all events.
mScaleDetector.onTouchEvent(ev);
final int action = ev.getAction();
switch (action & MotionEvent.ACTION_MASK) {
case MotionEvent.ACTION_DOWN: {
final float x = ev.getX();
final float y = ev.getY();
mLastTouchX = x;
mLastTouchY = y;
mActivePointerId = ev.getPointerId(0);
break;
}
case MotionEvent.ACTION_MOVE: { // triggered as long as finger movers
final int pointerIndex = ev.findPointerIndex(mActivePointerId);
final float x = ev.getX(pointerIndex);
final float y = ev.getY(pointerIndex);
// Only move if the ScaleGestureDetector isn't processing a gesture.
if (!mScaleDetector.isInProgress()) {
final float dx = x - mLastTouchX;
final float dy = y - mLastTouchY;
mPosX += dx;
mPosY += dy;
// update the starting point if the 'Start' button is not yet pressed
// to ensure the screen center (i.e. the pin) is always the starting point
if (!isStarted) {
Constant.setInitialX(Constant.INITIAL_X - dx);
Constant.setInitialY(Constant.INITIAL_Y - dy);
if ((historyXSeries.size() > 0) && (historyYSeries.size() > 0)) {
// new initial starting point
historyXSeries.set(0, Constant.INITIAL_X);
historyYSeries.set(0, Constant.INITIAL_Y);
}
}
invalidate();
}
mLastTouchX = x;
mLastTouchY = y;
break;
}
By doing that above, my green dot stays there, when the background image is moved.
But I have problems in trying to make it stay there, when the background image is zoomed.
Essentially, I don't really understand how canvas.scale(mScaleFactor, mScaleFactor) works, and therefore I cannot move the green dot accordingly like what I have done in the simple moving case.
I think something should be added in the scale listener handler below, could anybody help me fill that part?
private class ScaleListener extends ScaleGestureDetector.SimpleOnScaleGestureListener {
#Override
public boolean onScale(ScaleGestureDetector detector) {
mScaleFactor *= detector.getScaleFactor();
// Don't let the object get too small or too large.
mScaleFactor = Math.max(1f, Math.min(mScaleFactor, 10.0f)); // 1 ~ 10
// HOW TO MOVE THE GREEN DOT HERE??
invalidate();
return true;
}
Or please at least explain how canvas.scale(mScaleFactor, mScaleFactor) works, and how may I move the green dot accordingly?
Keep in mind that the canvas is thought to scale everything according to the scale factor, so while going against the zoom is possible, it is probably not the best approach. However, if this is what you're looking for, I will help you as best as I can.
I am assuming the following:
Scale factor is relative to the current zoom (old zoom is always scale factor 1). If this is not the case, then you should observe the zoom values after scaling roughly 200% two times and seeing if the resulting scale factor is 4 or 3 (exponential or linear). You can achieve the results below by normalizing the scale factor to 2 for a zoom factor of 200%, for example. You'll have to remember the old scale factor in order to do so.
No rotation is performed
If this is the case then following can be said for a marker with respect to the zoom center.
For every horizonal pixel x away from the zoom center after zoom, its original position could be calculated to be: zoom_center_x + *x* / scale_factor (or alternatively zoom_center_x + (marker_x - zoom_center_x) / scale_factor). In other words, if zoom center is (50, 0) and the marker is (100, 0) with a scale factor of 2, then the x position of the marker prior to the zoom was 50 + (100 - 50) / 2 or 75. Obviously, if the marker is in the same position of the zoom center, then the x position will be the same as the zoom center. Similarly, if the scale is 1, then the x position for the marker will be the same as it is now.
The same can be applied to the y axis.
While I can't know exactly how to set the position of your marker, I would expect the code to look something like:
Point zoomCenter = detector.getZoomCenter();
// Set marker variable here
marker.setX(Math.round(zoomCenter.getX() + ((double)(marker.getX() - zoomCenter.getX())) / mScaleFactor));
marker.setY(Math.round(zoomCenter.getY() + ((double)(marker.getY() - zoomCenter.getY())) / mScaleFactor));
I hope that helps.
I'm creating a simple OpenGL 'app' to rotate a triangle. I wish, on the first touch, to save the angle the touch position corresponds to. Then, on motion, rotate the shape by the angle corresponding to current position minus angle of first touch.
It was my understanding that the first step should be done in MotionEvent.ACTION_DOWN, and the second in MotionEvent.ACTION_MOVE. However, it seems as if ACTION_DOWN is being called during the motion. That is, the below code causes the shape to rotate as a finger is dragged (and I understood that it would rotate only to the position of the initial touch):
private double mTheta;
#Override
public boolean onTouchEvent(MotionEvent e) {
super.onTouchEvent(e);
float x = e.getX();
float y = e.getY();
switch (e.getAction()) {
case MotionEvent.ACTION_DOWN:
x -= getWidth() / 2;
y -= getHeight() / 2;
mTheta = Math.atan2(-x,-y) * 180.0f / Math.PI;
GL20Renderer.mAngle = (float) mTheta;
requestRender();
}
return true;
}
Is my code wrong, or is this some weird behaviour of the emulator? (I don't currently have access to an android device.)
(Addendum: I original attempted to implement the above fully, with a MotionEvent.ACTION_MOVE case for calculating the new angle and rendering. The ACTION_DOWN case was only saving the starting offset angle. This didn't work, in that the shape didn't rotate - because the offset angle was being re-calculated during movement - which is how I ended up at this point.)
It might have been that you forgot to put a break statement in your switch/case. So once ACTION_MOVE is done, ACTION_DOWN follows immediately after
Needed to be using getActionMasked() rather than getAction(). See comment from WarrenFaith.
I'm currently working on my own basic drawing app. So far, my app is working well but I noticed that my motion event doesn't seem to get all the X and Y axis points that are touched.
When I move my finger across the screen, there is noticeable spaces between the circles. Only when I move my finger slowly does it capture all the points. Is there a way I can grab all the points or is there a way i can optimize it to be able to handle all the points?
Here's how I'm doing it:
setOnTouchListener(new View.OnTouchListener() {
public boolean onTouch(View v, MotionEvent event) {
int x;
int y;
switch (event.getAction())
{
case MotionEvent.ACTION_DOWN:
{
x = (int) event.getX();
y = (int) event.getY();
//method draws circle at x and y coordinate
MyPainter mp = new MyPainter(x,y);
break;
}
case MotionEvent.ACTION_MOVE:
{
x = (int) event.getX();
y = (int) event.getY();
MyPainter mp = new MyPainter(x,y);
break;
}
}
return true;
}
});
Any Suggestions or comments are appreciated. Thanks
This is just the way the hardware is designed. The hardware samples the touchscreen N times per second, and spits out the coordinates every time it senses a finger.
If you move faster than the screen samples, you'll have gaps in the reported data. Only thing you can do is to generate the missing points yourself by interpolating along the line between the two touches.
The accepted answer isn't strictly correct, it is possible to access the touch data from the graps using the following methods:
event.getHistorySize(); // returns the number of data points stored as an int
event.getHistoricalX(int);
event.getHistoricalX(int);
Other related methods are available when there are multiple pointers (see documentation).
A good example of how this can be used in practice can be found here.