GLSurfaceView textured rect as button clicks - android

I just want to ask a simple question related the GLSurfaceView and drawn objects on it. I am drawing a rect and bind a texture to it. It works great. Then, the textured rect I am drawing is on a GLSurfaceView. I am drawing a "button"-like object for which I should know whether the user clicked on the button or not.
I imagined that like this: if the user taps the screen and the .y of the tap are in the rect of the drawn object (in my case the button), I need another operation to be performed (i.e. change another view or so...). Is my idea correct?
Now the question: How do I handle user interactions? And how can I get the Rect of the drawn object (button) maped on the GLSurfaceView (i.e. Rect(120, 80)) so that I can then check whether the user clicked the button or not? Or there is some other approach.
Also I am interested in the following:
I have in mind to make my application full opengl es based. I won't use Button views from Android. I am working with big textures and I think OpenGL ES is the better way to do it since animations via translations and rotations and scaling are much easier to handle. Am I right?
Thanks

You can use onTouchEvent to get the x and y points where user touches.
#Override
public boolean onTouchEvent(MotionEvent event) {
float x = event.getX();
float y = event.getY();
}

Related

Camera rotation not rotating correctly on mobile

I'm making a FPS and I want the player to rotate the camera, my code works for PC, but on mobile if I'm rotating the camera and I also touching the fire button (or anywhere else with my other finger) the camera rotates to right (here where my fire button is) and I don't know if I can do something about it, or I need to cancel the release for android and IOS and publish my game only for PC
Part of my code:
if (CanProcessInput())
{
// Check if this look input is coming from the mouse
bool isGamepad = Input.GetAxis(stickInputName) != 0f;
float i = isGamepad ? Input.GetAxis(stickInputName) : Input.GetAxisRaw(mouseInputName);
// handle inverting vertical input
if (InvertYAxis)
i *= -1f;
// apply sensitivity multiplier
i *= LookSensitivity;
if (isGamepad)
{
// since mouse input is already deltaTime-dependant, only scale input with frame time if it's coming from sticks
i *= Time.deltaTime;
}
else
{
// reduce mouse input amount to be equivalent to stick movement
i *= 0.01f;
#if UNITY_WEBGL
// Mouse tends to be even more sensitive in WebGL due to mouse acceleration, so reduce it even more
i *= WebglLookSensitivityMultiplier;
#endif
}
return i;
}
Segment the touch input so you ignore values that are generated in the area of the fire button. You can do this by checking the Touch. position value:
Description
The position of the touch in screen space pixel coordinates.
Position returns the current position of a touch contact as it's dragged. If you need the original position of the touch see Touch.rawPosition.
The documentation hints at what you might be interested in, too - the original touch position. Your question body is asking about horizontal motion, but your code is referencing y values. The position coordinates are given in (width, height), so I just wanted to be clear up front that I'll use x values to answer your question about horizontal motion.
So if you know that the bottom-left corner of the screen is (0,0) and you can use Screen.width to get the width in pixels, then you could calculate the right 20% of the screen as reserved for the fire button. Anything less than that would be acceptable inputs for rotation, but the 80%-of-width pixel would be the maximum allowable input:
private int maxAllowablePixel;
void Start()
{
maxAllowablePixel = 0.8f * Screen.width;
}
and then only process the touch as a rotation if the original touch was less than that value:
if(touch.rawPosition.x < maxAllowablePixel)
{
DoRotation(touch.position);
}
and again here you are allowing the user to put their finger down in the middle of the screen, drag it over the fire button, and still rotate that much, but any touches that originate in the fire button "exclusion zone" are ignored for the purposes of doing rotations.
Also when you do the rotation, do it where the finger is now (Touch.position) and not where it started (Touch.rawPosition).
The way I always solve an issue like this is to not use the touch positions directly, but to have a "TouchReceiver" Object in the Canvas with a script that implements IDragHandler. So the class might look like this:
[RequireComponent(typeof(Image))]
public class TouchReceiver : MonoBehaviour, IDragHandler
{
public void OnDrag(PointerEventData data)
{
Vector2 delta = data.delta;
// do something with delta, like rotating the camera.
}
}
Then set up your canvas like so:
Make sure to put the "TouchReceiver" behind the fire button, that way the OnDrag event will not occur when pressing the fire button, because the fire button blocks the "TouchReceiver" object itself. The "TouchReceiver" object needs to have an Image component, otherwise it won't be able to receive any touches. Of course the image should then be made totally transparent.
With this setup is pretty easy to move the fire button around without changing anything else. This might also come in handy when adapting to different screen resolutions.

Dragging a sprite in Android

I'm simply trying to drag a sprite around using my finger. I'm doing this by detecting the distance the finger that is touching the screen has moved by and then moving the sprite by the same amount.
This is an openGL ES 2.0 app, so my rendering and logic updating takes place on my GL Rendering thread, and obviously touch events are captured on the UI thread.
So, my setup is something like this:
UI Thread
case MotionEvent.ACTION_MOVE: {
movedByX = event.getX()-oldXPosition;
movedByY = event.getY()-oldYPosition;
oldXPosition = event.getX();
oldYPosition = event.getY();
break;
}
GL Rendering thread
Rendering
#Override
public void render() {
drawSprite(testSprite); //Draws using the sprites' internal x and y coordinates
}
logic update
#Override
public void updateLogic() {
testSprite.x+=movedByX; //Update the sprite's X position
testSprite.y+=movedByY; //Update the sprite's Y position
}
The issue
If one drags the sprite around the screen for a while, and then stops. The resting point of the finger relative to the sprite is not the same as it was when the finger initially went down. Take for a example a circlar sprite like so. The blue circle is the sprite and the red dot represents the finger/pointer.
So as you can see, it doesn't quite work as expected and I'm not sure why.
I had a similar question in previously in which I 'worked around' the problem by initially grabbing the X and Y in ACTION_MOVE (onTouchEvent/UI Thread) as I am above, but then in my updateLogic method, I make a copy of it and work out the 'moveByY' amount there before applying it to my sprite's position.
Doing this effectively solved the problem of the finger 'wandering' - but - it makes the movement of the sprite very choppy, therefore I can't use this solution.
I think this choppiness may be because the rendering thread sometimes runs twice without the UI thread running, therefore, even though the finger has moved, the logic is still using the version it has because onTouch hasn't been able to capture to actual most up to date finger position. But I'm not 100% sure.
If I simply update my sprite's positionin my UI thread, (again, in ACTION_MOVE) - again, I get very choppy movement but the pointer position does remain correct).
So, I need to keep the smooth movement that I get from the method outlined at the top of the question, but I need to know why the moveBy amount is causing the sprite to wander from the finger.
Other Notes
I need to move the sprite using a the difference between the finger's old and current positions, and not simply draw the sprite at the finger's current position because this will eventually become part of a 'scrollable' menu system.
All of my variables are declared as 'private float volatile' and my onTouchEvent and updateLogic methods are synchronised.
i dont know if its a typo
#Override
public void updateLogic() {
testSprite.x+=movedByX; //Update the sprite's X position
testSprite.x+=movedByY; //Update the sprite's Y position
}
but you set testsprite.x both times

Implement different clickable Area on an image-page through image map

A B C D
E F G H --figure.
I am working on Book App on Android. In my book App, there are many images-pages. In first image-page there are 26 alphabets(A-Z),4 alphabets in a raw(as shown in my picture).
My Problem is:
I want to developed a click-able area for each and every alphabets (A-Z) . when user click or touch on any of alphabet then a square box should be appear like "image A is clicked"(shown in above figure).
On every touch or click on alphabets images(A-Z) i will pass corresponding sound. Means i want to call different-different Listeners for different-different area on a single image-page.
i have no idea about how to implement click-able area on an image which click able area or coordinates doesn't vary emulator to emulator.
please provide me some sample code or an idea or reference link.
Thank in Advance
I remember answering a very similar question just a few days ago. In order to implement this as the image, you'll need to know what areas of the image mean what (i.e. if you get coordinates of of a touch event, which action these coordinates would correspond to). Provided you have this info, you can then register OnTouchListener with your ImageView:
imageView.setOnTouchListener(new OnTouchListener() {
public boolean onTouch (View v, MotionEvent event) {
int x = event.getX();
int y = event.getY();
//now deal with coordinates (x, y)
}
});
However in your case, I would suggest using a layout with 26 buttons and just use OnClickListener with each of them. You can easily set the buttons up in the shape you want and you can easily set individual background drawables to them to make it look as one whole image.

Problems with touch event

Speaking of events we know that the touch screen method onTouchEvent (MotionEvent events) allows you to capture touch events on the screen and event.getX () and
event.getY () give me the coordinates of the touch as float values ​​(ie values ​​with a comma).
Specifically, I realized that taking logcat using the fixed point the finger in a mobile phone screen and not moving it, the event is not only perceived but MotionEvent.ACTION_DOWN MotionEvent.ACTION_MOVE and returned coordinates are manifold. This i can understand because the surface of the finger touches more points and more by reading the device coordinates believe that the finger moves while holding it.
My problem is that I would read the color of one pixel of a color image when I hold your finger still. In particular I need the central area occupied by the finger.
First, assuming you have a Bitmap object, I have to round to integer coordinates of the touch as the getPixel (intX, int y) coordinates of the entire Bitmap wants to return the pixel color.
Then how do I get the central point of touch? Be that there is a function that can help me? Keep in mind that while I stopped and read the value of a pixel on the image then I start to move slowly and even though I always record every move the center pixel. I need this because every move, if you change the color of the image in pixels, I want to vibrate or not the device.
If I could stop thinking about themselves in a dynamic array to store the coordinates of different pixels and search for the center of gravity, but moving I do not know what to do. I think however, that the research center of gravity, if not done very efficiently, can be slow during the move and then give the wrong results. If the CG is not quite enough for me to another point belonging to the environment.
I hope you can help me maybe with a few lines of code sketched.
Thanks in advance.
To get the touch position of finger, just cast the getX() and getY() to int and perform your desired tasks.
public boolean onTouchEvent(MotionEvent event) {
if (event.getAction() == MotionEvent.ACTION_DOWN ) {
getPixelAt((int)event.getX(), (int)event.getY());
}
}
you can put you on main activity class when you can want to touch event occurs. let see you can put following code or not ? and return is most important as well.
#Override
public boolean onTouchEvent(MotionEvent event) {
if (event.getAction() == MotionEvent.ACTION_DOWN) {
mDistanceX = 0;
}
return super.onTouchEvent(event);
}

Is it possible to move components around the screen using the standard android apis?

I would like to produce an android user interface which allows the user to move added components/widgets around the screen by selecting them and then dragging them around.
Is this possible using the standard android apis?
Yes. It depends what you are trying to achieve.
It can be done using the standard APIs, but this functionality is not part of the standard APIs. That is, there is no widget.DragOverHere() method unless you write one.
That said, it would not be terribly complicated to do. At a minimum, you would need to write a custom subclass of View and implement two methods: onDraw(Canvas c) and onTouch(MotionEvent e). A rough sketch:
class MyView extends View {
int x, y; //the x-y coordinates of the icon (top-left corner)
Bitmap bitmap; //the icon you are dragging around
onDraw(Canvas c) {
canvas.drawBitmap(x, y, bitmap);
}
onTouch(MotionEvent e) {
switch(e.getAction()) {
case MotionEvent.ACTION_DOWN:
//maybe use a different bitmap to indicate 'selected'
break;
case MotionEvent.ACTION_MOVE:
x = (int)e.getX();
y = (int)e.getY();
break;
case MotionEvent.ACTION_UP:
//switch back to 'unselected' bitmap
break;
}
invalidate(); //redraw the view
}
}
To some degree. It depends on what degree of freedom you want to give the user. One solution that comes to mind is a TableLayout with predefined cells. A user can then tap and drag components. When dragging a component you'll just want to draw an image of it under the user's finger, but when they release, remove the component from it's previous parent and add it to the new parent using the ViewGroup add/remove methods.
While you can programmatically shift Views around, it would be difficult/impossible to switch layouts on the fly as far as I can see.
The way I have done this is to have an absolute layout and then just update the object position as it is dragged. You have to implement dragging yourself. If the widgets have a natural order (a toolbar where you want to be able to drag buttons around) you can stack an absolute layout on top of the toolbar and when the drag starts, you add it to the absolute layout and when it finishes, you add it back to the original layout in the new position.

Categories

Resources