First-person shooter on Android – how to process mouse events? - android

As a hobby and to learn, whether this is possible, I'm trying to implement a simple first-person shooter for Android. Unfortunately, I ran into a dead end when dealing with mouse event processing.
I already have an onGenericMotion()-Listener, which processes MotionEvent objects generated by the system framework. The problem is, that MotionEvent objects, generated by a mouse merely contain absolute coordinates, which tend to get "stuck", once the cursor reaches an edge or a corner of the screen. So I'm thinking relative mouse coordinates. While I found no feature on MotionEvent that could deliver relative movements, using
adb shell su -- getevent -lt /dev/input/event3
and examining its output revealed that the kernel generates distinct relative motion events when one tries to move the cursor, even when it is stuck in a corner of the screen. So, given that my shooter has su access, I could obtain relative movements.
And now the question: A little bit of Google-fu revealed, that in many first-person shooters, the typical mouse movement is achieved by
using relative mouse coordinates and
by re-positioning the mouse cursor in the center of the screen.
So, the question really is two-folded:
Is it possible to re-position the mouse cursor in the center of the screen on Android? and
If not, can the typical "first-person-shooter" mouse movement be realised by using relative mouse movement information alone?

This is now possible without root permissions, with the pointer capture API in Android 8.0+ (released August 2017). In summary:
To request pointer capture, call the requestPointerCapture() method on the view.
Once the request to capture the pointer is successful, Android calls onPointerCaptureChange(true), and starts delivering the mouse events.
Your focused view can handle the events by performing one of the following tasks:
If you're using a custom view, override onCapturedPointerEvent(MotionEvent).
Otherwise, register an OnCapturedPointerListener.
The view in your app can release the pointer capture by calling releasePointerCapture().

It turns out that this completely answers my question. Never mind then.

Related

Multi-touch drawing : action_cancel when the palm touches the screen

I'm working on an android application on tablet to draw every points touched by the finger or the hand. It is very important for the application to track not only the fingers but also the palm (even if it's not drawing the exact touched area but only a thin path, as long as we can see the movement of the palm it's fine). A good example of what I want to do is the "show pointer location" in the Android developer menu : if the drawing could be exactly like this, that would be perfect.
I managed to code a multi-touch drawing app, but the problem I have here is that every time the touched area is too wide (for example when I touch the screen with my entire palm), the application stops drawing (even the fingers drawings stops) and I have this error in the log : [ViewRootImpl] action cancel - 1, eccen:1.4225352 (the number after "eccen" changes depending on the size of my palm touch).
I'm quite new to android, I spent a lot of time searching how I could prevent this action_cancel but I couldn't find anything that makes it work... I tried to prevent the parent views from taking control of the onTouch events, but it didn't work. So if you have any idea of how I can manage that, it would be great :)
Thanks !
(English is not my native language, so don't hesitate to ask me to reformulate is something is not clear)

Implement something comparable to Android's pattern-lock system

Suppose that my display consists of numbers 1-9 displayed via a 3x3 grid of Buttons, like on num-pad, where one number is placed on each button. I want to be able to detect complex movements, such as the tracing of a Z, which would yield [7 8 9 5 1 2 3] or top-left to bottom-right swipe, which would yield [7 5 3]. I read that I could accomplish this via overriding boolean onTouchEvent(MotionEvent event), but I'm not certain that I understand how this works. From what I can tell from the Android API for MotionEvents, whenever an object is touched on the screen, whether or not one's finger is still tracing or lifting up, it is captured by ACTION_UP. If that is true,
How do I get the history or stack trace, if you will? Can I do this in real time, or do I have to wait for the drag to complete?
How do I actually highlight/ draw the path that their finger/ stylus is tracing (in real time)? I don't want their precise, trace, but rather, I'd like to be able to normalize it via straight-line paths, like Android implements in their pattern-lock system.
Thanks in advance.

Touchscreen sensitivity

I'm trying to make an android app only for tablets, which will draw the lines as and where the user touches the screen. It is very simple and there are lot more apps like this. I have a doubt regarding the touch-screen technology. Is there any possibility that if the user touch the screen soft then the lines will be dull and if the user touch the screen harder then the lines drawn will be thicker? Is it even possible to do such things in tablet? I don't have info about the hardware and technology used in tablets, please guide me with a valid answers and please refer me to any blogs or docs which says about the touch sense technology.
Thank you
You can use the OnTouch() Input Event (http://developer.android.com/guide/topics/ui/ui-events.html) which is triggered when you touch the screen. So inside the VIEW in your application you should register an OnTouchListener using setOnTouchListener (setOnTouchListener) with a callback to the function that will handle the event.
Inside your callback get the pressure properties:
public float pressure
Added in API level 9 A normalized value that describes the pressure
applied to the device by a finger or other tool. The pressure
generally ranges from 0 (no pressure at all) to 1 (normal pressure),
although values higher than 1 may be generated depending on the
calibration of the input device.
http://developer.android.com/reference/android/view/MotionEvent.PointerCoords.html#pressure
When you touch the screen you get a MotionEvent which has many methods and one of them gives you the pressure
http://developer.android.com/reference/android/view/MotionEvent.html
final float getPressure() getPressure(int)
final float getPressure(int pointerIndex)

Moving OpenGL objects with fingers

I'm developing an Android application with OpenGL and JNI (all OpenGL stuff is in C code).
Imagine I've drawn a cube. I want that user can push his finger over the cube and can rotate the cube and move it around the screen.
Is there any way to do that?
How can assign an event listener to touch and move events only when the user touch the cube?
UPDATE I want something like this:
Rotate cube with fingers
Thanks.
This is called "picking" in 3d-ville... There are a number of tutorials on the subject floating hither and yon. There's even another question (sans the JNI spin) here on StackOverflow.
Also, check out this google IO video on developing android games to see why your approach may not be faster than pure Java... It Depends.
It turns out that JNI calls are Quite Expensive, so a JNI-based renderer could end up slower than a pure-java one unless you are Very Careful. YM Will V.
I'm pretty sure you'll have to listen to all touches then change behavior based on what is being touched. I suppose you could compute your cube's bounding box on screen and then monkey with your listeners every frame (or every time it moves), but I seriously doubt that would be the most efficient course to take. Listen for all touches, react appropriately.

Android - Detect Path of Complex Drag

I am recently getting into Android programming and want to make a simple game using 2D canvas drawing. I have checked out the Lunar Lander example and read up on some gestures, but it looks like I can only detect if a gesture occurred. I am looking to do a little more complicated detection on a swipe:
I want to make a simple game where a user can drag their finger through one or more objects on the screen and I want to be able to detect the objects that they went over in their path. They may start going vertically, then horizontally, then vertically again, such that at the end of a contiguous swipe they have selected 4 elements.
1) Are there APIs that expose the functionality of getting the full path of a swipe like this?
2) Since I am drawing on a Canvas, I don't think I will be able to access things like "onMouseOver" for the items in my game. I will have to instead detect if the swipe was within the bounding box of my sprites. Am I thinking about this correctly?
If I have overlooked an obvious post, I apologize. Thank you in advance!
I decided to implement the
public boolean onTouchEvent(MotionEvent event)
handler in my code for my game. Instead of getting the full path, I do a check to see which tile the user is over each time the onTouchEvent fires. I previously thought this event fired only once on the first touch, but it fires as long as you are moving along the surface of the screen, even if you haven't retouched.

Categories

Resources