Accessing event input nodes in Android withour rooting - android

I want to be able to inject different events into an Android device. After some search, I found that I can do this by accessing event input nodes in Android OS, which are found in dev/input/eventX. Once these are accessed, read and write operations can take place, and hence I can inject events.
The problem is that these nodes are only accessible in a rooted device. If I try to use them without rooting, the process will fail as mentioned in this article,
http://www.pocketmagic.net/programmatically-injecting-events-on-android-part-2/
I don't want to root the device to preserve its warranty. I've searched the web for possible ways to accessing Android OS, but I only found rooting.
The alternative way which I think it would work is compiling the application as a system application, but I couldn't found whether this will allow it to have access (both read and write privileges) to event input nodes. Will this method provide these privileges?
If not, is there any alternative way to rooting, where I can give system permissions to my application without rooting the device?
Any help is appreciated.
Thanks.
EDIT: To elaborate more, I want to inject different touch events. For example, single touch, swipe, etc.

You can inject input events on a device by executing the /system/bin/input utility that ships with Android. You can see some examples of it being used (via adb) in this question. The input utility does not appear to need any special privileges to execute.
To create a system application, you need access to the signing keys used when the Android OS for your device was built - you can't just modify an ordinary App to give it system privileges. Even if you could, it wouldn't give you root access (although you could probably make it part of the input user group which the /dev/input/eventX devices also appear to allow access to).
If you want to inject touch events, you can either execute the /system/bin/input utility using the exec() method of the Java Runtime class or just use the injectMotionEvent() method in InputManager.
Below is a method taken from the Android source showing how to inject a MotionEvent - you can view the full source for more info.
/**
* Builds a MotionEvent and injects it into the event stream.
*
* #param inputSource the InputDevice.SOURCE_* sending the input event
* #param action the MotionEvent.ACTION_* for the event
* #param when the value of SystemClock.uptimeMillis() at which the event happened
* #param x x coordinate of event
* #param y y coordinate of event
* #param pressure pressure of event
*/
private void injectMotionEvent(int inputSource, int action, long when, float x, float y, float pressure) {
final float DEFAULT_SIZE = 1.0f;
final int DEFAULT_META_STATE = 0;
final float DEFAULT_PRECISION_X = 1.0f;
final float DEFAULT_PRECISION_Y = 1.0f;
final int DEFAULT_DEVICE_ID = 0;
final int DEFAULT_EDGE_FLAGS = 0;
MotionEvent event = MotionEvent.obtain(when, when, action, x, y, pressure, DEFAULT_SIZE,
DEFAULT_META_STATE, DEFAULT_PRECISION_X, DEFAULT_PRECISION_Y, DEFAULT_DEVICE_ID,
DEFAULT_EDGE_FLAGS);
event.setSource(inputSource);
Log.i(TAG, "injectMotionEvent: " + event);
InputManager.getInstance().injectInputEvent(event,
InputManager.INJECT_INPUT_EVENT_MODE_WAIT_FOR_FINISH);
}
These methods only allow you to inject events into your own app windows.
If you want to inject events into other windows not owned by your app, you need to declare additional permissions (READ_INPUT_STATE and INJECT_EVENTS) in your app manifest and sign your App with the Android OS signing keys. In other words, the permissions needed to inject events into other apps are never granted to ordinary apps (for obvious reasons).

Related

How to detect my device left/right audio balance?

Is there a way to programmatically detect the current device's audio balance?
On Android 10 - the setting that I'm willing to detect is under:
Settings -> Accessibility -> Audio & On-Screen Text -> Audio balance
Also shown here
By changing this balance we as users can set either the left side or the right side of the stereo volume louder.
Can I somehow retrieve this value in my app?
This only works system-wide since Android Q and this would be the Settings.java:
/**
* Master balance (float -1.f = 100% left, 0.f = dead center, 1.f = 100% right).
*
* #hide
*/
public static final String MASTER_BALANCE = "master_balance";
These global settings can only be read (not tested, but something alike this):
Settings.Global.getFloat(context.getContentResolver(), Settings.Global.MASTER_BALANCE, 0f)
I could not find any public API for the system-wide setting (which means, that only the DevicePolicyManager could change it), but one can use MediaPlayer.setVolume (float leftVolume, float rightVolume) to control this for the playback of the own application.

Is it possible to make android webview based app less sensitive in interpreting taps as small drags?

I have and C# Xamarin android app that hosts a reactjs app in a webview.
When using this app on a touch screen android device, It appears that occasionally tapping the screen is ignored.
What appears to be going on is that, the tap is interpreted as a mini drag event, as there was some small directional movement in the tap.
Looking at the android logs, for failed taps, I noticed output like the following:
adb -d logcat -s CustomFrequencyManagerService
06-19 13:35:49.225 2945 9989 D CustomFrequencyManagerService: acquireDVFSLockLocked : type : DVFS_MIN_LIMIT frequency : 839000 uid : 1000 pid : 2945 pkgName : GESTURE_DETECTED#CPU_MIN#49
06-19 13:35:49.781 2945 2945 D CustomFrequencyManagerService: releaseDVFSLockLocked : Getting Lock type frm List : DVFS_MIN_LIMIT frequency : 839000 uid : 1000 pid : 2945 tag : GESTURE_DETECTED#CPU_MIN#49
Note the GESTURE_DETECTED part of the log entry.
However for successful taps, CustomFrequencyManagerService has no output in the log.
Looking at this from the reactjs app perspective:
I noticed that the failed taps emit the following events:
touchstart
touchend
While the normal successful events are:
touchstart
touchend
mousedown
blur
mouseup
click
I could potentially change the reactjs app to respond directly to touch events instead of click events, but I was wondering if there was a way (hopefully programmatically via android app) to alter the sensitivity with regard to what's interpreted as a drag as opposed to a click?
By installing a IOnTouchListener on the Android.WebKit.WebView
_webView.SetOnTouchListener(new GestureIgnoreTouchListener());
I was able to see at what movement threshold a click turned into a drag.
public class GestureIgnoreTouchListener : Java.Lang.Object, Android.Views.View.IOnTouchListener
{
float _x;
float _y;
public bool OnTouch(Android.Views.View v, MotionEvent e)
{
if (e.Action == MotionEventActions.Down)
{
_x = e.RawX;
_y = e.RawY;
return false;
}
if (e.Action == MotionEventActions.Up)
{
var diffX = e.RawX - _x;
var diffY = e.RawY - _y;
var distance = Math.Sqrt(Math.Pow(diffX, 2) + Math.Pow(diffY, 2));
// observed:
// if distance is 10 or less then this is interpreted as a click.
// if distance is 12 or greater, click is not emitted.
Console.WriteLine(distance);
return false;
}
return false;
}
}
Ideally, if the distance was between 10 and 50, I would like to be able to make this be considered a click not a drag. Possibly I could create a synthetic click event, in this case, but I'm hoping I can somehow influence what ever android code is responsible for interpreting this as a drag.
There are two approaches I've seen people use for this situation. The first you already mentioned: tell react when you're in touch screen environment to use tap events instead of click events.
The second is to take into account what Android refers to as "touch slop":
From https://developer.android.com/training/gestures/movement.html:
Because finger-based touch isn't always the most precise form of
interaction, detecting touch events is often based more on movement
than on simple contact. To help apps distinguish between
movement-based gestures (such as a swipe) and non-movement gestures
(such as a single tap), Android includes the notion of "touch slop".
Touch slop refers to the distance in pixels a user's touch can wander
before the gesture is interpreted as a movement-based gesture. For
more discussion of this topic, see Managing Touch Events in a
ViewGroup.
Google provides an example of one way to deal with it in a difference context here: https://developer.android.com/training/gestures/viewgroup#vc which you could probably adapt to your situation.

How to detect device is moving or not using sensor android

I am using below code to identify the movement of the device, means I would like to know that device is moving or not. I also use Google Activity APIs which provides different activity modes like WALKING, ON_FOOT, STILL, etc without using GPS. I would like to achieve the same with Sensors but I am not able to get it accurately.
The issue with the following code is that as soon as I move the device quickly like take it from the table then I am getting the result as moving whereas it's not actually moving.
// calling method from onSensorChanged method and using TYPE_ACCELEROMETER sensor.
double speed = getAccelerometer(event.values);
// then checking the speed.
if(speed > 0.9 && speed < 1.1) {
// device is not moving
} else {
// device is moving.
}
/**
* #return
*/
private double getAccelerometer(float[] values) {
// Movement
float x = values[0];
float y = values[1];
float z = values[2];
float accelerationSquareRoot =
(float) ((x * x + y * y + z * z) / (9.80665 * 9.80665));
return Math.sqrt(accelerationSquareRoot);
}
Can anyone guide me how to make this logic accurate so that I can identify the device is moving or not?
The accelerometer is made to return acceleration data and according to Netwon's 2nd law if the acceleration is constant then the body is not moving or moving with constant speed(this is quite impossibile in your case).
Therefore if you keep reading the same data on all three axis(or better in a quite strict range) from accelerometer over time it means the phone is not moving otherwise it is.
For the purpose, you need to use Activity Recognition API which will provide you some events like moving, stop, driving, e.t.c, And activity recognize use some sensor data and also help of location service when is running. For the more how we can use and what actually it. You can read from below link
https://developers.google.com/location-context/activity-recognition/

How to inject input events into own application on Android?

I look for a way how to inject input events such as touch or keyboard events into the own application, from code within that particular app. NOT into an Android OS or to other apps b/c that needs a signature-level permission android.permission.INJECT_EVENTS or rooted device and I have to avoid both. I look for a code that runs on a non-rooted device without anyhow elevated rights.
I'm somehow making a bit of progress with code based on the following concept:
View v = findViewById(android.R.id.content).getRootView();
v.dispatchTouchEvent(e);
I think getRootView and dispatch it should works well, if there is any problem with it. Please just brief it.
Or there is an more high level but tricky way to do this.
in View.java, there is a hide api called getViewRootImpl() which will return a object of android.view.ViewRootImpl of this Activity. And there is a method called enqueueInputEvent(InputEvent event) on it. So that you can use reflection to get this method.
As I have just finished a project which using Keyboard to send touch commands to the screen, I think when you are trying to inject these kind of events to the activity, multiple touch might be a problem. When you try to send another finger to the screen. The event action must be ACTION_POINTER_DOWN with the index shift of the array named properties. And you should also handle the properties and coordses arrays which including the info of all the fingers each time, even if you just need to move only one finger at this time. The finger's id which is set to properties[i].id is used to identify your finger. But the index of the array is not solid.
for (int i = 0; i < size; i++) {
properties[i].id = fingerId;
properties[i].toolType = MotionEvent.TOOL_TYPE_FINGER;
coordses[i].x = currentX;
coordses[i].y = currentY;
coordses[i].pressure = 1;
coordses[i].size = DEFAULT_SIZE;
}
event = MotionEvent.obtain(
mDownTime, SystemClock.uptimeMillis(),
((size - 1) << MotionEvent.ACTION_POINTER_INDEX_SHIFT)
+ MotionEvent.ACTION_POINTER_DOWN,
size,
properties, coordses, DEFAULT_META_STATE, DEFAULT_BUTTON_STATE,
DEFAULT_PRECISION_X, DEFAULT_PRECISION_Y, DEFAULT_DEVICE_ID,
DEFAULT_EDGE_FLAGS, InputDevice.SOURCE_TOUCHSCREEN, DEFAULT_FLAGS);
I managed to solve that in a way that it handles not only a Activity root window but also to a windows that are created for various overlays such as menus and dialogs.
The code is here: https://github.com/TeskaLabs/Remote-Access-Android/blob/master/tlra/src/main/java/com/teskalabs/tlra/android/inapp/InAppInputManager.java
In our own application you can just use the input cli command:
public static final void tapButton (int x, int y) throws IOException, InterruptedException {
Runtime.getRuntime().exec("input tap " + x + " " + y);
}

Android ACTION_MOVE Threshold

I'm writing an app that involves writing on the screen using one's finger, or eventually a stylus. I have that part working. On ACTION_DOWN, starts drawing; on ACTION_MOVE, adds line segments; on ACTION_UP, finishes line.
The problem is that after ACTION_DOWN, apparently the pointer needs to move more than 10 pixels away from where it started (basically a 20x20 box around the starting point) in order to begin sending ACTION_MOVE events. After leaving the box, the move events are all quite accurate. (I figured out the 10 pixel thing by testing it.) Since this is meant to be used for writing or drawing, 10 pixels is a fairly significant loss: depending on how small you're trying to write, you can lose the first letter or two. I haven't been able to find anything about it - only a couple posts on a forum or two, like http://android.modaco.com/topic/339694-touch-input-problem-not-detecting-very-small-movements/page_pid_1701028#entry1701028. It seems to be present on some devices or systems and not others. No ideas as to how to get rid of it when you have it, though.
I'm using a Galaxy Tab 10.1, with Android 3.1. I've tried several different things to try to get rid of it: I've tried setting the event's coords to something else to see if I could trick it into thinking the cursor was in a different place; I tried re-dispatching the event with the coords changed (my handler reacted to the new points, but still didn't respond to movements in the 10-pixel radius.) I've searched through source code for any references to the effect, and found none (though I think it's from a different version of Android - code for 3.1 isn't released yet, is it?) I've searched for methods of querying the current state of the pointers, so I could just have a timer catch the changes until the pointer crossed the threshold. Couldn't find any way of getting pointer coords without a corresponding movement event. Nothing worked. Does anybody know anything about this, or have any ideas or work-arounds? Thank you.
-- Update: Drag and drop events show the same threshold.
I agree in part with the post by #passsy but come to a different conclusion. Firstly as mentioned, the mTouchSlop is the value that we are interested in and is exposed via ViewConfiguration.get(context).getScaledTouchSlop();
If you check the Android source for the ViewConfiguraton, the default value for TOUCH_SLOP is 8dip, but the comments mention that this value is a fallback only, and the actual value is defined when the Android OS for that specific device is built. (it may be more or less than this value. It appears to hold true for the Galaxy Tab devices)
More specific to the code sample, the mTouchSlop value is read from the ViewConfiguration when the View is initialised, but the value is only accessed in the onTouchEvent method. If you extend View and override this method (without calling super) then the behaviour of mTouchSlop in the View class is no longer relevant.
More telling (to us) was that when changing the Android settings to overlay touch events on the screen, a touch with a small drag does not register as a motion event, highlighted by the fact that the crosshairs from the Android OS do not move. From this our conclusion is that the minimal drag distance is being enforced at the OS level and your application will never be aware of drag events smaller than the TOUCH_SLOP value. You should also be aware that TOUCH_SLOP should not be used directly and the API deprecates the getTouchSlop method and recommends getScaledTouchSlop which takes the device screen size and pixel density into account. A side effect of this is that the actual minimum stroke length as perceived on different devices may vary. eg on a Galaxy Tab 2.0 7.0" it feels like we are able to draw shorter minimum strokes using the same code base than when running on a Galaxy Tab 2.0 10.1"
You should also be aware that (if you find a way to alter this value), this value determines how Android systems distinguish between taps and strokes. That is if you tap the screen but your finger moves slightly while performing the tap, it will be interpreted as a tap if it moved less than TOUCH_SLOP, but as a stroke if it moved more than TOUCH_SLOP. Therefore setting TOUCH_SLOP to a smaller value will increase the chance that a tap will be interpreted as a stroke.
Our own conclusion is that this minimum distance is not something that can be changed in practice and is something we need to live with.
The problem is on Line 6549 in class View https://github.com/android/platform_frameworks_base/blob/master/core/java/android/view/View.java
if (!pointInView(x, y, mTouchSlop)) {...}
/**
* Utility method to determine whether the given point, in local coordinates,
* is inside the view, where the area of the view is expanded by the slop factor.
* This method is called while processing touch-move events to determine if the event
* is still within the view.
*/
private boolean pointInView(float localX, float localY, float slop) {
return localX >= -slop && localY >= -slop && localX < ((mRight - mLeft) + slop) &&
localY < ((mBottom - mTop) + slop);
}
mTouchSlop is set in the constructor
mTouchSlop = ViewConfiguration.get(context).getScaledTouchSlop();
You can extend View and set mTouchSlop to zero. I don't see an other way to set mTouchSlop. There is no function like getApplicationContext.setScaledTouchSlop(int n).
Extend View Class.
Override pointInView method without "#Override" annotaion and set touchSlop = 0:
public boolean pointInView(float localX, float localY, float slop) {
slop = 0;
return localX >= -slop && localY >= -slop && localX < (getWidth() + slop) &&
localY < (getBottom() + slop);
}

Categories

Resources