Background
I want to try to send Unicode characters from the PC to the Android device via adb commands, as if they are being typed from a physical keyboard. Characters from various languages, for example, and not just English.
The problem
Such a thing is impossible using the commands I've found, as it seems to supports only a basic set of characters (probably only Ascii) :
adb shell "input keyboard text 'This goes to Android device'"
Because of this, I've decided to request it to be supported, here (please consider starring).
As a workaround, I thought that maybe I could develop an app that uses AccessibilityService and it would dispatch key events as if I'm typing via the device, and the PC would send such events using adb directly to the app via an Intent.
Thing is, after creating the app, I can't find which function I should use to do it.
What I've found
There are multiple things I've found:
onAccessibilityEvent - this is not for dispatching. It's only for getting events, which I don't think I will even need in this case.
getSoftKeyboardController - a function that can help with hiding the automatically shown keyboard, but that's about it...
dispatchGesture - a function that seems to be used only for dispatching touch events. It seems quite cool, but I don't see that it can handle keys.
performGlobalAction - seems promising, but sadly supports a very limited set of operations (back-key, home-key, etc...).
findFocus - I think I could use this and then dispatch a key event on what I get, but I'm not sure if this is a valid way to do it as I want to dispatch the event globally (plus maybe I would get null object, which means it might not be reliable). Not to mention that according to the options I see, it doesn't allow me to put the text right on the caret and that's it.
The question
Is it possible for AccessibilityService to dispatch a key event of Unicode characters, as if I type some text?
What's the best option to use for this?
This is an unconventional solution.
You can use the UI Automator framework to send Unicode characters through ADB to a focused text field like the input command does with ASCII characters (but fails with Unicode characters.)
First, implement an Android automation test that is capable of receiving broadcasts. Broadcasts will direct the test to do certain tasks. The implementation below will clear text and enter text using Base-64 or Unicode. I should not that the following can act like a background server until stopped.
AdbReceiver.kt
package com.example.adbreceiver
/*
* Test that runs with a broadcast receiver that accepts commands.
*
* To start the test:
* adb shell nohup am instrument -w com.example.adbreceiver.test/androidx.test.runner.AndroidJUnitRunner
*
* On Windows, the code page may need to be changed to UTF-8 by using the following command:
* chcp 65001
*
*/
import android.content.BroadcastReceiver
import android.content.Context
import android.content.Intent
import android.content.IntentFilter
import android.util.Base64
import androidx.test.core.app.ApplicationProvider
import androidx.test.ext.junit.runners.AndroidJUnit4
import androidx.test.filters.SdkSuppress
import androidx.test.platform.app.InstrumentationRegistry
import androidx.test.uiautomator.By
import androidx.test.uiautomator.UiDevice
import org.junit.Test
import org.junit.runner.RunWith
#RunWith(AndroidJUnit4::class)
#SdkSuppress(minSdkVersion = 18)
class AdbInterface {
private var mDevice: UiDevice? = null
private var mStop = false
private val ACTION_MESSAGE = "ADB_INPUT_TEXT"
private val ACTION_MESSAGE_B64 = "ADB_INPUT_B64"
private val ACTION_CLEAR_TEXT = "ADB_CLEAR_TEXT"
private val ACTION_STOP = "ADB_STOP"
private var mReceiver: BroadcastReceiver? = null
#Test
fun adbListener() {
mDevice = UiDevice.getInstance(InstrumentationRegistry.getInstrumentation())
if (mReceiver == null) {
val filter = IntentFilter(ACTION_MESSAGE)
filter.addAction(ACTION_MESSAGE_B64)
filter.addAction(ACTION_CLEAR_TEXT)
filter.addAction(ACTION_STOP)
mReceiver = AdbReceiver()
ApplicationProvider.getApplicationContext<Context>().registerReceiver(mReceiver, filter)
}
try {
// Keep us running to receive commands.
// Really not a good way to go, but it works for the proof of concept.
while (!mStop) {
Thread.sleep(10000)
}
} catch (e: InterruptedException) {
e.printStackTrace()
}
}
fun inputMsg(s: String?) {
mDevice?.findObject(By.focused(true))?.setText(s)
}
internal inner class AdbReceiver : BroadcastReceiver() {
override fun onReceive(context: Context, intent: Intent) {
when (intent.action) {
ACTION_MESSAGE -> {
val msg = intent.getStringExtra("msg")
inputMsg(msg)
}
ACTION_MESSAGE_B64 -> {
val data = intent.getStringExtra("msg")
val b64 = Base64.decode(data, Base64.DEFAULT)
val msg: String
try {
msg = String(b64, Charsets.UTF_8)
inputMsg(msg)
} catch (e: Exception) {
}
}
ACTION_CLEAR_TEXT -> inputMsg("")
ACTION_STOP -> {
mStop = true
ApplicationProvider.getApplicationContext<Context>()
.unregisterReceiver(mReceiver)
}
}
}
}
}
Here is a short demo running on an emulator. In the demo, the first text "你好嗎? Hello?" is entered with adb using Base-64 encoding. The second text, "你好嗎? Hello, again?" is entered as a straight Unicode string.
Here are three Windows .bat file to manage the interface. There is not reason that these can't be ported to other OSes.
start.bat
Starts the instrumented test that receives the command broadcasts. This will run until it receives the "ADB_STOP" command.
rem Start AdbReceiver and disconnect.
adb shell nohup am instrument -w com.example.adbreceiver.test/androidx.test.runner.AndroidJUnitRunner
send.bat
Used for the demo to send Unicode text but can be easily generalized
rem Send text entry commands to AdbReceiver. All text is input on the current focused element.
rem Change code page to UTF-8.
chcp 65001
rem Clear the field.
adb shell am broadcast -a ADB_CLEAR_TEXT
rem Input the Unicode characters encode in Base-64.
adb shell am broadcast -a ADB_INPUT_B64 --es msg 5L2g5aW95ZeOPyBIZWxsbz8=
rem Input the Unicode characters withouth further encoding.
adb shell am broadcast -a ADB_INPUT_TEXT --es msg '你好嗎? Hello, again?'
stop.bat
Stops the instrumented test.
rem Stop AdbReceiver.
adb shell am broadcast -a ADB_STOP
For some reason, the code only works on API 21+. It doesn't error out on earlier APIs but just silently fails.
This is just a proof of concept and the code needs more work.
Project AdbReceiver is on GitHub.
How to Run (Windows)
Start an emulator.
Bring up the project in Android Studio.
Under java->com.example.adbreceiver (andoidTest) right click AdbInterface.
In the pop-up menu, click "Run". This will start the instrumented test.
Bring up any app on the emulator and set the cursor into a data entry fields (EditText).
In a terminal window, enter
adb shell am broadcast -a ADB_INPUT_TEXT --es msg 'Hello World!'
This should enter "Hello World!" into the text field.
This can also be accomplished from the command line. See the "start.bat"
file above on how to start the test and the "stop.bat" file on how to stop it.
Notes on UI Automator and Assessibility
I took a look under the hood at how UI Automator works. As the OP guessed, UI Automator does use assessibility services on Android. The AssessibilityNode is used to set text. In the posted code above, the inputMesg() function has the line:
mDevice?.findObject(By.focused(true))?.setText(s)
findObject() is in UiDevice.java which looks like this:
public UiObject2 findObject(BySelector selector) {
AccessibilityNodeInfo node = ByMatcher.findMatch(this, selector, getWindowRoots());
return node != null ? new UiObject2(this, selector, node) : null;
}
setText() can be found in _UiObject2.java` and starts off like this:
public void setText(String text) {
AccessibilityNodeInfo node = getAccessibilityNodeInfo();
// Per framework convention, setText(null) means clearing it
if (text == null) {
text = "";
}
if (UiDevice.API_LEVEL_ACTUAL > Build.VERSION_CODES.KITKAT) {
// do this for API Level above 19 (exclusive)
Bundle args = new Bundle();
args.putCharSequence(AccessibilityNodeInfo.ACTION_ARGUMENT_SET_TEXT_CHARSEQUENCE, text);
if (!node.performAction(AccessibilityNodeInfo.ACTION_SET_TEXT, args)) {
// TODO: Decide if we should throw here
Log.w(TAG, "AccessibilityNodeInfo#performAction(ACTION_SET_TEXT) failed");
}
} else {
...
So, accessibility services is integral to UI Automator.
Here is what I found based on my research on input command.
input keyevent KEYCODE_N
These KEYCODE are predefined in here
Logs:
Input : injectKeyEvent: KeyEvent { action=ACTION_DOWN, keyCode=KEYCODE_N, scanCode=0, metaState=0, flags=0x0, repeatCount=0, eventTime=25861196, downTime=25861196, deviceId=-1, source=0x101 }
Input : injectKeyEvent: KeyEvent { action=ACTION_UP, keyCode=KEYCODE_N, scanCode=0, metaState=0, flags=0x0, repeatCount=0, eventTime=25861196, downTime=25861196, deviceId=-1, source=0x101 }
To get more insight on above logs refer this source file.
Even switching the keyboard layout to other languages in the device, the keyevent only prints the ASCII characters.
Here are the resources that I found for key event in accessibility services:
In Android framework the accessibility service already dispatch the key Event. Refer here
FLAG_REQUEST_FILTER_KEY_EVENTS: If this flag is set the accessibility service will receive the key events before applications allowing it implement global shortcuts.
Note: Analysis based on Android 9.0
Not sure, if I exactly understand what you are looking for.
But, here are some resources that you can refer to perform actions on behalf of users using the Accessibility Service.
performAction()
performGlobalAction()
Taking Actions for User
Or in case you are looking to develop a completely new accessibility service, you can refer to this:
Developing an Accessibility Service for Android
Related
I have an applet (taken from this HelloSTK2 repo) I've compiled and installed on a SysmoISIM-SJA2 card and I've lightly modified it to respond to a SELECT APDU. The modification looks like this:
public void process(APDU arg0) throws ISOException {
showHello();
}
private void showHello() {
ProactiveHandler proHdlr = ProactiveHandler.getTheHandler();
proHdlr.initDisplayText((byte)0, DCS_8_BIT_DATA, welcomeMsg, (short)0,
(short)(welcomeMsg.length));
proHdlr.send();
return;
}
All I did was move the existing showHello() function to the function that handles APDUs. It's my understanding from the Javacard documentation that the process() function should run and then return a status word of 9000, or an error code if applicable.
To SELECT the file I have an Android application I've written that uses iccOpenLogicalChannel and takes the AID as an argument. Using GlobalPlatformPro I can see that the applet is installed properly on the UICC and that it is listed as SELECTABLE, however when I run my Android application I get a STATUS_NO_SUCH_ELEMENT response which according to the iccOpenLogicalChannelResponse source means the AID is not found on the UICC.
The code for the Android app is very simple and looks like this:
val inputView: EditText = findViewById<EditText>(R.id.AID_INPUT)
val input: String = inputView.text.toString()
val ch = mTelephonyManager.iccOpenLogicalChannel(input)
Toast.makeText(this, ch.toString(), Toast.LENGTH_LONG).show()
mTelephonyManager.iccCloseLogicalChannel(ch.channel)
and the output of listing the applets on the card looks like this (truncated):
AID: d07002ca44, State: 01, Privs: 00
Instance AID: d07002ca44900102
I've tried both d07002ca44 and d07002ca44900102 and get the same response for both AIDs.
My question then: what steps do I need to take to ensure this applet is able to be selected by my Android application?
Worth noting probably that my Android app does have carrier privileges and I'm able to send APDUs to other applications such as the USIM and ISIM applets.
Edit 10 Apr 2020
I may have misnamed what we are looking for. It may actually be the linux user name of the installed app, rather than its UID. So how should we get that programmatically?
Original question below
When we use adb shell ps even on a non rooted android device, it returns process info where the UID comes in the form u0_xxxx where x represents some arbitrary digits in hex (except for system/root processes).
For example
u0_a464 31481 894 5015336 69200 0 0 S com.example.app_uid_checker
In this example app_uid_checker is my app in user space. When trying to obtain the UID programmatically, though, I get 10464 (the decimal representation of a464), and without the u0 prefix.
I tried
package manager's getApplicationInfo()
activity manager's getAllRunningProcess()
android.os.Process.myUid()
(following suggestions in this post on SO. They all return 10464. How can I get the "full" UID? (for want of a better term, I'm not sure what the u0_a464 version of the UID should be called, as compared to the 10464 version)).
Even if we can programmatically use adb shell ps I think it may not be a good way, as adb needs developer mode to be enabled.
You need to use the geteuid(2) and getpwuid(3) to retrieve the data, as the JVM does not expose it.
extern "C" JNIEXPORT jstring JNICALL Java_com_example_GetUser_getUser(JNIEnv* env) {
uid_t uid = geteuid();
struct passwd *user;
if (uid == -1)
return NULL;
user = getpwuid(uid);
return env->NewStringUTF(user->pw_name);
}
Full working project: https://gitlab.com/hackintosh5/getuser
I am trying to run some GMock/GTest tests on Android. These all run fine, but there's no output, as GMock logs to stdout.
I've tried the following with no luck (likely because it's for the Dalvik VM, and they've done away with that in Android 5):
$ adb shell stop
$ adb shell setprop log.redirect-stdio true
$ adb shell start
When log.redirect-stdio is set to true, there is still no output from stdio to logcat.
I've also tried custom several streambuf implementations with std::cout.rdbuf to try to direct stdout to logcat with __android_log_print, but none of these have printed anything to logcat.
Has anyone successfully managed to redirect stdout to logcat in Android 5?
I can add more details (such as streambuf implementations I've tried) if needed.
This isn't really a solution to the problem of redirecting stdout to logcat, but I'm suggesting it as a workaround in case it helps someone.
You can redirect stdout to a file instead:
freopen("/data/data/com.your.package/files/out.txt", "w", stdout);
... // Call GMock which prints to the file instead
fclose(stdout)
We can then cat the file to see the logged test results. Sadly Android doesn't have tail so the logging isn't nicely available in real time. (Unless you're good at spamming cat)
Do it with the old Java way: (but I am using kotlin, can anyone suggest a cleaner version?)
documentation: System.setOut()
import java.io.OutputStream
import java.io.PrintStream
private const val TAG = "MyActivity"
class LogcatOutputStream: OutputStream(){
private var line_buffer: StringBuilder = StringBuilder()
override fun write(b: Int){
when(b){
'\n'.toInt() -> {
Log.i(TAG, line_buffer.toString())
line_buffer.setLength(0)
}
else -> line_buffer.append(b.toChar())
}
}
}
// put this somewhere in the code, like onCreate() as shown
class MainActivity: Activity(){
override fun onCreate(savedInstanceState: Bundle?){
// some other code
PrintStream(LoggerOutputStream()).let{
System.setOut(it)
System.setErr(it)
}
// some other code
}
}
// result:
println("Hello World") // which is effectively System.out.println in Java
// with have the below output in logcat
I/MyActivity(<pid>): Hello World
// as a reminder, you can filter logcat by tags
adb logcat MyActivity:D
// to only show logs tagged with 'MyActivity' (same value as 'TAG' above)
Making an app at the moment for my personal use (rooted) and it requires getting certain pixels colors from the screen. I was trying to accomplish this through the Runtime.
Process p = Runtime.getRuntime().exec("screencap");
p.waitFor();
InputStream is = p.getInputStream()
BitmapFactory.decodeStream(is);
and I get factory returned null.
but if I dump the process to my sd card through adb -d shell screencap /sdcard/ss.dump and access it from my app
BitmapFactory.decodeFile("/sdcard/ss.dump");
all goes well.
So it there anyway to dump the stream straight into BitmapFactory within my app?
Thanks SO and please excuse the generally laziness/shortcuts of the example code.
This might help if not too far off your intended path. (I think you are using node / javascript). I spawned the ADB.EXE command producing a stream (and being 'jailed' on Windows the program must transform the stream to account for linefeed ending differences. So with that, I have working the following:
exports.capture = function(filename) {
// you'll need to map your requirement (decodeStream) instead
// of streaming to a file
var strm = fs.createWriteStream(path);
var cv = new Convert();
cv.pipe(strm);
var capture = spawn(cmd, args);
capture.stdout.on('data', function(data) {
cv.write(data);
});
capture.stdout.on('exit', function(data) {
cv.end();
});
}
To explain the process, spawn is running the ADB command, on windows, CR-LF are inserted (being a PNG stream), and stream is chunked / piped through a fs-transformation. Others on the web have described the process as adb.exe shell screencap -p | sed 's/\r$//' > output.file. And it does work. To be clear the conversion is CR-CR-LF => LF for us window jailed birds. And if you don't want to implement a SED and not use javascript regular expressions converting binary->strings->binary, you may follow my path. There is probably a simpler way, I just don't know what it is...
So, Convert() is an object with methods that converts the byte stream on the fly.
See the codewind blog link is: http://codewinds.com/blog/2013-08-20-nodejs-transform-streams.html to build your convert.
When using screencap from inside an app you must use su, i.e. root. When you do this via adb it runs under a different user id, which has more permissions than a normal Android app.
There are several examples how to use screencap, e.g. here.
I want to run any app (say Settings) after rebooting tablet. Can I use os.system or do I have to use other methods.
import os,time
for i in range(0,3):
os.system("adb reboot")
time.sleep(60)
Yes, you can use os.system to execute ADB commands. If you want to validate the command executed successfully, take a look at the check_output(...) function which is apart of the subprocess library. This code snipet is how I choose to implement the check_output function. For the full code look here.
def _run_command(self, cmd):
"""
Execute an adb command via the subprocess module. If the process exits with
a exit status of zero, the output is encapsulated into a ADBCommandResult and
returned. Otherwise, an ADBExecutionError is thrown.
"""
try:
output = check_output(cmd, stderr=subprocess.STDOUT)
return ADBCommandResult(0,output)
except CalledProcessError as e:
raise ADBProcessError(e.cmd, e.returncode, e.output)
To launch an application you can use the command am start -n yourpackagename/.activityname. To launch the Settings App, run adb shell am start -n com.android.settings/com.android.settings.Settings. This stackoverflow question shows you in detail the options you can use to start the application via a command line intent.
Other tips:
I created an ADB wrapper written in python along with a few other python utilities that may aid in what you are trying to accomplish. For example, instead of calling time.sleep(60) to wait for the reboot, you use adb to poll the status of the property sys.boot_completed and once the property is set the device has finished booting and you can launch any application. Below is a reference implementation you can use.
def wait_boot_complete(self, encryption='off'):
"""
When data at rest encryption is turned on, there needs to be a waiting period
during boot up for the user to enter the DAR password. This function will wait
till the password has been entered and the phone has finished booting up.
OR
Wait for the BOOT_COMPLETED intent to be broadcast by check the system
property 'sys.boot_completed'. A ADBProcessError is thrown if there is an
error communicating with the device.
This method assumes the phone will eventually reach the boot completed state.
A check is needed to see if the output length is zero because the property
is not initialized with a 0 value. It is created once the intent is broadcast.
"""
if encryption is 'on':
decrypted = None
target = 'trigger_restart_framework'
print 'waiting for framework restart'
while decrypted is None:
status = self.adb.adb_shell(self.serial, "getprop vold.decrypt")
if status.output.strip() == 'trigger_restart_framework':
decrypted = 'true'
#Wait for boot to complete. The boot completed intent is broadcast before
#boot is actually completed when encryption is enabled. So 'key' off the
#animation.
status = self.adb.adb_shell(self.serial, "getprop init.svc.bootanim").output.strip()
print 'wait for animation to start'
while status == 'stopped':
status = self.adb.adb_shell(self.serial, "getprop init.svc.bootanim").output.strip()
status = self.adb.adb_shell(self.serial, "getprop init.svc.bootanim").output.strip()
print 'waiting for animation to finish'
while status == 'running':
status = self.adb.adb_shell(self.serial, "getprop init.svc.bootanim").output.strip()
else:
boot = False
while(not boot):
self.adb.adb_wait_for_device(self.serial)
res = self.adb.adb_shell(self.serial, "getprop sys.boot_completed")
if len(res.output.strip()) != 0 and int(res.output.strip()) is 1:
boot = True