input tap through adb shell on FireTV stick - android

I've been playing around with Amazon FireTV Stick with adb.
the input tap X Y does not seem to work. I want to simulate mouse input from my laptop connected to the FireTV. input press seems to be working just fine. Any hints on inputting a tap by coordinates? The device is not rooted. I've got a screenshot and got coordinates on the image with just GIMP.
The main reason I believe this is possible, is because there is an app that shows a mouse cursor and taps in various locations work just fine. I suspect it's not the actual Android built in cursor. But it may well be, in which case I'll be on the hunt of displaying it when needed and controlling it. If you have a suggestion about this, please point me in the right direction.
My main idea is to take a screenshot, find the coordinates of the button on the screen and do a tap at those coordinates. I am not considering choosing this button with keys and doing input press. Any ideas?

If you want to run an app on the device to manage it, then using MotionEvent will allow you to simulate an action at a location.
If you want to control it via adb then a MonkeyRunner script would be the easiest (see https://developer.android.com/studio/test/monkeyrunner/index.html):
#
# usage: monkeyrunner tap_xy.py
#
# Import monkeyrunner modules
from com.android.monkeyrunner import MonkeyRunner, MonkeyDevice
# Connects to the current device
device = MonkeyRunner.waitForConnection()
# Click at X,Y
y = 400
x = 100
device.touch(x, y, MonkeyDevice.DOWN_AND_UP)

Create a UI automator service running , using UiDevice.click(x,y) method you can click on the screen.
UiDevice

Related

Calabash-android select image from the device library

I am running calabash-android test for an android application. I need to attach images and videos to a particular section. I can reach the gallery section and after that, I could not select an item from the device library. Is there any way to keep a copy of video and image in my test directory and access the whenever needed? Or is there any solution to access the gallery. And one more thing is that I am integrating the test on Circle-ci later. And I don't know how can I manage it when it is on Circle-ci. All kind of help is appreciated.
Calabash only has access to your app, not anything outside of it.
You will be able to put together a solution using adb to interact with the screen, using touch or keyboard events. However, this will be tied to one specific screen size, as it's done by pixels.
adb shell input tap x y
You could put the images/videos in the directory so they are the most recent, then use adb touch events to select them? It's a bit hacky but it should work.
If I've understood your question correctly, the circle ci part should probably be split out into another question, as it isn't really tied to the previous bit.
EDIT: We ended up getting it to work by running
`adb shell input tap 200 200`

How to disable showcase view in Android emulator

I'm using Android emulator in Jenkins to run functional tests (Cucumber). Everything works fine if the emulator doesn't contain showcase view at the start.
But if there is a showcase view my tests fail, because application runs behind this view.
I've tried to send keyevents using adb to the emulator before using it:
adb shell input keyevent KEYCODE_MENU;
but it doesn't help. I've tried KEYCODE_MENU, KEYCODE_BACK and other keys, but they don't disable this view.
I guess this property should be available as a system preference in the Android, but I can't find it :(
How can I disable showcase view in emulator? I have access to the emulator using adb.
UPDATE
There's no such flag which can be set in emulator config file or passed to emulator at the start.
And I still don't have a clean solution for this, but few workarounds exist. And that's understandable as showcase view is just a view from Launcher application and logic for that is inside Launcher application.
tricky way, but universal: Prepare custom Android Launcher application with disabled showcase (based on AOSP Launcher) and pre-install (replace default launcher) it on target emulator.
manual way, not universal: gather list of emulators used and coordinates of OK button on those, and send appropriate touch coordinates upon emulator start (as Christopher Orr proposed)
You should be able to send a tap event, for example:
adb shell input tap 700 900
That would tap at approximately the correct x,y pixel coordinate for that button on a Nexus 4.

Kivy: how to change window size properties and the difference between click and touch

I have two questions that I can not answer to myself:
How can I change the size of my window, if I do not know the exact size of the phone screen? I.e. my aim is to fit all screen sizes.
Is there any difference between clicking with mouse and touching with fingers in the code? If I write code for clicking, will it work with touch?
On mobile, your app should automatically fill the phone screen. You don't need to worry about it. On desktop, you can use the --size=WxH option to test a specific screen size, or use the screen module (-m screen:nexus7 for example - run kivy with -m screen to see the available options).
No. All mouse/touchscreen interactions are considered touches in Kivy. So using on_touch_down/on_touch_move/on_touch_up will work regardless of the input device. The only difference is that with touchscreen you could have multi-touch - but if you write your app assuming single-touch it will work the same on both mobile and desktop.

is it possible to invoke ICS screen shot function from adb?

I would need to be able to take screen dumps for testing, and with ICS there is now a screen shot function, that can be invoked by pressing (and holding) volume down and power button.
Is there any way to script this function through adb? (As I understand it there's no public java API for it). I have tried to use KeyEvent from java to emulate power and volume button, and I have tried to use adb keyevent and adb sendevent without success. I suspect that the power button also generate some low level calls that are not generated with the above methods.
So do anyone know if it is possible to call the function from adb?
If this is not possible, do someone know where in the source code this screen shot function exist? Maybe I can figure something out by reading it.
update
source code for capture the screen is in "frameworks/base/services/surfaceflinger/services/surfaceflinger/SurfaceFlinger.cpp" in a function called screenCapture. I do not know if it is possible to call it from jni, but I will try, because it would be great if I could take a screen shot through java.
Otherwise, #edthethird had a solution through android.amberfog.com/?p=168 that will make it possible to take a screenshot with the commandline.
Thank you for the help everyone!
In the form of adb commands, the following works on ICS devices:
adb shell /system/bin/screencap -p /sdcard/img.png
adb pull /sdcard/img.png img.png
See: http://a-more-common-hades.blogspot.com/2012/01/screenshot-command.html
Well this has nothing to do with ICS, but in eclipse, look at a Devices tab. There is a tiny little camera icon on the far right. (From right to left, it is "box", "line", "upside down triangle", and "camera". Click on this camera to take a screen shot of the currently selected device.
This works on any version of Android, not only ICS.
See this question:
Screenshot of the Nexus One from adb?
Basically you can pull the framebuffer using adb and convert it into a usable image yourself, or you can just use the command line utility provided by Google. Looking round I think you may need to tweak that utility a little to get it to work on newer versions of Android.
As the other answer points out though, its probably less hassle to just do this from Eclipse, unless you're trying to automate testing.
After looking into the source code, there is a library that does exactly what I want.
frameworks\base\cmds\screencap\screencap.cpp
The program can be executed on android by /system/bin/screencap .
So it is possible to execute in Java on android by Runtime.getRuntime().exec();
A drawback is that you need a special certificate for taking screenshots.

MonkeyRunner: easy way to determine coordinates for MonkeyDevice touch command?

I've started creating a MonkeyRunner script. This is going ok, but whenever I add a MonkeyDevice::touch command, I have to determine the input coordinates by trial-and-error. Basically I guess at the coordinates I want to touch and see if those coordinates result in the button touch I'm trying to test. That works, but it's a slow process. Is there anyway to determine the coordinates of UI controls, perhaps from the layout XML files?
I found how to do it. Use the Pixel Perfect view within Eclipse to determine the x & y coordinates of the UI element. Here's a quick overview:
1) Eclipse must be running
2) Your Android device must be connected (either the real device via ADB, or the emulator)
3) Run the hierarchy viewer (in /tools)
4) Select "Inspect screenshot"
The Pixel Perfect view will launch automatically. Just place the cross-hairs on the UI element. The x and y coordinates, along with the RGB values, are displayed below.
Here's the URL that got me started: http://developer.android.com/guide/developing/debugging/debugging-ui.html#pixelperfect
This post (monkeyrunner: interacting with views), may give you and idea of how to obtain the View's coordinates using AndroidViewClient.
Most of the Android versions you can enable pointer location in Settings->Developer Options. Once you enable it, it is easy to find out the (x,y) coordinates.
you can also use the HierarchyViewer tool in your AndroidSDK>tools folder to take screenshots of the current screen and examine that image pixel by pixel to get exact coordinates.
For devices older than Android 4.0, see the paragraph following this one. Android 4.0 and later include Settings->Developer options->Pointer location toggle which toggles a transparent ribbon across the top of the device screen with coordinates, velocities and touch-pressure readings including swipe tracks and x/y crosshairs for the current touch location. This is a lot easier than using alternatives such as Monkey Recorder and other means. In Android 4.2 and later, Developer options is hidden from the Settings menu and must be enabled by going to Settings->About tablet and tapping on Build number seven times. One can only presume that Android hid Developer options because of the increasingly user-experience-affecting options it contains and the number of consumer calls/complaints to device makers from people who played with it or whose children played with it.
In older versions which may not include a Pointer location toggle, there is an app on the Play Store (aka Android Market), Developers Tools. See link here: https://play.google.com/store/apps/details?id=com.ggb.development
It will show up with a gear icon and the caption Dev Tools on a device and provide similar functionality which Dev Tools in an AVD (Android Virtual Device) has. That includes Dev Tools ->Development settings>No Pointer Location/Pointer Location radio button toggle. Setting that toggle to Pointer Location provides the exact same functionality built-in with Android 4.0 and later. The same application also has a more limited pointer setting at Dev Tools->Pointer Location which limits pointer location to only a blank screen.
Enjoy!
Create a xyz.py file with below code and connect the device and run it in terminal like monkeyrunner xyz.py, then you will get your device in pc, then u click on any button in the recorder it will give you the coordinates, after that explore it to any file and you can use the coordinates.
from com.android.monkeyrunner import MonkeyRunner as mr
from com.android.monkeyrunner.recorder import MonkeyRecorder as recorder
device = mr.waitForConnection()
recorder.start(device)
I found an adb approach. Use adb shell getevent -l to get a list of events, grep for ABS_MT_POSITION (gets the line with touch events in hex) and finally use awk to get the relevant hex values, strip them of zeros and convert hex to decimal that monkey runner uses. This is all done with the following:
adb shell getevent -l | grep ABS_MT_POSITION --line-buffered | awk '{a = substr($0,54,8); sub(/^0+/, "", a); b = sprintf("0x%s",a); printf("%d\n",strtonum(b))}'
This continuously prints the x and y coordinates in the terminal only when you press on the device.

Categories

Resources