How to disable showcase view in Android emulator - android

I'm using Android emulator in Jenkins to run functional tests (Cucumber). Everything works fine if the emulator doesn't contain showcase view at the start.
But if there is a showcase view my tests fail, because application runs behind this view.
I've tried to send keyevents using adb to the emulator before using it:
adb shell input keyevent KEYCODE_MENU;
but it doesn't help. I've tried KEYCODE_MENU, KEYCODE_BACK and other keys, but they don't disable this view.
I guess this property should be available as a system preference in the Android, but I can't find it :(
How can I disable showcase view in emulator? I have access to the emulator using adb.
UPDATE
There's no such flag which can be set in emulator config file or passed to emulator at the start.
And I still don't have a clean solution for this, but few workarounds exist. And that's understandable as showcase view is just a view from Launcher application and logic for that is inside Launcher application.
tricky way, but universal: Prepare custom Android Launcher application with disabled showcase (based on AOSP Launcher) and pre-install (replace default launcher) it on target emulator.
manual way, not universal: gather list of emulators used and coordinates of OK button on those, and send appropriate touch coordinates upon emulator start (as Christopher Orr proposed)

You should be able to send a tap event, for example:
adb shell input tap 700 900
That would tap at approximately the correct x,y pixel coordinate for that button on a Nexus 4.

Related

Calabash-android select image from the device library

I am running calabash-android test for an android application. I need to attach images and videos to a particular section. I can reach the gallery section and after that, I could not select an item from the device library. Is there any way to keep a copy of video and image in my test directory and access the whenever needed? Or is there any solution to access the gallery. And one more thing is that I am integrating the test on Circle-ci later. And I don't know how can I manage it when it is on Circle-ci. All kind of help is appreciated.
Calabash only has access to your app, not anything outside of it.
You will be able to put together a solution using adb to interact with the screen, using touch or keyboard events. However, this will be tied to one specific screen size, as it's done by pixels.
adb shell input tap x y
You could put the images/videos in the directory so they are the most recent, then use adb touch events to select them? It's a bit hacky but it should work.
If I've understood your question correctly, the circle ci part should probably be split out into another question, as it isn't really tied to the previous bit.
EDIT: We ended up getting it to work by running
`adb shell input tap 200 200`

How do I "shake" an Android device within the Android emulator to bring up the dev menu to debug my React Native app

I am working on a cross-platform React Native mobile app. I am writing console.log statements as I develop. I want to see these logging statements in Chrome while I'm running the Android app in the default Android emulator. According to Facebook's docs I just need to "shake the device". How do I do this in the Android emulator?
To access the in-app developer menu:
On iOS shake the device or press control + ⌘ + z in the simulator.
On Android shake the device or press hardware menu button (available on older >devices and in most of the emulators, e.g. in genymotion you can press ⌘ + m to >simulate hardware menu button click)
Within your app in the Android Emulator press Command + M on macOS or Ctrl + M on Linux and Windows.
With a React Native running in the emulator,
Press ctrl+m (for Linux, I suppose it's the same for Windows and ⌘+m for Mac OS X)
or run the following in terminal:
adb shell input keyevent 82
If you're using the new emulator that comes with Android Studio 2.0, the keyboard shortcut for the menu key is now Cmd+M, just like in Genymotion.
Alternatively, you can always send a menu button press using adb in a terminal:
adb shell input keyevent KEYCODE_MENU
Also note that the menu button shortcut isn't a strict requirement, it's just the default behavior provided by the ReactActivity Java class (which is used by default if you created your project with react-native init). Here's the relevant code from onKeyUp in ReactActivity.java:
if (keyCode == KeyEvent.KEYCODE_MENU) {
mReactInstanceManager.showDevOptionsDialog();
return true;
}
If you're adding React Native to an existing app (documentation here) and you aren't using ReactActivity, you'll need to hook the menu button up in a similar way. You can also call ReactInstanceManager.showDevOptionsDialog through any other mechanism. For example, in an app I'm working on, I added a dev-only Action Bar menu item that brings up the menu, since I find that more convenient than shaking the device when working on a physical device.
For Linux you click on the three dots "..." beside the emulator, on Virtual sensors check "Move" and then try quickly moving either x, y or z coordinates.
'Ctrl + m' works for Windows in the Android emulator to bring up the React-Native developer menu.
Couldn't find that documented anywhere.
Found my way here, guessed the rest... Good grief.
By the way: OP: You didn't mention what OS you were on.
I am on Mac OS so when I press Command, it enable zooming option.
Here is my solution
Open Configuration window [...] button
Go toSettings tab ->General tab -> Send keyboard shortcuts to field
Change value to Virtual device" as shown in the picture
After that focus on the emulator and press Command + M, the dev menu appears.
As while developing react native apps, we play with the terminal so much
so I added a script in the scripts in the package.json file
"menu": "adb shell input keyevent 82"
and I hit $ yarn menu
for the menu to appear on the emulator
it will forward the keycode 82 to the emulator via ADB
not the optimal way but I like it and felt to share it.
'Command + M' for OSX is working for me.
If you want to simulate a 1 second shake from terminal you can use the following command:
adb emu sensor set acceleration 100:100:100; sleep 1; adb emu sensor set acceleration 0:0:0
on linux ctrl+m should work but it doesn't for solving the problem click on the (...) (its extended controls) and then close that window.now you can open menu by ctrl+m. then:
click on the (...) (its extended controls)
close extended controls
ctrl+m
Use command + m(cmd + M) on MAC. Also make sure that you are accessing your application while you try to access the Debug Menui.e. your app must be running otherwise Cmd + M will just return the usual ordinary phone menu.
It might be not direct solution, but I've created a lib that allows you to use 3 fingers touch instead of shake to open dev menu, when in development mode
https://github.com/pie6k/react-native-dev-menu-on-touch
You only have to wrap your app inside:
import DevMenuOnTouch from 'react-native-dev-menu-on-touch';
// or: import { DevMenuOnTouch } from 'react-native-dev-menu-on-touch'
class YourRootApp extends Component {
render() {
return (
<DevMenuOnTouch>
<YourApp />
</DevMenuOnTouch>
);
}
}
It's really useful when you have to debug on real device and you have co-workers sitting next to you.
I was trying on a release build via adb install -r -d <app-release>.apk 🤦
Make sure you're running the debug build, then the menu will work via the shortcut or CLI.

Touch not working on android apps running on ARC

I have a laptop running windows 8.1 with a touch enabled screen.
I am using arc to run apk on the laptop.
The touch works fine on ARC welder but after launching apk, the touch event does not work on app's screen. Just the mouse event works.
I have tried multiple apks and everyone behaves same.
Do we have to enable something to make the touch events work?
This is a bug that has been reported here:
https://code.google.com/p/chromium/issues/detail?id=480745
So a way of getting around this at least for map applications is to specifically code the APK to handle the event and you must use the Arc-Metadata tag of
"enableSynthesizeTouchEventsOnWheel": false, this stops the runtime from trying to interpret your scroll wheel as a touch instead of the SOURCE_CLASS it is.
So for the input device SOURCE_CLASS_POINTER, you have to define ACTION_SCROLL so that a mouse scroll wheel will allow you to do actions. There is plenty of examples of how to do this all over StackOverflow though, just disable that meta-tag and most of the code you normally use for making things work in Android will work here.
Note: Haven't tried "enableSynthesizeTouchEventsOnClick": false, and testing with SOURCE_ that equals a external touchscreen but i believe it would work .

MonkeyRunner: easy way to determine coordinates for MonkeyDevice touch command?

I've started creating a MonkeyRunner script. This is going ok, but whenever I add a MonkeyDevice::touch command, I have to determine the input coordinates by trial-and-error. Basically I guess at the coordinates I want to touch and see if those coordinates result in the button touch I'm trying to test. That works, but it's a slow process. Is there anyway to determine the coordinates of UI controls, perhaps from the layout XML files?
I found how to do it. Use the Pixel Perfect view within Eclipse to determine the x & y coordinates of the UI element. Here's a quick overview:
1) Eclipse must be running
2) Your Android device must be connected (either the real device via ADB, or the emulator)
3) Run the hierarchy viewer (in /tools)
4) Select "Inspect screenshot"
The Pixel Perfect view will launch automatically. Just place the cross-hairs on the UI element. The x and y coordinates, along with the RGB values, are displayed below.
Here's the URL that got me started: http://developer.android.com/guide/developing/debugging/debugging-ui.html#pixelperfect
This post (monkeyrunner: interacting with views), may give you and idea of how to obtain the View's coordinates using AndroidViewClient.
Most of the Android versions you can enable pointer location in Settings->Developer Options. Once you enable it, it is easy to find out the (x,y) coordinates.
you can also use the HierarchyViewer tool in your AndroidSDK>tools folder to take screenshots of the current screen and examine that image pixel by pixel to get exact coordinates.
For devices older than Android 4.0, see the paragraph following this one. Android 4.0 and later include Settings->Developer options->Pointer location toggle which toggles a transparent ribbon across the top of the device screen with coordinates, velocities and touch-pressure readings including swipe tracks and x/y crosshairs for the current touch location. This is a lot easier than using alternatives such as Monkey Recorder and other means. In Android 4.2 and later, Developer options is hidden from the Settings menu and must be enabled by going to Settings->About tablet and tapping on Build number seven times. One can only presume that Android hid Developer options because of the increasingly user-experience-affecting options it contains and the number of consumer calls/complaints to device makers from people who played with it or whose children played with it.
In older versions which may not include a Pointer location toggle, there is an app on the Play Store (aka Android Market), Developers Tools. See link here: https://play.google.com/store/apps/details?id=com.ggb.development
It will show up with a gear icon and the caption Dev Tools on a device and provide similar functionality which Dev Tools in an AVD (Android Virtual Device) has. That includes Dev Tools ->Development settings>No Pointer Location/Pointer Location radio button toggle. Setting that toggle to Pointer Location provides the exact same functionality built-in with Android 4.0 and later. The same application also has a more limited pointer setting at Dev Tools->Pointer Location which limits pointer location to only a blank screen.
Enjoy!
Create a xyz.py file with below code and connect the device and run it in terminal like monkeyrunner xyz.py, then you will get your device in pc, then u click on any button in the recorder it will give you the coordinates, after that explore it to any file and you can use the coordinates.
from com.android.monkeyrunner import MonkeyRunner as mr
from com.android.monkeyrunner.recorder import MonkeyRecorder as recorder
device = mr.waitForConnection()
recorder.start(device)
I found an adb approach. Use adb shell getevent -l to get a list of events, grep for ABS_MT_POSITION (gets the line with touch events in hex) and finally use awk to get the relevant hex values, strip them of zeros and convert hex to decimal that monkey runner uses. This is all done with the following:
adb shell getevent -l | grep ABS_MT_POSITION --line-buffered | awk '{a = substr($0,54,8); sub(/^0+/, "", a); b = sprintf("0x%s",a); printf("%d\n",strtonum(b))}'
This continuously prints the x and y coordinates in the terminal only when you press on the device.

How to configure which project to debug?

I want to run two different Android apps from Eclipse (in two different types of Emulators) simultaneously but step through one of them, in which I've set breakpoints.
How do I configure which of the two apps is the one being debugged? It's not possible to debug both simultaneously, is it (with breakpoints in each project)?
If you take a look at this image, you can see that the "Devices" tab in the DDMS perspective has a "Bug" icon (the first one). If you highlight your running app (in whichever emulator it is) and click that button, debugging will become enabled for that app. If you have any breakpoints set in your code it will hit them and the debug view will open. That's the easiest way to debug a running application. I find this is the best approach because running the whole app in debug mode is slow. It's better to start debugging once you are close to where you actually want to step through.
You can also do this on an actual device, but you will need to set the debuggable="true" flag in the manifest, otherwise your app won't show in the list.

Categories

Resources