I am having a problem with running Android 7.1 on a virtual machine inside OpenNebula (QEMU/KVM) - more specifically, a problem with mouse integration.
When creating a machine, I tried setting the input two ways:
1. Mouse/USB
2. Tablet/USB
In case 1. the mouse cursor is not in the location it should be (more to the left/right/up/down, depending on speed of mouse movement).
In case2. I have to hold left-click to move the cursor - this makes it move but also works as holding down the finger on touch screen.
My question is: is there a way to set the correct input type, so I can use the mouse normally when working with the system?
Info: It was not a problem in Android 4.4, where the Table/USB option worked correctly.
Related
I've been doing development with Flutter. I was using a real hardware but wanted to use an emulator. I set it up but there's this issue.
I usually put my apps in virtual desktops and I switch between them with Ctrl+Meta+[arrow keys]. Each of them have a special purpose, usually desktop 1 is for browsing, desktop 2 is for development and desktop 4 is extra (for testing UI apps and emulator in this case).
However, after I launch Android emulator and switch to a different desktop, it results in the issue seen below:
The red area you see stays unresponsive for other apps such as browser, VSCode etc. (i.e. does not respond to clicks). That area is where the emulator is on Desktop 4. And it also keeps showing multitouch tool.
It's a bit annoying so I wanted to ask if anyone got this issue and if they have any solutions.
Thanks in advance.
Environment
Kubuntu 20.04
AMD Radeon R7 240/340
Solution 1
I have, somehow in a weird way, found a workaround for this issue.
After having this issue, go to the emulator window and press the magnifying glass with a plus button icon twice.
After that, it will drop that weird multitouch state and it won't bother you even as you switch desktop.
Rebooting computer will result in the same bug, however, you have to do it again.
Solution 2
Another method that works is to simply resize the window. If you have a device frame on your emulator, you can simply hold down the Meta key and hold right click and resize your window.
Solution 3
You can also maximize the window. If you have device frame around emulator window, you can press ALT+F3, which opens up window options menu, then click "Maximize". This will get rid of it.
I'm using Android emulator in Jenkins to run functional tests (Cucumber). Everything works fine if the emulator doesn't contain showcase view at the start.
But if there is a showcase view my tests fail, because application runs behind this view.
I've tried to send keyevents using adb to the emulator before using it:
adb shell input keyevent KEYCODE_MENU;
but it doesn't help. I've tried KEYCODE_MENU, KEYCODE_BACK and other keys, but they don't disable this view.
I guess this property should be available as a system preference in the Android, but I can't find it :(
How can I disable showcase view in emulator? I have access to the emulator using adb.
UPDATE
There's no such flag which can be set in emulator config file or passed to emulator at the start.
And I still don't have a clean solution for this, but few workarounds exist. And that's understandable as showcase view is just a view from Launcher application and logic for that is inside Launcher application.
tricky way, but universal: Prepare custom Android Launcher application with disabled showcase (based on AOSP Launcher) and pre-install (replace default launcher) it on target emulator.
manual way, not universal: gather list of emulators used and coordinates of OK button on those, and send appropriate touch coordinates upon emulator start (as Christopher Orr proposed)
You should be able to send a tap event, for example:
adb shell input tap 700 900
That would tap at approximately the correct x,y pixel coordinate for that button on a Nexus 4.
I have an application in which I need to implement image editing, which also includes pinch zooming. I am done with pinch zooming but I can test this only on device, not on emulator.
Is there any way for testing pinch zooming in android emulator, any shortcut key or any other way?
With a mouse:
Press and hold Ctrl and press and hold left mouse and while doing that move your mouse.
With a trackpad:
Press and hold Ctrl and press and hold your trackpad and move with a finger to the LEFT and RIGHT (not up and down).
Since ADT 17, it is possible to use a physical Android 4.0 device to send multitouch gestures to the emulator.See Android's instructions here.
Multi-Touch
The emulator supports multi-touch input, as an experimental feature in r17, using a tethered Android device running the SdkControllerMultitouch application. The application contains an activity that monitors touch inputs and sends them to the emulator. This requires an Android 4.0 or later system image.
The activity displays the content of the emulator screens to help with interactivity. It is recommended to enable "show touches" in the Developer section of the Settings on the emulator to see exactly where the touches are sent.
The SdkControllerSensor application source code is located in
$SDK/tools/apps/SdkControllerMultitouch/
But this is still a workaround since we need a real device. My advice would be to test your app directly on a real device as it's more robust, and even more performant.
I know it's late reply but this might save someones time.
Double-click and then hold down the second click and move the mouse up to zoom out or down to zoom in.
Hope this works!
On a Mac running the latest version of AndroidStudio and a vanilla Nexus 5 API 24 emulator all you have to do is keep cmd (⌘) pressed.
The drag points will appear.
After that just left click and drag anywhere on the screen!
I know this is old but this might still help someone.
On mac:
To zoom in use double click
(on track pad)
To zoom out use Command + Shift + click (on track pad)
OK, I was experimenting a bit, and on the bluestacks android emulator, I've figured it out.
It is Ctrl+Mouse wheel.
Works for me, hope it helps you too.
There has been some progress in this field with android tools release 17 : you can use a device to control the emulator : http://developer.android.com/sdk/tools-notes.html.
Otherwise, it looks like testing on real device is just better for this case (and in general its also much faster).
For mac: Use the ⌥ Option key and then use your mouse or trackpad to pinch/zoom. This is working on the android studio emulator.
You can find all the shortcuts in the extended controls menu, by pressing the three dots on the menu and then navigating to "help".
Click 3 points (like preferences on emulator)
Settings
set param "Send keyboard shortcuts to" from "Virtual device" to "Emulator controls"
now gestures work perfect!
An Android Emulator doesn't support multi-touch you can't test it on emulators. Please use real-device for multitouch testing.
For recent developments in this regard read this - http://tools.android.com/tips/hardware-emulation
PinchZoom works on Multi-Touch.
If the device is supporting multi-touch then only your code will work else it wont work.
As the Android Emulators doesn't support multi-touch you can't test it on emulators.
Still new ADT has feature But I never tried this
CTRL + a click on left side of the view screen will zoom out.
Double click on a location will zoom in.
North and South -> Scroll wheel - up and down
West-> CTRL+Scroll up at left side of the screen
East-> CTRL+Scroll up at right side of the screen
On macbook with Android studio Bumblebee, I have to hold control first which lets the drag pointer to appear on the screen. Then all I need to do is double tap, hold the second tap, and drag to zoom-in/out.
As of March 1, 2016 for the Android Studio's emulator on Windows. The pinch key on the emulator is the Alt key. Hold down Alt and click and drag your mouse button.
The CTRL key might still work for you, but it doesn't work for me.
You can not able to test pinch zoom in Emulator. Please use Real device to test pinch zoom effect.
I have not try with youwave. May be you can able to test pinch in that environment.
Not sur but just try with it.
Enjoy Coding....
I need to detect the double touch in emulator, while running my app. I used ScaleGestureDetector in my app. I need to check that in my emulator. In iPhone by pressing option, we can use the two finger touch. Is there any way to use the ScaleGestureDetector in emulator.
http://android-developers.blogspot.com/2012/03/updated-sdk-tools-and-adt-revision-17.html
"After adding webcam support and sensor emulation, we are adding experimental support for Multi-Touch input through a tethered Android device." ....
But it looks like "This requires an Android 4.0 or later system image."
If you want to use two-finger pan you need to right click. On a Mac with a trackpad you would need to plug in a mouse or switch your secondary click to bottom corner. Then you can hold the command button. Otherwise hold the command button to see the finger indicators then click and drag to pinch/zoom
I've started creating a MonkeyRunner script. This is going ok, but whenever I add a MonkeyDevice::touch command, I have to determine the input coordinates by trial-and-error. Basically I guess at the coordinates I want to touch and see if those coordinates result in the button touch I'm trying to test. That works, but it's a slow process. Is there anyway to determine the coordinates of UI controls, perhaps from the layout XML files?
I found how to do it. Use the Pixel Perfect view within Eclipse to determine the x & y coordinates of the UI element. Here's a quick overview:
1) Eclipse must be running
2) Your Android device must be connected (either the real device via ADB, or the emulator)
3) Run the hierarchy viewer (in /tools)
4) Select "Inspect screenshot"
The Pixel Perfect view will launch automatically. Just place the cross-hairs on the UI element. The x and y coordinates, along with the RGB values, are displayed below.
Here's the URL that got me started: http://developer.android.com/guide/developing/debugging/debugging-ui.html#pixelperfect
This post (monkeyrunner: interacting with views), may give you and idea of how to obtain the View's coordinates using AndroidViewClient.
Most of the Android versions you can enable pointer location in Settings->Developer Options. Once you enable it, it is easy to find out the (x,y) coordinates.
you can also use the HierarchyViewer tool in your AndroidSDK>tools folder to take screenshots of the current screen and examine that image pixel by pixel to get exact coordinates.
For devices older than Android 4.0, see the paragraph following this one. Android 4.0 and later include Settings->Developer options->Pointer location toggle which toggles a transparent ribbon across the top of the device screen with coordinates, velocities and touch-pressure readings including swipe tracks and x/y crosshairs for the current touch location. This is a lot easier than using alternatives such as Monkey Recorder and other means. In Android 4.2 and later, Developer options is hidden from the Settings menu and must be enabled by going to Settings->About tablet and tapping on Build number seven times. One can only presume that Android hid Developer options because of the increasingly user-experience-affecting options it contains and the number of consumer calls/complaints to device makers from people who played with it or whose children played with it.
In older versions which may not include a Pointer location toggle, there is an app on the Play Store (aka Android Market), Developers Tools. See link here: https://play.google.com/store/apps/details?id=com.ggb.development
It will show up with a gear icon and the caption Dev Tools on a device and provide similar functionality which Dev Tools in an AVD (Android Virtual Device) has. That includes Dev Tools ->Development settings>No Pointer Location/Pointer Location radio button toggle. Setting that toggle to Pointer Location provides the exact same functionality built-in with Android 4.0 and later. The same application also has a more limited pointer setting at Dev Tools->Pointer Location which limits pointer location to only a blank screen.
Enjoy!
Create a xyz.py file with below code and connect the device and run it in terminal like monkeyrunner xyz.py, then you will get your device in pc, then u click on any button in the recorder it will give you the coordinates, after that explore it to any file and you can use the coordinates.
from com.android.monkeyrunner import MonkeyRunner as mr
from com.android.monkeyrunner.recorder import MonkeyRecorder as recorder
device = mr.waitForConnection()
recorder.start(device)
I found an adb approach. Use adb shell getevent -l to get a list of events, grep for ABS_MT_POSITION (gets the line with touch events in hex) and finally use awk to get the relevant hex values, strip them of zeros and convert hex to decimal that monkey runner uses. This is all done with the following:
adb shell getevent -l | grep ABS_MT_POSITION --line-buffered | awk '{a = substr($0,54,8); sub(/^0+/, "", a); b = sprintf("0x%s",a); printf("%d\n",strtonum(b))}'
This continuously prints the x and y coordinates in the terminal only when you press on the device.