I have an Android fingerprint implementation working and I was looking to add UI tests with Espresso. One problem I can't find a solution to is how to emulate the scanning of a finger. There is an adb command
adb -e emu finger touch which should work on emulators.
Any idea on how to integrate something like that with Espresso?
From this question sending to an emulator is possible:
Runtime.getRuntime().exec("adb -e emu finger touch 1")
I expect, though can't show any working, that faking a fingerprint on a real device would require some special kind of security magic.
Edit: this doesn't work from within espresso tests.
Related
I am setting up a CI server which creates Android AVDs on the fly to run automated UI tests. This works great, but since the CI gets brand new emulators with brand new images each time a job is run, I get all of the Android welcome and first run and do you agree prompts. These break my tests.
Is there anyway to have the emulator auto-accept or dismiss all of these prompts?
Here are some examples:
After much digging and experimentation, I've figured out a way to work around both of the prompts mentioned above. There isn't a catch-all solution, but here it goes, piece by piece.
Chrome
Before starting up Chrome for the first time, run this command with adb:
./adb shell 'echo "chrome --disable-fre --no-default-browser-check --no-first-run" > /data/local/tmp/chrome-command-line'
Basically, this writes out a file to a known location which Chrome will check on boot. All of the flags specified in the command are obeyed, and those inheriently disable all of the first run prompts. This link was very helpful.
Keyboard (Gboard)
The Android shell has a tool called ime to manage the input methods available on the device. By default, on newer devices in English, the input method is LatinIME. This is the Latin implementation of Gboard, which provided the legal prompt shown above.
The easiest solution is to select another keyboard which doesn't have this prompt. I used the old SoftKeyboard:
./adb shell 'ime set com.example.android.softkeyboard/.SoftKeyboard'
You may obtain a list of available keyboards, like so:
./adb shell 'ime list -a -s'
Final Result
I am using appium_libto automate my tests in both iOS and Android
Currently, I am maintaining two suites, one for android seperately and another for iOS.
Is there a provision in this cucumber with appium_lib gem, to maintain the cases for both iOS and Android both in a single suite? And let appium automatically detect which device is connected, and execute the test cases tagged accordingly?
Like if Appium detects it to be iOS, run the test cases that are tagged under iOS cucumber -t #ios
I understand env.rb could be of help in this and in hooks.rb under before do I could provide a if and else condition with
if $driver.device_is_android? == 'true'
'cucumber -t #Android'
else
'cucumber -t #iOS'
end
Something like this? Is it possible? I am unable to find information on the internet with respect to auto detect of device OS, and hence posted this as a question with my thoughts.
I understand I can read the versions of the OS with the command adb shell getprop | grep build.version.release , and hence wanted to know if its possible to read the OS with one common command, apart from $driver.device_is_android?
Now to run the cucumber cases, for iOS and Android, I need to pass individual capabilities in appium. Is there a way to do it programmatically in the if and else condition itself?
I also ran into this dilema but I solved differently.
In our case we are using Cucumber on top of Appium so I just execute it with an argument that represents my platform, for example:
cucumber PLATFORM=iOS
Then in my env.rb I get the argument:
#platform = ENV['PLATFORM']
And then I just use it to get the capabilities I want:
if #platform == "DROID"
Appium::Driver.new(droid_caps, true)
else
Appium::Driver.new(ios_caps, true)
end
When you are running a test you always know which platform you are testing so it makes sense to send it as an argument.
After checking some answers:
Is there a way to unlock android phone via adb, if I KNOW the pattern and
Unlock Screen Galaxy Nexus ADB
Im trying to unlock a pattern using adb commands using this script https://github.com/mattwilson1024/android-pattern-unlock/blob/master/unlock.sh (for automation purpose).
Unfortunately the events arent working but I noticed that if I turn on the screen by myself (without using adb shell input keyevent 26) while the script is running, the events work and the pattern unlock.
Could someone explain me why this is happening and if there is another way to unlock patterns? maybe without using events (like I wrote before is for automation purpose, not for a phone that was locked).
Can you please try following capabilities in your code?
We can use capabilities, where we can directly set the unlockType and unlockKey..
unlockType: ['pin', 'password', 'pattern', 'fingerprint']
unlockKey; If you want to draw suppose 'L', then it would be 1478 in key section.
Let me know if this doesn't work.
Appium version - 1.6.4
Reference - https://github.com/appium/appium-android-driver/blob/master/docs/UNLOCK.md
This feature is available in latest appium release.
If you, for some reason, need to use Matt Wilson's script, this is not going to help you, but if your goal is to unlock your phone with a pattern lock via your computer, try Vysor, a Chrome extension that displays your phone's screen (including the lock screen) on your computer, allowing you to enter your lock pattern using the mouse or (if you have a touch display) your finger. For me, it worked fine, as I explain in more detail here.
Is it possible to create some kind of scripting with Genymotion?
I know there is shell commands, but I also need to simulate user touch (using adb input).
The idea is to create some simple test for my app, where the script will execute certain shell commands and adb as well.
Thanks
As Genymotion VMs are considered as a physical device by ADB, you can use MonkeyRunner.
This tool, provided by Google allows you to send touches events among other things.
You can script, using Python many inputs. Look at this gist, coming from this StackOverflow post, it gives good example for a complete gesture.
I'm trying to script a tap and hold on an Android device and I haven't worked out how.
I tried playing with the input command options but couldn't find anything relating to holding.
I've also looked at MonkeyRunner and could succesfully get the desired effect from a computer with the Android device connected, but couldn't run monkeyrunner on the device itself, without a computer.
Is there a way to script a tap and hold/tap down only/long tap on an Android device (I'm just using the shell for now) ? If so how ?
There are several ways to inject events. You can use Robotium or Monkeyrunner for testing. Or you can try to inject events directly this way.