we are building an Android app which is tested by using Appium. Now I would like to see the test coverage of our Appium tests.
I think this is possible, because Jacoco supports offline instrumentation (http://www.eclemma.org/jacoco/trunk/doc/offline.html).
And even the documentation of the jacoco gradle plugin says:
While all tasks of type Test are automatically enhanced to provide coverage information when the java plugin has been applied, any task that implements JavaForkOptions can be enhanced by the JaCoCo plugin. That is, any task that forks Java processes can be used to generate coverage information.
see https://docs.gradle.org/current/userguide/jacoco_plugin.html
But how do I have to write the build.gradle so our acceptance debug flavor is instrumented and the exec file is written to the Smartphone when the Appium tests are executed or even manual test cases are executed?
Because then I can extract the exec file and send it so SonarQube for further analysis.
Thanks
Ben
Finally I managed it to get it working and I want to share the solution with you:
enable instrumentation for your buildType and configure SonarQube accordingly
e.g.
...
apply plugin: 'jacoco'
...
android {
...
productFlavors {
acceptance {
applicationId packageName + ".acceptance"
buildTypes {
debug {
testCoverageEnabled true
}
}
}
}
}
sonarRunner {
sonarProperties {
property "sonar.host.url", "..."
property "sonar.jdbc.url", sonarDatabaseUrl
property "sonar.jdbc.driverClassName", sonarDatabaseDriverClassName
property "sonar.jdbc.username", sonarDatabaseUsername
property "sonar.jdbc.password", sonarDatabasePassword
property "sonar.sourceEncoding", "UTF-8"
property "sonar.sources", "src/main"
property "sonar.tests", "src/test"
property "sonar.inclusions", "**/*.java,**/*.xml"
property "sonar.import_unknown_files", "true"
property "sonar.java.binaries", "build/intermediates/classes/acceptance/debug"
property "sonar.junit.reportsPath", "build/test-results/acceptanceDebug"
property "sonar.android.lint.report", "build/outputs/lint-results.xml"
property "sonar.java.coveragePlugin", "jacoco"
property "sonar.jacoco.reportPath", "build/jacoco/testAcceptanceDebugUnitTest.exec"
// see steps below on how to get that file:
property "sonar.jacoco.itReportPath", "build/jacoco/jacoco-it.exec"
property "sonar.projectKey", projectKey
property "sonar.projectName", projectName
property "sonar.projectVersion", appVersionName
}
}
add the following to your AndroidManifest.xml
<receiver
android:name=".util.CoverageDataDumper"
tools:ignore="ExportedReceiver">
<intent-filter>
<action android:name="org.example.DUMP_COVERAGE_DATA"/>
</intent-filter>
</receiver>
CoverageDataDumper should look like that:
public class CoverageDataDumper extends BroadcastReceiver {
private static final Logger LOG = LoggerFactory.getLogger( CoverageDataDumper.class );
#Override
public void onReceive( Context context, Intent intent ) {
try {
Class
.forName( "com.vladium.emma.rt.RT" )
.getMethod( "dumpCoverageData", File.class, boolean.class, boolean.class )
.invoke( null,
new File( App.getContext().getExternalFilesDir( null ) + "/coverage.ec" ),
true, // merge
false // stopDataCollection
);
}
catch ( Exception e ) {
LOG.error( "Error when writing coverage data", e );
}
}
}
Then run your Appium test cases with the acceptance flavor app (with instrumented classes). Before you call "Reset App" or "Close Application" make sure to call the following methods (just a draft, but I think you get the idea):
// intent is "org.example.DUMP_COVERAGE_DATA"
public void endTestCoverage( String intent ) {
if ( driver instanceof AndroidDriver ) {
((AndroidDriver) driver).endTestCoverage( intent, "" );
}
}
public void pullCoverageData( String outputPath ) {
String coverageFilePath = (String) appiumDriver.getCapabilities().getCapability( "coverageFilePath" );
if ( coverageFilePath != null ) {
byte[] log = appiumDriver.pullFile( coverageFilePath );
MobileAppLog.writeLog( new File( outputPath ), log );
}
else {
throw new AppiumLibraryNonFatalException(
"Tried to pull the coverage data, but the coverageFilePath wasn't specified." );
}
}
outputPath could be for example: /sdcard/Android/data/org.example.acceptance/files/coverage.ec
Now the Jacoco data is written to the Smartphone. Next we need to download that file. You can use
appiumDriver.pullFile( logFilePath );
Now you need to copy the file "jacoco-it.exec" (which should always be appended when you pull the file) into build/jacoco/jacoco-it.exec see gradle.build above and run
gradlew sonarRunner
In SonarQube add the Integration Test Coverage Widget and you should see now some values...
Unfortunately code coverage won't work if you are using retrolambda (as we do). Retrolambda will generate anonymous classes which are not part of the source files - so SonarQube cannot match them correctly and displays a much lower code coverage than it actually is. If someone finds a solution for that, I would be very happy :-)
I solved this problem by adding broadcast receiver to the application you test! (you can add the receiver only to debug folder cause no need it for to exist in main source)
public class CoverageReceiver extends BroadcastReceiver {
private static final String EXEC_FILE_PATH = "/mnt/sdcard/coverage.exec";
private static final String TAG = "CoverageJacoco";
private static final String BROADCAST_RECEIVED_MESSAGE = "EndJacocoBroadcast broadcast received!";
private static final String EMMA_CLASS = "com.vladium.emma.rt.RT";
private static final String EMMA_DUMP_METHOD = "dumpCoverageData";
#Override
public void onReceive(Context context, Intent intent) {
try {
Log.d(TAG, BROADCAST_RECEIVED_MESSAGE);
Class.forName(EMMA_CLASS)
.getMethod(EMMA_DUMP_METHOD, File.class, boolean.class,
boolean.class)
.invoke(null, new File(EXEC_FILE_PATH), true,
false);
} catch (Exception e) {
Log.d(TAG, e.getMessage());
}
}
}
In manefist add (you can add this debug folder so it won't exist in main source)
<?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android" >
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<application>
<receiver android:name=".CoverageReceiver">
<intent-filter>
<action android:name="com.example.action" />
</intent-filter>
</receiver>
</application>
In the build.gradle of the application I added
apply plugin: 'jacoco'
jacoco {
toolVersion = "0.7.4+"
}
model {
android {
compileSdkVersion 23
buildToolsVersion "23.0.2"
defaultConfig {
applicationId "com.example.app"
minSdkVersion.apiLevel 23
targetSdkVersion.apiLevel 23
versionCode 12
versionName "1.11"
}
buildTypes {
debug {
testCoverageEnabled true
}
}
you build your application as debug, than install and run it.
send broadcast through ADB "adb shell am broadcast -a com.example.action" to create coverage.exec
pull coverage from device - adb pull /mnt/sdcard/coverage.exec
after you run this you need to create the coverage from the file
**
* This task is used to create a code coverage report via the Jcoco tool.
*/
task jacocoTestReport(type: JacocoReport) {
def coverageSourceDirs = [
'src/main/java',
]
group = "Reporting"
description = "Generates Jacoco coverage reports"
reports {
csv.enabled false
xml{
enabled = true
destination "${buildDir}/jacoco/jacoco.xml"
}
html{
enabled true
destination "${buildDir}/jacocoHtml"
}
}
classDirectories = fileTree(
dir: 'build/intermediates/classes',
excludes: ['**/R.class',
'**/R$*.class',
'**/BuildConfig.*',
'**/Manifest*.*',
'**/*Activity*.*',
'**/*Fragment*.*'
]
)
sourceDirectories = files(coverageSourceDirs)
executionData = files('build/coverage.exec')
}
this task is one way to create coverage files
in coverageSourceDirs add all the locations of your applicaiton source code, so it will know which code to take and create coverage based on them
executionData is the location where you put the coverage.exec you pulled from the device
Run the task
the files will created for html and xml you can also add csv (notice it will be create in the build folder of the application)!
Need to know, you must run the task against the same code you built your application debug version
Related
Tried the Android code as given in the Docs.
I am unable to exclude the files.
testOptions {
unitTests.all {
if (name == "testDebugUnitTest") {
kover {
disabled = false
binaryReportFile.set(file("$buildDir/custom/debug-report.bin"))
// includes = ['com.example.*']
excludes = [
"com.makeappssimple.abhimanyu.financemanager.android.navigation.di.NavigationManagerModule"
]
}
}
}
}
I expect this code to exclude com.makeappssimple.abhimanyu.financemanager.android.navigation.di.NavigationManagerModule file, but it is not working.
Also tried with wildcard names.
Kover setup,
plugins {
id "com.android.application"
id "kotlin-android"
id "kotlin-kapt"
id "dagger.hilt.android.plugin"
id "com.google.gms.google-services"
id "org.jetbrains.kotlinx.kover" version "0.5.0"
}
// Kover
kover {
disabled = false // true to disable instrumentation of all test tasks in all projects
coverageEngine.set(kotlinx.kover.api.CoverageEngine.INTELLIJ) // change instrumentation agent and reporter
intellijEngineVersion.set('1.0.656') // change version of IntelliJ agent and reporter
jacocoEngineVersion.set('0.8.7') // change version of JaCoCo agent and reporter
generateReportOnCheck = true // false to do not execute `koverMergedReport` task before `check` task
disabledProjects = [] // ["project-name"] or [":project-name"] to disable coverage for project with path `:project-name` (`:` for the root project)
instrumentAndroidPackage = false // true to instrument packages `android.*` and `com.android.*`
runAllTestsForProjectTask = false // true to run all tests in all projects if `koverHtmlReport`, `koverXmlReport`, `koverReport`, `koverVerify` or `check` tasks executed on some project
}
P.S: I have also raised the same issue here
You can also try to add a filter the kover plugin like this version 0.6.0
kover {
instrumentation {
excludeTasks.add("testReleaseUnitTest")
}
filters {
classes {
excludes += listOf(
"dagger.hilt.internal.aggregatedroot.codegen.*",
"hilt_aggregated_deps.*",
"*ComposableSingletons*",
"*_HiltModules*",
"*Hilt_*",
"*BuildConfig",
".*_Factory.*",
)
}
}
}
Thanks to shanshin's comment, I understood the issue.
Fixed the unit test coverage report exclusion list using this code
tasks.koverHtmlReport {
excludes = [
// Hilt
"*.di.*",
"dagger.hilt.**",
"hilt_aggregated_deps.*",
"<package_name>.*.*_Factory",
// Room
// MyRoomDatabase_AutoMigration_*_Impl, *Dao_Impl
"<package_name>.*.*_Impl*",
// BuildConfig
"<package_name>.BuildConfig",
// Moshi - Json Adapter
"<package_name>.*.*JsonAdapter",
]
}
The exclusion list mentioned in the question is to exclude tests.
I have 5 different apps in my monorepo, the build.gradle for each of these apps contains a nearly indentical task, which along with an upload script is in charge of uploading files matching to our Jenkins file server.
we can call this task releaseBuilds()
This code is in a file called artifactUploads.gradle, this file also contains a custom plugin called UploadPlugin. Now my concern is with the System.out.println() line in the task.
apply plugin: UploadPlugin
task copyReleaseBuilds() {
doLast {
def dir = project.file('build/outputs/apk')
dir.eachFileRecurse(FILES) { file ->
if (file.getName().contains(".apk") && file.getName().contains("release")) {
System.out.println("uploading ${file.getName()}")
def rootDir = project.getRootDir().path
def filePath = file.path
def fileName = file.name.replace('app-', '')
exec {
workingDir rootDir
commandLine "./uploadBuild.sh", filePath, "$dirPath" + fileName
}.assertNormalExitValue()
}
}
}
}
class UploadPlugin implements Plugin<Project> {
void apply(Project project) {
//Create container instance for config object
//plugin stuff
NamedDomainObjectContainer<Config> configContainer =
project.container(Config)
project.extensions.add('directory', configContainer)
project.task('uploadConfig') << {
def uploadConfig = project.extensions.getByName('directory')
}
}
class Config{
String name
String dirPath
Config(final String name){
this.name = name
}
}
In the build.gradle file for each of my apps/projects. I have the line
apply from: "${rootDir}/artifactUploads.gradle
along with
directory{
name{
dirPath = "hca/"
println "stuff"
}
}
directory is some object that is created in my CustomPlugin.
Now when I build the line println "Stuff" is output to the gradle log, however I can not find any console outputs that are put in artifactUploads.gradle which contains my CustomPlugin and the task that all my apps share.
From what I understand, since each of my apps build.gradle has
apply from: artifactUploads.gradle, each app should now run the copyReleaseBuilds() task, with the custom parameter dirPath which is set in the DSL part of each app build.gradle.
Is there another reason the build would not be hitting that System.out.println()? I'm very new to gradle in general so thanks for your help
I'm trying to configure an Android library project to deploy multiple artifacts to a locally hosted Maven repository. I've gotten far enough such that both artifacts have their own POM generated, and it gets deployed properly to the repo, with the following script:
android {
// Publish both debug and release
publishNonDefault true
}
uploadArchives {
repositories.mavenDeployer {
def majorVersion = 1
def minorVersion = 1
def buildVersion = project.properties.get('RELEASE', '0').toInteger()
addFilter('release') { artifact, file ->
file.name.contains('release')
}
addFilter('debug') { artifact, file ->
file.name.contains('debug')
}
activePomFilters.each { filter ->
pom(filter.name) {
groupId = 'com.redacted'
artifactId = 'redacted'
packaging = 'aar'
version = "${majorVersion}.${minorVersion}.${buildVersion}"
if (!project.hasProperty('RELEASE')) {
version += "-SNAPSHOT"
}
if (filter.name == 'debug') {
artifactId += '-debug'
}
}
}
}
}
The expected delivery is:
com/
redacted/
redacted/
1.1.0-SNAPSHOT/
redacted-debug/
1.1.0-SNAPSHOT/
Which happens as expected, but it seems to publish the artifacts with an additional suffix (which breaks the dependency discovery), and I cannot figure out where it is coming from, or how to change it. What I see is:
com/redacted/redacted/1.1.0-SNAPSHOT/
redacted-1.1.0-20150717.213849-1-release.aar
redacted-1.1.0-20150717.213849-1-release.aar.md5
redacted-1.1.0-20150717.213849-1-release.aar.sha1
redacted-1.1.0-20150717.213849-1.pom
redacted-1.1.0-20150717.213849-1.pom.md5
redacted-1.1.0-20150717.213849-1.pom.sha1
For some reason, it's appending the date, as well as a -release suffix to only the AAR-related files, but not the POM files. If I manually rename these files, everything works as expected. For example, this is what I expect to be output:
com/redacted/redacted/1.1.0-SNAPSHOT/
redacted-1.1.0-20150717.213849-1.aar
redacted-1.1.0-20150717.213849-1.aar.md5
redacted-1.1.0-20150717.213849-1.aar.sha1
redacted-1.1.0-20150717.213849-1.pom
redacted-1.1.0-20150717.213849-1.pom.md5
redacted-1.1.0-20150717.213849-1.pom.sha1
How can I change how these files are delivered?
What you are running in to is this (emphasis mine):
Important: When enabling publishing of non default, the Maven publishing plugin will publish these additional variants as extra packages (with classifier). This means that this is not really compatible with publishing to a maven repository. You should either publish a single variant to a repository OR enable all config publishing for inter-project dependencies.
See the documentation: http://tools.android.com/tech-docs/new-build-system/user-guide#TOC-Library-Publication
The suffixes release and debug that you see are the classifiers introduced by enabling publishing of non-default artifacts. The <artifact> elements in build/ivy.xml, which is used as the basis for the Maven configuration, contain these classifiers.
Iterating over the artifacts in the configurations and removing the classifier does not work. Although setting the classifier is allowed, its original value is kept.
But what does work is wrapping the original artifacts. The wrapper will always return null for a classifier. This does result in the release and debug artifact having the same fully-qualified ID (= name + classifier), which results in only one artifact being published. This can be fixed by using a different name for debug artifacts:
class UnclassifiedPublishArtifact implements PublishArtifact {
private PublishArtifact delegatee;
private boolean isDebugArtifact;
UnclassifiedPublishArtifact(PublishArtifact delegatee, isDebugArtifact) {
this.delegatee = delegatee
this.isDebugArtifact = isDebugArtifact
}
#Override
String getName() {
return delegatee.name + (isDebugArtifact ? '-debug' : '')
}
#Override
String getExtension() {
return delegatee.extension
}
#Override
String getType() {
return delegatee.type
}
#Override
String getClassifier() {
return null
}
#Override
File getFile() {
return delegatee.file
}
#Override
Date getDate() {
return delegatee.date
}
#Override
TaskDependency getBuildDependencies() {
return delegatee.buildDependencies
}
}
project.afterEvaluate {
configurations.each { configuration ->
def artifacts = configuration.artifacts
if (!artifacts.isEmpty()) {
def unclassifiedArtifacts = []
unclassifiedArtifacts.addAll(artifacts.collect { classifiedArtifact ->
new UnclassifiedPublishArtifact(classifiedArtifact, classifiedArtifact.classifier == 'debug')
})
artifacts.clear()
artifacts.addAll(unclassifiedArtifacts)
}
}
}
I can't quite understand from the documentation what the consequences are for project dependencies, so you should check if these still work.
I declared this function in my Android project build.gradle:
def remoteGitVertsion() {
def jsonSlurper = new JsonSlurper()
def object = jsonSlurper.parse(new URL("https://api.github.com/repos/github/android/commits"))
assert object instanceof List
object[0].sha
}
And this flavor:
android {
...
productFlavors {
internal {
def lastRemoteVersion = remoteGitVersion()
buildConfigField "String", "LAST_REMOTE_VERSION", "\"" + lastRemoteVersion + "\""
}
...
}
...
}
Now, due to gradle declarative nature, the remoteGitVersion function is executed every time the project is built, it doesn't matter if the build flavor is internal or something else. So, the github API call quota is consumed and, after a little while, I receive a nice forbidden message.
How can I avoid this? Is it possible to execute the function only when the selected flavor is the right one?
Took reference from here:
In Android/Gradle how to define a task that only runs when building specific buildType/buildVariant/productFlavor (v0.10+)
To recap:
1. Wrap your flavor specific logic into a task
task fetchGitSha << {
android.productFlavors.internal {
def lastRemoteVersion = remoteGitVersion()
buildConfigField "String", "LAST_REMOTE_VERSION", "\"" + lastRemoteVersion + "\""
}
}
2. Make the task being called whenever you build your variant, and only then.
You could use assembleInternalDebug to hook into, in your case.
tasks.whenTaskAdded { task ->
if(task.name == 'assembleInternalDebug') {
task.dependsOn fetchGitSha
}
}
3. Make sure to remove the dynamic stuff from your flavor definition
productFlavors {
internal {
# no buildConfigField here
}
}
Hope that helps.
I'm trying migrating a normal Android Studio (IntelliJ) project to Gradle project recently. And currently I'm encounter a problem: IntelliJ gives me a warning on the beginning of every file says that my 'package name does not correspond to the file path'. e.g.
The first line of my some/prefixes/a/b/c/d/E.java is:
package a.b.c.d;
....
IntelliJ thinks the package name should be 'c.d' instead of 'a.b.c.d'. Because I set
SourceSets {
main.java.srcDirs = ["some/prefixes/a/b"]
}
in the module's build.gradle.
I know I could do the change below to make IntelliJ happy:
SourceSets {
main.java.srcDirs = ['some/prefixes']
}
But I can't do that because there're huge numbers of projects under 'some/prefixes' and I definitely don't want to introduce all of them into this module.
I used to add a packagePrefix="a.b" in my 'module.iml' in my original Android studio project and it works well:
https://www.jetbrains.com/idea/help/configuring-content-roots.html#d2814695e312
But I don't know how to accomplish similar fix after migrating to Gradle project.
I end up to write a task for gradle.
The task add the famous packagePrefix to the *.iml file.
This solution only work for intelliJ, I hope someone have a better solution.
task addPackagePrefix << {
println 'addPackagePrefix'
def imlFile = file(MODULE_NAME+".iml")
if (!imlFile.exists()) {
println 'no module find '
return
}
def parsedXml = (new XmlParser()).parse(imlFile)
if(parsedXml.component[1] && parsedXml.component[1].content){
parsedXml.component[1].content.findAll { Node node ->
node.sourceFolder.findAll { Node s ->
def url = s.attribute("url").toString()
if (url.endsWith(SRC_DIR)) {
println 'Node founded '
def attr = s.attribute('packagePrefix')
if (attr == null) {
// add prefix
println 'Adding package prefix'
s.attributes().put('packagePrefix', PACKAGE_NAME)
println s.toString()
// writing
def writer = new StringWriter()
new XmlNodePrinter(new PrintWriter(writer)).print(parsedXml)
imlFile.text = writer.toString()
}
}
}
}
}