Android Studio Gradle - JavaExec classpath configuration - Java Reflection - class access - android

Im currently working on an android project where i have to process .java-files to possibly generate another .java-files which should then be compiled and packed into the .apk-file.
Lets assume i have 2 files which will be processed by my library, FILE_A.java and FILE_B.java.
Now i need to access these files within my library via reflection, e.g. with:
Class.forName("com.test.entities.FILE_A");
Class.forName("com.test.entities.FILE_B");
The problem is that i'm not able to access the class files, i think because of the missing classpath configuration. Currently i use this task to call my .jar-file:
task (mytask, type: org.gradle.api.tasks.JavaExec) {
classpath(files('libs/myjar.jar'))
main('com.test.TestMain')
}
preBuild.dependsOn mytask
I found some ressources on the web, but they all don't work.
I tried to add the following to the classpath:
sourceSets.main.runtimeClasspath (main is unknown)
android.sourceSets.main.runtimeClasspath (runtimeClasspath is unkown).
So how can i access the class files in my library?

Try this
task execute(dependsOn: ['compileReleaseJavaWithJavac'], type:JavaExec) {
main = 'com.geniml.Main'
classpath(files('build/intermediates/classes/release',"${android.getSdkDirectory().getAbsolutePath() + '/platforms/' + android.compileSdkVersion + '/android.jar'}"))
}
android gralde is 1.5.
As to build dir, you can use rootProject.getBuildDir(). But normally build dir is a convention. A static way is ok.

Related

How to generate OpenAPI sources from gradle when building Android app

What I'm trying to achieve
I'm trying to generate my REST API client for Android using OpenAPI Generator from the build.gradle script. That way, I wouldn't have to run the generator command line every time the specs change. Ideally, this would be generated when I build/assemble my app, and the sources would end up in the java (generated) folder, where generated sources are then accessible from the code (this is what happens with the BuildConfig.java file for example).
What I've tried so far
Following this link from their official GitHub, here's the build.gradle file I ended up with:
apply plugin: 'com.android.application'
apply plugin: 'org.openapi.generator'
...
openApiValidate {
inputSpec = "$rootDir/app/src/main/openapi/my-api.yaml"
recommend = true
}
openApiGenerate {
generatorName = "java"
inputSpec = "$rootDir/app/src/main/openapi/my-api.yaml"
outputDir = "$buildDir/generated/openapi"
groupId = "$project.group"
id = "$project.name-openapi"
version = "$project.version"
apiPackage = "com.example.mypackage.api"
invokerPackage = "com.example.mypackage.invoker"
modelPackage = "com.example.mypackage.model"
configOptions = [
java8 : "true",
dateLibrary : "java8",
library : "retrofit2"
]
}
...
First, I've never managed to get the API generated with the build/assemble task, even when I tried adding:
compileJava.dependsOn tasks.openApiGenerate
or
assemble.dependsOn tasks.openApiGenerate
The only way I could generate the sources was by manually triggering the openApiGenerate task:
Then, when I do generate my sources this way, they end up in the build folder but aren't accessible from my code, and aren't visible in the java (generated) folder:
I then have to manually copy/paste the generated source files to my project sources in order to use the API.
Even though I'm able to work around these issues by adding manual procedures, it would be way more maintainable if the whole process was simply automatic. I was able to achieve a similar result with another tool, Protobuf. Indeed, my gradle task gets triggered every time I build the app, and the sources end up in the java (generated) folder, so I don't have to do any additional work. The task is much simpler though, so I assume the main work that I'm not able to replicate with OpenAPI Generator is handled by the Protobuf plugin itself.
You have to specify path to the generated sources as a custom source set for your Gradle module, which is app in this case, as described here – https://developer.android.com/studio/build/build-variants#configure-sourcesets. That way Gradle will treat your sources as accessible from your code.
Something like this:
android {
...
sourceSets {
main {
java.srcDirs = ['build/generated/openapi/src/main/java']
}
}
...
}
I solved the issue you described like this, I'm using gradle.kts however.
See my build.gradle.kts
plugins {
// Your other plugins
id("org.openapi.generator") version "5.3.0"
}
openApiGenerate {
generatorName.set("kotlin")
inputSpec.set("$rootDir/app/src/main/openapi/my-api.yaml")
outputDir.set("$buildDir/generated/api")
// Your other specification
}
application {
// Your other code
sourceSets {
main {
java {
// TODO: Set this path according to what was generated for you
srcDir("$buildDir/generated/api/src/main/kotlin")
}
}
}
}
tasks.compileKotlin {
dependsOn(tasks.openApiGenerate)
}
You need to build the application at least once for the IDE to detect the library (at least this is the case for me in Intellij)
Your build should automatically generate the open api classes , to refer the generated classes in your java project you should add the generated class path to your source directory like it was mentioned in the other answers
https://developer.android.com/studio/build/build-variants#configure-sourcesets
As far as the task dependency goes , in android tasks are generated after configuration thus for gradle to recognize the task , wrap it inside afterEvaluate block like
afterEvaluate {
tasks.compileDebugJavaWithJavac.dependsOn(tasks.openApiGenerate)
}
I had this issue, and this answer https://stackoverflow.com/a/55646891/14111809 led me to a more informative error:
error: incompatible types: Object cannot be converted to Annotation
#java.lang.Object()
Taking a look at the generated files that were causing this error, noticed:
import com.squareup.moshi.Json;
After including a Moshi in the app build.gradle, the build succeeded and the generated code was accessible.
implementation("com.squareup.moshi:moshi-kotlin:1.13.0")

Testing inconvenience: Android Studio JUnit vs Gradle based: testOptions ignored by Android Studio

The following was done with Android Studio 3.4, Android Gradle Plugin 3.3.2 and Gradle 4.10.3.
In the build.gradle file, I have configured some unit test options like this:
android {
testOptions {
unitTests.all {
systemProperty "debug","true"
}
}
}
I do have a test function that tries to read this property:
package com.demo;
public class SysPropTestDemo {
#Test
public static void dumpSysProps() {
System.out.println("sysprop(debug)=" + System.getProperty("debug"));
}
}
When run via command line gradlew test --test com.demo.SysPropTestDemo I will get the property debug set correctly to true. If I run the same test via Android Studio without setting any options, the value shown will be null.
In order to get the same result from Android Studio, I explicitly have to enter some values in the "Run/Debug Configurations" panel, i.e something like -Ddebug=true in the VM options.
Now this is a trivial example, but what I really want to do, is to add some path to the java.library.path property in order to be able to load a JNI library compiled within the project. (I do need to write some tests that make use a modified SQLite lib, so not using JNI is not an option here)
It does work when setting additional options, but I think this is very inconvenient, since I can't enter a variable based value in the configuration options (or at least, I don't know how to). To sum it up: when setting or changing values, I do have to go through a bunch of config screens where I would really prefer to have one place in a config file.
Shouldn't Android Studio somehow make use of the values specified in the build.gradle file? If not, the docs don't make it clear that the testOptions.unitTests.all settings can only be used via gradlew invocation.
Skybow,
I feel you have two questions
1. How to load jni lib for androidTest(not for 'test[non instrumented unit tests])
- copy your jni library in corresponding folder [JNI libraries: [app/src/androidTestFLAVORNAMEDebug/jniLibs]
- load your jni library
static {
try {
System.loadLibrary("xyzjni");
} catch (Exception e) {
Logger.error("Exception on loading the jni library : " + e.getMessage());
}
}
2. How to make android studio use your config variables defined for unitTests.
- It would have great if some text file is there which has all configs.
- Or it is part of build.gradle
- I don't have any detail on this.

How to integrate the Reflections library in android studio using gradle with save and collect

My question is related to the Reflections library by #ronmamo on github and integrating this into my Android project to dynamically access all classes that inherit from a certain interface.
I am not that familiar with gradle or maven so this is a learning process for me but i have reached a roadblock and do not know how to debug / find an answer to this one.
As #ronmamo suggests here, I want to generate a xml file on build containing all scanned metadata and let Reflections collect it later when I use it in my code:
Although scanning can be easily done on bootstrap time of your
application - and shouldn't take long, it is sometime a good idea to
integrate Reflections into your build lifecyle. With simple
Maven/Gradle/SBT/whatever configuration you can save all scanned
metadata into xml/json files just after compile time. Later on, when
your project is bootstrapping you can let Reflections collect all
those resources and re-create that metadata for you, making it
available at runtime without re-scanning the classpath - thus reducing
the bootstrapping time.
I am not sure I fully understand where exactly in the entire process this "bootstrapping" takes place (in terms of the android app lifecycle etc. or even build time?) so I am not certain where exactly to call Reflections.collect(). Currently I am calling it at some point later in my app when the user has reached a certain point in the program.
From several stackoverflow posts and the git readme files, I have come up with this for now: ([...] means removed unrelated code)
build.gradle (Module:app):
dependencies {
[...]
compile 'org.reflections:reflections:0.9.11'
}
build.gradle (Project: MyProject):
buildscript {
repositories {
jcenter()
mavenCentral()
}
dependencies {
classpath 'com.android.tools.build:gradle:2.3.3'
classpath 'org.reflections:reflections:0.9.11'
}
}
allprojects {
repositories {
jcenter()
}
}
task runReflections {
doLast {
org.reflections.Reflections("f.q.n").save("${sourceSet.main.output.classesDir}/META-INF/reflections/myproject-reflections.xml")
}
}
task clean(type: Delete) {
delete rootProject.buildDir
}
And later on in my code (this class is reached at some point through user input, not loaded on app start):
Reflections reflections = Reflections.collect();
Set<Class<? extends MyInterface>> allClasses = reflections.getSubTypesOf(MyInterface.class);
This generates the following exception since "reflections" is not instantiated and has the value of "null":
Attempt to invoke virtual method 'java.util.Set org.reflections.Reflections.getSubTypesOf(java.lang.Class)' on a null object reference
I understand that the generated .xml file resides on the computer where the build is happening, and I am not sure if this is also transferred to the android device so my guess is that is why this fails. But at what point does my Java code have access to this file before the apk is transferred and run on my android device?
I have tried googling this in many different ways from different angles but I cannot seem to find a solution to make reflections work in Android. I understand the principle explained here and it seems better to generate the information in an xml file at build time to have the class information available at runtime. But how can I set this up properly?
Thank you
There's a little bit of a chicken-or-egg problem to solve here
You want Reflections API to access the classes compiled from src/main/java
Gradle tasks and the Reflections classes are loaded by Gradle's buildscript classloader
The classes in src/main/java are compiled after the buildscript classloader is defined
You'll need to introduce another classloader that can access the compiled classes to break the cyclic dependency. This can then be passed to Reflections. Eg:
buildscript {
classpath 'org.reflections:reflections:0.9.11'
}
task doReflectyStuff {
dependsOn compileJava
doLast {
URL[] urls = sourceSets.main.runtimeClasspath.files.collect {
it.toURI().toURL()
}
ClassLoader classLoader = new URLClassLoader(urls, null)
Configuration config = new ConfigurationBuilder("com.mypackage", classLoader)
Reflections reflections = new ReflectionsBuilder(config)
...
}
}
See here for a similar question
This is what I did: The task was to use Reflections on Android for classes provided with a dependency (i.e. inside a JAR file). This solution works for me:
top build.gradle:
dependencies {
classpath 'org.reflections:reflections:0.9.10'
}
project build.gradle:
afterEvaluate {
android.applicationVariants.each { variant ->
variant.javaCompiler.doLast {
// get JAR file that contains the classes
def collection = project.configurations.compile*.toURI().find { URI uri -> new File(uri).name.startsWith("startOfJarFileNameHere") }
URL[] urls = collection.collect {
println "Collecting classes using Reflections from " + it
it.toURL()
}
// collect all classes
ClassLoader classLoader = new URLClassLoader(urls, ClassLoader.systemClassLoader)
org.reflections.Configuration config = org.reflections.util.ConfigurationBuilder
.build("package.name.of.interest.here")
.addClassLoader(classLoader)
.setUrls(urls)
org.reflections.Reflections reflections = new org.reflections.Reflections(config)
// save as JSON file into the assets folder
// (a) generate file for current debug or release build
reflections.save(
"${variant.javaCompiler.destinationDir}/../../assets/${variant.buildType.name}/reflections/my-reflections.json",
new org.reflections.serializers.JsonSerializer())
// (b) always update fall-back file for debug (used when running app from Android Studio or IntelliJ)
reflections.save(
"${variant.javaCompiler.destinationDir}/../../../../src/debug/assets/reflections/my-reflections.json",
new org.reflections.serializers.JsonSerializer())
}
}
}
Java code on Android:
InputStream iStream = getAssets().open("reflections/my-reflections.json");
Configuration config = ConfigurationBuilder.build().setSerializer(new JsonSerializer());
Reflections reflections = new Reflections(config);
reflections.collect(iStream);
Set<Class<? extends MyType>> myTypes = reflections.getSubTypesOf(MyType.class);
I have been trying to use Reflections in Android for some days and this is what I have achieved so far. I have created a task in project's build.gradle:
task myTask(dependsOn: compileJava) {
doLast {
URL[] urls = sourceSets.main.runtimeClasspath.files.collect {
it.toURI().toURL()
}
ClassLoader classLoader = new URLClassLoader(urls, ClassLoader.systemClassLoader)
org.reflections.Configuration config = new ConfigurationBuilder()
.addClassLoader(classLoader)
.filterInputsBy(new FilterBuilder().include(FilterBuilder.prefix("com.company.project")))
.addScanners(new SubTypesScanner(false))
Reflections reflections = new Reflections(config)
reflections.save("${sourceSets.main.output.classesDirs}/META-INF/reflections/mcommerce-reflections.json", new JsonSerializer())
}
}
Later on a class from the project I instantiate Reflections just as is done in the GitHub's examples (I use Kotlin):
val reflections = Reflections.collect(
"META-INF/reflections",
FilterBuilder().include(".*-reflections.json"),
JsonSerializer()
)
If myTask is run on the Terminal the build is successful but I get this message "given scan urls are empty. set urls in the configuration", I searched for this in Google but didn't find anything helpful.
I tried different ways of configuring Reflections on the gradle file but when I collect them I always receive a null instance.
I hope my answer is of some use for someone.

Is there anyway to prepend a jar to the unmanagedClasspath in sbt

I am using the android-sbt-plugin with the sbt, and I would like to add an unmanaged jar to the test classpath. The reason being android.jar contains stub functions for the org.json libraries and results in exceptions being thrown for unit tests. This is what I am doing
unmanagedClasspath in Test <+= (baseDirectory) map { base =>
Attributed.blank(base/"test-libs"/"json.jar")
}
Because of the order of the jars this file is ignored during when i run the test command within the sbt. If I type the command the order clearly shows the android.jar as the first jar
show test:unmanaged-classpath
[info] ArrayBuffer(Attributed(/home/rohit/Projects/android-sdk-linux/platforms/android- 17/android.jar), Attributed(/home/rohit/Projects/barfrendz/trunk/src/buzze/test-libs/json.jar))
If I create a lib folder and let sbt pick up the json jar the order is reversed the tests now run, but I can no longer create an android package due to conflicts with the org.json namespace in android.jar. Here is the exception
[error] (Buzze/android:proguard) java.io.IOException: Can't read [/home/rohit/Projects/barfrendz/trunk/src/buzze/lib/json.jar(;;;;!META-INF/MANIFEST.MF,!**/R.class,!**/R$*.class,!**/TR.class,!**/TR$.class,!**/library.properties)] (Can't process class [org/json/CDL.class] (Unsupported version number [51.0] for class format))
Is there anyway I can change the order of the jars in the classpath for the unit tests?
Instead of using <+=, use <<=, get unmanagedClasspath itself as a dependency, and then modify it as desired. The documentation has such an example with resolvers:
resolvers <<= resolvers {rs =>
val localMaven = "Local Maven Repository" at "file://"+Path.userHome.absolutePath+"/.m2/repository"
localMaven +: rs
}
This way, localMaven ends up first in resolvers.
According to the API docs, the unmanagedClasspath is a Task of type Classpath. Note that when you use that syntax, you are changing the Classpath, not the Task.
The API doc for the classpath is here -- it's a type, and it points to Seq[Attributed[File]], so you can manipulate it with any Seq command. I tried out the snippet here and it works:
$ cat build.sbt
unmanagedClasspath in Test <<= (unmanagedClasspath in Test, baseDirectory) map { (uc, base) =>
Attributed.blank(base/"test-libs"/"json.jar") +: uc
}
Daniel#DANIEL-PC /c/scala/Programas/sbtTest
$ sbt
[info] Set current project to default-60c6f9 (in build file:/C:/scala/Programas/sbtTest/)
> show test:unmanaged-classpath
[info] ArrayBuffer(Attributed(C:\scala\Programas\sbtTest\test-libs\json.jar))
[success] Total time: 0 s, completed 30/08/2013 13:32:42
>
Maybe overriding the unmanagedJars instead of the unmanagedClasspath would allow you to do this:
http://www.scala-sbt.org/0.12.3/docs/Detailed-Topics/Library-Management.html

Custom Class Loading in Dalvik with Gradle (Android New Build System)

As per the introduction of Custom Class Loading in Dalvik by Fred Chung on the Android Developers Blog:
The Dalvik VM provides facilities for developers to perform custom
class loading. Instead of loading Dalvik executable (“dex”) files from
the default location, an application can load them from alternative
locations such as internal storage or over the network.
However, not many developers have the need to do custom class loading. But those who do and follow the instructions on that blog post, might have some problems mimicking the same behavior with Gradle, the new build system for Android introduced in Google I/O 2013.
How exactly one can adapt the new build system to perform the same intermediary steps as in the old (Ant based) build system?
My team and I recently reached the 64K method references in our app, which is the maximum number of supported in a dex file. To get around this limitation, we need to partition part of the program into multiple secondary dex files, and load them at runtime.
We followed the blog post mentioned in the question for the old, Ant based, build system and everything was working just fine. But we recently felt the need to move to the new build system, based on Gradle.
This answer does not intend to replace the full blog post with a complete example. Instead, it will simply explain how to use Gradle to tweak the build process and achieve the same thing. Please note that this is probably just one way of doing it and how we are currently doing it in our team. It doesn't necessarily mean it's the only way.
Our project is structured a little different and this example works as an individual Java project that will compile all the source code into .class files, assemble them into a single .dex file and to finish, package that single .dex file into a .jar file.
Let's start...
In the root build.gradle we have the following piece of code to define some defaults:
ext.androidSdkDir = System.env.ANDROID_HOME
if(androidSdkDir == null) {
Properties localProps = new Properties()
localProps.load(new FileInputStream(file('local.properties')))
ext.androidSdkDir = localProps['sdk.dir']
}
ext.buildToolsVersion = '18.0.1'
ext.compileSdkVersion = 18
We need the code above because although the example is an individual Java project, we still need to use components from the Android SDK. And we will also be needing some of the other properties later on... So, on the build.gradle of the main project, we have this dependency:
dependencies {
compile files("${androidSdkDir}/platforms/android-${compileSdkVersion}/android.jar")
}
We are also simplifying the source sets of this project, which might not be necessary for your project:
sourceSets {
main {
java.srcDirs = ['src']
}
}
Next, we change the default configuration of the build-in jar task to simply include the classes.dex file instead of all .class files:
configure(jar) {
include 'classes.dex'
}
Now we need to have new task that will actually assemble all .class files into a single .dex file. In our case, we also need to include the Protobuf library JAR into the .dex file. So I'm including that in the example here:
task dexClasses << {
String protobufJarPath = ''
String cmdExt = Os.isFamily(Os.FAMILY_WINDOWS) ? '.bat' : ''
configurations.compile.files.find {
if(it.name.startsWith('protobuf-java')) {
protobufJarPath = it.path
}
}
exec {
commandLine "${androidSdkDir}/build-tools/${buildToolsVersion}/dx${cmdExt}", '--dex',
"--output=${buildDir}/classes/main/classes.dex",
"${buildDir}/classes/main", "${protobufJarPath}"
}
}
Also, make sure you have the following import somewhere (usually at the top, of course) on your build.gradle file:
import org.apache.tools.ant.taskdefs.condition.Os
Now we must make the jar task depend on our dexClasses task, to make sure that our task is executed before the final .jar file is assembled. We do that with a simple line of code:
jar.dependsOn(dexClasses)
And we're done... Simply invoke Gradle with the usual assemble task and your final .jar file, ${buildDir}/libs/${archivesBaseName}.jar will contain a single classes.dex file (besides the MANIFEST.MF file). Just copy that into your app assets folder (you can always automate that with Gradle as we've done but that is out of scope of this question) and follow the rest of the blog post.
If you have any questions, just shout in the comments. I'll try to help to the best of my abilities.
The Android Studio Gradle plugin now provides native multidex support, which effectively solves the Android 65k method limit without having to manually load classes from a jar file, and thus makes Fred Chung's blog obsolete for that purpose. However, loading custom classes from a jar file at runtime in Android is still useful for the purpose of extensibility (e.g. making a plugin framework for your app), so I'll address that usage scenario below:
I have created a port of the original example app on Fred Chung's blog to Android Studio on my github page over here using the Android library plugin rather than the Java plugin. Instead of trying to modify the existing dex process to split up into two modules like in the blog, I've put the code which we want to go into the jar file into its own module, and added a custom task assembleExternalJar which dexes the necessary class files after the main assemble task has finished.
Here is relevant part of the build.gradle file for the library. If your library module has any dependencies which are not in the main project then you will probably need to modify this script to add them.
apply plugin: 'com.android.library'
// ... see github project for the full build.gradle file
// Define some tasks which are used in the build process
task copyClasses(type: Copy) { // Copy the assembled *.class files for only the current namespace into a new directory
// get directory for current namespace (PLUGIN_NAMESPACE = 'com.example.toastlib')
def namespacePath = PLUGIN_NAMESPACE.replaceAll("\\.","/")
// set source and destination directories
from "build/intermediates/classes/release/${namespacePath}/"
into "build/intermediates/dex/${namespacePath}/"
// exclude classes which don't have a corresponding .java entry in the source directory
def remExt = { name -> name.lastIndexOf('.').with {it != -1 ? name[0..<it] : name} }
eachFile {details ->
def thisFile = new File("${projectDir}/src/main/java/${namespacePath}/", remExt(details.name)+".java")
if (!(thisFile.exists())) {
details.exclude()
}
}
}
task assembleExternalJar << {
// Get the location of the Android SDK
ext.androidSdkDir = System.env.ANDROID_HOME
if(androidSdkDir == null) {
Properties localProps = new Properties()
localProps.load(new FileInputStream(file('local.properties')))
ext.androidSdkDir = localProps['sdk.dir']
}
// Make sure no existing jar file exists as this will cause dx to fail
new File("${buildDir}/intermediates/dex/${PLUGIN_NAMESPACE}.jar").delete();
// Use command line dx utility to convert *.class files into classes.dex inside jar archive
String cmdExt = Os.isFamily(Os.FAMILY_WINDOWS) ? '.bat' : ''
exec {
commandLine "${androidSdkDir}/build-tools/${BUILD_TOOLS_VERSION}/dx${cmdExt}", '--dex',
"--output=${buildDir}/intermediates/dex/${PLUGIN_NAMESPACE}.jar",
"${buildDir}/intermediates/dex/"
}
copyJarToOutputs.execute()
}
task copyJarToOutputs(type: Copy) {
// Copy the built jar archive to the outputs folder
from 'build/intermediates/dex/'
into 'build/outputs/'
include '*.jar'
}
// Set the dependencies of the build tasks so that assembleExternalJar does a complete build
copyClasses.dependsOn(assemble)
assembleExternalJar.dependsOn(copyClasses)
For more detailed information see the full source code for the sample app on my github.
See my answer over here. The key points are:
Use the additionalParameters property on the dynamically created dexCamelCase tasks to pass --multi-dex to dx and create multiple dex files.
Use the multidex class loader to use the multiple dex files.

Categories

Resources