How to properly setup dynamic library loading with header file? - android

The title was tough to get right, so let me explain my situation:
Another team develops a library. They ship a header file and a *.so file. The header file is available to us, and we can include it in our own code and use it if we wish. The *.so, however, is shipped with the platform we run on. We do not have access to this *.so at build time for our software. Because of this, we can't really use the header file either, since the linker will expect the *.so to be available at some point.
Right now what I do is create a wrapper class that loads the *.so file at runtime, then uses dlsym() to find functions by name, and I map them to function pointers.
Is this the only option? Is there a way I can use the header file but tell the linker to not resolve the symbols at build time, but instead try to resolve them at runtime after we have a chance to load the *.so file?
Note the real platform here is Android (via NDK), but hoping general linux advice will work as well in this case since we have POSIX APIs available.

You have a few options, in order of preference:
Get the libraries from the maintainer. Providing the header but not the library (at least a stub library like we do for libraries in the NDK) just won't work.
Build your own stub library. It's pretty straightforward if you have a list of symbols to expose. Put int foo; void bar() {} in a C file for all the variables and functions you need to expose and build it as a shared lib. If you have the list of symbols in a version script, you might be able to use Android's gen_stub_libs.py to do it for you.
Mark all the symbols with __attribute__((weak)) in the header file. The linker won't complain that they are missing. If they're missing at runtime, the library will still load but each function's address will be nullptr. Not really what you want in most cases because if your definition of the library is wrong, you turn build time failures into runtime failures, but in some cases this can be handy because it's easier to check for function availability with if (foo) { foo(); } then to do similar with dlsym.
Add -Wl,--allow-shlib-undefined to your ldflags. This is even worse than 3 because it affects all the libraries you link, but it wouldn't require you to meddle with the header.

On Windows this is solved by requiring export library (.lib) to be present for linker instead of real dynamic library (.dll). I think you can try making something similar, that is making a fake .so containing stubs of all the methods exported from real .so and linking against it. This hopefully will make linker happy and at the same time at runtime application will load real .so.

Related

Android NDK: Resolve dynamic library version conflict

I'm making an Android app with help of the NDK. One of the shared libraries I'm using depends on ICU, which is another library that I'm trying to explicitly include too.
The problem is that my device (like apparently many others) has an old version of ICU pre-installed. This means that whenever I try to load my shared library, the system tries to load the system version instead of my own, more recent version.
Android seems to ignore any RPATHs, which would otherwise let libraries specific where to look for dependencies. As far as I understand RPATH is essentially hard-coded to /vendor/lib and /system/lib on Android.
I've seen a few workarounds for the issue, but none of them seem to work:
Explicitly load all the libraries (including dependencies) in dependency order.
Although the ICU libraries appear to load fine (via their absolute paths), I still get a cannot locate symbol error when attempting to load my shared library. I've quadruple-checked (via nm and readelf) that the missing symbol does exist in my (newly-compiled) ICU .so files.
Change the SONAME of the ICU dynamic libs to something project-specific and depend on them instead.
This would possibly work, but is kind of a last result because it would involve recompiling a lot of library code with non-trivial dependencies. It also feels more like a hack. A simple name change of the libraries doesn't work for the same reason as 1. – although the ICU symbols get loaded, they are not recognised when the dependent library tries to access them.
My question is to anyone who has tried something like this before or knows their way around linking a bit more than I do: how did you get it working? Is there a way to force Android to get its proverbial hotsauce together and actually load the correct libraries, or otherwise use the symbols which I've already successfully loaded?

Generate MainDexList.txt for Pre-5.0 Multidex app with reflection support

We are using android-maven-plugin to build a multidex application targeting Jelly Bean (4.3.x) with greater than 65k methods. The approach described here helps create a MainDexList.txt file, but does not automatically include classes that will be loaded by reflection.
Are any tools or processes available that can create a MainDexList.txt file with reflection support? [The majority of the classes we are loading via reflection are named via String constants...]
We are attempting to avoid manually running the app and dealing with NoClassDefFoundError messages one at a time.
To deal with the NoClassDefFoundError, you just need to add the MainDexList.txt to each of your projects. This should solve your initial errors right away. However, since your MainDexList.txt will still be empty, you will run into further issues.
To load the MainDexList.txt with a script instead of doing it manually, you can use this open source script by Google which will generate the exact class names that should be included in MainDexList.txt. Here is a link to the actual commit by Google:
https://android.googlesource.com/platform/dalvik/+/2bb6fe45bf620525ba34bd7303d7ecb597aa0689
To learn more (and also my source of information):
http://blog.osom.info/2014/10/generating-main-dex-list-file.html
Notes: This unfortunately does not support reflection however, DexClassLoader loads classes from .jar and .apk files containing a classes.dex entry. This may be worth looking at as well.
Hope this helps!

Android app crashes after switching from .lib to .so

I am using a number of static pre-built static libraries in my native android application and everything works fine. Now I want to switch one of my static libraries to be .so. I was successfully able to build .so library by replacing BUILD_STATIC_LIBRARY with BUILD_SHARED_LIBRARY in its android.mk and adding required dependencies.
I was also able to build my application by replacing corresponding PREBUILT_STATIC_LIBRARY with PREBUILT_SHARED_LIBRARY in its android.mk. The resulting application now fails to start. I cannot even get to point where debugger attaches to the application.
Besides that what I do not understand is how the build system knows that the function should be imported from the library. My so library should export one function, but I did not declare it as dllexport/import or something. Still there are no unresolved symbols in my application (when I remove my prebuilt library from the list, the unresolved symbol appears as expected).
The other question is that I see there are two .so files generated. One big file in obj/local/$(TARGET_ARCH_ABI) folder and another small one in libs/$(TARGET_ARCH_ABI). When declaring my prebuilt library I reference the second one in libs folder.
I did try to search stackoverflow for answers and found quite a few related posts:
loading library (.so file) in android
NDK - How to use a generated .so library in another project
How to use .so file in Android code to use the native methods
How to use libffmpeg.so in Android project?
but I do not see how these posts related to my problem since I can successfully build and even link my application.
You need to load the libraries in reverse dependency order in the java code. You previously probably have something like this:
System.loadLibrary("mylib");
Now if your prebuilt library (that was previously a static library, now a shared library) is named dependencylib, you need to change the code for loading the libraries into this:
System.loadLibrary("dependencylib");
System.loadLibrary("mylib");
As for your question how the linker can figure it out; when linking libmylib.so, it looks for all undefined symbols in all the other libraries you specified (i.e. in libdependencylib.so, and in libc.so and other system libraries). As long as all undefined symbols are found somewhere, the linker is ok. Then at runtime, when libmylib.so is loaded, it does the same routine again; all undefined symbols are looked up in the list of symbols loaded in the current process. On linux, you normally don't need to manually mark symbols as dllexport as you do on windows - all non-static symbols are exported by default.
There may be two reasons why the app fails to start after the change of STATIC -> SHARED.
The prebuilt library is not installed. With your device connected, run adb ls -l /data/your.package.name/lib/. Do you see the library there?
The prebuilt library is not loaded. In your main Java class, try
static {
System.loadLibrary("prebuiltname");
System.loadLibrary("yourlib");
}
This is a static constructor, the safest place to load JNI library.
If you are on linux you will see exported symbols using nm -D. example nm -D libzip.so:
...
0000000000009dc0 T zip_unchange
0000000000009dd0 T zip_unchange_all
0000000000009e30 T zip_unchange_archive
0000000000009e60 T _zip_unchange_data
If you want to control visibility of your functions use __attribute__ ((visibility ("default"))) and command line -fvisibility=hidden. More information here.
Now I want to switch one of my static libraries to be .so. I was successfully able to build .so library by replacing BUILD_STATIC_LIBRARY with BUILD_SHARED_LIBRARY in its android.mk and adding required dependencies.
I don't think you can do it if its a C++ library. From <doc>/CPLUSPLUS-SUPPORT.html:
Please keep in mind that the static library variant of a given C++
runtime SHALL ONLY BE LINKED INTO A SINGLE BINARY for optimal
conditions.
What this means is that if your project consists of a
single shared library, you can link against, e.g., stlport_static, and
everything will work correctly.
On the other hand, if you have two
shared libraries in your project (e.g. libfoo.so and libbar.so) which
both link against the same static runtime, each one of them will
include a copy of the runtime's code in its final binary image. This
is problematic because certain global variables used/provided
internally by the runtime are duplicated.
This is likely to result in code that doesn't work correctly, for example:
* memory allocated in one library, and freed in the other would leak
or even corrupt the heap.
* exceptions raised in libfoo.so cannot be caught in libbar.so (and may
simply crash the program).
* the buffering of cout not working properly
This problem also happens if you want to link an executable and a shared
library to the same static library.
In other words, if your project requires several shared library modules,
then use the shared library variant of your C++ runtime.
From above, it means everything needs to link against the same C++ standard runtime shared object.

Android.mk for LibXtract

Can somebody help me write Android.mk for LibXtract or point me in correct directoin?
Here is source for lib - https://github.com/jamiebullock/LibXtract.git
Or mayby there is a way to use linux generated shared objects in Android?
Especially for bigger established projects, crafting Android.mk files is quite an effort. More so, if you are not familiar with Android NDK build architecture whose understanding requires digging deep into the documentation and Android NDK make files. I would suggest trying to use existing make files by setting CC to point to your NDK tool chain, and CFLAGS += -sysroot $(SYSROOT) where SYSROOT=${NDK_INSTALL_DIR}/platforms/android-<level>/arch-<arch>/ (depending on targeted Android API version and architecture). Even without knowing about your library, I would bet you should have good chance of success this way. Android NDK documentation (${NDK_INSTALL_DIR}/doc/STANDALONE-TOOLCHAIN.html) details the use of independent tool chain and also instructs how to create a standalone tool chain that will not require the use of -sysroot argument to xxx-gcc.
If you decide to use Android.mk instead, you might check existing projects -CSipSimple comes to my mind (PJSIP converted from standard form GNU make files).
Important is to create the shared objects using Android tool chains. It is possible to build them outside of your application source tree, and then just copy the shared objects into the package source libs/<architecture>/ directory.
Integration with your build system depends on details that are not known (including how smooth you desire this whole integration to be e.g. because of other people working with the same project). If you are creating an application from command line, the easiest would be to have GNU make file or shell script in the package root directory ensure libXtract.so and your application package is up-to-date by calling libXtract make file and ant to build and pack your Java application. If you are using ant there should be a way to specify using make to take care of libXtract.so. I am not sure if eclipse is completely relying on ant for building an application to know if this would be enough for enabling complete build by clicking mouse buttons from within eclipse, too.
The answer to this question says you could use cmake script to build Android.mk files - I have not tried this approach.

How to integrate Scala into core Android platform?

I am interested in integrating Scala (or some other non-Java JVM-language) into the android platform. I am not referring to writing an android application with Scala, that I did early early on, but actually hooking into the build process that builds the android platform source tree. I imagine this will be a matter of hooking into the makefiles and such. Does anyone have insight into this?
What I have so far:
The platform source treefrom git://android.git.kernel.org/platform/manifest.git built in its virgin form, guided by "[Download and build the Google Android][1]"
build/core/combo/scalac.mk # Configures scala compiler related variables, included by config.mk
Added definitions in build/core/definitions.mk for an all-subdir-scala-files and an all-scala-files-under
Added definition in definitions.mk to build scala files such that they are included in the package
What's left:
Include scala-library.jar
Ensure changes to -bootclasspath has not broken anything
Figure out how to handle case where scala classes depend on java classes and visa versa
Major cleanup of code
Testing!
Figure out what to do (other than just posting them here) with the changes I've made
Looks like I'm almost there!!!
Some notes from the past
Latest: I have found where the Java source files are compiled! In definitions.mk, see 'define transform-java-to-classes.jar'. The latest idea is to write a transform-scala-to-classes definition and then have it store those classes in the directly that gets packaged. I will call transform-scala-to-class right before this step in transform-java-to-classes.jar. Support for eclipse and cygwin will for now be dropped as it clutters up the code with workarounds and therefore increases my chances of failure.
The build process starts out by the root Makefile running build/core/main.mk
build/core/main.mk includes build/core/config.mk which includes build/core/combo/javac.mk which sets HOST_JAVAC, TARGET_JAVAC, and COMMON_JAVAC. COMMON_JAVAC is the "Java compiler command with common arguments," by the look of it the other two variables get these values by default, unless in a special environment (openjdk or eclipse). COMMON_JAVAC is not used outside this file. The other two are only used in build/core/definitions.mk.
build/core/java_library.mk (included by config.mk) seems to only be concerned with building jars. This is out of the scope of us caring. Any interaction with jars presupposes class files which presuppose that we were already successful in building our scala files.
There are checks in main.mk regarding the version of java. We will ignore these and assume that our version of scala is compatible. Right now (in combo/scalac.mk) I am using the same --target arg used in javac.mk. This should perhaps be stored in a variable.
main.mk also includes build/core/definitions.mk which in turns defines some useful functions. The one we care about here is all-java-files-under and all-subdir-java-files. The latter is used in Android.mk files to find java files. The former is used in the implementation of the latter. I will write Scala equivalents of them.
To figure out how the build process works, I am now running make with -n and others. I got this idea from the stackoverflow article "[Tool for debugging makefiles][2]". I am also investigating debugging with remake.
build/core/{config.mk, definitions.mk} gives us light as to which make files/commands are used to do what.
As a possible way of hacking in support on a per project bases, additional code could most likely be added to the project's Android.mk file. From platform/build/core/build-system.html we read "Android.mk is the standard name for the makefile fragments that control the building of a given module. Only the top directory should have a file named "Makefile"." You could create a new target like "scala-build" and run that (make PackageName scala-build) before the final make. One could perhaps also hide it sneakily in a variable assignment, mitigating the need for a target to be called explicitly.
Another way (far far more hackish) is to hijack the command being used for javac. This is set in build/core/combo/javac.mk. Your project's Android.mk will have to include *.scala files in LOCAL_SRC_FILES along with the *.java files.
Guys on reddit say, there's a tutorial on integration Scala into Android with ant here.

Categories

Resources