I'm using the Native Development Kit (NDK) in a project of mine, and I'm trying to automate the whole app build procedure with Python.
Whenever ndk-build is called, it copies the prebuilt shared libraries to libs/<abi>/, even if there's no changes in them or they already exist there. This causes problem when I call ant later on, as it detects changed files (the library timestamps are newer) and so rebuilds the apk without any need.
Is there a way to change the ndk-build behaviour so it checks for existing libraries in the libs/<abi>/ folder and if they need updating or some are missing, it will call ndk-build, otherwise, just proceed to the next build step?
I've tried using filecmp in Python, but as the timestamps are different between the prebuilt shared libraries and the installed ones, it doesn't work.
The OP probably doesn't need this any more, but I had the exact same problem, trying to set up a Makefile to build a project, so maybe this will be helpful to someone else in the future as well.
ndk-build is a wrapper around gnu make, that invokes a bunch of Makefiles in build/core directory of the ndk, so, while it's not universally applicable*, for your personal project you can modify those Makefiles to do whatever you want. I found a clean-installed-binaries target that a couple of build/install targets depended on, removing those dependencies fixed the issue with perpetual installs.
In whichever cases that clean target is necessary you can invoke it manually with:
ndk-build clean-installed-binaries.
*Given the time to come up with a clean opt-in solution you can submit a patch to ndk project, and if accepted it will eventually become universally applicable.
Related
I was able to follow directions in this question to build a shared lib of openssl for Android.
E.g.
cd openssl-fips-2.0/
./config
make
make install
And
cd openssl-1.0.1c/
./config fips --with-fipsdir=/usr/local/ssl/fips-2.0/ shared
make depend
make
This generates libcrypto.so.1.0.0 and libssl.so.1.0.0 with corresponding symbolic links to them as libcrypto.so and libssl.so.
Since the NDK build system doesn't support versioned shared libraries I had to use the symbolic links (with PREBUILT_SHARED_LIBRARY). However, with this, the libraries end up getting to the device as libcrypto.so and libssl.so instead of as libcrypto.so.1.0.0 and libssl.so.1.0.0 causing my library to fail to load as it is looking for the libraries with the version names.
The linked question mentions loading the libraries with System.load(libcrypto.so.1.0.0) instead of with System.loadLibrary() but I have not been able to get this to work even with full paths since as mentioned earlier, the file is copied to the device as libcrypto.so.
Anyone done this successfully?
Note: I've also tried modifying the openssl-1.0.1c config and makefiles to generate libcrypto.1.0.0.so (e.g. with the version number before the extension in the filename and soname) and this allowed me to get around the previous loading issue. However, with that I get an error when I try to turn on FIPS mode with FIPS_module_mode_set (FIPS_R_FINGERPRINT_DOES_NOT_MATCH).
I don't know yet why that is happening, but it could be due to NDK stripping of 'unneeded' stuff (see this question)... I'm still looking at this as well but if someone has some info on this as well it would be MUCH appreciated.
Let us identify the problem correctly. It's likely not the NDK build that causes problems, and definitely not the linker which strips away unused entries when it builds a shared lib from static lib.
First of all, I am not sure you can deliver FIPS mode in a usual APK, without rebuilding or at least rooting Android (see for example http://gcn.com/articles/2010/12/23/android-fips-security.aspx).
There is no problem for System.load() to load a versioned .so when you a) specify the full path correctly (e.g. System.load("/data/local/tmp/libssl.so.1.0.0")) and b) the file is delivered to that path. For the first tests, I would suggest to manually upload libcrypto.so.1.0.0 and libssl.so.1.0.0 to /sdcard/ and see if FIPS fingerprint becomes happier.
If the location on /sdcard/ causes any problem, you can try /data/local/ or /data/local/tmp/. You can also use /data/data/(your package)/files/. The latter has one advantage: it will be automatically deleted by the system when your app is uninstalled.
To make a versioned .so (like libcrypto.so.1.0.0) part of your APK, copy it to the assets folder of your project. It will be responsibility of your Java code to copy it from there to the designated location on disk. Make sure this Java code handles correctly upgrades and SD card swaps.
Can somebody help me write Android.mk for LibXtract or point me in correct directoin?
Here is source for lib - https://github.com/jamiebullock/LibXtract.git
Or mayby there is a way to use linux generated shared objects in Android?
Especially for bigger established projects, crafting Android.mk files is quite an effort. More so, if you are not familiar with Android NDK build architecture whose understanding requires digging deep into the documentation and Android NDK make files. I would suggest trying to use existing make files by setting CC to point to your NDK tool chain, and CFLAGS += -sysroot $(SYSROOT) where SYSROOT=${NDK_INSTALL_DIR}/platforms/android-<level>/arch-<arch>/ (depending on targeted Android API version and architecture). Even without knowing about your library, I would bet you should have good chance of success this way. Android NDK documentation (${NDK_INSTALL_DIR}/doc/STANDALONE-TOOLCHAIN.html) details the use of independent tool chain and also instructs how to create a standalone tool chain that will not require the use of -sysroot argument to xxx-gcc.
If you decide to use Android.mk instead, you might check existing projects -CSipSimple comes to my mind (PJSIP converted from standard form GNU make files).
Important is to create the shared objects using Android tool chains. It is possible to build them outside of your application source tree, and then just copy the shared objects into the package source libs/<architecture>/ directory.
Integration with your build system depends on details that are not known (including how smooth you desire this whole integration to be e.g. because of other people working with the same project). If you are creating an application from command line, the easiest would be to have GNU make file or shell script in the package root directory ensure libXtract.so and your application package is up-to-date by calling libXtract make file and ant to build and pack your Java application. If you are using ant there should be a way to specify using make to take care of libXtract.so. I am not sure if eclipse is completely relying on ant for building an application to know if this would be enough for enabling complete build by clicking mouse buttons from within eclipse, too.
The answer to this question says you could use cmake script to build Android.mk files - I have not tried this approach.
I wish to back port the Android RTP APIs introduced in version 3.1(Honeycomb) to earlier versions. I downloaded the source of version 4.0 and found that it these APIs had both java and native code. In order to build the native code with the NDK, certain shared libraries are required.
According the Android.mk file, these are libnativehelper, libcutils, libutils, and libmedia. Though the source of all of these are present in the source code, building them was difficult. Each required many other shared libraries. For eg, libmedia requires these shared libraries: libui, libcutils, libutils, libbinder, libsonivox, libicuuc, libexpat, libcamera_client, libstagefright_foundation, libgui and libdl.
So my question is, is there some way of obtaining the original 4 shared libs? Does it involve building the entire source?
Say I need to build a piece of native code which is going to use standard Android shared libraries such as libutils, libcutlis, libmedia. I would perform following steps:
Install AOSP repository with target version.
Add my source code to appropriate directories under ./frameworks/base. In your case it might be easier to create a separate folder and put proper Android.mk of course.
You might get compile errors if required functions from those standard shared libraries are not present in the previous version.
When you build the code as part of AOSP it will build required libraries and link them for you automatically.
P.S. To accomplish that you're better to use a Linux-based build host.
using cygwin terminal, build native part i.e. jni folder. To build using cygwin, goto jni folder using cygdrive command. Then type ndk-build. After successful completion, shared libraries i.e. .so files will be created in libs folder.
I can understand your problem, you can pull the libraries from /system/lib of device or emulator. But you need a system permission. But you can do it by installing application.
Otherwise build your source code on linux platfor. Building process is very easy, just using 2 or 3 command. First time it is needed long time to build. After that you need very short time to build, it will build only according to the timestamp of modified code.
Please have a look here
I have set up Eclipse to build my C/C++ files. I created a builder and set it to point at
the ndk-build executable in the ndk install tree. Each time I run this it keeps rebuilding all sources. I am not passing any arguments so why would it do this?
It doesn't look like it's possible to currently perform incremental builds using stock android-ndk. You could, however, do it manually. It's quite an involved process because you'll have to redo the makefiles and such. See this answer for a description of what this involves.
I am interested in integrating Scala (or some other non-Java JVM-language) into the android platform. I am not referring to writing an android application with Scala, that I did early early on, but actually hooking into the build process that builds the android platform source tree. I imagine this will be a matter of hooking into the makefiles and such. Does anyone have insight into this?
What I have so far:
The platform source treefrom git://android.git.kernel.org/platform/manifest.git built in its virgin form, guided by "[Download and build the Google Android][1]"
build/core/combo/scalac.mk # Configures scala compiler related variables, included by config.mk
Added definitions in build/core/definitions.mk for an all-subdir-scala-files and an all-scala-files-under
Added definition in definitions.mk to build scala files such that they are included in the package
What's left:
Include scala-library.jar
Ensure changes to -bootclasspath has not broken anything
Figure out how to handle case where scala classes depend on java classes and visa versa
Major cleanup of code
Testing!
Figure out what to do (other than just posting them here) with the changes I've made
Looks like I'm almost there!!!
Some notes from the past
Latest: I have found where the Java source files are compiled! In definitions.mk, see 'define transform-java-to-classes.jar'. The latest idea is to write a transform-scala-to-classes definition and then have it store those classes in the directly that gets packaged. I will call transform-scala-to-class right before this step in transform-java-to-classes.jar. Support for eclipse and cygwin will for now be dropped as it clutters up the code with workarounds and therefore increases my chances of failure.
The build process starts out by the root Makefile running build/core/main.mk
build/core/main.mk includes build/core/config.mk which includes build/core/combo/javac.mk which sets HOST_JAVAC, TARGET_JAVAC, and COMMON_JAVAC. COMMON_JAVAC is the "Java compiler command with common arguments," by the look of it the other two variables get these values by default, unless in a special environment (openjdk or eclipse). COMMON_JAVAC is not used outside this file. The other two are only used in build/core/definitions.mk.
build/core/java_library.mk (included by config.mk) seems to only be concerned with building jars. This is out of the scope of us caring. Any interaction with jars presupposes class files which presuppose that we were already successful in building our scala files.
There are checks in main.mk regarding the version of java. We will ignore these and assume that our version of scala is compatible. Right now (in combo/scalac.mk) I am using the same --target arg used in javac.mk. This should perhaps be stored in a variable.
main.mk also includes build/core/definitions.mk which in turns defines some useful functions. The one we care about here is all-java-files-under and all-subdir-java-files. The latter is used in Android.mk files to find java files. The former is used in the implementation of the latter. I will write Scala equivalents of them.
To figure out how the build process works, I am now running make with -n and others. I got this idea from the stackoverflow article "[Tool for debugging makefiles][2]". I am also investigating debugging with remake.
build/core/{config.mk, definitions.mk} gives us light as to which make files/commands are used to do what.
As a possible way of hacking in support on a per project bases, additional code could most likely be added to the project's Android.mk file. From platform/build/core/build-system.html we read "Android.mk is the standard name for the makefile fragments that control the building of a given module. Only the top directory should have a file named "Makefile"." You could create a new target like "scala-build" and run that (make PackageName scala-build) before the final make. One could perhaps also hide it sneakily in a variable assignment, mitigating the need for a target to be called explicitly.
Another way (far far more hackish) is to hijack the command being used for javac. This is set in build/core/combo/javac.mk. Your project's Android.mk will have to include *.scala files in LOCAL_SRC_FILES along with the *.java files.
Guys on reddit say, there's a tutorial on integration Scala into Android with ant here.