Externalizing Android Linux kernel module - android

I don't know if what I'm trying to do is even possible, and while it may be undesirable I'd like to know whether I can make this work.
I have a Linux kernel compiled for an Android tablet and I need to make some changes to one of the built-in modules. (Compiling a new kernel from source is not, in this particular case, an option for me).
I've gotten as far as compiling using my modified source and compiling the .ko files I need. However, when compiling these modules, I get a list of errors that look like the following:
WARNING: "alarm_start_range" [/modules/p3_battery.ko] undefined!
It seems as though my Makefile isn't correctly linking this header:
#include <linux/power/p3_battery.h>
Anyway, I tried to load those modules on the device, and when I try to do insmod p3_battery.ko, I get a failure message (which I expected). Looking at dmesg, I see messages that tell me the following:
p3_battery: Unknown symbol alarm_start_range (err 0)
As mentioned above, those functions do exist in the kallsyms table.
I can provide more detail by supplying my Makefile if that will help, but I wanted to offer a concise formulation of the problem to see if what I'm doing here makes any sense.

Related

How to properly setup dynamic library loading with header file?

The title was tough to get right, so let me explain my situation:
Another team develops a library. They ship a header file and a *.so file. The header file is available to us, and we can include it in our own code and use it if we wish. The *.so, however, is shipped with the platform we run on. We do not have access to this *.so at build time for our software. Because of this, we can't really use the header file either, since the linker will expect the *.so to be available at some point.
Right now what I do is create a wrapper class that loads the *.so file at runtime, then uses dlsym() to find functions by name, and I map them to function pointers.
Is this the only option? Is there a way I can use the header file but tell the linker to not resolve the symbols at build time, but instead try to resolve them at runtime after we have a chance to load the *.so file?
Note the real platform here is Android (via NDK), but hoping general linux advice will work as well in this case since we have POSIX APIs available.
You have a few options, in order of preference:
Get the libraries from the maintainer. Providing the header but not the library (at least a stub library like we do for libraries in the NDK) just won't work.
Build your own stub library. It's pretty straightforward if you have a list of symbols to expose. Put int foo; void bar() {} in a C file for all the variables and functions you need to expose and build it as a shared lib. If you have the list of symbols in a version script, you might be able to use Android's gen_stub_libs.py to do it for you.
Mark all the symbols with __attribute__((weak)) in the header file. The linker won't complain that they are missing. If they're missing at runtime, the library will still load but each function's address will be nullptr. Not really what you want in most cases because if your definition of the library is wrong, you turn build time failures into runtime failures, but in some cases this can be handy because it's easier to check for function availability with if (foo) { foo(); } then to do similar with dlsym.
Add -Wl,--allow-shlib-undefined to your ldflags. This is even worse than 3 because it affects all the libraries you link, but it wouldn't require you to meddle with the header.
On Windows this is solved by requiring export library (.lib) to be present for linker instead of real dynamic library (.dll). I think you can try making something similar, that is making a fake .so containing stubs of all the methods exported from real .so and linking against it. This hopefully will make linker happy and at the same time at runtime application will load real .so.

SDL2 compiling for Android with GLEW

This is a compilation problem, specifically with referencing shared libraries with NDK.
I have the SDL2 + GLEW program running fine on my mac (obviously with a different makefile/build system), and I have it running fine on Android as well (so long as I don't use GLEW). But now I need to use GLEW, and can't find a straightforward reference for how/where/what the heck is going on with including libraries in the NDK.
Anyways- in my android-project folder, I have jni/src/Android.mk (which I assume is where I should be looking?)
There's a line with LOCAL_SHARED_LIBRARIES := SDL2 SDL2_image, and I assume that's the magic variable that I shlould add GLEW to? But what even is that? How does that know what GLEW even means? Should it be -lGLEW? (that last question is rhetorical- I've tried all of them and nothing works). I've even tried commenting out that line totally and get the same build error:
jni/src/src/main.cpp:8:21: fatal error: GL/glew.h: No such file or directory
#include <GL/glew.h>
I'm pretty much totally lost... does anyone have any resources I could look into?
Also, as a note, I'd also prefer an explanation regarding why/what is going on, as I'm sure I'll be including other stuff as I go on.
Edit:
As Drop pointed out below, this wasn't a linking problem- the compiler needed to know where to find GL/glew.h. So I added /usr/local/Cellar/glew/1.12.0/include to the LOCAL_C_INCLUDES := line, and that works. But now, there's an error compiling glew.h- GL/glu.h: No such file or directory.
So now,
is there a more general/clean/better way than hardcoding the whole /usr/local/Cellar/glew/1.12.0/include line to let the compiler know where GLEW is? Should glew.h be located somewhere more general?
Why do I only need this when compiling for android? The mac build doesn't need these flags...
I can't find clear documentation anywhere about this, but in compiling the mac build with GLEW, I didn't need to also install GLU and GLUT (like GLEW's website implied...), and further, neither GLU nor GLUT are available as packages via brew.
I've been looking around for documentation about this, but I feel like I don't know what I don't know. Is there some insight you could give to where I'm going wrong regarding how includes and libraries and stuff are expected to be referenced across platforms?
Welp. Turns out there isn't a straightforward way to include GLEW/GLU/GLUT with NDK? (I have a question mark because I'm still not 100% sure...).
However, I found the solution to my problem, and that was- I didn't need GLEW (or GLU, or GLUT, etc...)!
Like I said, I'm building for OSX and android, so I'm using OpenGL 2.1 and OpenGLES2.
I chose to do this because I was told that OpenGLES2 is simply a subset of OpenGL2, so I didn't expect any issues (so long as I didn't use any func's only in OpenGL 2.1).
I needed GLEW because I wanted to use framebuffers (glGenFramebuffers, etc...), which only exist as an extension in OpenGL 2.1. Once I got that working, I assumed I would need the same extension wrangling for OpenGLES2 on Android- turns out I simply don't! It just kinda works out of the box! (Well, once you get NDK working, and linked with OpenGL, and blah blah blah).

How to tell autoconf about broken cross HOST functions?

I am trying to cross-build an autotools enabled package for an unusual embedded system with a very incomplete libc. (If it is relevant: The package is CPython 3.4.2 and the "embedded system" is the command line shell on Android 4.4.)
AFAIK there's no way that configure running on my build machine can determine which functions on the host are broken. (configure can, and does, compile and link test programs on the build machine, but it has no access to running the program on the host.) So, for example, the wcsftime() function is declared in the host's <wchar.h> header and defined in the host's libc, but the implementation is incorrect.
For this package configure builds a config.h file with a C macro HAS_WCSFTIME, which is defined if configure believes the host has a working wcsftime() and is undef'd otherwise. And the package's source code is correctly ifdefed so that if wcsftime() is missing, strftime() is used instead, with proper conversions back and forth between 7-bit ascii and UCS-4.
I can't just run configure with:
CPPFLAGS=-UHAS_WCSFTIME configure --build=... --host=... ...
because the config.h file just redefines it anyway.
The options I've come up with so far are:
add a ac_cv_broken_host_wcsftime variable to the configure.ac file
add ifdefs for a HAS_BROKEN_WCSFTIME macro to the sources
fix the host's libc
create a patch for config.h that flips HAS_WCSFTIME from defined to undefined, and remember to run patch every time after I run configure
I've already implemented option (4) and it is ... unsatisfying. I can do (1) or (2) and contribute it back to the package developers, but then it will be months before the changes get incorporated. I'm working on option (3), but that will take years to get deployed to the majority of my user's phones and tablets.
What's the right way to deal with this problem? (I expect it to come up a lot since I've got a lot of different packages that I want to get working, and there are dozens of broken functions in libc.)
Is there some command line option to configure that will let me control which CPP macros do and do not get defined?
Is there some command line option to configure that will let me control which CPP macros do and do not get defined?
No.
Your best bet is to talk with the package maintainers. They can help you put an acceptable patch together for their package. You can then apply this patch until it gets pushed with the following method.
As an alternative to 4), you could also patch configure itself, especially if there's a bootstrap script that is invoked to create configure. Doing actions in the bootstrap script to fix up configure or libtool, etc. is one of the ways I've solved this problem in the past.
If in 3) you mean Bionic as libc, I'd think that "never" is probably a more accurate timetable than "takes years" to get wide character functions into it.
AFAIK there's no way that configure running on my build machine can determine which functions on the host are broken. (configure can, and does, compile and link test programs on the build machine, but it has no access to running the program on the host.)
Mostly true. Scratchbox2 will allow you to do runtime configure tests on the host, but it doesn't support Android, unfortunately.

Android NDK API list?

Is there a way/tool to enumerate all the C function prototypes (not sure if that's the right term) from the .h files in the NDK folder subdirectories (i.e. C:\android-ndk-r8b\platforms\android-9\arch-arm\usr\include) and produce something like a javadoc? The reason I'm asking is because we have a developer on our end who is trying to port his Windows code over to the android platform and before he begins to go forward with such an effort, he needs to know what API calls are supported so he can begin changing his code base to make it Android-NDK compliant. I've run across the following links in my search for an answer:
http://mobilepearls.com/labs/native-android-api/#c++
C:/android-ndk-r8b/docs/STABLE-APIS.html
C:/android-ndk-r8b/docs/CPLUSPLUS-SUPPORT.html
The right way to check if the code is compliant is to compile it and see what breaks. Win32 API surely would - there's even no point in checking if an NDK counterpart exists. The C/C++ RTL is a little more tricky - some functions have counterparts, some don't. But enumerating them all and matching by hand is, frankly, a waste of time. A compiler will do the same much faster.
For starters, let the code be Linux compliant. The minute differences between Android libs and Linux can usually be resolved incrementally. Note that you have a moving target here: platform-14 supports much more Linux headers than 9.
http://www.tenouk.com/Module000linuxnm3.html
can you use libs instead of .h files?
the link explains using 'nm' to dump symbols from libs

How to integrate Scala into core Android platform?

I am interested in integrating Scala (or some other non-Java JVM-language) into the android platform. I am not referring to writing an android application with Scala, that I did early early on, but actually hooking into the build process that builds the android platform source tree. I imagine this will be a matter of hooking into the makefiles and such. Does anyone have insight into this?
What I have so far:
The platform source treefrom git://android.git.kernel.org/platform/manifest.git built in its virgin form, guided by "[Download and build the Google Android][1]"
build/core/combo/scalac.mk # Configures scala compiler related variables, included by config.mk
Added definitions in build/core/definitions.mk for an all-subdir-scala-files and an all-scala-files-under
Added definition in definitions.mk to build scala files such that they are included in the package
What's left:
Include scala-library.jar
Ensure changes to -bootclasspath has not broken anything
Figure out how to handle case where scala classes depend on java classes and visa versa
Major cleanup of code
Testing!
Figure out what to do (other than just posting them here) with the changes I've made
Looks like I'm almost there!!!
Some notes from the past
Latest: I have found where the Java source files are compiled! In definitions.mk, see 'define transform-java-to-classes.jar'. The latest idea is to write a transform-scala-to-classes definition and then have it store those classes in the directly that gets packaged. I will call transform-scala-to-class right before this step in transform-java-to-classes.jar. Support for eclipse and cygwin will for now be dropped as it clutters up the code with workarounds and therefore increases my chances of failure.
The build process starts out by the root Makefile running build/core/main.mk
build/core/main.mk includes build/core/config.mk which includes build/core/combo/javac.mk which sets HOST_JAVAC, TARGET_JAVAC, and COMMON_JAVAC. COMMON_JAVAC is the "Java compiler command with common arguments," by the look of it the other two variables get these values by default, unless in a special environment (openjdk or eclipse). COMMON_JAVAC is not used outside this file. The other two are only used in build/core/definitions.mk.
build/core/java_library.mk (included by config.mk) seems to only be concerned with building jars. This is out of the scope of us caring. Any interaction with jars presupposes class files which presuppose that we were already successful in building our scala files.
There are checks in main.mk regarding the version of java. We will ignore these and assume that our version of scala is compatible. Right now (in combo/scalac.mk) I am using the same --target arg used in javac.mk. This should perhaps be stored in a variable.
main.mk also includes build/core/definitions.mk which in turns defines some useful functions. The one we care about here is all-java-files-under and all-subdir-java-files. The latter is used in Android.mk files to find java files. The former is used in the implementation of the latter. I will write Scala equivalents of them.
To figure out how the build process works, I am now running make with -n and others. I got this idea from the stackoverflow article "[Tool for debugging makefiles][2]". I am also investigating debugging with remake.
build/core/{config.mk, definitions.mk} gives us light as to which make files/commands are used to do what.
As a possible way of hacking in support on a per project bases, additional code could most likely be added to the project's Android.mk file. From platform/build/core/build-system.html we read "Android.mk is the standard name for the makefile fragments that control the building of a given module. Only the top directory should have a file named "Makefile"." You could create a new target like "scala-build" and run that (make PackageName scala-build) before the final make. One could perhaps also hide it sneakily in a variable assignment, mitigating the need for a target to be called explicitly.
Another way (far far more hackish) is to hijack the command being used for javac. This is set in build/core/combo/javac.mk. Your project's Android.mk will have to include *.scala files in LOCAL_SRC_FILES along with the *.java files.
Guys on reddit say, there's a tutorial on integration Scala into Android with ant here.

Categories

Resources