How do I build just the LatinIME package from AOSP? - android

I'm trying to test some changes that I made to the LatinIME package in AOSP. The problem is, that the documentation only shows how to build the entire thing.
What I really need to know is how to build a single package (in this case, LatinIME), from the command line
edit: What isn't made clear (at least to me), is that in the repo root directory, you can type make PACKAGE (e.g. `make LatinIME'), and it will build that. I haven't tested it thoroughly, but it does appear to build all the prerequisites of the required package as well.

I think you want the mm or mmm command. See this documentation
Building only an individual program or module
If you use build/envsetup.sh, you can use some of the defined functions to build only a part of the tree. Use the 'mm' or 'mmm' commands to do this.
The 'mm' command makes stuff in the current directory (and sub-directories, I believe). With the 'mmm' command, you specify a directory or list of directories, and it builds those.
To install your changes, do 'make snod' from the top of tree. 'make snod' builds a new system image from current binaries.

Related

Using OpenCv contrib modules for android

Is there a way to use opencv contrib modules in android ? I am specifically using text module. Is there a android lib for these modules. I have my code working on desktop and i m trying to migrate my codes to android. Any insight would be gr8.
I was having issues figuring out solutions to these problems as well. I thought I would find a relevant question out there and put a response in for the community in case others are also looking for solutions to a problem similar to this one and mine. Compilation was done on a Macbook Retina 13".
The instructions provided are somewhat incomplete and there are additional steps that will be needed to get to a final product.
At the start you will follow the standard procedure as outlined online
$ cd <opecv_directory>
$ mkdir build
$ cd <opencv_build_directory>
$ cmake -D OPENCV_EXTRA_MODULES_PATH=<opencv_contrib>/modules <opencv_source_directory>
$ make -j5
$ make install
In addition to this, you may run across an error or two. I needed to install some missing components in order to get past things that were missing but this may differ for you (I researched errors and understood that I needed additional components)
brew install ninja
brew install oxygen
brew install ant
I also ran into an error with one module requesting the need for the following declared in the source code (or with compiler flags):
#define SOLARIS_64BIT_ENABLED
Another thing you can do is remove other modules in the contrib folder you may not be interested in during compilation. Just include the modules you want and hopefully those ones are good. I did this simply by removing one or two from the /modules folder and then reran the python script.
A final python script was needed to run the build. I created a directory alongside the main source tree and contrib folder.
OpenCVSource
-> opencv
-> opencv_contrib
-> android_opencv_build
The call below was made from the directory where I want the build to be taking place from, so I changed to the directory The call was the following:
python ../opencv/platforms/android/build_sdk.py --extra_modules_path ../opencv_contrib/modules --ndk_path <your-path-to-ndk-top-level-folder> --sdk_path <your-path-to-sdk-top-level-folder> ./ ../opencv
This only builds the .so files that are necessary for using the library, but it doesn't build the .jar file that you will need to use the new binaries. In order to do that navigate to your build folder (mine as seen is in android_opencv_build/OpenCV-android-sdk)
Load this project into Eclipse in the standard manner with the import existing Android project into workspace. You really only need the /sdk project but feel free to load samples as well if desired. Then build the project. You may need to alter the target build to support the new Camera APIs for a successful build; in my case changing the target to API level 21.
You will then find the .jar file in the /bin directory of the project. The .jar and the .so files found in android_opencv_build/OpenCV-android-sdk/sdk/native/jni/ contain the necessary .so files that you will need to include in your projects /lib folder alongside this jar.
Now you should have everything that you need. Since we are working with contrib modules (or not if you are building it for other reasons), it is possible that you will run across other errors in the build process that are not quite stable and will need some attention. This cannot be helped but people can feel free to add comments to other peoples solutions and this post to aide them in resolving them if they have found a solution.

How to tell autoconf about broken cross HOST functions?

I am trying to cross-build an autotools enabled package for an unusual embedded system with a very incomplete libc. (If it is relevant: The package is CPython 3.4.2 and the "embedded system" is the command line shell on Android 4.4.)
AFAIK there's no way that configure running on my build machine can determine which functions on the host are broken. (configure can, and does, compile and link test programs on the build machine, but it has no access to running the program on the host.) So, for example, the wcsftime() function is declared in the host's <wchar.h> header and defined in the host's libc, but the implementation is incorrect.
For this package configure builds a config.h file with a C macro HAS_WCSFTIME, which is defined if configure believes the host has a working wcsftime() and is undef'd otherwise. And the package's source code is correctly ifdefed so that if wcsftime() is missing, strftime() is used instead, with proper conversions back and forth between 7-bit ascii and UCS-4.
I can't just run configure with:
CPPFLAGS=-UHAS_WCSFTIME configure --build=... --host=... ...
because the config.h file just redefines it anyway.
The options I've come up with so far are:
add a ac_cv_broken_host_wcsftime variable to the configure.ac file
add ifdefs for a HAS_BROKEN_WCSFTIME macro to the sources
fix the host's libc
create a patch for config.h that flips HAS_WCSFTIME from defined to undefined, and remember to run patch every time after I run configure
I've already implemented option (4) and it is ... unsatisfying. I can do (1) or (2) and contribute it back to the package developers, but then it will be months before the changes get incorporated. I'm working on option (3), but that will take years to get deployed to the majority of my user's phones and tablets.
What's the right way to deal with this problem? (I expect it to come up a lot since I've got a lot of different packages that I want to get working, and there are dozens of broken functions in libc.)
Is there some command line option to configure that will let me control which CPP macros do and do not get defined?
Is there some command line option to configure that will let me control which CPP macros do and do not get defined?
No.
Your best bet is to talk with the package maintainers. They can help you put an acceptable patch together for their package. You can then apply this patch until it gets pushed with the following method.
As an alternative to 4), you could also patch configure itself, especially if there's a bootstrap script that is invoked to create configure. Doing actions in the bootstrap script to fix up configure or libtool, etc. is one of the ways I've solved this problem in the past.
If in 3) you mean Bionic as libc, I'd think that "never" is probably a more accurate timetable than "takes years" to get wide character functions into it.
AFAIK there's no way that configure running on my build machine can determine which functions on the host are broken. (configure can, and does, compile and link test programs on the build machine, but it has no access to running the program on the host.)
Mostly true. Scratchbox2 will allow you to do runtime configure tests on the host, but it doesn't support Android, unfortunately.

Android LatinIME build

I want to make some changes to LatinIME. I got the code from git repository-
git clone https://android.googlesource.com/platform/packages/inputmethods/LatinIME
But I don't know how to build the apk file from the code. If anyone has build the LatinIME from the code, can you please share instructions.
Specifically I want to know how to build the dictionary tools (I guess I would need ndk), how to build the native code (again I guess it would required ndk) and finally how to build the java code by using the lib file from native code.
I tried creating Android app project in eclipse (using existing code option) by giving root directory as LatinIME/java I was able to compile but since it didn't have libjni_latinime.so, it crashed. I then got the .so file from emulator and put it in the libs/armeabi-v7a folder. Now I get this exception:
10-15 12:54:55.289: E/AndroidRuntime(32253): FATAL EXCEPTION: InitializeBinaryDictionary
10-15 12:54:55.289: E/AndroidRuntime(32253): android.content.res.Resources$NotFoundException: File res/raw/main_en.dict from drawable resource ID #0x7f070003
I think I may have solved this...
Having encountered a similar problem in another project where resources were being unnecessarily compressed due to their file extension I renamed the dictionaries (.dict) to .jet - an extension excluded from compression. Voila, dictionaries are now working. Not sure how good of a resolution that is seeing as the dictionaries are now uncompressed but it's a step in the right direction at least?
So far i have customised the LatinIME many times for different projects. I never faced this problem.
But i never used eclipse to create apks. I downloaded whole AOSP code onto my machine and compiled the modified source with AOSP. And mm creates the apk file in out folder, and can be installed with adb install -r latinime.apk
Here is how to download AOSP :http://source.android.com/source/downloading.html
And here is how to compile it initially : http://source.android.com/source/initializing.html and http://xda-university.com/as-a-developer/getting-started-building-android-from-source
And the LatinIME can be found in <android roo>/packages/inputmethods/LatinIME, Modify the code ther and cd to the same path and run mm (you need to do source build/envisetup.sh and lunch full-eng done in same terminal before doing mm)
First some background. As also suggested by the other answer issue seems to be related of .dict files being compressed. For example you can see how official Android builds solve this in project's tests for LatinIME
# Do not compress dictionary files to mmap dict data runtime
LOCAL_AAPT_FLAGS += -0 .dict
A quick searching the web reveals that to day this kind of directive or instructing aapt from Eclipse isn't trivial. You would probably end up creating a build.xml in case you want to handle don't-compress-dicts case properly.
One nice suggestion is this answer/question on how to instruct aapt to not to compress certain files.
If you want to build this from official git link you provide, you'll end up building whole Android repo, which you can by following building-running instructions.
If using gradle, add this
android {
aaptOptions {
noCompress 'dict'
}

android aapt in eclipse

i saw some info regarding this question in several threads but non suited my condition.
I have an android application which i now need to customized some resources and code.
For the time being i have some problems using android library so i have an ant build that copies base resources and Assets plus specific ones to my android project and than changes the package name in the manifest as needed. All my activity have constant path and non relative to the package name so that's not an issue.
The problem is with the R object in the gen folder that is generated in the aapt. aapt does have a parameter to not use the android manifest package but another one, but it's available only if i use the ant build file, the parameters for the ADT are hard coded in the plugin.
has anyone found solution to this ? i mean i can always use ant task to change all R references (imports) but it looks to me error prone. Any way except wrapper script that wont do it on windows to customize aapt ?
again... no answer for my question, i'll start thinking you guys got something against me.
in short, i see there's no solution, the things i can do:
use ant task that regexps all the import com....R; to the new package - disadvantage: even the smallest issue with the regexp i may mess something up in the code that will show only in runtime.
use wrapper script around aapt.exe or aapt binary that adds --custom-package my.package.name that even if i change the manifest package will keep the original R files, for both ant and eclipse - disadvantage, some: 1. on windows this needs to be an exec file as eclipse looks for one it took me a while to build one properly in C, on linux\mac this could be a script of any kind so we need 2 versions of it\ 2. every update to the SDK we need to install the wrapper all over again.
edit the ant tasks jar code to allow --custom-package and edit the adt source code to enable in the menu custom parameters - this is great feature that should be included! but the disadvantage is until the code is accepted (if it is accepted) and added to the SDK itself , i'll needs to merge my changes every new SDK emerges, unlike #2 this is a lot harder.
I chose option #2, for now i have executable for windows, and i'll check options for Linux and mac (probably simple bash script) once i'm there, and i'll create a python installation script and save it all in the svn along with my build project.
Good luck to everyone....

How to integrate Scala into core Android platform?

I am interested in integrating Scala (or some other non-Java JVM-language) into the android platform. I am not referring to writing an android application with Scala, that I did early early on, but actually hooking into the build process that builds the android platform source tree. I imagine this will be a matter of hooking into the makefiles and such. Does anyone have insight into this?
What I have so far:
The platform source treefrom git://android.git.kernel.org/platform/manifest.git built in its virgin form, guided by "[Download and build the Google Android][1]"
build/core/combo/scalac.mk # Configures scala compiler related variables, included by config.mk
Added definitions in build/core/definitions.mk for an all-subdir-scala-files and an all-scala-files-under
Added definition in definitions.mk to build scala files such that they are included in the package
What's left:
Include scala-library.jar
Ensure changes to -bootclasspath has not broken anything
Figure out how to handle case where scala classes depend on java classes and visa versa
Major cleanup of code
Testing!
Figure out what to do (other than just posting them here) with the changes I've made
Looks like I'm almost there!!!
Some notes from the past
Latest: I have found where the Java source files are compiled! In definitions.mk, see 'define transform-java-to-classes.jar'. The latest idea is to write a transform-scala-to-classes definition and then have it store those classes in the directly that gets packaged. I will call transform-scala-to-class right before this step in transform-java-to-classes.jar. Support for eclipse and cygwin will for now be dropped as it clutters up the code with workarounds and therefore increases my chances of failure.
The build process starts out by the root Makefile running build/core/main.mk
build/core/main.mk includes build/core/config.mk which includes build/core/combo/javac.mk which sets HOST_JAVAC, TARGET_JAVAC, and COMMON_JAVAC. COMMON_JAVAC is the "Java compiler command with common arguments," by the look of it the other two variables get these values by default, unless in a special environment (openjdk or eclipse). COMMON_JAVAC is not used outside this file. The other two are only used in build/core/definitions.mk.
build/core/java_library.mk (included by config.mk) seems to only be concerned with building jars. This is out of the scope of us caring. Any interaction with jars presupposes class files which presuppose that we were already successful in building our scala files.
There are checks in main.mk regarding the version of java. We will ignore these and assume that our version of scala is compatible. Right now (in combo/scalac.mk) I am using the same --target arg used in javac.mk. This should perhaps be stored in a variable.
main.mk also includes build/core/definitions.mk which in turns defines some useful functions. The one we care about here is all-java-files-under and all-subdir-java-files. The latter is used in Android.mk files to find java files. The former is used in the implementation of the latter. I will write Scala equivalents of them.
To figure out how the build process works, I am now running make with -n and others. I got this idea from the stackoverflow article "[Tool for debugging makefiles][2]". I am also investigating debugging with remake.
build/core/{config.mk, definitions.mk} gives us light as to which make files/commands are used to do what.
As a possible way of hacking in support on a per project bases, additional code could most likely be added to the project's Android.mk file. From platform/build/core/build-system.html we read "Android.mk is the standard name for the makefile fragments that control the building of a given module. Only the top directory should have a file named "Makefile"." You could create a new target like "scala-build" and run that (make PackageName scala-build) before the final make. One could perhaps also hide it sneakily in a variable assignment, mitigating the need for a target to be called explicitly.
Another way (far far more hackish) is to hijack the command being used for javac. This is set in build/core/combo/javac.mk. Your project's Android.mk will have to include *.scala files in LOCAL_SRC_FILES along with the *.java files.
Guys on reddit say, there's a tutorial on integration Scala into Android with ant here.

Categories

Resources