OpenGL ES source files - android

I am trying to build the OpenGL SO lib from android sources (libGLESv2.so) and i would like a little bit more understanding of the internal mechanism of Android OpenGL ES and the flow.
Please correct me where i am wrong:
I know that in windows a developer includes gl.h and static link to OpenGL32(64).lib (which in turn dynamically link to OpenGL32.dll (probably there is a way to dynamic linke to OpenGL32.dll by the developer but that's not important).
The developer is exposed to the declaration of OpenGL API's but the implementation which i assume to be HW dependent.
The same scenario, Android: assuming developer import .opengl.GLES20 and calls the following method: GLES20.glTexEnvf(....
I would like to know what's going on behind the scenes in android (maybe Linux is better for an Android beginner).
the implementation which reside in opengl/java/android/opengl/GLES20.java source calls the native C function glTexEnvf which unlike windows we have it's implementation which reside in opengl/libagl.
Is it true?
In any case what is the GLES2_dbg library in /libs/GLES20_dbg? i can see there some kind of debug implementation with python scripts... are they to compile OpenGL debug version?
What are the .in files and gl2.cpp file in /libs/GLES20?
Where are the HW calls? does each GPU vendor sends his libGLESv2 implementation for HW calls as i saw the libGLESv2_adreno200.so in my xperia arc?
Please help me understand the flow. If you have a link which explain this structure even in Linux it will be great.

In Windows opengl32.dll contains both a software rasterizer fallback and so called trampolines into the OpenGL-ICD shipping with the GPU driver.
The opengl32.lib ist not really a library but a cross reference for the linker to add entries into the executable that make the OS dynamically link the program against the DLL at runtime.
On Linux in the current implementation the libGL.so ships with the graphics driver and contains the vendor specific implementations. The linkers used in *nix systems don't rely on an extra crossreferencing .lib but can take the information directly from the .so
On Android the libGLES you see is only a kind of placeholder to make linking possible. But ultimately the GPU vendor provides the proper library, which drops into the place where the phony libGLES resided.
The .in files are nothing special in particular. They are input files used by configure and build systems to build source files from a template (the .in file) with fields filled in by configuration values.

thanks for the quick answer, i did a little more digging and as i saw here:
Missing OpenGL drivers on Android emulator,
further explanation.
What i understand now is the libagl is pure SW implementation.
In that case the libhgl is actually the GPU vendor implementation.
I also understood that libEGL opens (found it in the code - Loader.cpp) the libGLESv2....
So i will ask 2 more question:
libGLESv2 only dynamically link to the HW lib or the libEGL does that? (found something on EGL - loader.cpp which seems like dynamically link OpenGL API's)
2.So when i call an OpenGL API i goes trough libEGL (since there is the dynamic binding)? and from there to libGLESv2 ?
thanks a lot for your help
it starting to make sense now

Related

What are the most important POSIX functions not available in Android?

I'm about to port a large C++ project (some sort of Library Project, it contains absolutely no GUI) to Android. It's actually a Visual C++ project, but it will be ported to Linux as intermediate step. I know that Android is not a "full" Linux and does not claim to provide all POSIX functions, but I also know there are a lot of "POSIXish functions" on Android by using the NDK.
Now my actual question is:
Which are the biggest/most important functions that are NOT available on Android compared with the full POSIX set? So that I can keep that in mind when doing the porting from Visual C++ to Linux GCC.
I tried to find something on Google, but found nothing really helpful, just here and there some stuff that mentioned that there are some POSIX functions on Android...
Bionic a recode by Google. It is small but optimized for Android.
The only big thing I know of that it lacks is indeed the pthread_cancel() function.
My experience is that if you port it successfully to GNU/Linux, without pthread_cancel() calls, then you should be mostly ok.
BTW, what kind of library are you trying to build? What does it uses? Network, threads...
PS: Even Linux is not fully POSIX.
Bionic Wikipedia page
https://en.wikipedia.org/wiki/Bionic_(software)#Differences_from_POSIX
Also has some interesting info:
Although bionic aims to implement all of C11 and POSIX, there are still (as of Oreo) about 70 POSIX functions missing[8] from libc. There are also POSIX functions such as the endpwent/getpwent/setpwent family that are inapplicable to Android because it lacks a passwd database. As of Oreo, libm is complete.
Some functions deliberately do not conform to the POSIX or C standards for security reasons, such as printf which does not support the %n format string.[9]
Official Bionic in tree documentation quote
https://android.googlesource.com/platform/bionic/+/37ad9597839c70a7ec79578e5072df9c189fc830/docs/status.md
Run ./libc/tools/check-symbols-glibc.py in bionic/ for the current list of POSIX functions implemented by glibc but not by bionic. Currently (2017-10):
aio_cancel
aio_error
aio_fsync
aio_read
aio_return
aio_suspend
aio_write
lio_listio
pthread_cancel
pthread_mutex_consistent
pthread_mutex_getprioceiling
pthread_mutex_setprioceiling
pthread_mutexattr_getprioceiling
pthread_mutexattr_getprotocol
pthread_mutexattr_getrobust
pthread_mutexattr_setprioceiling
pthread_mutexattr_setprotocol
pthread_mutexattr_setrobust
pthread_setcancelstate
pthread_setcanceltype
pthread_testcancel
wordexp
wordfree
libm
Current libm symbols: https://android.googlesource.com/platform/bionic/+/master/libm/libm.map.txt
0 remaining missing POSIX libm functions.
The most obvious feature missing is pthread_cancel().
This blog has some additional details: http://codingrelic.geekhold.com/2008/11/six-million-dollar-libc.html
Good overview of Bionic: https://android-platform.googlegroups.com/attach/0f8eba5ecb95c6f4/OVERVIEW.TXT?gda=HWJaO0UAAAB1RXoVyyH5sRXFfLYnAq48KOFqr-45JqvtfiaR6gxIj4Qe8cBqW3BaWdSUPWi_jHqO3f1cykW9hbJ1ju6H3kglGu1iLHeqhw4ZZRj3RjJ_-A&view=1&part=4
shared memory is also something you might find differently implemented in android. was hit hard while trying to work with shm_open and shm_unlink on android kernel. Android implements asynchronous shared memory (ashmem).

Android API from Go

I know that Go programs can be compiled for Android.
How can I use Android specific API, like getting GPS coordinates or opening a URL with the default browser, from within a Go program?
I'm afraid it's hardly possible at the moment. In the "Meet the Go Team" I/O sessions, the guys from the Go team stated that they have no plans to add Android support to Go.
What we have now is just a compiler for ARM architecture. Unfortunately, this is pretty much useless for real Android apps, though such programs can be launched from the command line on Android devices.
Most of the Android framework is written in Java, so to interact with it your code should be compiled to a *.so libary, that will be loaded and called via the JNI interface. And it's not possible with the current Go compiler (gc, not sure about the gccgo).
Maybe you will be able to make bindings to the Android NDK API with cgo, that would allow you to create applications in Go since API level 9 (Android 2.3)
UPD: You can now use JNI from Go and create java bindings automatically with golang.org/x/mobile package. In Go 1.4 it's still experimental, but there are plans to include it into Go 1.5 release. The package also provides bindings for GL, audio and user input (hopefully they would also add iOS support and that would be compatible for Android and iOS one day). Anyway this package is mostly oriented at writing games in Go, rather than using Go as a replacement for Java on Android.
Take a look at my answer to Android App from Go programming language. The goandroid project allows you to create Android apps in Go through the NDK.
(Disclaimer: I'm the author of goandroid)
Edit: As mentioned in the comments, Go 1.5 adds official support for Android apps in pure Go or as a mix of Java and Go. iOS is also expected to arrive in time for the final 1.5 release. See https://github.com/golang/mobile for more details.
GO 1.4 doc says, "Go 1.4 can build binaries for ARM processors running the Android operating system. It can also build a .so library that can be loaded by an Android application using the supporting packages in the mobile subrepository"
There is package app option in "golang.org/x/mobile/app" library that lets you write Apps for Android (and eventually, iOS).
Step 1: Create a platform independent GUI library using Golang that uses OpenGL to draw and an intelligent event and data-binding system to write the apps in. Any software using OpenGL is going to be more-or-less portable. Essentially, re-write Kivy in Golang.
Step 2: Create introspection/reflection based wrapper for using Java classes similar to PyJNIus (also a Kivy project).
Step 3: Lots more hard work, because there is a lot to do to get to the level of Kivy
Step 4: Profit

Is it possible to use GIMP's image capabilities from an Android application?

I was using opencv for some time for programming in Android, and I now see that the Gimp library is much stronger. Where can I find a starting point to learn Gimp?
I also want to know the basic concepts behind of Gimp plugins. In the past, I used C APIs in opencv. How could I write the code for android?
Also, what packages do I need to install in windows to start using Gimp?
ALthough GIMP dows have some standalone libraries that perform some image manipulation, most image manipulation is done either by GIMP's core program or through GIMP's plug-ins. Both approaches need to have the entire program installed and running (though not necessarily usin a display).
I know nothing on Andorid progrmaing, and don't knwo how can one install ordinary native code in C and call it from Android apps - if you are very familiar with it, you might have a chance in your attempt.
However GIMP itself relies on a extensive ecosystem of libraries, including, but not limited to, glib, gtk+, cairo, pango, gegl - and each of these in turn might have other pre-requisites. Since Windows does not have a working package manager to authomatically install libraries and header files of these various libraries, working with these natively on Windows, though the code of each of them is multiplatform and can run on Windows and other OSses,is very hard. So hard that hthe people who build GIMP for Windows themselves do so in a Linux environment, from where they cros-compile GIMP for Windows.
Making all of these libraries work on an Android is probably not hard if you are using the GNU ecosystem around the Android's Linux kernel , and not just the bare Android environment (I don't know enough about android to even know if that is possible).
All in all: it will be though for you, and demand a whole lot of research.
One of GIMP's libraries, the GEGL (Generic Graphics Library) has a lot less prerequistes, and can be used as an ordinary library. I think you can probably build it with just glib and Babl as prerequisites. This is the library that will replace current's GIMP core, and reimplement the operations of most existing plug-ins -- so it might be enough for you.
If you can get GEGL running and usable from an Android system share that with the World --it would be , in itelsef, a project worth of a Google Summer of Code project. (And still would be about an order of magnitude easier than getting GIMP code in there to be used as a library from other applications).
Finally -- if you want just a couple of GIMP's effects, if the effect is implemented as a Plug-in in GIMP, the plug-ins' code is quite straightforward. So, while it would be hard to get the whole GIMP environment inside Android, copying the functions that actually perform the pixel manipulation from GIMP's source tree and converting them to work in a java method inside your app would not be hard. Just remember to comply with the license in this case: GIMP's plugins code is under GPLv3. (the GEGL library is only LGPL)
In short: no, you can't use GIMP's "libraries" as native code from an Android app -if you can use OpenCV, you have a good chance of being able to use GEGL instead. Only orting the algorithms of certain plugins to manipulate pixels in your app would be easier.
However -- if your application would allow delegating Image Processing to an internet based server, setting up an HTTP application to receive a image, use GIMP to process it, and stream it back would be a simple thing to do.
(So, you could not apply effects in real time, but would allow one to, for example, take a photo, select a series of effects from menus, and send it to the server for processing)
GIMP uses quite a bit of memory when loading brushes. If you drop all of the useless plug-ins, and build it from source. You may be able to get it working but you will have to build ALL of the linked libraries directly into the executable.
In other words; build linked libraries directly into the code as a static build. In this manner things may function properly unless one of those linked libraries call another linked library.
Getting the libraries themselves to work on the OS may provide additional programs opportunities to use them. Additionally, GTK+ (GIMP Tool Kit), GIMP's interface is also rather bloated and ugly.
If all else fails, you'll simply have to settle for a smaller program with the features you're looking for on the fly ( Levels, Curves, the clone tool, dodge and burn, etc. ) Layers are also nice, but editing a a large megapixel image begins to eat up memory rather quickly and most android device don't have a swap partition.

Running a Haskell program on the Android OS

Forenote: This is an extension of the thread started on /r/haskell
Lets start with the facts:
Android is one awesome Operating System
Haskell is the best programming language on the planet
Therefore, clearly, combining them would make Android development that much better. So essentially I would just like to know how I can write Haskell programs for the Android OS. My question is:
How can I get a Haskell program to execute/run on the Android OS?
How you do it is by first getting a Haskell compiler which can target C with the android NDK which comes with a GCC port for ARM architectures. JHC can trivially do this with a very small inf style file which describes the platform (word size, c-compiler, etc) I've done this with the Wii homebrew dev kit and it was quite easy. However jhc still has some stability issues with complex code such as using a monad transformer stack with IO but jhc has been improving a lot over the last 6 months. There is only one person working on JHC I just wished more people could help him.
The other option is to build an "unregistered" port of GHC targeting the ndk gcc, this is a lot more involved process because GHC is not a true cross-compiler at the moment and you need to understand the build system what parts you need to change. Another option is NHC which can cross-compile to C, like GHC you need to build nhc targeting a C compiler, NHC does not have many Haskell extensions like GHC.
Once you have Haskell compiler targeting NDK GCC, you will need to write bindings to either the android NDK JNI glue code framework (added since android 2.3) or you must write JNI glue code between Java-C-Haskell, the former option is the easier solution and if I remember correctly might actually be backwards compatible with previous versions of Android below 2.3.
Once you have this you must build Haskell code as shared library or static library which gets linked into the NDK java glue code (which itself is a shared library). As far as I'm aware you can not officially run native executables on android. You could probably do it with a rooted phone, thus I assume this means you can not distribute native executables on the app store even when the NDK gcc port can generate native executables just fine. This also probably kills the option for using LLVM unless you can get the NDK JNI working with LLVM.
The biggest hurdle isn't so much of getting a Haskell compiler for android (which is still a big hurdle) the biggest problem is that some one needs to write binding APIs for NDK libraries which is a huge task and the situation is worse if you need to write android UI code because there are no NDK APIs for this part of the android SDK. If you want to do android UI code in Haskell somebody will have to write Haskell bindings to Java through JNI/C. Unless there is a more automated process to writing binding libraries (I know there are some, they are just not automated enough for me) then chances of some one doing it are quite low.
L01man: Is there a tutorial about how to do this? For the
first part, I understand I have to download JHC. What do I have to
write in the inf file and how to use it?
Please note before I answer this question I haven't used jhc for quite sometime since I originally wrote this and newer versions have been released since so I do not know how stable jhc currently is when it comes to code generation of more complex Haskell programs. This is a warning to anyone before you consider making a large Haskell program with JHC, you should do some small tests before you go full on.
jhc does have a manual http://repetae.net/computer/jhc/manual.html and a section on setting-up cross-compilation and .ini file with options: http://repetae.net/computer/jhc/manual.html#crosscompilation.
L01man: The second part is an alternative to the first. I don't know how to do what you said in the
third.
Before you begin you should have some knowledge of C and be comfortable with using the Haskell foreign function interface (FFI) and tools such as hs2c. You should also be familiar with using the Android NDK and building .apk with shared libraries. You will need to know these to interface between C-Haskell, Java/C-Haskell and develop Haskell programs for Android that you can officially distribute/sell on the market store.
L01man: I understand that its goal is to create a binding for the
Android API. But... does the 4th part says we can't make .apk with
Haskell?
.apk is just an app package file format and is built with the tools that come with the Android SDK (not NDK), this has very little to do building the binaries itself. Android packages can contain native shared libraries, this what your Haskell program will be and the native shared/static libraries are generated via the Android NDK.
A language that has recently come to my attention is Eta.
Eta's compiler is a fork of GHC 7.10 which has a JVM backend. It is possible to use the generated JAR files to write Android apps and even use its Foreign Function Interface to call native Android Java libraries.
Brian McKenna has written a blog post about how to configure an Android Studio project to use an Eta library.
There is https://github.com/neurocyte/android-haskell-activity demonstrating Haskell code running.
I once came across the same Reddit thread, but it was old, and comments were closed. I sent a message to the OP, but am not sure whether it reached the recipient. My suggestion here (may work for older Androids where native activities were not possible).
I (developed in Haskell some time ago, but currently switched to Smalltalk) am currently developing a port of Squeak VM to Android. The way I am doing this is similar to what might be dealt with in a haskell-on-android project: a lump of C code which needs to be called from Java part of the application (basically all that can be done in Android is to handle various events; an application cannot poll for events itself and does not have any event loop). In my case the code is generated by the Squeak VM building tools, in the case of haskell on android this will be output from GHC of JHC or whatever front end used. This repo may be worth looking at:
http://gitorious.org/~golubovsky/cogvm/dmg-blessed/trees/master/platforms/android/project
Under "src" there is the Java code which provides for user events interception and sending them to the native code (see the CogView class). The C code of the VM itself is not completely there (see squeakvm.org, the Cog branch for that), but one may get the idea. One also might look under http://gitorious.org/~golubovsky/cogvm/dmg-blessed/trees/master/platforms/android/vm which is the C frontend to the interpreter (including user events handling, some timekeeping, etc.)
Hope this helps.
Dmitry
There is https://github.com/conscell/hugs-android a port of HUGS Haskell interpreter to Android.
I think the general answer should come from source->source transformations, since loading specially compiled shared objects seems to be a bit of a kludge (involving ghc->c and a c->java step in the answers above). This question thus falls under the heading of Haskell on the JVM, which has been tried (with one step as a Java intermediate representation) and discussed at length. You could use frege if the libraries you need compile there. The only remaining steps would be the beginnings of the Android framework API translated into IO() actions and maybe a wrapper for building the manifest xml and apk.

Android OpenAL?

Has anyone built OpenAL for the Android, or found the shared library for it on the system? This seems like an obvious need for a game of any kind, yet there's no resources out there for it. It seems the Android java sound library can't do pitch changes from what I can tell, so there seems a need for OpenAL. I know OpenAL Soft can be built on top of ALSA, but I'm not sure if anyones done that, and I'm sure it would take me a month.
If there's a good guide somewhere on sound manipulation on the Android without OpenAL, that's fine too. It's just that OpenAL is sort of a standard for game makers and it would be nice to port my thousands of lines over to this system, which I sort of thought was the point of the NDK before I dugg into it and saw that there's almost no shared library access on the system.
Thanks.. I hope I can actually port without becoming a java expert myself. Really disliking the NDK so far!
A few options are available now for NDK audio:
It's not OpenAL, but OpenSL ES 1.0.1 is an official part of the NDK as of API level 9 (2.3). More information here.
OpenAL Soft has an OpenSL ES backend in its git master (not released as of version 1.13). It is however at this time broken on Android, as it is written for OpenSL ES 1.1, not 1.0.1. See this commit for a fix.
As mentioned in a previous answer, a JNI backend for OpenAL Soft is linked to and described here as the only option for OpenAL on pre-2.3 Android platforms. However, this is an outdated fork of OpenAL Soft - I've updated the backend to the latest version on a github repo here along with the OpenSL ES 1.0.1 fix. Also included is an untested optional patch that claims to provide better performance and less latency.
Just before Google announced that 3D audio is going to be included into Android 2.3, I managed to compile OpenAL for Android, and package it as shared object.
See http://pielot.org/2010/12/14/openal-on-android/
Might still be helpful if you'd like to target devices < 2.3.
Cheers.
OpenSL is planned for a future Android build; OpenAL isn't available, and the low-level hardware is off-limits to anything you can do in the NDK, so you can't safely build it yourself.
There's no support for low-latency audio even planned; there's a bug to that effect here:
http://code.google.com/p/android/issues/detail?id=3434
Star it if it's relevant to you; maybe Google will listen if it gets enough stars.
EDIT: There IS low latency audio in Android 4.0+, and OpenSL is available now. See this page and linked pages: http://source.android.com/devices/audio/latency.html Also see the NDK guide on OpenSL.
You can use the NDK to build OpenAL and package it with your APK. That way you can access it from your native code.

Categories

Resources