What is the right way to choose an EGL config in Android? - android

I'm using my own GLSurfaceView and have been struggling with crashes related to the EGL config chooser for a while.
It seems as though requesting RGB_565 by calling setEGLConfigChooser(5, 6, 5, 0, 16, 0) should be the most supported. However, running this code on the emulator using host GPU I still get a crash, seemingly because my graphics card does not natively support RGB_565. Setting to RGBA_8888 by calling setEGLConfigChooser(8, 8, 8, 8, 16, 0) seems to run fine on my emulator and HTC Rezound device, but I'm seeing a small number of crash reports in the market still.
My guess is that most phones support RGBA_8888 natively now but a small number of my users have phones which are only compatible with RGB_565, which prevents my config chooser from getting a config.
Seeing as how I don't need the alpha channel, is there a right way to try RGBA_8888 first and then fall back to RGB_565? Is there a cleaner way to just ask for any ol' config without caring about the alpha value?
I saw a possible solution to determine ahead of time what the default display config was and request that specifically here: https://stackoverflow.com/a/20918779/234256. Unfortunately, it looks like the suggested getPixelFormat function is deprecated as of API level 17.

From my experience I do not think setEGLConfigChooser() actually works correctly, i.e. it has a bug in its implementation. Across a number of devices I have seen crashes where setEGLConfigChooser() fails to select a valid context even if the underlying surface is of the correct type.
I have found the only reliable way to choose an EGL context is with a custom EGLConfigChooser. This also has the added benefit of choosing a config based on your own rules, e.g. surface must have depth and preferably RGB888 but can settle for RGB565. This is actually pretty straightforward to use eglChooseConfig() to get a list of possible configurations and then return one of them that matches your selection criteria.

This gsoc code sample is for enabling MSAA. But it also contains code to select configurations, checking if they are available.
https://code.google.com/p/gdc2011-android-opengl/source/browse/trunk/src/com/example/gdc11/MultisampleConfigChooser.java#90

Related

On Android, determining whether glBlendFuncSeparateOES is supported

I have some code that uses glBlendFuncSeparateOES and glBendEquationSeparateOES in order to render onto framebuffers with alpha.
However, I've found that a couple of my target devices do NOT appear to support these functions. They fail silently, and all that happens is that your render mode doesn't get set. My Kinda Fire cheapie tablet and an older Samsung both exhibit this behavior.
Is there a good way, on android, to query if they're actually implemented? I have tried eglGetProcAddress, but it returns an address for any string you throw at it!!!!!
Currently, I just have the game, on startup, do a quick render on a small FBO to see if the transparency is correct or if it has artifacts. It works, but it's a very kludgy method.
I'd much prefer if there was something like glIsBlendFuncSeparateSupported().
You can get a list of all available extensions using glGetString(GL_EXTENSIONS). This returns a space-separated list of supported extensions. For more details see Khronos specification.

Frame Listener for Android Virtual Display (NDK internal build)

I'm building an internal shared library for Android platform. I have the signing keystore from the device manufacturer.
My library is making use of ScreenRecord.cpp internal file from Android source. Recording works fine with the MediaCodec encoder; however I want to access each frame so that I can apply some image overlay logo on to each frame before it gets passed to the encoder.
There's an overlay example in Android source too, but that only works for newer versions of Android (5.0 / API 21+). I want to have an overlay solution for Android Kitkat (4.4 / API 19)
Here's a code example that I obtained from minicap.
mVirtualDisplay = android::SurfaceComposerClient::createDisplay(
android::String8("minicap"),
true);
LOGI("Creating buffer queue");
mScreenshotClient.getCpuConsumer();
mBufferQueue = mScreenshotClient.mBufferQueue;
LOGI("Creating CPU consumer");
mConsumer = new android::CpuConsumer(mBufferQueue, 3, false);
mConsumer->setName(android::String8("minicap"));
mConsumer->setDefaultBufferSize(targetWidth, targetHeight);
mConsumer->setDefaultBufferFormat(android::PIXEL_FORMAT_RGBA_8888);
mConsumer->setFrameAvailableListener(mFrameProxy);
//mFrameProxy is from:
//class FrameProxy: public android::ConsumerBase::FrameAvailableListener
LOGI("Publishing virtual display");
android::SurfaceComposerClient::openGlobalTransaction();
android::SurfaceComposerClient::setDisplaySurface(mVirtualDisplay, mBufferQueue);
android::SurfaceComposerClient::setDisplayProjection(mVirtualDisplay,
android::DISPLAY_ORIENTATION_0, layerStackRect, visibleRect);
android::SurfaceComposerClient::setDisplayLayerStack(mVirtualDisplay, 0);// default stack
android::SurfaceComposerClient::closeGlobalTransaction();
I set up the above code, but onFrameAvailable() method of FrameAvailableListener gets called only once. It never gets called again even when I do stuff on the screen. What am I missing here?
Isn't there any lesser tricky way to access the frames before passing to the encoder?
An example of adding an overlay is built into the screenrecord sources for Lollipop. As far as I can recall it doesn't rely on any features added in Lollipop, so you should be able to build and run it on 4.4. As noted on bigflake, the --bugreport mode was added to AOSP in the 4.4 time frame, but didn't actually ship with the system until 5.x. (With a minor tweak, it should even run on 4.3, but I haven't tried it.)
The key source files are Overlay.{cpp,h}. It does the same thing that you would do from code written in Java: create a GLConsumer (SurfaceTexture), use that to convert the incoming frames to GLES textures, then render texture + overlay to the video encoder.
Sample video is here. Note it adds a block of text to the very start, and a running timestamp / frame-counter to the top left corner.
Note for anyone else reading this: this code is using internal private APIs that have been changing in recent releases, so any binaries must be built for specific versions of Android, and may not be portable to devices built by different manufacturers even if they run the same version of Android (sometimes the OEMs like to mess with things).
Update: My earlier statements about working on KitKat weren't accurate -- there was a major API shift before the Lollipop version went out. The trick is to grab the sources before this change went in, as that was when the BufferQueue API rewrite reached screenrecord. You can see from the change list that the --bugreport option went in about five months before that.

adb screencap output is different than on the device

I have a graphical glitch related to blending in my OpenGL application using Android NDK.
The strange thing is that when I take a screenshot through adb screencap command, the problem completely disappears and the result looks okay.
My question is:
Is there a way to know what is happening behind the scenes of making screenshots? Is there eglChooseConfig called with some specific values for the entire frame for example? Or maybe is there some specific initial GL state forced?
Some background:
My device is using Qualcomm Adreno 320.
The glich occurs when I call glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA) for some of the geometry.
I have also found that setting glColorMask(1, 1, 1, 0) results in a black screen on my device (and only on this device), whereas taking a screenshot results in a complete, correct game frame.
The application outputs no glitches on several other Android devices, and other applications work well, even ones that use blending extensively.
Generally speaking, devices don't have a framebuffer full of pixels that you can copy out of when capturing the screen. The "screen capture" functions are actually "redraw the screen into a buffer" functions. It's possible for the screen capture to be slightly different, deliberately so if a "secure" layer or DRM content is on screen.
Is this a single, fully opaque Surface? Or is it being blended with another layer above or below?
The most common reason for differences is bugs in the Hardware Composer, but it sounds like you're seeing issues rendering on a single surface, so that is less likely. If you have a rooted device, you can turn HWC composition on and off using the commands shown here: adb shell service call SurfaceFlinger 1008 i32 1 will disable overlays and force GLES composition. (If none of that made any sense, read through the graphics architecture doc.)
Are you able to post images of the correct and glitched images? (One via screenshot, one by taking a picture of the device with a second device.)
Do you see similar issues if you record the screen with adb shell screenrecord?
The problem disappeared once I commented out EGL_ALPHA_SIZE setting:
const EGLint attribs[] = {
EGL_BLUE_SIZE, 8,
EGL_GREEN_SIZE, 8,
EGL_RED_SIZE, 8,
//EGL_ALPHA_SIZE, 8,
EGL_NONE
};
It looks like with alpha set to 8 bits, eglChooseConfig returned a problematic configuration object.
Funnily enough, the "correct" EGLConfig specifies 0 bits for EGL_ALPHA_SIZE, so at first I would expect it to not work at all. Other devices don't really care about the value and they are doing well provided only RGB channels' depths.
I have learned a lesson: if there are graphical glitches on your device, always check all possible EGL configurations!
So my conclusion is: yes, probably there is a custom EGLConfig set inside adb screencap.

Render-to-texture and synchronization

I have a cross-platform code base (iOS and Android) that uses a standard render-to-texture setup. Each frame (after initialization), the following sequence occurs:
glBindFramebuffer of a framebuffer with a texture color attachment
Render some stuff
*
glBindFramebuffer of the default framebuffer (0 on Android, usually 2 on iOS)
glBindTexture of the texture that was the color attachment to the first framebuffer
Render using the bound texture
On iOS and some Android devices (including the emulator), this works fine and as expected. On other devices (currently sitting in front of a Samsung Galaxy Note running 4.0.4), the second-phase rendering that uses the texture looks "jumpy". Other animations continue to run at 60 fps on the same screen as the "jumpy" bits; my conclusion is that the changes to the target texture are not always visible in the second rendering pass.
To test this theory, I insert a glFinish() at the step marked with a * above. On all devices, now, this has the correct behavior. Interestingly, glFlush() does NOT fix the problem. But glFinish() is expensive, and I haven't seen any documentation that suggests that this should be necessary.
So, here's my question: What must I do when finished rendering to a texture to make sure that the most-recently-drawn texture is available in later rendering passes?
The code you describe should be fine.
As long as you are using a single context, and not opting in to any extensions that relax synchronization behavior (such as EXT_map_buffer_range), then every command must appear to execute as if it had executed in exactly the same order specified in the API, and in your API usage you're rendering to the texture before reading from it.
Given that, you are probably encountering driver bugs on those devices. Can you list which devices are encountering the issue? You'll probably find common hardware or drivers.

Android "cpu may be pegged" bug

Foreword: This severe bug can cause Android devices to lock up (unable to press Home/Back buttons, needs hard reset). It is associated with OpenGL surfaces and audio playback. Logcat repeats something to the effect of
W/SharedBufferStack( 398): waitForCondition(LockCondition) timed out (identity=9, status=0). CPU may be pegged. trying again.
once every second, hence the name of this bug. The fundamental cause of this is likely due to a deadlock when buffering data, be it sound or graphics.
I occasionally get this bug when testing my app on the Asus EEE Transformer tablet. The crash occurs when the sound thread populates MediaPlayer objects using MediaPlayer.create(context, R.raw.someid); and the GLSurface thread loads textures from bitmaps using
Bitmap bitmap = BitmapFactory.decodeResource(context.getResources(),
R.drawable.textureMap,opts);
gl.glGenTextures(1, texAtlas, 0);
gl.glBindTexture(GL10.GL_TEXTURE_2D, texAtlas[0]);
gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MIN_FILTER, GL10.GL_NEAREST);
gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MAG_FILTER, GL10.GL_LINEAR);
GLUtils.texImage2D(GL10.GL_TEXTURE_2D, 0, bitmap, 0);
bitmap.recycle();
I don't think the cause is the audio, as the audio does in fact still play (the thread which loads the audio then plays it after x amount of time). If so, the cause lies in the OpenGL ES buffering using the above code.
Related Material
This SO post refers to this bug. They use OpenGL ES 2.0 and NDK. I use OpenGL ES 1.1 (albeit most devices emulate 1.1 through 2.0, so technically they are using 2.0) and I do not use the NDK. Further, they use Android 2.1 and my crash occurs on Android 3.2.1.
This site links the bug to the AudioTrack object. However, I do not use that in my app.
The Android Bug Tracker lists this as a known bug, but as of yet there is no solution (and it's not fixed in Honeycomb+).
Common Elements
Freeze occurs when buffering. The thing being buffered is usually quite large, so an image (bug occurs more frequently the larger the image) or audio is typically affected.
Freeze only occurs on some devices.
Freeze is not related to a specific Android version - has been recorded on 2.1 and 3.2.1, among others.
Freeze is not related to use of the NDK.
Freeze is not related to a single programming practice (order of buffering, file types, etc)
My question is pretty simple. Is there a workaround for this issue? If you can't prevent it, is there a way to fail elegantly and prevent the whole device being bricked?
In the case of my game the "waitForCondition" problem has been noticed on Samsung Galaxy S (Android 2.3.3). Of course I don't know if the issue has been noticed on different devices, but probably the problem exists there too. Fortunately I was able to reproduce it as one of my friends has got the device and he was kind enough to lent me one for a week.
In my case the game is practically all written in Java (few calls through NDK to OpenGL functions), so I'm not really sure if this will apply to your problem too.
Anyway it seems that the problem is somehow related to OpenGL internal buffers. In the code presented below the line which has been commented out (1) has been changed to (2) - manual config selection. I didn't test it thoroughly yet, but since that change I haven't noticed any freezes, so there is a hope..
UPDATE 1: As an additional info, I think I read somewhere that somebody had the same problem with his CPU gets pegged and his solution was to set up all the OpenGL Surface components to 8 bits (alpha component too) rather than 565 or 4 bits (I don't remember exactly what was the faulty configuration)
UPDATE 2: Also one may consider to use the following implementation of the EGLConfigChooser: GdxEglConfigChooser.java. If this doesn't help eventually use the approach presented in GLSurfaceView20.java.
UPDATE 3: Additionally simplifying the program shaders as much as it's possible helped a bit too.
// in Activity...
glView = new GLSurfaceView(this);
glView.setEGLContextClientVersion(2); // OpenGL ES 2.0
// glView.setEGLConfigChooser(false); // (1) false - no depth buffer
glView.getHolder().setFormat(PixelFormat.TRANSLUCENT);
glView.setEGLConfigChooser(8,8,8,8,0,0); // (2) TODO: crashes on devices which doesn't support this particular configuration
glView.setRenderer(new MyRenderer(this));
Increasing the virtual memory of the device lowers the occurrences in which this issue happens. Of course this is not an option unless you are the manufacturer of the device.

Categories

Resources