While working with the NDK Camera interface I stumbled upon the problem that a SurfaceTexture can request a defaultSize which can be overridden.
I wanted to get the actual size, but SurfaceTexture offers no way to query that.
So instead I tried this:
// this runs directly after SurfaceTexture.updateTexImage
GLfloat width, height;
glGetTexLevelParameterfv(GL_TEXTURE_EXTERNAL_OES, 0, GL_TEXTURE_WIDTH, &width);
glGetTexLevelParameterfv(GL_TEXTURE_EXTERNAL_OES, 0, GL_TEXTURE_HEIGHT, &height);
Which worked flawlessly on a Mali system but returns GL_INVALID_ENUM on an Adreno system.
The extension's description does not introduce GL_TEXTURE_EXTERNAL_OES as a valid paremeter for GetTexLevelParameter so I assume I was just lucky on the Mali system.
Is there any way to query the size of an GL_TEXTURE_EXTERNAL_OES in OpenGL ES (3.1)?
Related
I want to take screenshot of current frame in OpenGL for further processing and I'm trying to improve the performance of glReadPixels by using PBO to asynchronously read framebuffers.
I'm under the impression that glReadPixels after GL_PIXEL_PACK_BUFFER is bound to buffer should return immediately, but it actually takes similar or even more time than not using PBO.
Here are samples of my codes:
// Setup PBO
GLES30.glGenBuffers(nPbo, pboIndex, 0);
for(int i=0;i<nPbo; i++){
GLES30.glBindBuffer (GL_PIXEL_PACK_BUFFER, pboIndex[i]);
GLES30.glBufferData(GL_PIXEL_PACK_BUFFER, size, null,GL_STREAM_READ);
}
GLES30.glBindBuffer(GL_PIXEL_PACK_BUFFER, 0);
......
// For each frame, trigger async transfer of framebuffer to PBO.
// Note that I don't even map the PBO to memory yet
GLES30.glBindBuffer (GL_PIXEL_PACK_BUFFER, pboIndex[index]);
// The following is a JNI method to overload glReadPixels in GLES20.glReadPixels,
// to allow passing int offset to the last param in order to use PBO,
// and slowdown (around 500ms on my device) happens here
GLES3PBOReadPixelsFix.glReadPixelsPBO(0, 0, mWidth, mHeight, GLES30.GL_RGBA, GLES30.GL_UNSIGNED_BYTE, 0);
GLES30.glBindBuffer(GL_PIXEL_PACK_BUFFER, 0);
Based on this article, the cause of the slowdown could be due to conversion between internal format, which may be GL_BGRA, and pixel transfer format, which is GL_RGBA in my code. Changing the transfer format to GL_RGB will reduce the latency of glReadPixels to around 100ms, but when I map the buffer with GLES30.glMapBufferRange the output frame doesn't look rendered correctly. I also tried the GL_BGRA format in GLES11Ext but it will cause GL_INVALID_OPERATION in glReadPixel.
Is there any other way to make glReadPixels on Android return immediately so that PBO can improve performance?
As Reto has suggested, it turns out to be an implementation specific issue. The GPU that I was originally testing with is Adreno 306. When I test the same codes on Samsung Note 4 (Adreno 420), it works as expected. So it's always worthwhile to test on different devices and GPUs for such types of issues.
Can someone answer me how come this line:
GLES30.glTexImage2D(GLES30.GL_TEXTURE_2D, 0, GLES30.GL_R16F, width, height, 0, GLES30.GL_RED, GLES30.GL_HALF_FLOAT, myBuffer);
works on tegra4 but doesn't work on ARM Mali-T628 MP6?
I am not attaching this to a framebuffer by the way, I am using this as a read only texture. The code returned on ARM is 1280 where Tegra 'doesn't complain' at all.
Also, I know that Tegra4 got extension for half float textures, and that specific Mali doesn't have that extension, but since it's OpenGL ES 3.0, shouldn't it support such textures?
That call looks completely valid to me. Error 1280 is GL_INVALID_ENUM, which suggests that one of the 3 enum type arguments is invalid. But each one by itself, as well as the combination of them, is spec compliant.
The most likely explanation is a driver bug. I found that several ES 3.0 drivers have numerous issues, so it's not a big surprise to discover problems.
The section below was written under the assumption that the texture would be used as a render target (FBO attachment). Please ignore if you are looking for a direct answer to the question.
GL_R16F is not color-renderable in standard ES 3.0.
If you pull up the spec document, which can be found on www.khronos.org (direct link), table 3.13 on pages 130-132 lists all texture formats and their properties. R16F does not have the checkmark in the "Color-renderable" column, which means that it can not be used as a render target.
Correspondingly, R16F is also listed under "Texture-only color formats" in section "Required Texture Formats" on pages 129-130.
This means that the device needs the EXT_color_buffer_half_float extension to support rendering to R16F. This is still the case in ES 3.1 as well.
I have an Android app that decodes video into yuv420p format then renders video frames using OpenGLES.
I use glTexSubImage2D() to upload y/u/v buffer to GPU then do a YUV2RGB conversion using shader. All EGL/OpenGL setup/rendering code is native code.
Now I am not saying there is no problem with my code, but considering the same code is running perfecting fine on iOS (iPad/iPhone), Nexus 7, Kindle HD 8.9, Samsung Note 1 and a few other cheap chinese tablets (A31/RockChip 3188) running Android 4.0/4.1/4.2. I would say it's less likely my code is wrong. On those devices, glTexSubImage2D() uses less than 16ms to upload a SD or 720P HD texture.
However, on Nexus 10, glTexSubImage2D() it takes about 50~90ms for a SD or 720P HD texture which is way too slow for a 30fps or 60fps video.
I would like to know
1) if I should pick a different texture format (RGBA or BGRA). Is there a ways to detect which is the best texture format used by a GPU?
2) if there is a feature that is 'OFF' on all other SOCs but set to 'ON' on Exynos 5. For example, the automatic MIPMAP generation option. (I have it off, btw)
3) if this is a known issue of Samsung Exynos SOC - I can't find a support forum for Exynos CPU
4) Is there any option I need to set when configure the EGL surface? like, transparency, surface format, etc? (I have no idea what I am talking about)
5) It could mean GPU is doing an implicit format conversion but I checked GL_LUMINANCE is always used. Again it works on all other platform.
6) anything else?
My EGL config:
const EGLint attribs[] = {
EGL_SURFACE_TYPE, EGL_WINDOW_BIT,
EGL_RENDERABLE_TYPE, EGL_OPENGL_ES2_BIT,
EGL_BLUE_SIZE, 8,
EGL_GREEN_SIZE, 8,
EGL_RED_SIZE, 8,
EGL_NONE
};
Initial setup:
glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE, ctx->frameW, ctx->frameH, 0, GL_LUMINANCE, GL_UNSIGNED_BYTE, NULL); /* also for U/V */
subsequent partial replacement:
glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, ctx->frameW, ctx->frameH, GL_LUMINANCE, GL_UNSIGNED_BYTE, yBuffer); /*also for U/V */
I am trying to render video at ~30FPS or ~60FPS at SD or 720P HD resolution.
This is a known driver issue that we have reported to ARM. A future update should fix it.
EDIT Status update
We've now managed to reproduce slow upload conditions for one path on the public firmware, which you are possibly hitting, and this will be fixed in the next driver release.
If you double-buffer texture IDs (e.g. frame N = ID X, N+1 = ID Y, N+2 = ID X, N+3 = ID Y, etc) for the textures you are uploading to it should help avoid this on the current firmware.
Thanks,
Iso
I can confirm this has been fixed in Android 4.3 - I'm seeing a performance increase by a factor of 2-3 with RGBA format and by a factor of 10-50 with other texture formats over Android 4.2.2. These results apply for both glTexImage2D and glTexSubImage2D. (Can't add comments yet so I had to put this here)
EDIT: If you're stuck with 4.2.2, you could try using RGBA texture instead, it should have better performance (3-10x or so with larger power-of-two texture sizes).
I have played for a while with OpenGL on Android on various devices. And unless I'm wrong, the default rendering is always performed with the RGB565 pixel format.
I would however like to render more accurate colors using RGB888.
The GLSurfaceView documentation mentions two methods which relate to pixel formats:
the setFormat() method exposed by SurfaceHolder, as returned by SurfaceView.getHolder()
the GLSurfaceView.setEGLConfigChooser() family of methods
Unless I'm wrong, I think I only need to use the later. Or is using SurfaceHolder.setFormat() relevant here?
The documentation of the EGLConfigChooser class mentions EGL10.eglChooseConfig(), to discover which configurations are available.
In my case it is ok to fallback to RGB565 if RGB888 isn't available, but I would prefer this to be quite rare.
So, is it possible to use RGB888 on most devices?
Are there any compatibility problems or weird bugs with this?
Do you have an example of a correct and reliable way to setup the GLSurfaceView for rendering RGB888?
On newer devices, most of them should support RGBA8888 as a native format. One way to force RGBA color format is to set the translucency of the surface, you'd still want to pick the EGLConfig to best guess the config for the channels in addition to the depth and stencil buffers.
setEGLConfigChooser(8, 8, 8, 8, 0, 0);
getHolder().setFormat(PixelFormat.RGBA_8888);
However, if I read your question correctly you're asking for RGB888 support (alpha don't care) in other words, RGBX8888 which might not be supported by all devices (driver vendor limitation).
Something to keep in mind about performance though, since RGBA8888 is the color format natively supported by most GPUs hardware it's best to avoid any other color format (non natively supported) since that usually translate into a color conversion underneath adding non necessary work load to the GPU.
This is how I do it;
{
window = [[UIWindow alloc] initWithFrame:[[UIScreen mainScreen] bounds]];
// cocos2d will inherit these values
[window setUserInteractionEnabled:YES];
[window setMultipleTouchEnabled:NO];
// must be called before any othe call to the director
[Director useFastDirector];
[[Director sharedDirector] setDisplayFPS:YES];
// create an openGL view inside a window
[[Director sharedDirector] attachInView:window];
// Default texture format for PNG/BMP/TIFF/JPEG/GIF images
// It can be RGBA8888, RGBA4444, RGB5_A1, RGB565
// You can change anytime.
[Texture2D setDefaultAlphaPixelFormat:kTexture2DPixelFormat_];
glClearColor(0.7f,0.7f,0.6f,1.0f);
//glClearColor(1.0f,1.0f,1.0f,1.0f);
[window makeKeyAndVisible];
[[Director sharedDirector] runWithScene:[GameLayer node]];
}
I hope that helps!
My scene in OpenGL ES requires several large resolution textures, but they are grayscale, since I am using them just for masks. I need to reduce my memory use.
I have tried loading these textures with Bitmap.Config.ALPHA_8, and as RGB_565. ALPHA_8 seems to actually increase memory use.
Is there some way to get a texture loaded into OpenGL and have it use less than 16bits per pixel?
glCompressedTexImage2D looks like it might be promising, but from what I can tell, different phones offer different texture compression methods. Also, I don't know if the compression actually reduces memory use at runtime. Is the solution to store my textures in both ATITC and PVRTC formats? If so, how do I detect which format is supported by the device?
Thanks!
PVRTC, ATITC, S3TC and so forth, the GPU native compressed texture should reduce memory usage and improve rendering performance.
For example (sorry in C, you can implement it as using GL11.glGetString in Java),
const char *extensions = glGetString(GL_EXTENSIONS);
int isPVRTCsupported = strstr(extensions, "GL_IMG_texture_compression_pvrtc") != 0;
int isATITCsupported = strstr(extensions, "GL_ATI_texture_compression_atitc") != 0;
int isS3TCsupported = strstr(extensions, "GL_EXT_texture_compression_s3tc") != 0;
if (isPVRTCsupportd) {
/* load PVRTC texture using glCompressedTexImage2D */
} else if (isATITCsupported) {
...
Besides you can specify supported devices using texture format in AndroidManifest.xml.
The AndroidManifest.xml File - supports-gl-texture
EDIT:
MOTODEV - Understanding Texture Compression
With Imagination Technologies-based (aka PowerVR) systems, you should be able to use PVRTC 4bpp and (depending on the texture and quality requirements) maybe even 2bpp PVRTC variant.
Also, though I'm not sure what is exposed in Android systems, the PVRTextool lists I8 (i.e. greyscale 8bpp) as target texture format, which would give you a lossless option.
ETC1 texture compression is supported on all Android devices with Android 2.2 and up.