I just bought a new SGS3 (I9300 - NOT LTE) and hoped to continue with developing an OpenGL ES (2) application.
Unfortunately when I compile it I don't see anything.
I get the following LogCat error messages:
D/libEGL(6890): loaded /system/lib/egl/libEGL_mali.so
D/libEGL(6890): loaded /system/lib/egl/libGLESv1_CM_mali.so
D/libEGL(6890): loaded /system/lib/egl/libGLESv2_mali.so
E/(6890): Device driver API match
E/(6890): Device driver API version: 23
E/(6890): User space API version: 23
E/(6890): mali: REVISION=Linux-r3p2-01rel3 BUILD_DATE=Wed Oct 9 21:05:57 KST 2013**
D/OpenGLRenderer(6890): Enabling debug mode 0
I also installed a custom rom (cyanogenmod 11 - snapshot M4) but I get the same problem.
When I start the app, I get the blank screen without any vertices rasterized. The clear color works so far so the basic functionality of OpenGL is working.
To be sure I tried it with the basic tutorial from the Google Developers page with GLES 1 and GLES 2. Both don't work! Here are the screenshots:
http://developer.android.com/training/graphics/opengl/index.html
The project itself works fine on my old Galaxy S1 (also on Cyanogenmod) but does not show anything else than a blank screen on my SGS3.
Is it possible that the Mali-400 MP4 graphics driver / system interpret GL commands differently? Is there a different way of calling than with the Hummingbird GPU from my SGS1?
Has anyone an idea what to do? Is this a problem with my phone or eclipse? Or is this normal - just my lack of understanding? How can I fix this problem?
------- EDIT : SOLUTION FOUND -------
Okay, I found the error. The google document shows a "wrong" the multiplication of the matrices in the vertex shader:
uniform mat4 uMVPMatrix;
attribute vec4 vPosition;
void main() {
gl_Position = uMVPMatrix * vPosition;
}
This doesn't seem to be a problem for my old Galaxy S1 but somehow the S3 (or the Mali GPU) is picky about this. I changed the order of multiplication to this:
uniform mat4 uMVPMatrix;
attribute vec4 vPosition;
void main() {
gl_Position = vPosition * uMVPMatrix;
}
And it works (also on the S1). Still not sure why the S1 works fine with both versions but this solves the problem.
Thanks for your help!
Just off the top of my head, did you check the projection settings? It's possible those shapes are being drawn, just not where you expect them to be.
Also, check the return values for the shader loading steps. If there's a compile issue, you'll get an invalid handle for the shader program.
There may be some warnings or errors in your code which aren't bubbling up to LogCat. I've found that using a native OpenGL wrapper from LibGDX actually shows me those errors and is pretty helpful for debugging.
You can find some helpful pointers on setting up the LibGDX libraries here - http://www.learnopengles.com/android-lesson-seven-an-introduction-to-vertex-buffer-objects-vbos/
Related
I am trying to render a smooth gradient from 0% to 10% gray across the screen of Asus Rog Phone 2 which supposedly has an HDR10 screen. In standard (8bit?) rendering mode I can clearly see banding between the gradient levels.
I followed the instructions from Android to modify the Vulkan tutorial sample code in the following way:
Added android:colorMode="wideColorGamut to AndroidManifest.xml.
Changed VkSwapchainCreateInfoKHR.imageFormat, VkAttachmentDescription.format and VkImageViewCreateInfo.format to VK_FORMAT_R16G16B16A16_SFLOAT (everything to setup swapchain).
Changed VkSwapchainCreateInfoKHR.imageColorSpace to VK_COLOR_SPACE_DISPLAY_P3_NONLINEAR_EXT.
I render the gradient dynamically in a custom fragment shader as a function of texturing coordinates mapped to a full-screen quad:
layout (location = 0) in vec2 texcoord;
layout (location = 0) out vec4 uFragColor;
void main() {
uFragColor = vec4(vec3(texcoord.x*0.1), 1.0);
}
As a result I observe absolutely no difference from the original 8bit(?) mode. I am not quite sure how to debug it and what to try.
I also tried to implement the same effect in Unity with HDR options enabled. There I see that all the intermediate rendering passes (in Forward mode) render to Float16 but the last extra pass likely converts it into RGB8. The visual result is the same banding again.
This makes me wonder whether this is not caused by some missing setting. In another thread I saw discussion of Window.setFormat(PixelFormat) or SurfaceHolder.setFormat(PixelFormat) but I am not sure how this related to a Native Android app. I cannot see a way how to call such function and I am not even sure if it would make sense.
Thank you for any suggestions.
Where are you setting HDR? HDR is PQ transfer function, neither 10 bit, not BT.2020 has anything to do with HDR.
https://developer.android.com/training/wide-color-gamut
This talks only about WCG.
This should be used to trigger HDR. https://developer.android.com/ndk/reference/struct/a-hdr-metadata-smpte2086
For Java this should be used to check whether HDR is there.
https://developer.android.com/reference/android/view/Display.HdrCapabilities?hl=en
See this question with further links What is the difference between Display.HdrCapabilities and configuration.isScreenHdr
I have an opengles 3.1 application that renders fine on the desktop but does not render on android.
The bit that goes wrong is when i have uniform buffer objects. In the vertex shader I have the below for example
layout (std140, binding = 0) uniform matrixUbo
{
mat4 projection;
mat4 view;
};
This works ok using deskop drivers but on android it fails. The version of opengles I am testing on is 3.2 compatible and the function calls are available in android.
I have tried both setting the bindings in the vertex shader and setting them using the glUniformBlockBinding method and both don't work on android (but both work on the desktop).
If I don't use those to matrix then the objects do render ok (I can see them ok on my android phone) but when I include those matrix nothing is drawn which tells me the matrix are full of zero's.
Is there anything special that needs to be done for UBO's to be supported on android?
I'm happy to provide more information as required.
To answer my own question, they are supported on android opengl es 3.1 but when you update the data you need to use a ByteBuffer not a FloatBuffer even though th function calls support it. Strange issue and a pain to debug!!
I am using OpenGL ES to run some shaders on Android.
On some older/cheap devices they do not support highp precision so the shader output is incorrect.
I need to know when the app starts if the device can support high precision. That way I can tell the user "forget it, your device does not support high precision floats" rather than have it output garbage for them.
I found this query code online, but it seems to only be for WebGL
var highp = gl.getShaderPrecisionFormat(gl.FRAGMENT_SHADER, gl.HIGH_FLOAT);
var highpSupported = highp.precision != 0;
Does anyone have a way I can query an android device (KitKat or higher) to see what precision the GLES shaders will support?
This is the final code I now use, but contents of range and precision are always -999 no matter where I run the code in my app. Before, during or after the GLSurfaceView has been created and GLES output has run.
IntBuffer range = IntBuffer.allocate(2);
IntBuffer precision = IntBuffer.allocate(1);
range.put(0,-999);
range.put(1,-999);
precision.put(0,-999);
android.opengl.GLES20.glGetShaderPrecisionFormat(android.opengl.GLES20.GL_FRAGMENT_SHADER, android.opengl.GLES20.GL_HIGH_FLOAT,range,precision);
String toastText="Range[0]="+String.valueOf(range.get(0))+" Range[1]="+String.valueOf(range.get(1))+" Precision[0]="+String.valueOf(precision.get(0));
Toast.makeText(getApplicationContext(),toastText, Toast.LENGTH_SHORT).show();
The above code always returns -999 for all 3 values, and the kronos doco states if an error occurs then the values will be unchanged. So it looks like there is an error or I am not calling it at the right time.
Can someone answer me how come this line:
GLES30.glTexImage2D(GLES30.GL_TEXTURE_2D, 0, GLES30.GL_R16F, width, height, 0, GLES30.GL_RED, GLES30.GL_HALF_FLOAT, myBuffer);
works on tegra4 but doesn't work on ARM Mali-T628 MP6?
I am not attaching this to a framebuffer by the way, I am using this as a read only texture. The code returned on ARM is 1280 where Tegra 'doesn't complain' at all.
Also, I know that Tegra4 got extension for half float textures, and that specific Mali doesn't have that extension, but since it's OpenGL ES 3.0, shouldn't it support such textures?
That call looks completely valid to me. Error 1280 is GL_INVALID_ENUM, which suggests that one of the 3 enum type arguments is invalid. But each one by itself, as well as the combination of them, is spec compliant.
The most likely explanation is a driver bug. I found that several ES 3.0 drivers have numerous issues, so it's not a big surprise to discover problems.
The section below was written under the assumption that the texture would be used as a render target (FBO attachment). Please ignore if you are looking for a direct answer to the question.
GL_R16F is not color-renderable in standard ES 3.0.
If you pull up the spec document, which can be found on www.khronos.org (direct link), table 3.13 on pages 130-132 lists all texture formats and their properties. R16F does not have the checkmark in the "Color-renderable" column, which means that it can not be used as a render target.
Correspondingly, R16F is also listed under "Texture-only color formats" in section "Required Texture Formats" on pages 129-130.
This means that the device needs the EXT_color_buffer_half_float extension to support rendering to R16F. This is still the case in ES 3.1 as well.
Ive been trying to make a 2.5D engine with depth and normal map textures for a few weeks now, not unlike whats used here Linky. After thinking the drawing of a depth map in the fragment shader from a texture was impossible due to ES 2.0 missing the gl_fragDepth variable I found a tutorial for iOS where they used glBlendEquation with the mode GL_MIN/GL_MAX to "fake" depth buffering of the fragment to a framebuffer-texture Linky. Unfortunely GLES20.glBlendEquation makes the application crash on both my phones (SGS 1/2) with UnsupportedOperationException. So Im wondering if anyone has used this function to any success? GL_MIN/GL_MAX also seems to be missing from the Android Opengl ES 2.0 spec so Im probably out of luck here...
Any ideas?
BTW It does seem to work in GL11Ext but since Im using the fragment shader for normal mapping this wont work from me.
i was experimenting on my Vega tablet (Tegra) and this worked for me:
fragment shader:
#extension GL_NV_shader_framebuffer_fetch : require
// makes gl_LastFragColor accessible
precision highp float;
varying vec2 v_texcoord;
uniform sampler2D n_sampler;
void main()
{
vec4 v_tex = texture2D(n_sampler, v_texcoord);
gl_FragColor = min(gl_LastFragColor, v_tex); // MIN blending
}
Pretty easy, huh? But i'm afraid this will be NV-only.