I'm trying to encode a movie using MediaCodec and Surfaces (pixel buffer mode works, but performance is not good enough). However, every time I try to call eglSwapBuffers(), it fails with EGL_BAD_SURFACE and as such, dequeueOutputBuffer() always returns -1 (INFO_TRY_AGAIN_LATER)
I have seen the examples on Bigflake and Grafika and I have another working project where everything is ok, but I need to get this working in another setup which is slightly different.
I currently have a GLSurfaceView which does screen rendering and is supplied with a custom EGLContextFactory/EGLConfigChooser. This allows me to create shared contexts to be used for separate OpenGL rendering in a native library. These are created using EGL10, but this should not be an issue as the underlying contexts only care about the client version, from what I know.
I've made sure the context is recordable, using the following config:
private android.opengl.EGLConfig chooseConfig14(android.opengl.EGLDisplay display) {
// Configure EGL for recording and OpenGL ES 3.x
int[] attribList = {
EGL14.EGL_RED_SIZE, 8,
EGL14.EGL_GREEN_SIZE, 8,
EGL14.EGL_BLUE_SIZE, 8,
EGL14.EGL_ALPHA_SIZE, 8,
EGL14.EGL_RENDERABLE_TYPE, EGLExt.EGL_OPENGL_ES3_BIT_KHR,
EGLExt.EGL_RECORDABLE_ANDROID, 1,
EGL14.EGL_NONE
};
android.opengl.EGLConfig[] configs = new android.opengl.EGLConfig[1];
int[] numConfigs = new int[1];
if (!EGL14.eglChooseConfig(display, attribList, 0, configs, 0,
configs.length, numConfigs, 0)) {
return null;
}
return configs[0];
}
Now, I tried to simplify the scenario, so when recording is started I initialise an instance of MediaCodec as an encoder and call createInputSurface() on the GLSurfaceView's thread. After I have a surface, I turn it into an EGLSurface (EGL14), as follows:
EGLSurface createEGLSurface(Surface surface) {
if (surface == null) return null;
int[] surfaceAttribs = { EGL14.EGL_NONE };
android.opengl.EGLDisplay display = EGL14.eglGetDisplay(EGL14.EGL_DEFAULT_DISPLAY);
android.opengl.EGLConfig config = chooseConfig14(display);
EGLSurface eglSurface = EGL14.eglCreateWindowSurface(display, config, surface, surfaceAttribs, 0);
return eglSurface;
}
When a new frame arrives from the camera, I send it to the screen and to another class that handles recording. That class just renders it to the EGLSurface built from MediaCodec's input surface, as follows:
public void drawToSurface(EGLSurface targetSurface, int width, int height, long timestampNano, boolean ignoreOrientation) {
if (mTextures[0] == null) {
Log.w(TAG, "Attempting to draw without a source texture");
return;
}
EGLContext currentContext = EGL14.eglGetCurrentContext();
EGLDisplay currentDisplay = EGL14.eglGetCurrentDisplay();
EGL14.eglMakeCurrent(currentDisplay, targetSurface, targetSurface, currentContext);
int error = EGL14.eglGetError();
ShaderProgram program = getProgramForTextureType(mTextures[0].getTextureType());
program.draw(width, height, TextureRendererView.LayoutType.LINEAR_HORIZONTAL, 0, 1, mTextures[0]);
error = EGL14.eglGetError();
EGLExt.eglPresentationTimeANDROID(currentDisplay, targetSurface, timestampNano);
error = EGL14.eglGetError();
EGL14.eglSwapBuffers(currentDisplay, targetSurface);
error = EGL14.eglGetError();
Log.d(TAG, "drawToSurface");
}
For some reason, eglSwapBuffers() fails and reports EGL_BAD_SURFACE and I haven't found a way to debug this further.
Update
I've tried querying the current surface after the call that makes it current and it always returns a malformed surface (looking inside, I can see the handle is 0 and it always fails when queried). It looks like eglMakeCurrent() silently fails to set the bind the surface to the context.
Moreover, I've determined this issue appears on Qualcomm chips (Adreno), not Kirin, so it's definitely related to the native OpenGL implementation (it's somehow funny because I've always noticed Adreno to be more permissive when it comes to "bad" OpenGL configurations)
Fixed! It turns out EGL10 and EGL14 appear to play nice together, but in some cases fail in very subtle ways, such as the one I encountered.
In my case, the EGLContextFactory I wrote was creating a base OpenGL ES context using EGL10 and then created more shared contexts on demand, again using EGL10. While I could retrieve them using EGL14 (either in Java or C) and context handles were always correct (sharing textures between contexts worked like a charm), it failed mysteriously when trying to use an EGLSurface created from a context or EGL10 origin... on Adreno chips.
The solution was to switch the EGLContextFactory to start from a context created with EGL14 and continue to create shared contexts using EGL14. For the GLSurfaceView, which still required EGL10, I had to use a hack
#Override
public javax.microedition.khronos.egl.EGLContext createContext(EGL10 egl10, javax.microedition.khronos.egl.EGLDisplay eglDisplay, javax.microedition.khronos.egl.EGLConfig eglConfig) {
EGLContext context = createContext();
boolean success = EGL14.eglMakeCurrent(mBaseEGLDisplay, EGL14.EGL_NO_SURFACE, EGL14.EGL_NO_SURFACE, context);
if (!success) {
int error = EGL14.eglGetError();
Log.w(TAG, "Failed to create a context. Error: " + error);
}
javax.microedition.khronos.egl.EGLContext egl14Context = egl10.eglGetCurrentContext(); //get an EGL10 context representation of our EGL14 context
javax.microedition.khronos.egl.EGLContext trueEGL10Context = egl10.eglCreateContext(eglDisplay, eglConfig, egl14Context, glAttributeList);
destroyContext(context);
return trueEGL10Context;
}
What this does is to create a new shared context with EGL14, make it current and then retrieve an EGL10 version of it. That version is not usable directly (for a reason I cannot exactly understand), but another shared context from it works well. The starting EGL14 context can then be destroyed (in my case it's put into a stack a reused later on).
I would really love to understand why this hack is needed, but I'm glad to have a working solution still.
Related
I am using RecordableSurfaceView
https://github.com/spaceLenny/recordablesurfaceview/blob/master/recordablesurfaceview/src/main/java/com/uncorkedstudios/android/view/recordablesurfaceview/RecordableSurfaceView.java
For android 6, (api 23) I get this error. Is there a way to fix this?
eglCreateWindowSurface() can only be called with an instance of Surface, SurfaceView, SurfaceTexture or SurfaceHolder at the moment, this will be fixed later.
.RecordableSurfaceView
The potential code segment.
mEGLSurface = EGL14
.eglCreateWindowSurface(mEGLDisplay, eglConfig, RecordableSurfaceView.this,
surfaceAttribs, 0);
EGL14.eglMakeCurrent(mEGLDisplay, mEGLSurface, mEGLSurface, mEGLContext);
// guarantee to only report surface as created once GL context
// associated with the surface has been created, and call on the GL thread
// NOT the main thread but BEFORE the codec surface is attached to the GL context
if (mRendererCallbacksWeakReference != null
&& mRendererCallbacksWeakReference.get() != null) {
mRendererCallbacksWeakReference.get().onSurfaceCreated();
}
mEGLSurfaceMedia = EGL14
.eglCreateWindowSurface(mEGLDisplay, eglConfig, mSurface,
surfaceAttribs, 0);
GLES20.glClearColor(0.1f, 0.1f, 0.1f, 1.0f);
write a null check for this mEGLSurface and done
(mEGLSurface != null)
In the code snippet of RecordableSurfaceView the second call to eglCreateWindowSurface passes in a mSurface variable which is initialized in doSetup via:
mSurface = MediaCodec.createPersistentInputSurface();
I would guess that your codec doesn't support this function and it's somehow returning null, which is causing the exception you see. Or perhaps it's being used by more than one codec or recorder instance?
The only other somewhat related question on SO I could find is: MediaCodec's Persistent Input Surface unsupported by Camera2 Session?
Can you at least clarify from the stack trace where in the library it's crashing? In other words, from which call of eglCreateWindowSurface?
I've written the following function to get the max texture size on Android. The function should run on the UI thread and cannot assume that an EGLContext has been setup already.
It seems to work on my test device, but I'm fairly new to both OpenGl and Android, and I am not fully confident about the code. I relied on trial and error and some copy-pasting without fully understanding things.
Do you see any issues with the code? Specifically:
Is this robust? i.e. have I unknowingly made any bad assumptions?
Have I cleaned up resources correctly?
Can it be made simpler?
Here is the code (in Kotlin):
private fun getGLMaxTextureSize():Int {
// Get display.
val display = EGL14.eglGetDisplay(0)
// Choose config.
val configSpec = intArrayOf(EGL14.EGL_NONE)
val configs = arrayOfNulls<EGLConfig>(1)
var num_config = IntArray(1)
EGL14.eglChooseConfig(display, configSpec, 0, configs, 0,1, num_config, 0)
// Create context.
val attribs = intArrayOf(EGL14.EGL_CONTEXT_CLIENT_VERSION, 2, EGL10.EGL_NONE)
val context = EGL14.eglCreateContext(display, configs[0], EGL14.EGL_NO_CONTEXT, attribs, 0)
EGL14.eglMakeCurrent(display, EGL14.EGL_NO_SURFACE, EGL14.EGL_NO_SURFACE, context)
// Read texture size.
val intBuffer = IntArray(1)
GLES20.glGetIntegerv(GLES20.GL_MAX_TEXTURE_SIZE, intBuffer, 0)
// Cleanup.
EGL14.eglMakeCurrent(display, EGL14.EGL_NO_SURFACE, EGL14.EGL_NO_SURFACE, EGL14.EGL_NO_CONTEXT)
EGL14.eglDestroyContext(display, context)
return intBuffer[0]
}
Main goal:
create a texture in one thread. use the texture in another thread.
What I have done so far.
I created two contexts and two surfaces. and made context1 and surface1 current in the main thread.
surface1 = eglCreateWindowSurface(display, config, engine->app->window, NULL);
context1 = eglCreateContext(display, config, NULL, attribList);
context2 = eglCreateContext(display, config, context1, attribList);
eglMakeCurrent(display, surface1, surface1, context1)
eglQuerySurface(display, surface1, EGL_WIDTH, &w);
eglQuerySurface(display, surface1, EGL_HEIGHT, &h);
EGLint attribpbf[] =
{
EGL_HEIGHT, h,
EGL_WIDTH, w,
EGL_NONE
};
surface2 = eglCreatePbufferSurface(display, config, attribpbf);
Now I created a new thread and in that thread I made context2 and surface2 current.
eglMakeCurrent(display, surface2, surface2, context2);
Then I created a texture and did some rendering into the texture and then I did glFlush();
I checked it here and the texture was successfully created.
After that tried to use this texture as a texture attachment in the main thread.But the result was a blank black screen. There was no GL error.I think the texture was not shared successfully.
Can you pleas tell me what I am doing wrong.Is there some cases when texture can not be shared .
I think you have runing into a problem of synchronization, which means you rendering your texture in context1, and then you query the texture content immediately in context2, but the command you post to context1 was not finish yet.
And may be you can put a fence on the context1's command buffer and wait for finish in context2
reference:
https://www.khronos.org/registry/OpenGL-Refpages/gl4/html/glFenceSync.xhtml
This piece of code used to work in my Nexus 7 2012 KitKat:
int[] maxSize = new int[1];
GLES10.glGetIntegerv(GL10.GL_MAX_TEXTURE_SIZE, maxSize, 0);
In KitKat I can obtain the max pixel value correctly, but after the upgrade to factory image Lollipop this snippet of code causes problem as it only returns 0. The logcat showed this output when it reached this method:
E/libEGLīš call to OpenGL ES API with no current context (logged once per thread)
I already have android:hardwareAccelerated="true" in my Manifest.xml. Is there any API changes that I am not aware of, that causes the above code unusable? Please advise.
The error log points out the basic problem very clearly:
call to OpenGL ES API with no current context (logged once per thread)
You need a current OpenGL context in your thread before you can make any OpenGL calls, which includes your glGetIntegerv() call. This was always true. But it seems like in pre-Lollipop, there was an OpenGL context that was created in the frameworks, and that was sometimes (always?) current when app code was called.
I don't believe this was ever documented or intended behavior. Apps were always supposed to explicitly create a context, and make it current, if they wanted to make OpenGL calls. And it appears like this is more strictly enforced in Lollipop.
There are two main approaches to create an OpenGL context:
Create a GLSurfaceView (documentation). This is the easiest and most convenient approach, but only really makes sense if you plan to do OpenGL rendering to the display.
Use EGL14 (documentation). This provides a lower level interface that allows you to complete the necessary setup for OpenGL calls without creating a view or rendering to the display.
The GLSurfaceView approach is extensively documented with examples and tutorials all over the place. So I will focus on the EGL approach.
Using EGL14 (API level 17)
The following code assumes that you care about ES 2.0, some attribute values would have to be adjusted for other ES versions.
At the start of the file, import the EGL14 class, and a few related classes:
import android.opengl.EGL14;
import android.opengl.EGLConfig;
import android.opengl.EGLContext;
import android.opengl.EGLDisplay;
import android.opengl.EGLSurface;
import android.opengl.GLES20;
Then get a hold of the default display, and initialize. This could get more complex if you have to deal with devices that could have multiple displays, but will be sufficient for a typical phone/tablet:
EGLDisplay dpy = EGL14.eglGetDisplay(EGL14.EGL_DEFAULT_DISPLAY);
int[] vers = new int[2];
EGL14.eglInitialize(dpy, vers, 0, vers, 1);
Next, we need to find a config. Since we won't use this context for rendering, the exact attributes aren't very critical:
int[] configAttr = {
EGL14.EGL_COLOR_BUFFER_TYPE, EGL14.EGL_RGB_BUFFER,
EGL14.EGL_LEVEL, 0,
EGL14.EGL_RENDERABLE_TYPE, EGL14.EGL_OPENGL_ES2_BIT,
EGL14.EGL_SURFACE_TYPE, EGL14.EGL_PBUFFER_BIT,
EGL14.EGL_NONE
};
EGLConfig[] configs = new EGLConfig[1];
int[] numConfig = new int[1];
EGL14.eglChooseConfig(dpy, configAttr, 0,
configs, 0, 1, numConfig, 0);
if (numConfig[0] == 0) {
// TROUBLE! No config found.
}
EGLConfig config = configs[0];
To make a context current, which we will need later, you need a rendering surface, even if you don't actually plan to render. To satisfy this requirement, create a small offscreen (Pbuffer) surface:
int[] surfAttr = {
EGL14.EGL_WIDTH, 64,
EGL14.EGL_HEIGHT, 64,
EGL14.EGL_NONE
};
EGLSurface surf = EGL14.eglCreatePbufferSurface(dpy, config, surfAttr, 0);
Next, create the context:
int[] ctxAttrib = {
EGL14.EGL_CONTEXT_CLIENT_VERSION, 2,
EGL14.EGL_NONE
};
EGLContext ctx = EGL14.eglCreateContext(dpy, config, EGL14.EGL_NO_CONTEXT, ctxAttrib, 0);
Ready to make the context current now:
EGL14.eglMakeCurrent(dpy, surf, surf, ctx);
If all of the above succeeded (error checking was omitted), you can make your OpenGL calls now:
int[] maxSize = new int[1];
GLES20.glGetIntegerv(GLES20.GL_MAX_TEXTURE_SIZE, maxSize, 0);
Once you're all done, you can tear down everything:
EGL14.eglMakeCurrent(dpy, EGL14.EGL_NO_SURFACE, EGL14.EGL_NO_SURFACE,
EGL14.EGL_NO_CONTEXT);
EGL14.eglDestroySurface(dpy, surf);
EGL14.eglDestroyContext(dpy, ctx);
EGL14.eglTerminate(dpy);
Using EGL10 (API level 1)
If you need something that works for earlier levels, you can use EGL10 (documentation) instead of EGL14, which has been available since API level 1. The code above adopted for 1.0 looks like this:
import android.opengl.GLES10;
import javax.microedition.khronos.egl.EGL10;
import javax.microedition.khronos.egl.EGLConfig;
import javax.microedition.khronos.egl.EGLContext;
import javax.microedition.khronos.egl.EGLDisplay;
import javax.microedition.khronos.egl.EGLSurface;
EGL10 egl = (EGL10)EGLContext.getEGL();
EGLDisplay dpy = egl.eglGetDisplay(EGL10.EGL_DEFAULT_DISPLAY);
int[] vers = new int[2];
egl.eglInitialize(dpy, vers);
int[] configAttr = {
EGL10.EGL_COLOR_BUFFER_TYPE, EGL10.EGL_RGB_BUFFER,
EGL10.EGL_LEVEL, 0,
EGL10.EGL_SURFACE_TYPE, EGL10.EGL_PBUFFER_BIT,
EGL10.EGL_NONE
};
EGLConfig[] configs = new EGLConfig[1];
int[] numConfig = new int[1];
egl.eglChooseConfig(dpy, configAttr, configs, 1, numConfig);
if (numConfig[0] == 0) {
// TROUBLE! No config found.
}
EGLConfig config = configs[0];
int[] surfAttr = {
EGL10.EGL_WIDTH, 64,
EGL10.EGL_HEIGHT, 64,
EGL10.EGL_NONE
};
EGLSurface surf = egl.eglCreatePbufferSurface(dpy, config, surfAttr);
final int EGL_CONTEXT_CLIENT_VERSION = 0x3098; // missing in EGL10
int[] ctxAttrib = {
EGL_CONTEXT_CLIENT_VERSION, 1,
EGL10.EGL_NONE
};
EGLContext ctx = egl.eglCreateContext(dpy, config, EGL10.EGL_NO_CONTEXT, ctxAttrib);
egl.eglMakeCurrent(dpy, surf, surf, ctx);
int[] maxSize = new int[1];
GLES10.glGetIntegerv(GLES10.GL_MAX_TEXTURE_SIZE, maxSize, 0);
egl.eglMakeCurrent(dpy, EGL10.EGL_NO_SURFACE, EGL10.EGL_NO_SURFACE,
EGL10.EGL_NO_CONTEXT);
egl.eglDestroySurface(dpy, surf);
egl.eglDestroyContext(dpy, ctx);
egl.eglTerminate(dpy);
Note that this version of the code uses an ES 1.x context. The reported maximum texture size can be different for ES 1.x and ES 2.0.
The error message is saying that you are calling the GLES function before the OpenGL ES context exists. I have found that KitKat is stricter about correctness in several areas so that may be the reason for the problem appearing now, or there may be some difference in the order in which you app is starting up that is causing it. If you posted more of your initialisation code, the reason may be clearer.
Typically you have a class that implements GLSurfaceView.Renderer that has a function:
public void onSurfaceCreated(GL10 gl, EGLConfig config)
In this function, you should be able to call gl.glGetIntegerv safely as at this point you know that the OpenGL ES context has been created. If you are calling it earlier than this, then that would explain the error you are seeing.
For the OpenGL Android project I am working on, I need ES 2.0, but I need the control of rendering buffers/surfaces I am accustomed to achieving by using EGL. For I cannot figure out any way to render to an offscreen buffer using GLSurfaceView, and then never displaying the buffer. Even if I use GLSurfaceView.EGLContextFactory, I cannot think of any way to accomplish this without EGL 1.2 functions/constants not included in Android's EGL package (e.g. EGL_CONTEXT_CLIENT_VERSION).
So the first obvious question is: is there a way to either 1) use EGL with ES 2.0 despite the omission of EGL_CONTEXT_CLIENT_VERSION and of eglBindAPI()? 2) is there some new API for setting the rendering context used before GLSurfaceView's callback surfaceCreated(EGLConfig) is called?
If you can live with the default EGLContextFactory and EGLConfigChooser, you can use the setEGLContextClientVersion() method of the GLSurfaceView.
Otherwise, if you're writing your own EGLContextFactory and EGLConfigChooser, just define the constants yourself. In the config chooser, define
private static final int EGL_OPENGL_ES2_BIT = 4;
then pass this as the value for EGL_RENDERABLE_TYPE to eglChooseConfig, together with other attributes you desire:
int attribs[] = {
EGL10.EGL_RED_SIZE, mRedSize,
EGL10.EGL_GREEN_SIZE, mGreenSize,
EGL10.EGL_BLUE_SIZE, mBlueSize,
EGL10.EGL_ALPHA_SIZE, mAlphaSize,
EGL10.EGL_DEPTH_SIZE, mDepthSize,
EGL10.EGL_SAMPLE_BUFFERS, mSampleBuffers,
EGL10.EGL_RENDERABLE_TYPE, EGL_OPENGL_ES2_BIT,
EGL10.EGL_NONE
};
For the context factory, define
private static final int EGL_CONTEXT_CLIENT_VERSION = 0x3098;
and use this when creating a context:
public EGLContext createContext(EGL10 egl, EGLDisplay display, EGLConfig eglConfig)
{
int[] attrib_list = {
EGL_CONTEXT_CLIENT_VERSION, 2,
EGL10.EGL_NONE
};
EGLContext context = egl.eglCreateContext(display, eglConfig, EGL10.EGL_NO_CONTEXT, attrib_list);
return context;
}
When you've written those, pass them to setEGLContextFactory and setEGLConfigChooser, respectively.