I'm porting an iPhone app to Android, and I need to use OpenGL framebuffers. I have a Nexus One, and a call to glGet(GL_EXTENSIONS) shows that the Nexus One supports the same framebuffer extension as the iPhone. However, I can't seem to call functions related to the OpenGL extension in my GLSurfaceView. When I call a simple framebuffer get function, I get an UnsupportedOperationException.
I can't seem to resolve this issue, and I must have framebuffers to continue development. Do I need to pass some options in when the OpenGL context is created to get a fully capable OpenGL context object? Here's the block of code that I'm trying to run that determines the capabilities of the hardware. It claims to support the extension and my gl object is an instance of GL11ExtensionPack, but the call to glGetFramebufferAttachmentParameterivOES fails with an UnsupportedOperationException.
public void runEnvironmentTests()
{
String extensions = gl.glGetString(GL11.GL_EXTENSIONS);
Log.d("Layers Graphics", extensions);
if (gl instanceof GL11ExtensionPack) {
Log.d("Layers Graphics", "GL11 Extension Pack supported");
GL11ExtensionPack g = (GL11ExtensionPack) gl;
int[] r = new int[1];
try {
g.glGetFramebufferAttachmentParameterivOES(GL11ExtensionPack.GL_FRAMEBUFFER_OES, GL11ExtensionPack.GL_COLOR_ATTACHMENT0_OES, L11.GL_TEXTURE_2D, r, 0);
Log.d("Layers Graphics", "Framebuffers are supported");
} catch (UnsupportedOperationException e) {
e.printStackTrace();
framebuffersSupported = false;
Log.d("Layers Graphics", "Framebuffers are NOT supported");
}
}
}
If anyone has successfully used the GL_FRAMEBUFFERS_OES extension, please let me know. I'm beginning to think it may just not be implemented in the Java API!
There's currently a bug in Android in which the Java versions of these functions are unimplemented. The only way to use these extension functions at the moment is to use the NDK.
Related
I'm tasked with converting an android app from GLES1 to GLES3. The app does all its OpenGL calls in JNI, runs in a thread, and calls into Java to run basic functions like:
InitOpenGL();
MakeContextCurrent();
GLSwapBuffer();
The app maintains its own clock, etc-- so basically the main loop of the app looks something like this (simplified).
JavaBridge->InitOpenGL();
while (stillRunning())
{
SleepUntilRightTime();
UpdateEverything();
if (shouldDraw())
{
JavaBridge->MakeContextCurrent();
DrawEverything();
JavaBridge->GLSwapBuffers();
}
}
So, to accomplish this, the app has its own OpenGL factory, which initializes OpenGL1.1.
I'll try to cut out everything unncessary for brevity, here are the basics (removed all error checking to keep it short):
public class GLView extends SurfaceView implements SurfaceHolder.Callback
{
EGL10 m_GL;
EGLContext m_GLContext;
EGLDisplay m_GLDisplay=null;
EGLSurface m_GLSurface=null;
EGLConfig m_GLConfig=null;
public void InitOpenGL()
{
m_GL = (EGL10) EGLContext.getEGL();
m_GLDisplay = m_GL.eglGetDisplay(EGL10.EGL_DEFAULT_DISPLAY);
m_GL.eglInitialize(m_GLDisplay, null);
EGLConfig[] configs = new EGLConfig[1];
int[] config_count = new int[1];
int[] specs = { EGL10.EGL_ALPHA_SIZE, 8, EGL10.EGL_DEPTH_SIZE, 16, EGL10.EGL_STENCIL_SIZE, 8, EGL10.EGL_NONE };
m_GL.eglChooseConfig(m_GLDisplay, specs, configs, 1, config_count);
m_GLContext = m_GL.eglCreateContext(m_GLDisplay, m_GLConfig, EGL10.EGL_NO_CONTEXT, null);
SurfaceHolder h = getHolder();
m_GLSurface = m_GL.eglCreateWindowSurface(m_GLDisplay, m_GLConfig, h, null);
m_GL.eglMakeCurrent(m_GLDisplay, m_GLSurface, m_GLSurface, m_GLContext);
m_GL = (EGL10) EGLContext.getEGL();
}
public void MakeContextCurrent()
{
m_GL.eglMakeCurrent(m_GLDisplay, m_GLSurface, m_GLSurface, m_GLContext);
}
public void SwapBuffers()
{
m_GL.eglSwapBuffers(m_GLDisplay, m_GLSurface);
}
}
This all works beautifully, the app runs in its thread and paints the screen whenever it deems fit (it's a game, btw, that's why the constant loop).
Now: I was hoping that to turn this into an OpenGL3.0 context, I'd just say "hey request version number" or something like that. I've tried a few things without success (setting GL_VERSION in an attrib list with createcontext, making sure the right libraries are linked, fixed manifest, etc, etc, on and on) but no matter what I do, openGL calls in the JNI either do nothing, or crash the program.
I can't even get glClear to work unless I just revert everything back to square one. Anyone have any advice on offer to turn this thing into a 3.0 capable context?
Okay, managed to figure this out. For anyone using modern android, you'll find that EG10.EGL_CONTEXT_CLIENT_VERSION is not defined. It seems like using EGL_VERSION would be the substitute, but it's not.
Why isn't EGL_CONTEXT_CLIENT_VERSION defined? Is it depreciated? Is it shunned? We'll never know. But we DO know that if it WAS defined it would be 0x3098.
So, making this all magically work was as simple as saying:
int[] attrib_list=new int[]{0x3098,3,EGL10.EGL_NONE};
m_GLContext = m_GL.eglCreateContext(m_GLDisplay, m_GLConfig, EGL10.EGL_NO_CONTEXT,attrib_list);
Is it dangerous to do this? Probably. I'll do a little more research into it. If I never return to edit this, then I found no real answer yea or nay.
Target: Android API >=23, OpenGL ES 2.
The following code
private void deleteFBO()
{
android.util.Log.e("FBO", "deleting "+mFramebufferID);
int[] textureIds = new int[1];
int[] mFBORenderToTexture = new int[1];
textureIds[0] = mTextureID;
mFBORenderToTexture[0] = mFramebufferID;
if( GLES20.glGetError()!=GLES20.GL_NO_ERROR )
android.util.Log.e("FBO", "error before deleting");
GLES20.glDeleteTextures(1, textureIds, 0);
GLES20.glDeleteFramebuffers(1, mFBORenderToTexture, 0);
if( GLES20.glGetError()!=GLES20.GL_NO_ERROR )
android.util.Log.e("FBO", "error after deleting");
}
doesn't give me any errors (i.e. I cannot see the 'error before/after deleting') even though it is for sure called from a thread which does NOT hold any OpenGL contexts.
How is that possible? Or maybe the glDelete() calls really DO fail, but my code fails to detect this?
Seems like I don't understand WHICH OpenGL calls need to be made when holding the context? Certainly glDrawArrays gives me an error when I try to call it without holding the context, and I thought I need to be holding it in every single case, including the two glDelete*() above?
WHICH OpenGL calls need to be made when holding the context?
All of them. Which includes glGetError(). This means that your error checks themselves are invalid if there is no current context.
Even though, I found some claims that glGetError() returns GL_INVALID_OPERATION if there is no current context. But I have not been able to find that behavior defined in the spec. So until somebody shows me otherwise, I'll stick to my claim that calling glGetError() without a current context will give undefined (i.e. implementation dependent) results.
I want to open android tablet's camera and get the data from camera in C level. After that I will modify the data, and C level will be efficient.
Now I'm thinking using the V4L2 C code. But I find the open function of V4L2 need the parameter of the camera's name, such as '/dev/video0'. However I can't find something like that in my tablet's dev folder. Besides, I am not sure whether using the V4L2 will be the right solution.
Does anyone know anything about this?
on my device "OpenCV for Android" does not provide required performance neither in 'native' mode nor in 'java' mode. it gives FPS=2 in 1920x1080, in same time when java MediaRecorder can record 1920x1080 with FPS=15
I'm trying to solve it using the code from Android Open Source Project used by native Camera application:
static void android_hardware_Camera_native_setup(JNIEnv *env, jobject thiz,
jobject weak_this, jint cameraId)
{
sp<Camera> camera = Camera::connect(cameraId);
if (camera == NULL) {
jniThrowRuntimeException(env, "Fail to connect to camera service");
return;
}
// make sure camera hardware is alive
if (camera->getStatus() != NO_ERROR) {
jniThrowRuntimeException(env, "Camera initialization failed");
return;
}
jclass clazz = env->GetObjectClass(thiz);
if (clazz == NULL) {
jniThrowRuntimeException(env, "Can't find android/hardware/Camera");
return;
}
// We use a weak reference so the Camera object can be garbage collected.
// The reference is only used as a proxy for callbacks.
sp<JNICameraContext> context = new JNICameraContext(env, weak_this, clazz, camera);
context->incStrong(thiz);
camera->setListener(context);
// save context in opaque field
env->SetIntField(thiz, fields.context, (int)context.get());
}
You can always build a JNI method for the Java classes to get access from C. Another way could be using OpenCV for Android: OpenCV4Android
This gives you a interface to the camera, but as far as I remember, there is currently no support for Android 4.3+.
I'm developing a mobile application that runs on Android and IOS. It's capable of real-time-processing of a video stream. On Android I get the Preview-Videostream of the camera via android.hardware.Camera.PreviewCallback.onPreviewFrame. I decided to use the NV21-Format, since it should be supported by all Android-devices, whereas RGB isn't (or just RGB565).
For my algorithms, which mostly are for pattern recognition, I need grayscale images as well as color information. Grayscale is not a problem, but the color conversion from NV21 to BGR takes way too long.
As described, I use the following method to capture the images;
In the App, I override the onPreviewFrame-Handler of the Camera. This is done in CameraPreviewFrameHandler.java:
#Override
public void onPreviewFrame(byte[] data, Camera camera) {
{
try {
AvCore.getInstance().onFrame(data, _prevWidth, _prevHeight, AvStreamEncoding.NV21);
} catch (NativeException e)
{
e.printStackTrace();
}
}
The onFrame-Function then calls a native function which fetches data from the Java-Objects as local references. This is then converted to an unsigned char* bytestream and calls the following c++ function, which uses OpenCV to convert from NV21 to BGR:
void CoreManager::api_onFrame(unsigned char* rImageData, avStreamEncoding_t vImageFormat, int vWidth, int vHeight)
{
// rImageData is a local JNI-reference to the java-byte-array "data" from onPreviewFrame
Mat bgrMat; // Holds the converted image
Mat origImg; // Holds the original image (OpenCV-Wrapping around rImageData)
double ts; // for profiling
switch(vImageFormat)
{
// other formats
case NV21:
origImg = Mat(vHeight + vHeight/2, vWidth, CV_8UC1, rImageData); // fast, only creates header around rImageData
bgrMat = Mat(vHeight, vWidth, CV_8UC3); // Prepare Mat for target image
ts = avUtils::gettime(); // PROFILING START
cvtColor(origImg, bgrMat, CV_YUV2BGRA_NV21);
_onFrameBGRConversion.push_back(avUtils::gettime()-ts); // PROFILING END
break;
}
[...APPLICATION LOGIC...]
}
As one might conclude from comments in the code, I profiled the conversion already and it turned out that it takes ~30ms on my Nexus 4, which is unacceptable long for such a "trivial" pre-processing step. (My profiling methods are double-checked and working properly for real-time measurement)
Now I'm trying desperately to find a faster implementation of this color conversion from NV21 to BGR. This is what I've already done;
Adopted the code "convertYUV420_NV21toRGB8888" to C++ provided in this topic (multiple of the conversion-time)
Modified the code from 1 to use only integer operations (doubled conversion-time of openCV-Solution)
Browsed through a couple other implementations, all with similar conversion-times
Checked OpenCV-Implementation, they use a lot of bit-shifting to get performance. Guess I'm not able to do better on my own
Do you have suggestions / know good implementations or even have a completely different way to work around this Problem? I somehow need to capture RGB/BGR-Frames from the Android-Camera and it should work on as many Android-devices as possible.
Thanks for your replies!
Did you try libyuv? I used it in the past and if you compile it with NEON support, it uses an asm code optimized for ARM processors, you can start from there to further optimize for your special situation.
For whatever reason, my EGL Context Client Version isn't getting set in my OpenGL application. I setup the context by simply doing the following:
final boolean supportEs2 = configurationInfo.reqGlEsVersion >= 0x20000;
if (supportEs2) { //<-- this resolves to true.
mGLView.setEGLContextClientVersion(2);
try {
mGLView.setRenderer(new PongDroidRenderer(getApplicationContext()));
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
And after that, I get a runtime exception when trying to compile a shader due to the context not being called correctly. After some research, I figured out that I'm unable to call OpenGL from the main thread, however, as far as I know my application itself isn't multi-threaded (of course, the system is however). So, I'm kind of hoping to see if anyone here would have any idea as to how I can get this working, and if more information is provided just say the word and I'll post it here.
Are you using a glSurfaceView? The opengl context is only valid from the surfaceview thread (the one that calls onDrawFrame, onSurfaceChanged, onSurfaceCreated, etc). You should compile your shaders in those callbacks.
You have to set the minimum supported API version of Android to 8, as the OpenGL 2.0 is supported in Android 2.2 and higher. If you set the version to lower, you must call the setEGLContextClientVersion() method either in onSurfaceCreated(), onSurfaceChanged() or onDrawFrame() as Tim said.