For our game on Android OS we have been using the ndk mainly, to write our game and a thin Java wrapper so than we can support all devices with gles 2.0 hardware.
Our implementation is pretty standard, using the overrides of GLSurfaceView we:
* create/resume the c++ part of the game onSurfaceChanged(),
* render onDrawFrame(),
* inform the engine about lost gl Context at destroyContex() which usually occurs when the app pauses or gets destructed.
When the gl context gets recreated (when resuming - onSurfaceChanged()) we pass back the information to the game and reload all gl resources back in. During the time between onSurfaceChanged() and the first onDrawFrame though the screen is black but I have noticed quite a few 3D games that dont have this kind of a problem (i.e Gun Bros), they also dont seem to reload their resources (unless they have everything memory and quickly load them back in).
Any info on why is this happening?
Just an idea, but if you dump the screen with glReadPixels or similar on destroyContext(), and then the first thing you do on onSurfaceChanged() is to upload that and draw, you will show the user a valid image before your first onDrawFrame.
Related
I am happily using libgdx for an android game application.
At a specific point in the game, I use a FrameBuffer object to render the screen onto, and then use its attached color texture for rendering that to the screen (to be able to render a half-transparent screen with full-color rectangular zones).
The documentation for FrameBuffer says:
FrameBuffers are managed. In case of an OpenGL context loss, which only happens on Android when a user switches to another application or receives an incoming call, the framebuffer will be automatically recreated.
And that works perfectly, I can switch to other applications, put the device in sleep mode, go back to the application and everything including the framebuffer is working as usual.
The problem begins when I try and change the device's language when the application is running (using the android settings menu).
After I change the language to something else, the framebuffer attached texture becomes completely black (either the rendering to it fails or its rendering to the screen does).
The incredible thing is that, even if I restart the application (i.e. the application reaches its onDestroy() method and exits), the problem does NOT go away, and it does only when I kill the application process from the task manager.
I probably could solve this by adding a System.exit(0) inside the onDestroy() method, but does anyone have an idea about what exactly happens when I change the device language?
I cannot think of any possible relation between that and the framebuffer object state (the other textures are working as usual!), if anyone could enlight me it would be greatly appreciated.
I have an OpenGL ES app that works both on iOS and Android. Most of the code was written ages ago by another person and now I have to maintain it. OpenGL usage seems fairly simple (the game is 2D and uses only textured sprites in a simple manner). But I see two major differences in graphics code realization for iOS and Android:
1) iOS code contains this code:
glGenFramebuffersOES(1, &m_defaultFramebuffer);
glGenRenderbuffersOES(1, &m_colorRenderbuffer);
glBindFramebufferOES(GL_FRAMEBUFFER_OES, m_defaultFramebuffer);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, m_colorRenderbuffer);
glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_COLOR_ATTACHMENT0_OES, GL_RENDERBUFFER_OES, m_colorRenderbuffer);
and Android's one does not.
2) When Android app goes to background, all OpenGL textures are destroyed (glDeleteTextures) and EGL is shutdowned using eglTerminate. When app returns from sleep, EGL is re-initialized and textures are re-created.
iOS code does not do these things. It just pauses rendering loop by calling [m_displayLink setPaused:YES];
Other OpenGL-related code is the same for iOS and Android.
Everything works well on both platforms, but I want to have a full understanding of what's going on. Can anybody explain me a rationale behind these two differences?
1)
This is just a difference in the APIs. On iOS, you create your own framebuffer to render in to when the App starts. On Android the framebuffer is created automatically in GLSurfaceView, so the App doesn't need to create its own.
2)
On iOS, when your App goes to the background, the OpenGL context is preserved, which means all your textures and buffers are still there when you return it to the foreground.
Older versions of Android had only a single OpenGL context, so it was destroyed whenever your App went to the background (so that other Apps could then make use of it).
Later versions of Android do have the option to behave more like iOS by calling setPreserveEGLContextOnPause. However for this to work, the Android version has to be 3.x or above (API 11) and the device must support it also.
When it is not used or supported, the App must delete and re-create all it's OpenGL resources when going between background and foreground, which is what your App appears to be doing.
I would like to write an application for Android which displays stuff on screen using the framebuffer. This will run only on a specific rooted device, so permissions etc is not a problem. Same application (simple test version anyway) is already running okay on PC/Linux.
The questions:
How to avoid the Android OS from accessing the framebuffer? I would like, while my application is running, to have the OS never touch the framebuffer, no writes and no ioctls. What do I need to do to get exclusive use of the framebuffer, and then (when my application quits) give it back to the OS?
Are there any differences between Android framebuffer and Linux framebuffer to watch out for?
P.S. I would like to start my application as a regular Android application (with some native code), it just has no visible UI except for framebuffer draws which take over the whole screen. It would be nice to still be able to get events from the OS.
See also:
http://www.kandroid.org/online-pdk/guide/display_drivers.html
Hi Alex Not sure why / how to stop android OS from writing to framebuffer. As long as your android application is visible and on top you have the control as what you want to display.
Your application should have an activity with a SurfaceView ( you may want your application to hide notification bar call this function in oncreate of your activity)
requestWindowFeature(Window.FEATURE_NO_TITLE); )
your activity should have SurfaceHolder.Callback implementation to handle callbacks as when the surface is ready to filled with framebuffer. Get the surface holder object as SurfaceView.getHolder() incase you want set pixel formats of the view etc.
Once "surfaceCreated" callback is called you can safely pass your surfaceview object(passing width and height maybe a good idea too) to the native so that you can fill it framebuffer using "ANativeWindow" class.
Check NDK sample code to see how to use the class NDK documentation
SurfaceHolder.Callback documentation
SurfaceHolder documentation
essentially you need to these (ON JB /Kitkat)
get the native window (ANativeWindow) associated with the surfaceview by ANativeWindow_fromSurface.
Acquire a lock on the ANativeWindow by ANativeWindow_acquire .
Set geometry parameters(window,width,height,pf) for the nativewindow by ANativeWindow_setBuffersGeometry
Load the nativewindow with the frambuffer stored (apply dirty rectangle if any here)
by ANativeWindow_lock
Final step to unlock and post the changes for rendering by ANativeWindow_unlockAndPost
Go through the ndk sample examples in case you need sample code.NDK documentation
First a bit of context: I'm developing a video game for both the Android and iPhone platforms. The way the iPhone works, when a user hits the home button and returns to the game later, in most circumstances the game will pick up RIGHT where it left off with no hicups. I immediately jump back into my rendering and game loop. Setting this up on the iOS platform was absolutely easy for me to do. Accomplishing this on the Android has left me in a fit of rage after hours of wrestling with google results :P
I have my own OpenGL setup for both the iPhone and Android, and everything has been working great. The root of the problem, I believe, is that I need a SurfaceHolder to create an OpenGL Context. Here's the sucky part, when the screen loses focus of the game (ie the home button was hit), Android calls surfaceDestroyed in my SurfaceView class and basically KILLS my opengl context. I could recreate a new one with a new SurfaceHolder when surfaceCreated is called, but then I need to reload all of my art assets which defeats the purpose of everything I'm trying to accomplish.
Can I somehow prevent the Android OS from killing my surface holder, is there some sort of custom view I can use to get this to work? Is there some setting in the Manifest that can help me out here (I doubt it as I have thoroughly tested most of the flags)? I know this is possible because Angry Birds does this perfectly on the Android OS.
I am almost done creating a game for android using a port of irrlicht 3d engine to android.
All code except a minimal frame work to make the native calls and play sounds is written in C++.
Even the opengles display is created in c++ code using eglGetDisplay and eglCreateWindowSurface
The problem I need to solve is that when home is pressed then relaunch the game the screen is all white.
From other answers I have found that the opengl context is lost then recreated when onSurfaceCreated is called. I thought that I could just reload textures but that seams to work for only some textures. Also the background color is changed which is not a resource.
It seams I would have to completely restart the game but this could be really annoying to a user.
the port of quake 3 has notes about this problem be has no solution.
Is there a example anywhere of a game written in native code which correctly handles this situation?
The way I handled the situation is to recreate everything. I made sure that all generated stuff like textures and buffers where deleted before recreating everything as if it happened for the first time.