Android OpengGL 1.0 Displaying two textured planes issue - android

I have an interesting problem with displaying two or quads at the same time while displaying one is working fine.
I was able to implement this popular tutorial for displaying a simple quad with texture:
http://www.jayway.com/2010/02/15/opengl-es-tutorial-for-android-part-v/
So basically i have a class named SimplePlane that extends Mesh class exactly as in tutorial.
I create an instance of SimplePlane:
public void onSurfaceCreated(...){
plane1 = new SimplePane(1,1);
plane2 = new SimplePane(1,1);
plane1.z = 2.0f
plane2.z = 3.0f
}
and then I draw mesh:
public void draw(GL10 gl) {
//set all gl variables as usual for opengl
plane1.draw(gl); // is displayed properly
plane2.draw(gl); //for some reason is not visible even that its behind plane1 and bigger to make sure plane1 is not covering plane2
}
The problem is only the first plane1 is being displayed.
If in my code i place plane2 first then plane 2 is diplayed and plane1 is not.
Mind you its not the z position issue as i ruled it out by creating one larger and semi transparent. And if i comment one out then the other is visible.
I added logging and both plane's draw methods are being called but only one is visible.
Am I allowed to take this approach calling one draw(gl) after another or i have to create a group object as in tutorial?

Related

LibGDX. Android. Black background on transparent DecalSprites

In my App I use several DecalSprites as a part of my scene. They all have transparency (PNG-textures). When I have them overlapping, some of those show black background instead of transparency. Those DecalSprites have different Z-coordinates. So they should look like one behind another.
Please note also the line on the border of a texture. This is also something that I'm struggling to remove.
Update 1: I use PerspectiveCamera in the scene. But all the decals are positioned to face the camera as in 2d mode. So this "black" background appears only in certain cases e.g. when camera goes right (and all those decals appear in the left of the scene). Also I use the CameraGroupStrategy
Solved! The reason was that CameraGroupStrategy when ordering Decals (from farthest to closest to camera) takes the "combined" vector distance between camera and the Decal. When my camera panned to left or to right the distance to the Z-farthest Decal became LESS than the Z-closer Decal. This produced the artifact. Fix:
GroupStrategy strategy = new CameraGroupStrategy(cam , new ZStrategyComparator());
And the Comparator:
private class ZStrategyComparator implements Comparator<Decal> {
#Override
public int compare (Decal o1, Decal o2) {
float dist1 = cam.position.dst(0, 0, o1.getPosition().z);
float dist2 = cam.position.dst(0, 0, o2.getPosition().z);
return (int)Math.signum(dist2 - dist1);
}
}
Thanks to all guys who tried to help. Especially Xoppa. He sent me into the right direction in libGDX IRC.

OpenGL ES2.0 - After restarting the App, everything looks.. weird

EDIT:
More debugging led me to the fact that glGetAttribLocation returns -1, except for the first start of the Application. Program ID is valid (I guess?), it was 12 in my testing right now. I also tried to retrieve attribute location right before drawing again, but this did not work out neither.
My shader "architecture" now looks like this:
I've turned the shader into a singleton. I.e. only one instance. Using it:
public void useProgram() {
GLES20.glUseProgram(iProgram);
getUniformLocations();
getAttributeLocations();
}
I.e. program will be sent to OpenGL, afterwards I'm retrieving uniform and attribute locations for all my variables, they are stored within a HashMap (one for each shader):
#Override
protected void getAttributeLocations() {
addToGLMap(A_NORMAL, GLES20.glGetAttribLocation(iProgram, A_NORMAL));
addToGLMap(A_POSITION, GLES20.glGetAttribLocation(iProgram, A_POSITION));
addToGLMap(A_COLOR, GLES20.glGetAttribLocation(iProgram, A_COLOR));
}
I don't understand, why the program's ID is for example 12, but all the attribute locations are non-existent in the second and the following run of my Application...
In my Application, I am loading a Wavefront object, as well as I am drawing several lines and cubes, just to try something. After starting the Application "clean", i.e. after rebooting or installing it, everything looks as intended. But if I close the Application and re-open it, it looks weird, screenshot is at the bottom.
What I'm currently doing:
onSurfaceCreated:
Taking care of culling, clear color, etc, etc.
Clear all loaded objects (just for testing, will of course not delete memory in later phase).
Reload objects (threaded).
My objects are stored like this:
public class WavefrontObject {
private FloatBuffer mPositionBuffer = null;
private FloatBuffer mColorBuffer = null;
private FloatBuffer mNormalBuffer = null;
private ShortBuffer mIndexBuffer = null;
}
Buffers are filled upon creation of the element.
They are drawn:
mColorBuffer.position(0);
mNormalBuffer.position(0);
mIndexBuffer.position(0);
mPositionBuffer.position(0);
GLES20.glVertexAttribPointer(mShader.getGLLocation(BaseShader.A_POSITION), 3, GLES20.GL_FLOAT, false,
0, mPositionBuffer);
GLES20.glEnableVertexAttribArray(mShader.getGLLocation(BaseShader.A_POSITION));
// etc...
GLES20.glDrawElements(GLES20.GL_TRIANGLES, mIndexBuffer.capacity(), GLES20.GL_UNSIGNED_SHORT, mIndexBuffer);
Do I need to disable the VertexAttribArrays after drawing them? I am currently overwriting the buffer for each drawing loop, but do they maybe interact with other models being drawn?
The model I am loading displays a small toy-plane. After restarting the Application, it looks like this (loading the object, all colors are set to white (for testing)):
So to me it looks like the buffers either have left-over stuff in them? What's the "best practice" for using these buffers? Disable the arrays? Does OpenGL ES2.0 offer some sort of "clear buffer" method that I can use before putting my values in them?
What was expected to be drawn: At the point where the "weird triangles" and colors origin from, there should be the plane-model. All in white.
When your application loses context its OpenGL context is destroyed.
So all objects (programs and its uniform/attribute handles, etc) are invalidated.
During reopening you have to clear/invalidate all singleton objects like yours...

Andengine swipe navigation

I am new to andengine and want to know that how I can switch between two BaseGameActivities. And also when switching from first activity to second, there is no black screen transition in between switching. Is there any possible way to do it.
Please help me out.
A BaseGameActivity can be used as any other Android Activity, too:
Intent intent = new Intent(this, MyOtherBaseGameActivity.class);
startActivity(intent);
So if you want to change from your program to another app (maybe by opening the browser…) you can do that as with any other Android App, too. However if both Activities are part of your own App, there is rarely a case where this is recommendable (It is like starting a second program). Although it is possible to exchange data between the activities as described in this post.
But maybe you are only looking for a way to switch between Views in AndEngine. If that's the case you can switch between Scenes without any transition necessary.
MyOtherScene secondScene = new MyOtherScene();
mEngine.setScene(secondScene);
That way you can switch between what is being displayed, without needing to load every image again.
EDIT:
Since you can't use AndEngine to switch between Activities, nor is a smooth switching between scenes possible. Here a quick example on how to switch between two screens (e.g. menus). In this example the screens are actually 2 different images (as big as the display … maybe some background images). Note: there is no such thing as 'screens' in AndEngine, it is simply a self made class that extends Entity.
Your Screen
public MyScreen extends Entity{
private float firstX;
public MyScreen(Sprite niceBackgroundImage1, Sprite niceBackgroundImage2){
this.attachChild(niceBackgroundImage1); // attach something to the screen, so you can see it (preferably an image that is as big as your camera)
this.attachChild(niceBackgroundImage2);
this.firstY=-1; // this is a variable to remember the first x coordinate touched
}
#Override
public boolean onAreaTouched(TouchEvent sceneTouchEvent, float touchAreaLocalX, float touchAreaLocalY) {
if(sceneTouchEvent.getAction()==TouchEvent.ACTION_DOWN){
this.firstY=touchAreaLocalX; // remember the x, on the first touch
}
if(sceneTouchEvent.getAction()==TouchEvent.ACTION_MOVE){
if(touchAreaLocalX>firstY+20){
// user swiped from left to right (at least 20 pixels)
niceBackgroundImage1.registerEntityModifier(new MoveModifier(3f, 0, niceBackgroundImage1.getWidth(), 0, 0, EaseBounceOut.getInstance()));
// the last line actualy moves the nice..image1 from left to right (in 3 seconds) and lets it bounce a little bit before it is completely out of the screen
return true;
}
}
return false;
}
...
}
Your Activity
private HUD hud; // create a HUD
...
#Override
protected void onCreateResources() {
this.hud = new HUD(); // init the HUD
this.myScreen = new MyScreen(image1, image2) // init the screen
this.hud.attachChild(myScreen); // attach the screen to the hud
mEngine.getCamera().setHud(hud); // attach your HUD to the camera
}
#Override
protected Scene onCreateScene() {
Scene myScene = new Scene();
myScene.registerTouchArea(myScreen); // you have to register the touch area of your Screen Class to the scene.
mEngine.setScene(myScene);
}
And this is how it works:
you create yourself a own screen class that extends Entity. An Entity can be everything visible in AndEngine (like a Sprite, Rectangle or even a whole scene). Put something in your screen class to make it look nice, preferably a big image that fills the whole display. That image will be responsible to register the touch afterwards. If the image is too small and the user misses the image, then no touch will be registered.
In this case I attach the instance of MyScreen to the cameras HUD. That way it will be at a fixed position on the display and it will have a fixed size (just in case you want to make the scene scrollable or zoomable).
Now when the app starts the HUD will be created and attached to the camera and with it your MyScreen class. Then the scene will be created and the screen's area will be registered as touch area to the scene. When a swipe movement on a horizontal axis gets noticed by the screen class, the first image will move outside the screen (in the same direction as the swipe).
But be careful, this is just an example. There is nothing defined on how the touch has to act when the first image was moved outside the screen or how big the screen actually is etc...
I know this is quite a long example, maybe it won't even work the first time and it is definitely not the only way on how switching between different screens can be done. But it shows you how to override the onAreaTouched() method and register the entity modifier to make the image move. Hopefully it will lead you in the right direction, to accomplish what you want to do.

Andengine sprites strange behaviour

I have created a little screenmanager (to handle multiple scenes), where every class extends from a custom class called Screen, and does the following (for example) in its load method:
public Scene load() {
BitmapTextureAtlas mBitmapTextureAtlas = new BitmapTextureAtlas(512, 1024, TextureOptions.BILINEAR_PREMULTIPLYALPHA);
SceneManager.loadTexture(mBitmapTextureAtlas);
scene.attachChild(bgSprite);
return scene;
}
The problem is that sometimes, if you move fast among screens, some sprites are not being rendered, sometimes they are (it depends on how fast I switch between scenes).
I guess the problem might be that I'm attaching the sprites to the scene when they still have been not fully loaded in memory. Can it be? Any idea how to solve this problem?
Yes it happens if you move across scenes, so you can set boolean flags for sprites. if true then perform operations. It is specially useful when perform collisionDetections.

Preventing "flickering" when calling Drawable.draw()

I have a little experimentation app (essentially a very cut-down version of the LunarLander demo in the Android SDK), with a single SurfaceView. I have a Drawable "sprite" which I periodically draw into the SurfaceView's Canvas object in different locations, without attempting to erase the previous image. Thus:
private class MyThread extends Thread {
SurfaceHolder holder; // Initialised in ctor (acquired via getHolder())
Drawable sprite; // Initialised in ctor
Rect bounds; // Initialised in ctor
...
#Override
public void run() {
while (true) {
Canvas c = holder.lockCanvas();
synchronized (bounds) {
sprite.setBounds(bounds);
}
sprite.draw(c);
holder.unlockCanvasAndPost(c);
}
}
/**
* Periodically called from activity thread
*/
public void updatePos(int dx, int dy) {
synchronized (bounds) {
bounds.offset(dx, dy);
}
}
}
Running in the emulator, what I'm seeing is that after a few updates have occurred, several old "copies" of the image begin to flicker, i.e. appearing and disappearing. I initially assumed that perhaps I was misunderstanding the semantics of a Canvas, and that it somehow maintains "layers", and that I was thrashing it to death. However, I then discovered that I only get this effect if I try to update faster than roughly every 200 ms. So my next best theory is that this is perhaps an artifact of the emulator not being able to keep up, and tearing the display. (I don't have a physical device to test on, yet.)
Is either of these theories correct?
Note: I don't actually want to do this in practice (i.e. draw hundreds of overlaid copies of the same thing). However, I would like to understand why this is happening.
Environment:
Eclipse 3.6.1 (Helios) on Windows 7
JDK 6
Android SDK Tools r9
App is targetting Android 2.3.1
Tangential question:
My run() method is essentially a stripped-down version of how the LunarLander example works (with all the excess logic removed). I don't quite understand why this isn't going to saturate the CPU, as there seems to be nothing to prevent it running at full pelt. Can anyone clarify this?
Ok, I've butchered Lunar Lander in a similar way to you, and having seen the flickering I can tell you that what you are seeing is a simple artefact of the double-buffering mechanism that every Surface has.
When you draw anything on a Canvas attached to a Surface, you are drawing to the 'back' buffer (the invisible one). And when you unlockCanvasAndPost() you are swapping the buffers over... what you drew suddenly becomes visible as the "back" buffer becomes the "front", and vice versa. And so your next frame of drawing is done to the old "front" buffer...
The point is that you always draw to seperate buffers on alternate frames. I guess there's an implicit assumption in graphics architecture that you're always going to be writing every pixel.
Having understood this, I think the real question is why doesn't it flicker on hardware? Having worked on graphics drivers in years gone by, I can guess at the reasons but hesitate to speculate too far. Hopefully the above will be sufficient to satisfy your curiousity about this rendering artefact. :-)
You need to clear the previous position of the sprite, as well as the new position. This is what the View system does automatically. However, if you use a Surface directly and do not redraw every pixel (either with an opaque color or using a SRC blending mode) you must clear the content of the buffer yourself. Note that you can pass a dirty rectangle to lockCanvas() and it will do the union for you of the previous dirty rectangle and the one you are passing (this is the mechanism used by the UI toolkit.) It will also set the clip rect of the Canvas to be the union of these two rectangles.
As for your second question, unlockAndPost() will do a vsync, so you will never draw at more than ~60fps (most devices that I've seen have a display refresh rate set around 55Hz.)

Categories

Resources