I have an app that I've been repeatedly playing with in android, it uses opengl-es.
Currently I load textures from a bitmap like so:
//Load up and flip the texture - then dispose the temp
Bitmap temp = BitmapFactory.decodeResource(Deflecticon.getContext().getResources(), resourceID);
Bitmap bmp = Bitmap.createBitmap(temp, 0, 0, temp.getWidth(), temp.getHeight(), flip, true);
temp.recycle();
//Bind the texture in memory
gl.glBindTexture(GL10.GL_TEXTURE_2D, id);
//Set the parameters of the texture.
gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MIN_FILTER, GL10.GL_LINEAR);
gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MAG_FILTER, GL10.GL_LINEAR);
//On to the GPU
GLUtils.texImage2D(GL10.GL_TEXTURE_2D, 0, bmp, 0);
The obvious issue is that the texture I'm using has to be a power of 2. At the moment I'm pre-editing the textures in photoshop to be a power of 2 and simply have empty borders. However this is a little tedious and I want to be able to load them as they are .. recognise they aren't a power of 2 and load them into a texture that is.
I know I could scale the bitmap to become a power of 2 size and simply stretch the texture but I do not wish to stretch the texture and in some cases may want to put several textures into one "atlas".
I know I can use glTexSubImage2D() to paste into the texture the data I want at the origin I want. This is great!
However I do not know how in Android to initialise a texture with no data?
In this question previously asked the suggestion was to call glTexImage2D() with no data and to then fill it.
However in android when you call "GLUtils.texImage2D(GL10.GL_TEXTURE_2D, 0, bmp, 0);" you do not specify a width / height. It reads this from the bitmap I assume.
What is the best way to do this? Can I create a new bitmap of the right power of 2 size only blank and not filled with any data and use this to initialise the texture then paste into it using subImage? Or should I make a new bitmap somehow copy the pixels I want (not sure if you can do this easily) into this new bitmap (leaving borders) and then just use this?
Edit: clarified that I'm using opengl.
I think if you tried creating a bitmap with the power of 2 axis sizes and then add your bitmap it should work just fine. maybe something like
Bitmap.createBitmap(notPowerOf2Bitmap, offx, offy, xsize, ysize, bitmapFlag)
other than that, I would say suffer through the photoshop process. How many pictures you got?
Non-power-of-two (NPOT) bitmaps are supported on some GLES platforms, but you have to check to see if the appropriate extension exists. Note, however, that at least on PowerVR SGX, even though NPOT is supported, there are still some other fairly arbitrary restrictions; for example, your texture width must be a multiple of 2 (if not a power of 2). Also, NPOT rendering tends to be a bit slower on many GPUs.
One thing you can do is just create a wrapper texture which is a power-of-two size and then use glTexSubimage2D to upload the texture to cover only part of that, and then adjust your texture coordinates accordingly. The obvious drawback to this is that you can't use texture wrapping in that circumstance. If you absolutely must support wrapping, you could just scale your texture to the nearest power-of-two size before you call glTexImage2D, although this usually introduces sampling artifacts and makes things blurry, especially if you're trying to do pixel-precise 2D work.
Another thing you might consider, if you don't need to support wrapping, is to make a "texture atlas," in which you condense all of your textures into a few big textures, and have your polygons map to just some portions of the texture atlas(es). You have to be careful when generating MIPmaps, but other than that it usually provides a pretty nice performance benefit, as well as making more efficient use of texture memory since you're not wasting so much on padded or scaled images.
I have 2 solutions I have employed for this problem. I can be more specific if necessary, but conceptually you can: -
Make the image a power of 2, and the section to crop you fill with 100% alpha channel and load the images with alpha enabled.
Tweak your texture vector / buffer so it doesn't load that section. So instead of using the default
float texture[] = {
0.0f, 1.0f, //
1.0f, 1.0f, //
0.0f, 0.0f, //
1.0f, 0.0f, //
};
as the matrix (obviously this is for loading an image to a 2 triangled square), factor back by ratio the area to crop, eg.
float texture[] = {
0.0f, 0.75f, //
0.9f, 0.75f, //
0.0f, 0.0f, //
0.9f, 0.0f, //
};
of course, be precise with your math or the unwanted bit may bleed in, or you'll cut out some of the real image. Obviously this array is calculated on the fly and not hard-coded as I have demonstrated here.
Uh why don't you create two bitmaps. Load the first one as you're doing then use createBitmapScaled to turn that bitmap into a power of two. Performance-wise I don't know if it is the fastest method possible but it works.
YOu can use GLES20.glTexImage2D() to create a empty texture with specified width and height. The example code is
public static int genTexture(int texWidth, int texHeight) {
int[] textureIds = new int[1];
GLES20.glGenTextures(1, textureIds, 0);
assertNoError();
int textureId = textureIds[0];
texWidth = Utils.nextPowerOf2(texWidth);
texHeight = Utils.nextPowerOf2(texHeight);
GLES20.glBindTexture(GL11.GL_TEXTURE_2D, textureId);
GLES20.glTexParameteri(
GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
GLES20.glTexParameteri(
GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D,
GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D,
GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);
GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_RGBA,
texWidth, texHeight, 0, GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, null);
return textureId;
}
Related
I would like to create an interactive 2D effect that I can put on anything. So I want to take a mostly transparent effect, render it to a texture, and put it wherever I want simply by putting it on a square.
The problem I encountered is that I can't get rid of the background color. When I put the effect over an object, the background color of the effect blocks out the object that I want to put the effect over.
Here is my code. Can anybody tell me what I'm missing?
Drawing:
GLES20.glClearColor(0.6f, 0.34f, 0.14f, 0.0f);
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);
GLES20.glDisable(GLES20.GL_DEPTH_TEST);
GLES20.glEnable(GLES20.GL_BLEND);
GLES20.glBlendFunc(GLES20.GL_SRC_ALPHA, GLES20.GL_ONE_MINUS_SRC_ALPHA);
ball.draw(); //This is the object a simple square draw with 2 triangles
particleSystem.renderToTexture(); //effect rendering to texture and drawing it
The render to texture code code:
public void renderToTexture(){
GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, fb[0]);
GLES20.glClear( GLES20.GL_DEPTH_BUFFER_BIT | GLES20.GL_COLOR_BUFFER_BIT);
GLES20.glFramebufferTexture2D(GLES20.GL_FRAMEBUFFER, GLES20.GL_COLOR_ATTACHMENT0, GLES20.GL_TEXTURE_2D, renderTex[0], 0);
GLES20.glFramebufferRenderbuffer(GLES20.GL_FRAMEBUFFER, GLES20.GL_DEPTH_ATTACHMENT, GLES20.GL_RENDERBUFFER, depthRb[0]);
int status = GLES20.glCheckFramebufferStatus(GLES20.GL_FRAMEBUFFER);
drawRender(); //draws the effect on a fbo and saving the texture to renderTex[0]
GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, 0);
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, renderTex[0]);
drawEffect(); //draws a square with the effect texture on it
The problem should be here somewhere, but I have no idea what to do. I tried everything I could think of. I even triend to dispose of the background color in the shader.
The posted code looks perfectly fine. You will need to make sure that the texture you use for the color render target (renderTex[0]) has an alpha component.
Note that the number of texture formats in ES 2.0 that are guaranteed to be color-renderable is very limited. The only two with an alpha component (see table 4.5. in the spec) are GL_RGBA4 and GL_RGB5_A1. Most notably, this does not include GL_RGBA with 8 bits per component.
So for defining a color-renderable texture with alpha component that is guaranteed to work across all ES 2.0 implementations, you will have to use GL_RGBA for the internal format, and GL_UNSIGNED_SHORT_4_4_4_4 or GL_UNSIGNED_SHORT_5_5_5_1 for the type argument of glTexImage2D().
Most common devices (at least all the ones I have seen) do support the OES_rgb8_rgba8 extension, which adds support for rendering to 8 bit component textures. But if you want to be completely portable, you should check for the presence of this extension before using render targets with those formats.
I know this question is old, but i also encountered it right now and the solution for me was to do something with the GLSurfaceView. Just add these lines after setting the context
glSurfaceView.setEGLConfigChooser(8, 8, 8, 8, 0, 0);
glSurfaceView.getHolder().setFormat(PixelFormat.TRANSLUCENT);
I was already using GL_RGBA format and was really annoyed that nothing else was working, but this worked like magic.
I'm developing a game for android and this is my first experience with OpenGL.
When the application loads I create my vertices and texture buffers, I load images from drawable resources; using GLUtils.tex2Image2D to bind the image to a texture array.
I was wondering if glBindTexture() was the correct function to use when changing the texture to produce animation.
public void onDraw(GL10 gl){
sprite.animate();
gl.glBindTexture(GL10.GL_TEXTURE_2D, textures[sprite.frameNumber]);
sprite.draw(gl);
}
Code Explanation
sprite.animate() - changes the frame number depending on System.uptimeMillis()
sprite.draw() - does the actual drawing:
gl.glEnableClientState(GL10.GL_VERTEX_ARRAY);
gl.glEnableClientState(GL10.GL_TEXTURE_COORD_ARRAY);
gl.glVertexPointer(3, GL10.GL_FLOAT, 0, vertexBuffer);
gl.glTexCoordPointer(2, GL10.GL_FLOAT, 0, textureBuffer);
gl.glDrawArrays(GL10.GL_TRIANGLE_STRIP, 0, vertices.length / 3);
//Disable the client state before leaving
gl.glDisableClientState(GL10.GL_VERTEX_ARRAY);
gl.glDisableClientState(GL10.GL_TEXTURE_COORD_ARRAY);
The function does work, but I wanted to confirm it was the correct function to use, or if there is an alternative way to do this.
Binding a different texture to animate is one way to do what you want.
A more popular way of doing this is to have all your animations frames in a big texture (pack all the individual frames in a huge rectangle): to draw a different frame, just change the texture coordinates.
For example, pack four frames of animation in a big 2x2 square
1|2
3|4
Then you'll use for texture coordinates (0,0) (0.5,0) (0.5,0.5) (0,0.5) to display frame 1, and the rest should be obvious.
I've tried to follow all the information I could find, but I am not having any luck finding the source of my texture problems and could really use a hand.
The following is piece of code in which I'm trying to draw 3 pieces of my background using glDrawTexfOES. The 3 pieces should look like green grass.
public void onDrawFrame(GL10 gl) {
gl.glClear(GL10.GL_COLOR_BUFFER_BIT | GL10.GL_DEPTH_BUFFER_BIT);
gl.glLoadIdentity();
gl.glFrontFace(GL10.GL_CW);
gl.glEnableClientState(GL10.GL_TEXTURE_COORD_ARRAY);
gl.glEnable(GL10.GL_TEXTURE_2D);
gl.glBindTexture(GL10.GL_TEXTURE_2D, grass);
gl.glColor4x(0x10000, 0x00000, 0x10000, 0x10000);
((GL11Ext) gl).glDrawTexfOES(0.0f, 0.0f, -8.0f, 32, 32);
((GL11Ext) gl).glDrawTexfOES(32.0f, 0.0f, -8.0f, 32, 32);
((GL11Ext) gl).glDrawTexfOES(64.0f, 0.0f, -8.0f, 32, 32);
}
Instead of green grass however, I'm getting 3 brown squares:
http://img109.imageshack.us/img109/9670/84615249.jpg
Any help in figuring out why my textures won't display correctly would be most appreciated!
On a related note, for building a simple 2D tiled game is glDrawTexfOES the most efficient method for generating the tiled background?
Thanks in advance,
Harry
You need to define the mapping of the image to the rectangle
You are probably just getting the top left pixel of the image.
int[]crop={0,0,text_dimx,text_dimy};
((GL11) gl).glTexParameteriv(GL10.GL_TEXTURE_2D, GL11Ext.GL_TEXTURE_CROP_RECT_OES, crop,0);
If you google most OpenGL functions you get documentation for the function, followed by examples. For that function, the results are dominated by people having trouble.
That function might be slightly faster than using textures the usual way (I haven't used it so I don't know), but simply applying a texture to a quad needs no extensions and is extremely portable.
Why don't you try that and get back to us... it could be that the function is working perfectly, but something went wrong initializing the texture. Using the uncomplicated texture-a-quad method will help you find that out as well.
I am writing a small app that at the moment generates a random map of textures.
I am drawing this map as a 10 x 15 group of "quads" which are infact all triangle strips. I use the "map" to grab an int which I then take as the location of the texture for this square in the textureAtlas. so for example 0 is the bottom left "tile". The atlas is 128 x 128 and split into 32 pixel tiles.
However I seem to be getting some odd artifacts where the texture from the one tile is creeping in to the next tile. I wondered if it was the image itself but as far as I can tell the pixels are exactly where they should be. I then looked at the texture coords I was specifying but they all look exact (0.0, 0.25, 0.5, 0.75, 1.0 - splitting it into the 4 rows and columns I would expect).
The odd thing is if I run it on the emulator I do not get any artifacts.
Is there a setting I am missing which would cause bleeding of 1 pixel? It seemed to only be vertical too - this could be related to on the phone I am "stretching" the image in that direction as the phone's screen is larger than normal in that direction.
I load the texture like so:
//Get a new ID
int id = newTextureID(gl);
//We will need to flip the texture vertically
Matrix flip = new Matrix();
flip.postScale(1f, -1f);
//Load up and flip the texture
Bitmap temp = BitmapFactory.decodeResource(context.getResources(), resource);
//Store the widths for the texturemap
int width = temp.getWidth();
int height = temp.getHeight();
Bitmap bmp = Bitmap.createBitmap(temp, 0, 0, width, height, flip, true);
temp.recycle();
//Bind
gl.glBindTexture(GL10.GL_TEXTURE_2D, id);
//Set params
gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MIN_FILTER, gl.GL_LINEAR);
gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MAG_FILTER, gl.GL_LINEAR);
//Push onto the GPU
GLUtils.texImage2D(GL10.GL_TEXTURE_2D, 0, bmp, 0);
TextureAtlas atlas = new TextureAtlas(id, width, height, tileSize);
return atlas;
I then render it like so:
gl.glBindTexture(GL10.GL_TEXTURE_2D, currentAtlas.textureID);
//Enable the vertices buffer for writing and to be used during our rendering
gl.glEnableClientState(GL10.GL_VERTEX_ARRAY);
//Specify the location and data format of an array of vertex coordinates to use
gl.glVertexPointer(3, GL10.GL_FLOAT, 0, vertexBuffer);
//Enable the texture buffer
gl.glEnableClientState(GL10.GL_TEXTURE_COORD_ARRAY);
gl.glTexCoordPointer(2, GL10.GL_FLOAT, 0, textureBuffer);
gl.glDrawElements(GL10.GL_TRIANGLES, indices.length, GL10.GL_UNSIGNED_SHORT, indexBuffer);
I would take a picture, but I am unsure how to get a screen cap from the phone...
If anyone knows of how I can capture the current frame and perhaps put it out into a file I will do that if it helps explain what is going on!
Look forward to your response.
Edit: Here is a screencap - note I run the app in landscape but cap is in portrait. Also excuse the horrible textures :D they were merely a place holder / messing around.
ScreenCap
Well, after speaking to a friend I managed to solve this little problem.
It turns out that if you wish to have exact pixel perfect textures you have to specify the edge of the texture to be halfway into the pixel.
To do this I simply added half a pixel or subtracted half a pixel to the measurement for the texture coords.
Like so:
//Top Left
textureCoords[textPlace] = xAdjust*currentAtlas.texSpaceWidth + currentAtlas.halfPixelAdjust; textPlace++;
textureCoords[textPlace] = (yAdjust+1)*currentAtlas.texSpaceHeight - currentAtlas.halfPixelAdjust; textPlace++;
This was simply calculated when loading the texture atlas:
(float)(0.5 * ((1.0f / numberOfTilesInAtlasRow) / pixelsPerTile));
Although if the height was different to the width of each tile (which could happen) you would need to calculate them individually.
This has solved all the artifacts so I can continue on. Hope it helps someone else!
I'm successfully generating my textures using GLUtils.texImage2D,
but when I use the textures generated I get problems with my alpha: they are darker than wanted.
after having checked several things I finally got the the conclusions that the problem comes from GLUtils.texImage2D(GL10.GL_TEXTURE_2D, level, bmp, 0);
I created a second function that uses gl.glTexImage2D(GL10.GL_TEXTURE_2D, level, GL10.GL_RGBA, width, height, 0, GL10.GL_RGBA, GL10.GL_UNSIGNED_BYTE, pixels2);
but it is costly in processing to create pixels2 which is a bytebuffer in which I have to recopy the bytes while changing the values from the bitmap ARGB to texture RGBA.
Has anybody noticed that ? and if so how did you solve this...
jason
Thank you for your answer,
I'm already using
gl.glBlendFunc(GL10.GL_SRC_ALPHA, GL10.GL_ONE_MINUS_SRC_ALPHA);
and I'm getting this problem
my problem is that the alpha generated by GLUtils isn't the one of the texture, its darker.
the difference is like looking at a color in the sun and in the shade (if that makes any sence).
I already tried gl.gltextimage2d but the creating the buffer takes too long, unless there is a tool to convert a bitmap to a byte buffer that I don't know of...
GLUtils.texImage2D generates a premultiplied-alpha texture. In this generated texture, all pixels are multiplied by alpha value, so you don't need to multiply alpha value once again.
Let's try
gl.glBlendFunc(GL10.GL_ONE, GL10.GL_ONE_MINUS_SRC_ALPHA);
The alpha channel is the poor mistreated stepchild of a huge number of programmers is all I can say... but the upload works fairly efficient if you do that:
Estimate your largest texture (like 1024x1024) and create an int array of that size (1024*1024) that you store in some static variable or somewhere where you can access it so that you don't need to recreate that array (allocation time is precious here)
then do this:
bitmap.getPixels(pixels, 0, width, 0, 0, width, height);
gl.glTexImage2D(GL10.GL_TEXTURE_2D, 0, GL10.GL_RGBA, width, height,
0, GL10.GL_RGBA, GL10.GL_UNSIGNED_BYTE, IntBuffer.wrap(pixels));
I am sorry not having found a different solution... the original problem is, that the implementor of the GLUtils.texImage2D function has mistreated the alpha channel somehow resulting in black borders when the image is displayed larger than it is (the bilinear filter calculates the color between four pixels and when the rgb values of the pixels with transparent alphas have been mangled (like set to black), the result is, that there's some kind of a "color bleeding" happening over the transparent border that is forming there. Very ugly. Maybe it was done to improve the compression ratio as the RGB values in alpha transparent areas in PSDs contain a lot of junk that when eliminated yield a lot of room of improvement for compression algorithms)
Edit: Sadly, this approach was only working for grey images correctly as the red and blue channel is swapped when fetching the pixels from the bitmap. At least on MY device. I am not sure how correctly this is for other devices, but in my case, this here did the trick:
for (int i=0;i<pixels.length;i+=1) {
int argb = pixels[i];
pixels[i] = argb&0xff00ff00 | ((argb&0xff)<<16) | ((argb>>16)&0xff);
}
Solution is found here. It is related as is stated by others with premultiplied alpha.
In the surfaceView Constructor
setEGLConfigChooser(8, 8, 8, 8, 0, 0);
getHolder().setFormat(PixelFormat.RGBA_8888);
In the View.Renderer onSurfaceCreated
gl.glEnable(GL10.GL_BLEND);
gl.glBlendFunc(GL10.GL_ONE, GL10.GL_ONE_MINUS_SRC_ALPHA);
Android Bitmap stores images loaded from PNG with premultiplied colors. GLUtils.texImage2D also uses colors premultiplied by alpha, so you can't get original colours this way.
In order to load PNG images without RGB channels being premultiplied I use 3rd party PNGDecoder and load texture with glTexImage2D. You can get PNGDecoder library to decode PNG from here: http://twl.l33tlabs.org/#downloads
There is an issues in GLUtils with premultiplied alpha. The only workaround that I can propose is to use:
gl.glBlendFunc(GL10.GL_SRC_ALPHA, GL10.GL_ONE_MINUS_SRC_ALPHA)
In case you need other blend functions you will have to use gl.glTexImage2D.
Android's BitmapFactory.decode() premultiplies alpha by default on loading.
So if you don't want to load premultiplied bitmaps, use Bitmap.Options with inPremultiplied set to true when loading the bitmap for texture:
//kotlin
val options = BitmapFactory.Options()
options.inPremultiplied = false
val bitmap = BitmapFactory.decodeStream(inputStream, null, options)
Then pass this bitmap to GLUtils.texImage2D
P.S.
Nice video for understanding premultiplied alpha:
https://www.youtube.com/watch?v=wVkLeaWQOlQ