Draw in FrameBuffer - android

I work with a program which draws some 2D textures and work with them. In this program was used libgdx. I have some problem with using FrameBuffer. I try draw some texture in my FrameBuffer and after that I need save changed texture(or draw) and use this texture in this FrameBuffer on more time. I try save texture via
Texture texture = mFrameBuffer.getColorBufferTexture()
and I try just bind texture from FrameBuffer
mFilterBuffer.getColorBufferTexture().bind();
For first iteration all work good. But when I try use in FrameBuffer his ColorBufferTexture like texture I have fully black texture.
Code:
public void process(MySprite psObject, float startX, float startY, 
float endX, float endY, int mWidth, int mHeight) {
         boolean frst = false;
         if(psObject.getFrameBuffer() == null){
             psObject.setFrameBuffer(new FrameBuffer(Pixmap.Format.RGBA8888, psObject.getTexture().getWidth(), psObject.getTexture().getHeight(), true));
         }
         if(pSprite == null || pSprite != psObject){
             mFrameBuffer = psObject.getFrameBuffer();
             frst = true;
             pSprite = psObject;
         }
         mFrameBuffer.begin();
         Gdx.gl.glViewport(0, 0, psObject.getTexture().getWidth(), psObject.getTexture().getHeight());
         Gdx.graphics.getGL20().glClearColor(0f, 0f, 0f, 1f);
         Gdx.graphics.getGL20().glClear(GL20.GL_COLOR_BUFFER_BIT);
         ShaderProgram shader = MyUtils.newInstance().getCurrentShader();
         if(!shader.isCompiled()){
             Log.i("ERROR", "SHERROR " + shader.getLog());
         }
         if(shader != null){
             if(frst){
                 psObject.getTexture().bind();
             }else{
mFrameBuffer.getColorBufferTexture().bind();
             }
             shader.begin();
             Matrix4 matrix = new Matrix4();
             matrix.setToRotation(1, 0, 0, 180);
             matrix.scale(scaleSizeInFilterProcessor, scaleSizeInFilterProcessor, 1);
             shader.setUniformMatrix("u_worldView", matrix);
             shader.setUniformi("u_texture", 0);
             float [] start = new float[]{0f,0};
             float [] end = new float[]{1f,1f};
             MyUtils.newInstance().getShaderData(shader, start, end, mWidth, mHeight);
             psObject.getMesh().render(shader, GL20.GL_TRIANGLES);
             shader.end();
         }
         mFrameBuffer.end();
     }

Your code needs some refactoring ;). Anyway, you can't read and write from/to the same FBO if that's your question.
You'll need 2 FBO's (say A and B)
Draw scene to A,
Bind A's color texture
Draw scene to B (now you can read from A).
Note that you can extend libgdx FBO to have many textures associated with the same FBO.

Related

Android ARCore render objects without Sceneform

I would like to render spheres like in the image below attached to anchors.
Unfortunately all examples are based on a Sceneform which I don't want to use. The spheres should be free in in the air without being bound to a flat surface.
With the Hello_AR example from Google I was able to render a 3D sphere into the space and fix it by attaching it to an anchor.
#Override
public void onSurfaceCreated(GL10 gl, EGLConfig config) {
...
backgroundRenderer.createOnGlThread(this);
virtualObject.createOnGlThread(this, "models/sphere.obj", "models/sphere.png");
virtualObject.setMaterialProperties(0.0f, 0.0f, 0.0f, 0.0f);
...
}
#Override
public void onDrawFrame(GL10 gl) {
...
// Get projection matrix.
float[] projmtx = new float[16];
camera.getProjectionMatrix(projmtx, 0, 0.1f, 100.0f);
// Get camera matrix and draw.
float[] viewmtx = new float[16];
camera.getViewMatrix(viewmtx, 0);
// Compute lighting from average intensity of the image.
// The first three components are color scaling factors.
// The last one is the average pixel intensity in gamma space.
final float[] colorCorrectionRgba = new float[] {255f, 0, 0, 255f};
frame.getLightEstimate().getColorCorrection(colorCorrectionRgba, 0);
// Visualize anchors created by touch.
float scaleFactor = 1.0f;
for (Anchor anchor : anchors) {
if (anchor.getTrackingState() != TrackingState.TRACKING) {
continue;
}
anchor.getPose().toMatrix(anchorMatrix, 0);
virtualObject.updateModelMatrix(anchorMatrix, scaleFactor);
float[] objColor = new float[] { 255f, 255f, 255f, 0 };
virtualObject.draw(viewmtx, projmtx, colorCorrectionRgba, objColor);
}
}
With that I am able to create a black sphere 1 meter away from the camera in the air.
My questions:
Is this a good / correct way to do it?
How do I change the color of the sphere, since color values have no effect on the object
How do I make it transparent?
Thank you very much.
You need to attach it to anchor. You don't need to use sceneform. Sceneform is only one of two methods.
In terms of color and transparency it depends on the way you serve your object. In your code I see that you're using material so it's hard to change color.

Android - WebRTC: Video call live drawing

I am trying to make a simple feature with WebRTC on an Android (mobile) app.
The app right now can make a simple video call: connect two devices with each other and allow them to hear and see.
What I am trying to achieve is some live drawing during the call. To put it simply: User1 call User2, the call gets connected, then User1 click on a draw button which will freeze the video frame and allow him to draw on this frozen frame. Obviously, User2 should see this drawing happening live on his phone.
Right now I can freeze the frame (by calling videoCapture.stopCapture()) and draw on it with a custom SurfaceViewRenderer. The problem is that User2 does NOT see the drawing, only the frozen frame.
First I tried to create a new videotrack containing the drawing canvas AND the frozen frame to draw on but I couldn't succeed.
When creating a videotrack with peerConnectionFactory.createVideoTrack("ARDAMSv1_" + rand, videoSource);
I am supposed to specify the video source of the track but the source can only be a VideoSource and this VideoSource can only be created with a VideoCapturer which is directly linked to a device camera (without any drawing on it of course). This explains why User2 is not seeing any drawing on his device.
My question here is: how can I create a VideoCapturer which can stream the camera stream (frozen frame) AND a canvas with the drawing on it?
So I tried to implements my own VideoCapturer to either:
1) Capture a View (for example the layout containing the drawing and the frozen frame) and stream it for the VideoSource
OR 2)Capture the camera view but also add the drawing to the frame before streaming it.
I couldn't make any of this work because I have no idea how to manipulate the I420Frame object to draw on it and return it with the right callback.
Maybe I am totally wrong with this approach and need to do something completely different, I am open to any suggestion.
PS: I am using Android API 25 with WebRTC 1.0.19742. I do NOT want to use any paid third party SDK/lib.
Does anyone have a clue how to proceed to achieve a simple WebRTC live drawing from one android app to another android app?
We came back to that feature a couple weeks ago and I managed to find a way.
I extended my own CameraCapturer class to get the hold on the camera frame before the rendering. I then created my own CanvasView to be able to draw on it.
From there what I did was merging the two bitmap together (the camera view + my canvas with the drawing), then I draw it with OpenGL on the buffer and displayed it on the SurfaceView.
If someone is interested I could potentially post some code.
#Override
public void startCapture(int width, int height, int fps) {
Log.d("InitialsClass", "startCapture");
surTexture.stopListening();
cameraHeight = 480;
cameraWidth = 640;
int horizontalSpacing = 16;
int verticalSpacing = 20;
int x = horizontalSpacing;
int y = cameraHeight - verticalSpacing;
cameraBitmap = Bitmap.createBitmap(640, 480, Bitmap.Config.ARGB_8888);
YuvFrame frame = new YuvFrame(null, PROCESSING_NONE, appContext);
surTexture.startListening(new VideoSink() {
#Override
public void onFrame(VideoFrame videoFrame) {
frame.fromVideoFrame(videoFrame, PROCESSING_NONE);
}
});
if (captureThread == null || !captureThread.isInterrupted()) {
captureThread = new Thread(() -> {
try {
if (matrix == null) {
matrix = new Matrix();
}
long start = System.nanoTime();
capturerObs.onCapturerStarted(true);
int[] textures = new int[1];
GLES20.glGenTextures(1, textures, 0);
YuvConverter yuvConverter = new YuvConverter();
WindowManager windowManager = (WindowManager) appContext.getSystemService(Context.WINDOW_SERVICE);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textures[0]);
// Set filtering
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_NEAREST);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_NEAREST);
//The bitmap is drawn on the GPU at this point.
TextureBufferImpl buffer = new TextureBufferImpl(cameraWidth, cameraHeight - 3, VideoFrame.TextureBuffer.Type.RGB, textures[0], matrix, surTexture.getHandler(), yuvConverter, null);
Resources resources = appContext.getResources();
float scale = resources.getDisplayMetrics().density;
Log.d("InitialsClass before", "camera start capturer width- " + cameraWidth + " height- " + cameraHeight);
while (!Thread.currentThread().isInterrupted()) {
ByteBuffer gBuffer = frame.getByteBuffer();
if (gBuffer != null) {
Log.d("InitialsClass ", "gBuffer not null");
cameraBitmap.copyPixelsFromBuffer(gBuffer);
}
if (cameraBitmap != null) {
if (canvas == null) {
canvas = new Canvas();
}
if (appContext.getResources().getConfiguration().orientation == ORIENTATION_PORTRAIT)
rotationDegree = -90;
else {
assert windowManager != null;
if (windowManager.getDefaultDisplay().getRotation() == Surface.ROTATION_0) {
// clockwise
rotationDegree = 0;
} else if (windowManager.getDefaultDisplay().getRotation() == Surface.ROTATION_90) {
// anti-clockwise
rotationDegree = -180;
}
}
canvas.save(); //save the position of the canvas
canvas.rotate(rotationDegree, (cameraBitmap.getWidth() / 2), (cameraBitmap.getHeight() / 2)); //rotate the canvas
canvas.drawBitmap(cameraBitmap, 0, 0, null); //draw the image on the rotated canvas
canvas.restore(); // restore the canvas position.
matrix.setScale(-1, 1);
matrix.postTranslate(/*weakBitmap.get().getWidth()*/ cameraBitmap.getWidth(), 0);
matrix.setScale(1, -1);
matrix.postTranslate(0, /*weakBitmap.get().getHeight()*/ cameraBitmap.getHeight());
canvas.setMatrix(matrix);
if (textPaint == null) {
textPaint = new TextPaint();
}
textPaint.setColor(Color.WHITE);
textPaint.setTypeface(Typeface.create(typeFace, Typeface.BOLD));
textPaint.setTextSize((int) (11 * scale));
if (textBounds == null) {
textBounds = new Rect();
}
textPaint.getTextBounds(userName, 0, userName.length(), textBounds);
textPaint.setTextAlign(Paint.Align.LEFT);
textPaint.setAntiAlias(true);
canvas.drawText(userName, x, y, textPaint);
if (paint == null) {
paint = new Paint();
}
if (isLocalCandidate) {
paint.setColor(Color.GREEN);
} else {
paint.setColor(Color.TRANSPARENT);
}
paint.setStrokeWidth(8);
paint.setStyle(Paint.Style.STROKE);
canvas.drawRect(0, 8, cameraWidth - 8, cameraHeight - 8, paint);
if (surTexture != null && surTexture.getHandler() != null && surTexture.getHandler().getLooper().getThread().isAlive()) {
surTexture.getHandler().post(() -> {
// Load the bitmap into the bound texture.
GLUtils.texImage2D(GLES20.GL_TEXTURE_2D, 0, /*weakBitmap.get()*/ cameraBitmap, 0);
//We transfer it to the VideoFrame
VideoFrame.I420Buffer i420Buf = yuvConverter.convert(buffer);
long frameTime = System.nanoTime() - start;
VideoFrame videoFrame = new VideoFrame(i420Buf, 0, frameTime);
capturerObs.onFrameCaptured(videoFrame);
});
}
}
Thread.sleep(100);
}
} catch (InterruptedException ex) {
Log.d("InitialsClass camera", ex.toString());
Thread.currentThread().interrupt();
return;
}
});
}
captureThread.start();
}
#Anael. Check it out.
I was working on a similar application, so I share the draw on webrtc stream part:
What you need to do is get the stream to canvas.
Then have drawing application to edit on the canvas, I look at a William Malone project. (if you want to import pictures, make then transparent!)
Finally (what you missed I guess) is stream from canvas as you would with any WebRTC.
A little demo I cooked up specially for you here (local webrtc, see log).
PS: I use getDisplayMedia, not userMedia as my webcam is kaput...

OpenGL ES 2.0 render to framebuffer/texture results in black texture

I'm using libgdx but this is pretty much vanilla opengl es 2.0 stuff. Just try and ignore the Gdx.gl prefix everywhere ^^ I'm testing it on my desktop as well as android device and it's the same story in both cases.
I have the following code in my window resize event. It is supposed to delete the frame buffer and associated textures if they already were created, and then make some new ones the right size. I'm not sure if this is even correct to delete the textures and framebuffer like i am doing.
if (depthTexture >= 0)
{
Gdx.gl.glDeleteTexture(depthTexture);
depthTexture = -1;
}
if (colorTexture >= 0)
{
Gdx.gl.glDeleteTexture(colorTexture);
colorTexture = -1;
}
if (depthBuffer >= 0)
{
Gdx.gl.glDeleteFramebuffer(depthBuffer);
depthBuffer = -1;
}
IntBuffer intBuffer = BufferUtils.newIntBuffer(16); // See http://lwjgl.org/forum/index.php?topic=1314.0;wap2
intBuffer.clear();
Gdx.gl.glGenFramebuffers(1, intBuffer);
frameBuffer = intBuffer.get(0);
intBuffer.clear();
Gdx.gl.glGenTextures(1, intBuffer);
colorTexture = intBuffer.get(0);
Gdx.gl.glBindTexture(GL20.GL_TEXTURE_2D, colorTexture);
Gdx.gl.glTexImage2D(GL20.GL_TEXTURE_2D, 0, GL20.GL_RGBA, width, height
, 0, GL20.GL_RGBA, GL20.GL_UNSIGNED_BYTE, null);
Gdx.gl.glTexParameteri(GL20.GL_TEXTURE_2D, GL20.GL_TEXTURE_MIN_FILTER, GL20.GL_NEAREST);
Gdx.gl.glTexParameteri(GL20.GL_TEXTURE_2D, GL20.GL_TEXTURE_MAG_FILTER, GL20.GL_NEAREST);
Gdx.gl.glBindTexture(GL20.GL_TEXTURE_2D, 0);
intBuffer.clear();
Gdx.gl.glGenTextures(1, intBuffer);
depthTexture = intBuffer.get(0);
Gdx.gl.glBindTexture(GL20.GL_TEXTURE_2D, depthTexture);
Gdx.gl.glTexImage2D(GL20.GL_TEXTURE_2D, 0, GL20.GL_DEPTH_COMPONENT, width, height
, 0, GL20.GL_DEPTH_COMPONENT, GL20.GL_UNSIGNED_SHORT, null);
Gdx.gl.glTexParameteri(GL20.GL_TEXTURE_2D, GL20.GL_TEXTURE_MIN_FILTER, GL20.GL_NEAREST);
Gdx.gl.glTexParameteri(GL20.GL_TEXTURE_2D, GL20.GL_TEXTURE_MAG_FILTER, GL20.GL_NEAREST);
Gdx.gl.glBindTexture(GL20.GL_TEXTURE_2D, 0);
Gdx.gl.glBindFramebuffer(GL20.GL_FRAMEBUFFER, frameBuffer);
Gdx.gl.glFramebufferTexture2D(GL20.GL_FRAMEBUFFER, GL20.GL_COLOR_ATTACHMENT0
, GL20.GL_TEXTURE_2D, colorTexture, 0);
Gdx.gl.glFramebufferTexture2D(GL20.GL_FRAMEBUFFER, GL20.GL_DEPTH_ATTACHMENT
, GL20.GL_TEXTURE_2D, depthTexture, 0);
int status = Gdx.gl.glCheckFramebufferStatus(GL20.GL_FRAMEBUFFER);
if (status != GL20.GL_FRAMEBUFFER_COMPLETE)
{
System.out.println("frame buffer not complete. status " + Integer.toHexString(status));
System.exit(0);
}
Gdx.gl.glBindFramebuffer(GL20.GL_FRAMEBUFFER, 0);
status = Gdx.gl.glCheckFramebufferStatus(GL20.GL_FRAMEBUFFER);
if (status != GL20.GL_FRAMEBUFFER_COMPLETE)
{
System.out.println("default buffer not complete. status " + Integer.toHexString(status));
System.exit(0);
}
I am not sure at all if i have made mistakes in setting up the render buffer or either the color texture or depth texture attachments. Anyway, on to the rendering loop
// update cameras and things
// setup rendering to off screen framebuffer
Gdx.gl.glBindFramebuffer(GL20.GL_FRAMEBUFFER, frameBuffer);
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT | GL20.GL_DEPTH_BUFFER_BIT);
Gdx.gl.glViewport(0, 0, Gdx.graphics.getWidth(), Gdx.graphics.getHeight());
// draw things
// setup rendering to default framebuffer
Gdx.gl.glBindFramebuffer(GL20.GL_FRAMEBUFFER, 0);
Gdx.gl.glViewport(0, 0, Gdx.graphics.getWidth(), Gdx.graphics.getHeight());
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT | GL20.GL_DEPTH_BUFFER_BIT);
shader.begin();
// setup shader stuff
Gdx.gl.glActiveTexture(0);
Gdx.gl.glBindTexture(GL20.GL_TEXTURE_2D, depthTexture);
shader.setUniformi("u_fbDepth", 0);
Gdx.gl.glActiveTexture(1);
Gdx.gl.glBindTexture(GL20.GL_TEXTURE_2D, colorTexture);
shader.setUniformi("u_fbColor", 1);
// draw things with shader
shader.end();
Again i am not sure i am setting things up the right way. The idea here is hopefully pretty clear. Render to the off screen frame buffer then use the depth and color textures from that frame buffer as textures to sample in the final shader that renders to the default framebuffer.
The depth and color textures that end up in my fragment shader are just empty however. Black screen. I know the fragment shader is not the problem - if i sample a different texture i see the texture as expected. I know that the drawing its self is not the problem - if i render what i would want to render to the off screen frame buffer directly to the default frame buffer i see what i expect.
I got it. There's a bit of a gotcha with setting active textures. The function glActiveTexture expects one of the GL_TEXTURE0 type constants, but the shader uniform just wants to be the integer in the constant name.
basically
Gdx.gl.glActiveTexture(0);
Gdx.gl.glBindTexture(GL20.GL_TEXTURE_2D, depthTexture);
shader.setUniformi("u_fbDepth", 0);
Gdx.gl.glActiveTexture(1);
Gdx.gl.glBindTexture(GL20.GL_TEXTURE_2D, colorTexture);
shader.setUniformi("u_fbColor", 1);
needed to be
Gdx.gl.glActiveTexture(GL20.GL_TEXTURE0);
Gdx.gl.glBindTexture(GL20.GL_TEXTURE_2D, depthTexture);
shader.setUniformi("u_fbDepth", 0);
Gdx.gl.glActiveTexture(GL20.GL_TEXTURE1);
Gdx.gl.glBindTexture(GL20.GL_TEXTURE_2D, colorTexture);
shader.setUniformi("u_fbColor", 1);

Dynamic texture creation with font using libgdx

I'm working on a word game and I was dynamically creating the textures for the letter tiles when the game loads, comprising of a background image and a font.
To do this I was drawing pixmaps onto pixmaps, this was all fine until I started working on scaling. The font scaling on the pixmaps was terrible, even with bilinear filtering turned on (left image below) even though my scaled fonts were looking pretty good elsewhere.
So I decided to get round this I'd use a frame buffer, render everything to that and then copy that out to a pixmap and create a texture from that. That way I could use the gpu filtering and it should look exactly the same as my other fonts, (middle image below) but it still didn't look quite as nice as the other fonts. A slight dark line round the outside, it looks like the alpha blending isn't working properly.
I then tried drawing straight over the tiles with the font at runtime to make sure it wasn't my imagination, and this definitely looks better with smooth blending into the image below (right image below), but this impacts my frame rate quite a lot.
So my question is, why is drawing to the frame buffer not producing the same result as when I draw to the screen? Code below.
Texture tx = Assets.loadTexture("bubbles/BubbleBlue.png");
tx.setFilter(TextureFilter.Linear, TextureFilter.Linear);
SpriteBatch sb = new SpriteBatch();
FrameBuffer fb = new FrameBuffer(Format.RGBA8888,
LayoutManager.getWidth(), LayoutManager.getHeight(), false);
fb.begin();
sb.begin();
sb.draw(tx, 0, 0, LetterGrid.blockWidth, LetterGrid.blockHeight);
Assets.candara80.font.getRegion().getTexture()
.setFilter(TextureFilter.Linear, TextureFilter.Linear);
Assets.candara80.setSize(0.15f);
TextBounds textBounds = Assets.candara80.getBounds(letter);
Assets.candara80.drawText(sb, letter,
(LetterGrid.blockWidth - textBounds.width) / 2,
(LetterGrid.blockHeight + textBounds.height) / 2);
sb.end();
Pixmap pm = ScreenUtils.getFrameBufferPixmap(0, 0,
(int) LetterGrid.blockWidth, (int) LetterGrid.blockHeight);
Pixmap flipped = flipPixmap(pm);
result = new Texture(flipped);
fb.end();
pm.dispose();
flipped.dispose();
tx.dispose();
fb.dispose();
sb.dispose();
set PROJECTION is the problem.
EXAMPLE
public Texture texture(Color fg_color, Color bg_color)
{
Pixmap pm = render( fg_color, bg_color );
texture = new Texture(pm);//***here's your new dynamic texture***
disposables.add(texture);//store the texture
}
//---------------------------
public Pixmap render(Color fg_color, Color bg_color)
{
int width = Gdx.graphics.getWidth();
int height = Gdx.graphics.getHeight();
SpriteBatch spriteBatch = new SpriteBatch();
m_fbo = new FrameBuffer(Format.RGB565, (int)(width * m_fboScaler), (int)(height * m_fboScaler), false);
m_fbo.begin();
Gdx.gl.glClearColor(bg_color.r, bg_color.g, bg_color.b, bg_color.a);
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);
/**set PROJECTION**/
Matrix4 normalProjection = new Matrix4().setToOrtho2D(0, 0, Gdx.graphics.getWidth(), Gdx.graphics.getHeight());
spriteBatch.setProjectionMatrix(normalProjection);
spriteBatch.begin();
spriteBatch.setColor(fg_color);
//do some drawing ***here's where you draw your dynamic texture***
...
spriteBatch.end();//finish write to buffer
pm = ScreenUtils.getFrameBufferPixmap(0, 0, (int) width, (int) height);//write frame buffer to Pixmap
m_fbo.end();
// pm.dispose();
// flipped.dispose();
// tx.dispose();
m_fbo.dispose();
m_fbo = null;
spriteBatch.dispose();
// return texture;
return pm;
}

Low FPS with OpenGL on Android

I contact because, I try to use openGL with android, in order to make a 2D game :)
Here is my way of working:
I have a class GlRender
public class GlRenderer implements Renderer
In this class, on onDrawFrame I do
GameRender() and GameDisplay()
And on gameDisplay() I have:
gl.glClear(GL10.GL_COLOR_BUFFER_BIT | GL10.GL_DEPTH_BUFFER_BIT);
// Reset the Modelview Matrix
gl.glMatrixMode(GL10.GL_PROJECTION); //Select The Modelview Matrix
gl.glLoadIdentity(); //Reset The Modelview Matrix
// Point to our buffers
gl.glEnableClientState(GL10.GL_VERTEX_ARRAY);
gl.glEnableClientState(GL10.GL_TEXTURE_COORD_ARRAY);
// Set the face rotation
gl.glFrontFace(GL10.GL_CW);
for(Sprites...)
{
sprite.draw(gl, att.getX(), att.getY());
}
//Disable the client state before leaving
gl.glDisableClientState(GL10.GL_VERTEX_ARRAY);
gl.glDisableClientState(GL10.GL_TEXTURE_COORD_ARRAY);
And in the draw method of sprite I have:
_vertices[0] = x;
_vertices[1] = y;
_vertices[3] = x;
_vertices[4] = y + height;
_vertices[6] = x + width;
_vertices[7] = y;
_vertices[9] = x + width;
_vertices[10] = y + height;
if(vertexBuffer != null)
{
vertexBuffer.clear();
}
// fill the vertexBuffer with the vertices
vertexBuffer.put(_vertices);
// set the cursor position to the beginning of the buffer
vertexBuffer.position(0);
// bind the previously generated texture
gl.glBindTexture(GL10.GL_TEXTURE_2D, textures[0]);
// Point to our vertex buffer
gl.glVertexPointer(3, GL10.GL_FLOAT, 0, vertexBuffer.mByteBuffer);
gl.glTexCoordPointer(2, GL10.GL_FLOAT, 0, textureBuffer.mByteBuffer);
// Draw the vertices as triangle strip
gl.glDrawArrays(GL10.GL_TRIANGLE_STRIP, 0, _vertices.length / 3);
My problem is that I have a low frame rate, even at 30 FPS I loose some frame sometimes with only 1 sprite (but it is the same with 50)
Am I doing something wrong? How can I improve FPS?
In general, you should not be changing your vertex buffer for every sprite drawn. And by "in general", I pretty much mean "never," unless you're making a particle system. And even then, you would use proper streaming techniques, not write a quad at a time.
For each sprite, you have a pre-built quad. To render it, you use shader uniforms to transform the sprite from a neutral position to the actual position you want to see it on screen.

Categories

Resources