Can someone please help me figure out what is the problem with my code? I am trying to load an image from the native side and send the texture to Unity. I am using Unity Pro 5.0.2f1.
Unity Side:
void Start () {
AndroidJavaObject mImageLoader = new AndroidJavaObject("com.saeid.android.LoadTexture2D");
Texture2D texture2D = new Texture2D(1920, 1080, TextureFormat.ARGB32, false);
Int32 texPtr = mImageLoader.Call <Int32> ("loadImageReturnTexturePtr", "/storage/sdcard0/Images/test.jpg");
Debug.Log("texture pointer? " + texPtr);
Texture2D nativeTexture = Texture2D.CreateExternalTexture (1920, 1080, TextureFormat.ARGB32 , false, false, (IntPtr)texPtr);
texture2D.UpdateExternalTexture(nativeTexture.GetNativeTexturePtr());
gameObject.GetComponent<Renderer>().material.mainTexture = texture2D;
}
Java Side:
public int loadImageReturnTexturePtr(String imagePath) {
Log.d(LOGTAG, "loading image1: " + imagePath);
Bitmap bitmap = BitmapFactory.decodeFile(imagePath);
Log.d(LOGTAG, "Bitmap is: " + bitmap);
ByteBuffer buffer = ByteBuffer.allocate(bitmap.getByteCount());
bitmap.copyPixelsToBuffer(buffer);
int textures[] = new int[1];
GLES20.glGenTextures(1, textures, 0);
int textureId = textures[0];
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureId);
GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_RGBA, 1920, 1080, 0, GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, buffer);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S,
GLES20.GL_CLAMP_TO_EDGE);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T,
GLES20.GL_CLAMP_TO_EDGE);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, 0);
Log.d(LOGTAG, "texture id returned: " + textureId);
return textureId;
}
So, I figured it out... The code is actually correct. except the texture format in both side should be the same.
In my case, I have TextureFormat.ARGB32 (in unity side) and GLES20.GL_RGBA (in Java side) which don't match. Also somehow GLES20.glTexImage2D(...) didn't work for me. I replaced it with
GLUtils.texImage2D(GLES20.GL_TEXTURE_2D,0, bitmap,0); and finally I noticed the same code works on same Unity versions and doesn't work on some other. for example it is not working in 5.0.2f1 but it works in 5.0.3.
In addition to ensuring that both Unity and the native plugin expect the same image format, an issue I've experienced is that creating the texture on the CPU thread fails because the thread doesn't have access to the opengl context.
My solution to the problem was to instead create the texture on the render thread, using Unity's GL.IssuePluginEvent.
Related
I am using following method to create OpenGL ES texture on android.
private int createTexture(int width, int height, int i) {
int[] mTextureHandles = new int[1];
GLES20.glGenTextures(1, mTextureHandles, 0);
int textureID = mTextureHandles[0];
GLES20.glActiveTexture(GLES20.GL_TEXTURE0 + i);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureID);
GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_RGBA, width, height, 0, GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, null);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_REPEAT);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR);
return textureID;
}
Then uploading a bitmap to this texture and simply rendering it using GLSurfaceView's Renderer.
Most of the times it is working as expected,
But randomly the texture is displayed like this. (here GL_TEXTURE_WRAP_S mode is GLES20.GL_CLAMP_TO_EDGE
After changing wrap mode.
If GL_TEXTURE_WRAP_S = GLES20.GL_REPEAT then (again randomly) texture is displayed like this(notice the color change).
I have already tried using power of 2 texture.
Code for vertex buffer
private FloatBuffer createVertexBuffer(RectF glCoords) {
// RectF glCoords contains gl vertices.
// current Rect read from Logcat.
// left = -0.5833333, top = 0.5, right = 0.5833334, bottom = -0.5
float[] vertices = new float[]{glCoords.left, glCoords.top, 0, 1, // V1 - top left
glCoords.left, glCoords.bottom, 0, 1, // V2 - bottom left
glCoords.right, glCoords.bottom, 0, 1, // V3 - bottom right
glCoords.right, glCoords.top, 0, 1 // V4 - top right
};
FloatBuffer vBuffer = ByteBuffer.allocateDirect(vertices.length * bytesPerFloat)
.order(ByteOrder.nativeOrder()).asFloatBuffer();
vBuffer.put(vertices);
vBuffer.position(0);
return vBuffer;
}
IndicesBuffer is not used as I am using Triangle-Fan to render triangles.
GLES20.glDrawArrays(GLES20.GL_TRIANGLE_FAN, 0, 6);
It would be really helpful if someone can point out as to what could be causing this
Your geometry seems to be a 4-vertex rectangle (quad), but the code wants to draw a fan with 6 vertices - GLES20.glDrawArrays(GLES20.GL_TRIANGLE_FAN, 0, 6);
Those extra 2 vertices aren't initialized, not specified in the vertex array. OpenGL will read beyond the defined part of the array. The results would be random, and I'd even expect crashes.
Replace 6 with 4 in glDrawArrays() call, this should fix the problem.
How I can load an image as a texture and rendering it through GLES to use the MediaCodec Surface input approach?
I was started from EncodeAndMuxTest example.
Thank you in advance.
Look at sample from grafika ,it will give you insight about how you should do it
Here is code to load bitmap into texture
int mTextureId = -1;
public void loadTexture(Bitmap bitmap)
{
if (mTextureId != -1) {
int[] textureHandle = new int[1];
GLES20.glGenTextures(1, textureHandle, 0);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, mTextureId);
GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D,
GLES20.GL_TEXTURE_MIN_FILTER,
GLES20.GL_NEAREST);
GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D,
GLES20.GL_TEXTURE_MAG_FILTER,
GLES20.GL_LINEAR);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D,
GLES20.GL_TEXTURE_WRAP_S,
GLES20.GL_CLAMP_TO_EDGE);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D,
GLES20.GL_TEXTURE_WRAP_T,
GLES20.GL_CLAMP_TO_EDGE);
} else {
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, mTextureId);
}
GLUtils.texImage2D(GLES20.GL_TEXTURE_2D, 0, bitmap, 0);
}
and here is how you can render it to inputsurface
// Create Fullframe rectangle (a class from grafika),
mInputSurface.makeCurrent();
mFullFrameRect = new FullFrameRect(new Texture2dProgram(Texture2dProgram.ProgramType.TEXTURE_2D));
....
// And when you want to draw it
mInputSurface.makeCurrent(); // if its not already current
loadTexture(bitmap);
GLES20.glViewport(0, 0, viewWidth, viewHeight);
mFullFrameRect.drawFrame(mTextureId, GlUtil.IDENTITY_MATRIX);
mInputSurface.setPresentationTime(pts);
mInputSurface.swapBuffers();
FullFrameRect, Texture2dProgram, GlUtil are classes from Grafika, so you should copy it or implement similar functionality by yourself
Im implementing Render to texture Using FBOs in android, as first step im creating a texture, but i get error 1280 by calling GLES20.glGenTextures method.
the Texture Creator function is bellow:
public int CreateTexture(int w, int h){
final int[] textureId = new int[1];
int i;
//ijad mikonim 1 Adad texturte ro rooye textureID
GLES20.glGenTextures(1, textureId,0);
i = GLES20.glGetError();
//BindTexture miad texturo ro baraaye call shodan amaade mikone
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureId[0]);
//texture nahaE ro ijaad mikonim
GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_RGBA, w, h, 0, GLES20.GL_RGBA, GLES20.GL_FLOAT, null);
//in null tooye voroodie akharie bala, mige ke fazaa ro baraye texture ijad kon vali ba hichi poresh nakon hanooz
GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_NEAREST);
GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_NEAREST);
GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);
if(i!=0){
Log.d("ERROR", "ERROR Happend"+i+"");
return i;
}
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, 0);
return textureId[0];
}
when i call this method it returns error 1280.
You got a GL_INVALID_ENUM error, which means you passed an unsupported enum value to a GL function. error is not in CreateTexture function , it is probably in function call before CreateTexture or in your opengl init function
I am uploading loading images as textures to GLSurfaceView.
The resulting textures look perfectly fine on some devices, on others the appear completely distorted.
This is what it looks like on a Samsung Galaxy Nexus (screen density 2.0):
The same images on a Motorola (screen density 1.5):
Here is my loading code:
FutureTask<Integer> futureTask = new FutureTask<Integer>(new Callable<Integer>() {
#Override
public Integer call() throws Exception {
// Generate Texture
int[] texturenames = new int[1];
GLES20.glGenTextures(1, texturenames, 0);
// Bind texture to texturename
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, texturenames[0]);
// Set filtering
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER,
GLES20.GL_LINEAR);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER,
GLES20.GL_LINEAR);
// Set wrapping mode
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);
// Correct the bitmap if its not a power of two
Bitmap potTextureBitmap = textureBitmap;
int potWidth = nextPOT(textureBitmap.getWidth());
int potHeight = nextPOT(textureBitmap.getHeight());
if ((textureBitmap.getWidth() != potWidth) || (textureBitmap.getHeight() != potHeight)) {
potTextureBitmap = Bitmap.createScaledBitmap(textureBitmap, potWidth, potHeight, false);
}
// Load the bitmap into the bound texture.
GLUtils.texImage2D(GLES20.GL_TEXTURE_2D, 0, potTextureBitmap, 0);
GLES20.glFlush();
return Integer.valueOf( texturenames[0]);
}
});
this.mSurfaceView.queueEvent(futureTask);
What am I doing wrong?
I finally found the cause of this problem.
Its rather specific, but I'll share the details anyhow:
There turned out to be an error in my color calculation. I need to convert from Hex colors to normalized rgba (in my case convert white #ffffffff to {1.0, 1.0, 1.0, 1.0}), but I actually fed non-normalized values into my shader (example: {255, 255, 255, 255}).
When multiplied with the color from my texture, the resulting colors would blow up.
Depending on the graphic card this caused the artifacts or not.
So the assumed dependency on the screen resolution was pure coincidence!
I am rendering to a frame buffer, and then I want to use that as a texture.
I can save the frame buffer to a file using glReadPixels so I am sure that I am rendering OK.
I can bind to fixed texture and render that OK.
But I cannot bind to the frame buffer to render that in place of the fixed texture.
//setup:
private void setupRenderToTexture() {
fb = new int[numberOfBuffers];
depthRb = new int[numberOfBuffers];
renderTex = new int[numberOfBuffers];
texBuffer = new IntBuffer[numberOfBuffers];
// generate
GLES20.glGenFramebuffers(fb.length, fb, 0);
GLES20.glGenRenderbuffers(depthRb.length, depthRb, 0);
GLES20.glGenTextures(renderTex.length, renderTex, 0);
GLErrorHandler.checkGlError("glGenFramebuffers");
for (int i = 0; i < fb.length; i++) {
GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, fb[i]);
// generate color texture
GLState.Instance.bindTexture(0, renderTex[i]);
// parameters
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D,
// GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_REPEAT);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D,
// GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);
GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_REPEAT);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D,
GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D,
GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR);
// create it
// create an empty intbuffer first?
int[] fileData = new int[width * height];
texBuffer[i] = IntBuffer.wrap(fileData);
GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_RGBA, width,
height, 0, GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE,
texBuffer[i]);
GLES20.glFramebufferTexture2D(GLES20.GL_FRAMEBUFFER, GLES20.GL_COLOR_ATTACHMENT0, GLES20.GL_TEXTURE_2D,
renderTex[i], 0);
// create render buffer and bind 16-bit depth buffer
GLES20.glBindRenderbuffer(GLES20.GL_RENDERBUFFER, depthRb[i]);
GLES20.glRenderbufferStorage(GLES20.GL_RENDERBUFFER,
GLES20.GL_DEPTH_COMPONENT16, width, height);
GLErrorHandler.checkGlError("glRenderbufferStorage or above");
}
GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, 0);
}
Use the frame buffer:
GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, fb[0]);
Save the frame buffer (for testing):
GLES20.glReadPixels(sourceX, sourceY, width, height, GL10.GL_RGBA,
GL10.GL_UNSIGNED_BYTE, ib);
Then to render to screen:
GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, 0);
GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, fb[0]);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, texture); //some simple texture works but:
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, renderTex[0]); //frame buffer texture does not work
All I get is a black screen if I bind to renderTex[0]
You are mis-using the FBO. You have to attach an empty texture so that it can render into it.
See this page