I use these functions to draw elements in android using OpenGL-ES. First in the constructor after binding the buffer (that is bind correctly because using GL10 the are drawn) I create the program using the CreateProgram function and then I call Draw. I think that the problem is in the draw function. Can anyone tell to me what my mistakes are?
PS: I don't post the code for the binding of buffers because as i said using G10 they are drawn. Now I want to use GL20 because maybe I'm wrong but reading for examples different questions and some pages on android developer site OpenGL-ES is faster because it uses static functions.
Here there is the code thanks indeed :
private final String vertexShader="" +
"attribute vec3 vertex; \n" +
"void main(){\n" +
" gl_Position=vertex;\n" +
"}";
private final String fragmentShader="" +
"attribute vec4 colors;\n" +
"void main(){\n" +
" gl_FragColor=colors;\n" +
"}";
public int LoadShader(String shader,int type){
int sha= GLES20.glCreateShader(type);
GLES20.glShaderSource(sha,shader);
GLES20.glCompileShader(sha);
return sha;
}
int program=0;
public void CreateGLProgram()
{
program=GLES20.glCreateProgram();
GLES20.glAttachShader(program,LoadShader(vertexShader,GLES20.GL_VERTEX_SHADER));
GLES20.glAttachShader(program,LoadShader(fragmentShader,GLES20.GL_FRAGMENT_SHADER));
GLES20.glLinkProgram(program);
}
public void DrawShader(){
GLES20.glUseProgram(program);
int vertex_handle=GLES20.glGetAttribLocation(program,"vertex");
GLES20.glVertexAttribPointer(vertex_handle,3,GLES20.GL_FLOAT,false,4,coordinatesbuff);
int frag_handle=GLES20.glGetAttribLocation(program,"colors");
GLES20.glVertexAttribPointer(frag_handle,4,GLES20.GL_FLOAT,false,4,colorbuffer);
GLES20.glDrawElements(GLES20.GL_TRIANGLES,indicies.length,GLES20.GL_UNSIGNED_SHORT,indiciesbuffer);
GLES20.glDisableVertexAttribArray(vertex_handle);
GLES20.glDisableVertexAttribArray(frag_handle);
}
In addition to what BDL wrote:
You should enable attribute pointers using glEnableVertexAttribArray(int index):
int vertex_handle=GLES20.glGetAttribLocation(program, "vertex");
GLES20.glEnableVertexAttribArray(vertex_handle);
GLES20.glVertexAttribPointer(vertex_handle, 3, GLES20.GL_FLOAT, false, 4, coordinatesbuff);
int frag_handle=GLES20.glGetAttribLocation(program, "colors");
GLES20.glEnableVertexAttribArray(frag_handle);
GLES20.glVertexAttribPointer(frag_handle, 4, GLES20.GL_FLOAT, false, 4, colorbuffer);
You can't have attributes in a fragment shader. If you want per-vertex colors in the fragment shader, you have to define this attribute in the vertex shader and pass the data to the fragment shader in a varying.
Additionally, you should check whether your shaders are compiling (glGetShaderiv(..., GL_COMPILE_STATUS... and linking (glProgramiv(..., GL_LINK_STATUS...) correctly. This would have brought you on the right track since using a attribute in a fragment shader should trigger a compile error. Also glGetError should be checked.
Related
I'm currently making an Android player plugin for Unity. The basic idea is that I will play the video by MediaPlayer on Android, which provides a setSurface API receiving a SurfaceTexture as constructor parameter and in the end binds with an OpenGL-ES texture. In most other cases like showing an image, we can just send this texture in form of pointer/id to Unity, call Texture2D.CreateExternalTexture there to generate a Texture2D object and set that to an UI GameObject to render the picture. However, when it comes to displaying video frames, it's a little bit different since video playing on Android requires a texture of type GL_TEXTURE_EXTERNAL_OES while Unity only supports the universal type GL_TEXTURE_2D.
To solve the problem, I've googled for a while and known that I should adopt a kind of technology called "Render to texture". More clear to say, I should generate 2 textures, one for the MediaPlayer and SurfaceTexture in Android to receive video frames and another for Unity that should also has the picture data inside. The first one should be in type of GL_TEXTURE_EXTERNAL_OES (let's call it OES texture for short) and the second one in type of GL_TEXTURE_2D (let's call it 2D texture). Both of these generated textures are empty in the beginning. When bound with MediaPlayer, the OES texture will be updated during video playing, then we can use a FrameBuffer to draw the content of OES texture upon the 2D texture.
I've written a pure-Android version of this process and it works pretty well when I finally draw the 2D texture upon the screen. However, when I publish it as an Unity Android plugin and runs the same code on Unity, there won't be any pictures showing. Instead, it only displays a preset color from glClearColor, which means two things:
The transferring process of OES texture -> FrameBuffer -> 2D texture is complete and Unity do receive the final 2D texture. Because the glClearColor is called only when we draw the content of OES texture to FrameBuffer.
There are some mistakes during drawing happened after glClearColor, because we don't see the video frames pictures. In fact, I also call glReadPixels after drawing and before unbinding with the FrameBuffer, which is going to read data from the FrameBuffer we bound with. And it returns the single color's value that is same with the color we set in glClearColor.
In order to simplify the code I should provide here, I'm going to draw a triangle to a 2D texture through FrameBuffer. If we can figure out which part is wrong, we then can easily solve the similar problem to draw video frames.
The function will be called on Unity:
public int displayTriangle() {
Texture2D texture = new Texture2D(UnityPlayer.currentActivity);
texture.init();
Triangle triangle = new Triangle(UnityPlayer.currentActivity);
triangle.init();
TextureTransfer textureTransfer = new TextureTransfer();
textureTransfer.tryToCreateFBO();
mTextureWidth = 960;
mTextureHeight = 960;
textureTransfer.tryToInitTempTexture2D(texture.getTextureID(), mTextureWidth, mTextureHeight);
textureTransfer.fboStart();
triangle.draw();
textureTransfer.fboEnd();
// Unity needs a native texture id to create its own Texture2D object
return texture.getTextureID();
}
Initialization of 2D texture:
protected void initTexture() {
int[] idContainer = new int[1];
GLES30.glGenTextures(1, idContainer, 0);
textureId = idContainer[0];
Log.i(TAG, "texture2D generated: " + textureId);
// texture.getTextureID() will return this textureId
bindTexture();
GLES30.glTexParameterf(GLES30.GL_TEXTURE_2D,
GLES30.GL_TEXTURE_MIN_FILTER, GLES30.GL_NEAREST);
GLES30.glTexParameterf(GLES30.GL_TEXTURE_2D,
GLES30.GL_TEXTURE_MAG_FILTER, GLES30.GL_LINEAR);
GLES30.glTexParameteri(GLES30.GL_TEXTURE_2D,
GLES30.GL_TEXTURE_WRAP_S, GLES30.GL_CLAMP_TO_EDGE);
GLES30.glTexParameteri(GLES30.GL_TEXTURE_2D,
GLES30.GL_TEXTURE_WRAP_T, GLES30.GL_CLAMP_TO_EDGE);
unbindTexture();
}
public void bindTexture() {
GLES30.glBindTexture(GLES30.GL_TEXTURE_2D, textureId);
}
public void unbindTexture() {
GLES30.glBindTexture(GLES30.GL_TEXTURE_2D, 0);
}
draw() of Triangle:
public void draw() {
float[] vertexData = new float[] {
0.0f, 0.0f, 0.0f,
1.0f, -1.0f, 0.0f,
1.0f, 1.0f, 0.0f
};
vertexBuffer = ByteBuffer.allocateDirect(vertexData.length * 4)
.order(ByteOrder.nativeOrder())
.asFloatBuffer()
.put(vertexData);
vertexBuffer.position(0);
GLES30.glClearColor(0.0f, 0.0f, 0.9f, 1.0f);
GLES30.glClear(GLES30.GL_DEPTH_BUFFER_BIT | GLES30.GL_COLOR_BUFFER_BIT);
GLES30.glUseProgram(mProgramId);
vertexBuffer.position(0);
GLES30.glEnableVertexAttribArray(aPosHandle);
GLES30.glVertexAttribPointer(
aPosHandle, 3, GLES30.GL_FLOAT, false, 12, vertexBuffer);
GLES30.glDrawArrays(GLES30.GL_TRIANGLE_STRIP, 0, 3);
}
vertex shader of Triangle:
attribute vec4 aPosition;
void main() {
gl_Position = aPosition;
}
fragment shader of Triangle:
precision mediump float;
void main() {
gl_FragColor = vec4(0.9, 0.0, 0.0, 1.0);
}
Key code of TextureTransfer:
public void tryToInitTempTexture2D(int texture2DId, int textureWidth, int textureHeight) {
if (mTexture2DId != -1) {
return;
}
mTexture2DId = texture2DId;
GLES30.glBindTexture(GLES30.GL_TEXTURE_2D, mTexture2DId);
Log.i(TAG, "glBindTexture " + mTexture2DId + " to init for FBO");
// make 2D texture empty
GLES30.glTexImage2D(GLES30.GL_TEXTURE_2D, 0, GLES30.GL_RGBA, textureWidth, textureHeight, 0,
GLES30.GL_RGBA, GLES30.GL_UNSIGNED_BYTE, null);
Log.i(TAG, "glTexImage2D, textureWidth: " + textureWidth + ", textureHeight: " + textureHeight);
GLES30.glBindTexture(GLES30.GL_TEXTURE_2D, 0);
fboStart();
GLES30.glFramebufferTexture2D(GLES30.GL_FRAMEBUFFER, GLES30.GL_COLOR_ATTACHMENT0,
GLES30.GL_TEXTURE_2D, mTexture2DId, 0);
Log.i(TAG, "glFramebufferTexture2D");
int fboStatus = GLES30.glCheckFramebufferStatus(GLES30.GL_FRAMEBUFFER);
Log.i(TAG, "fbo status: " + fboStatus);
if (fboStatus != GLES30.GL_FRAMEBUFFER_COMPLETE) {
throw new RuntimeException("framebuffer " + mFBOId + " incomplete!");
}
fboEnd();
}
public void fboStart() {
GLES30.glBindFramebuffer(GLES30.GL_FRAMEBUFFER, mFBOId);
}
public void fboEnd() {
GLES30.glBindFramebuffer(GLES30.GL_FRAMEBUFFER, 0);
}
And finally some code on Unity-side:
int textureId = plugin.Call<int>("displayTriangle");
Debug.Log("native textureId: " + textureId);
Texture2D triangleTexture = Texture2D.CreateExternalTexture(
960, 960, TextureFormat.RGBA32, false, true, (IntPtr) textureId);
triangleTexture.UpdateExternalTexture(triangleTexture.GetNativeTexturePtr());
rawImage.texture = triangleTexture;
rawImage.color = Color.white;
Well, code above will not display the expected triangle but only a blue background. I add glGetError after nearly every OpenGL functions call while no errors are thrown.
My Unity version is 2017.2.1. For Android build, I shut down the experimental multithread rendering and other settings are all default(no texture compression, not use development build, so on). My app's minimum API level is 5.0 Lollipop and target API level is 9.0 Pie.
I really need some help, thanks in advance!
Now I found the answer: If you want to do any drawing jobs in your plugin, you should do it at native layer. So if you want to make an Android plugin, you should call OpenGL-ES APIs at JNI instead of Java side. The reason is that Unity only allows drawing graphics on its rendering thread. If you simply call OpenGL-ES APIs like I did at Java side as in question description, they will actually run on Unity main thread instead of rendering thread. Unity provides a method, GL.IssuePluginEvent, to call your own functions on rendering thread but it needs native coding since this function requires a function pointer as its callback. Here is a simple example to use it:
At JNI side:
// you can copy these headers from https://github.com/googlevr/gvr-unity-sdk/tree/master/native_libs/video_plugin/src/main/jni/Unity
#include "IUnityInterface.h"
#include "UnityGraphics.h"
static void on_render_event(int event_type) {
// do all of your jobs related to rendering, including initializing the context,
// linking shaders, creating program, finding handles, drawing and so on
}
// UnityRenderingEvent is an alias of void(*)(int) defined in UnityGraphics.h
UnityRenderingEvent get_render_event_function() {
UnityRenderingEvent ptr = on_render_event;
return ptr;
}
// notice you should return a long value to Java side
extern "C" JNIEXPORT jlong JNICALL
Java_com_abc_xyz_YourPluginClass_getNativeRenderFunctionPointer(JNIEnv *env, jobject instance) {
UnityRenderingEvent ptr = get_render_event_function();
return (long) ptr;
}
At Android Java side:
class YourPluginClass {
...
public native long getNativeRenderFunctionPointer();
...
}
At Unity side:
private void IssuePluginEvent(int pluginEventType) {
long nativeRenderFuncPtr = Call_getNativeRenderFunctionPointer(); // call through plugin class
IntPtr ptr = (IntPtr) nativeRenderFuncPtr;
GL.IssuePluginEvent(ptr, pluginEventType); // pluginEventType is related to native function parameter event_type
}
void Start() {
IssuePluginEvent(1); // let's assume 1 stands for initializing everything
// get your texture2D id from plugin, create Texture2D object from it,
// attach that to a GameObject, and start playing for the first time
}
void Update() {
// call SurfaceTexture.updateTexImage in plugin
IssuePluginEvent(2); // let's assume 2 stands for transferring TEXTURE_EXTERNAL_OES to TEXTURE_2D through FrameBuffer
// call Texture2D.UpdateExternalTexture to update GameObject's appearance
}
You still need to transfer texture and everything about it should happen at JNI layer. But don't worry, they are nearly the same as I did in question description but only in a different language than Java and there are a lot of materials about this process so you can surely make it.
Finally let me address the key to solve this problem again: do your native stuff at native layer and don't be addicted to pure Java... I'm totally surprised that there are no blog/answer/wiki to tell us just write our code in C++. Although there are some open-source implementations like Google's gvr-unity-sdk, they give a complete reference but you'll still be doubt that maybe you can finish the task without writing any C++ code. Now we know that we can't. However, to be honest, I think Unity have the ability to make this progress even easier.
As a starting point I use the Vuforia (version 4) sample called MultiTargets which tracks a 3d physical "cube" in the camera feed and augments it with yellow grid lines along the cube edges.
What I want to achieve is remove the textures and use diffuse lighting on the cube faces instead, by setting my own light position.
I want to do this on native Android and I do NOT want to use Unity.
It's been a hard journey of several days of work and learning. This is my first time working with OpenGL of any kind, and OpenGL ES 2.0 doesn't exactly make it easy for the beginner.
So I have a light source positioned slightly above the top face of my cube. I found that I can get the diffuse effect right if I compute the lambert factor in model space, everything remains in place regardless of my camera, and only the top face gets any light.
But when I move to using eye space, it becomes weird and the light seems to follow my camera around. Other faces get light, not only the top face. I don't understand why that is. For testing I have made sure that the light position is as expected by only using distance to lightsource for rendering pixel brightness in the fragment shader. Therefore, I'm fairly confident in the correctness of my "lightDirectionEyespace", and my only explanation is that something with the normals must be wrong. But I think I followed the explanations for creating the normal matrix correctly...
Help please!
Then there is of course the question whether those diffuse calculations SHOULD be performed in eye space? Will there be any disadvantages if I just do it in model space? I suspect that probably when I later use more models and lights and add specular and transparency, it will not work anymore, even though I don't see yet why.
My renderFrame method: (some variable names still contain "bottle", which is the object I want to light next after I get the cube right)
private void renderFrame()
{
ShaderFactory.checkGLError("Check gl errors prior render Frame");
// Clear color and depth buffer
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);
// Get the state from Vuforia and mark the beginning of a rendering section
final State state=Renderer.getInstance().begin();
// Explicitly render the Video Background
Renderer.getInstance().drawVideoBackground();
GLES20.glEnable(GLES20.GL_DEPTH_TEST);
GLES20.glEnable(GLES20.GL_BLEND);
GLES20.glBlendFunc(GLES20.GL_SRC_ALPHA, GLES20.GL_ONE_MINUS_SRC_ALPHA);
// Did we find any trackables this frame?
if(0 != state.getNumTrackableResults())
{
// Get the trackable:
TrackableResult result=null;
final int numResults=state.getNumTrackableResults();
// Browse results searching for the MultiTarget
for(int j=0; j < numResults; j++)
{
result=state.getTrackableResult(j);
if(result.isOfType(MultiTargetResult.getClassType()))
break;
result=null;
}
// If it was not found exit
if(null == result)
{
// Clean up and leave
GLES20.glDisable(GLES20.GL_BLEND);
GLES20.glDisable(GLES20.GL_DEPTH_TEST);
Renderer.getInstance().end();
return;
}
final Matrix44F modelViewMatrix_Vuforia=Tool.convertPose2GLMatrix(result.getPose());
final float[] modelViewMatrix=modelViewMatrix_Vuforia.getData();
final float[] modelViewProjection=new float[16];
Matrix.scaleM(modelViewMatrix, 0, CUBE_SCALE_X, CUBE_SCALE_Y, CUBE_SCALE_Z);
Matrix.multiplyMM(modelViewProjection, 0, vuforiaAppSession
.getProjectionMatrix().getData(), 0, modelViewMatrix, 0);
GLES20.glUseProgram(bottleShaderProgramID);
// Draw the cube:
GLES20.glEnable(GLES20.GL_CULL_FACE);
GLES20.glCullFace(GLES20.GL_BACK);
GLES20.glVertexAttribPointer(vertexHandleBottle, 3, GLES20.GL_FLOAT, false, 0, cubeObject.getVertices());
GLES20.glVertexAttribPointer(normalHandleBottle, 3, GLES20.GL_FLOAT, false, 0, cubeObject.getNormals());
GLES20.glEnableVertexAttribArray(vertexHandleBottle);
GLES20.glEnableVertexAttribArray(normalHandleBottle);
// add light position and color
final float[] lightPositionInModelSpace=new float[] {0.0f, 1.1f, 0.0f, 1.0f};
GLES20.glUniform4f(lightPositionHandleBottle, lightPositionInModelSpace[0], lightPositionInModelSpace[1],
lightPositionInModelSpace[2], lightPositionInModelSpace[3]);
GLES20.glUniform3f(lightColorHandleBottle, 0.9f, 0.9f, 0.9f);
// create the normalMatrix for lighting calculations
final float[] normalMatrix=new float[16];
Matrix.invertM(normalMatrix, 0, modelViewMatrix, 0);
Matrix.transposeM(normalMatrix, 0, normalMatrix, 0);
// pass the normalMatrix to the shader
GLES20.glUniformMatrix4fv(normalMatrixHandleBottle, 1, false, normalMatrix, 0);
// extract the camera position for lighting calculations (last column of matrix)
// GLES20.glUniform3f(cameraPositionHandleBottle, normalMatrix[12], normalMatrix[13], normalMatrix[14]);
// set material properties
GLES20.glUniform3f(matAmbientHandleBottle, 0.0f, 0.0f, 0.0f);
GLES20.glUniform3f(matDiffuseHandleBottle, 0.1f, 0.9f, 0.1f);
// pass the model view matrix to the shader
GLES20.glUniformMatrix4fv(modelViewMatrixHandleBottle, 1, false, modelViewMatrix, 0);
// pass the model view projection matrix to the shader
// the "transpose" parameter must be "false" according to the spec, anything else is an error
GLES20.glUniformMatrix4fv(mvpMatrixHandleBottle, 1, false, modelViewProjection, 0);
GLES20.glDrawElements(GLES20.GL_TRIANGLES,
cubeObject.getNumObjectIndex(), GLES20.GL_UNSIGNED_SHORT, cubeObject.getIndices());
GLES20.glDisable(GLES20.GL_CULL_FACE);
// disable the enabled arrays after everything has been rendered
GLES20.glDisableVertexAttribArray(vertexHandleBottle);
GLES20.glDisableVertexAttribArray(normalHandleBottle);
ShaderFactory.checkGLError("MultiTargets renderFrame");
}
GLES20.glDisable(GLES20.GL_BLEND);
GLES20.glDisable(GLES20.GL_DEPTH_TEST);
Renderer.getInstance().end();
}
My vertex shader:
attribute vec4 vertexPosition;
attribute vec3 vertexNormal;
uniform mat4 modelViewProjectionMatrix;
uniform mat4 modelViewMatrix;
uniform mat4 normalMatrix;
// lighting
uniform vec4 uLightPosition;
uniform vec3 uLightColor;
// material
uniform vec3 uMatAmbient;
uniform vec3 uMatDiffuse;
// pass to fragment shader
varying vec3 vNormalEyespace;
varying vec3 vVertexEyespace;
varying vec4 vLightPositionEyespace;
varying vec3 vNormal;
varying vec4 vVertex;
void main()
{
// we can just take vec3() of a vec4 and it will take the first 3 entries
vNormalEyespace = vec3(normalMatrix * vec4(vertexNormal, 1.0));
vNormal = vertexNormal;
vVertexEyespace = vec3(modelViewMatrix * vertexPosition);
vVertex = vertexPosition;
// light position
vLightPositionEyespace = modelViewMatrix * uLightPosition;
gl_Position = modelViewProjectionMatrix * vertexPosition;
}
And my fragment shader:
precision highp float; //apparently necessary to force same precision as in vertex shader
//lighting
uniform vec4 uLightPosition;
uniform vec3 uLightColor;
//material
uniform vec3 uMatAmbient;
uniform vec3 uMatDiffuse;
//from vertex shader
varying vec3 vNormalEyespace;
varying vec3 vVertexEyespace;
varying vec4 vLightPositionEyespace;
varying vec3 vNormal;
varying vec4 vVertex;
void main()
{
vec3 normalModel = normalize(vNormal);
vec3 normalEyespace = normalize(vNormalEyespace);
vec3 lightDirectionModel = normalize(uLightPosition.xyz - vVertex.xyz);
vec3 lightDirectionEyespace = normalize(vLightPositionEyespace.xyz - vVertexEyespace.xyz);
vec3 ambientTerm = uMatAmbient;
vec3 diffuseTerm = uMatDiffuse * uLightColor;
// calculate the lambert factor via cosine law
float diffuseLambert = max(dot(normalEyespace, lightDirectionEyespace), 0.0);
// Attenuate the light based on distance.
float distance = length(vLightPositionEyespace.xyz - vVertexEyespace.xyz);
float diffuseLambertAttenuated = diffuseLambert * (1.0 / (1.0 + (0.01 * distance * distance)));
diffuseTerm = diffuseLambertAttenuated * diffuseTerm;
gl_FragColor = vec4(ambientTerm + diffuseTerm, 1.0);
}
I finally solved all problems.
There were 2 issues that might be of interest for future readers.
Vuforia CubeObject class from the official sample (current Vuforia version 4) has wrong normals. They do not all correspond with the vertex definition order. If you're using the CubeObject from the sample, make sure that the normal definitions are correctly corresponding with the faces. Vuforia fail...
As suspected, my normalMatrix was wrongly built. We cannot just invert-transpose the 4x4 modelViewMatrix, we need to first extract the top left 3x3 submatrix from it and then invert-transpose that.
Here is the code that works for me:
final Mat3 normalMatrixCube=new Mat3();
normalMatrixCube.SetFrom4X4(modelViewMatrix);
normalMatrixCube.invert();
normalMatrixCube.transpose();
This code by itself is not that useful though, because it relies on a custom class Mat3 which I randomly imported from this guy because neither Android nor Vuforia seem to offer any matrix class that can invert/transpose 3x3 matrices. This really makes me question my sanity - the only code that works for such a basic problem has to rely on a custom matrix class? Maybe I'm just doing it wrong, I don't know...
thumbs up for not using the fixed functions on this! I found your example quite useful for understanding that one needs to also translate the light to a position in eyespace. All the questions i've found just recommend using glLight.
While this helped me solve using a static light source, something which is missing from your code if you wish to also make transformations on your model(s) while keeping the light source static(e.g rotating the object) is to keep track of the original modelview matrix until the view is changed, or until you're drawing another object which has a different model. So something like:
vLightPositionEyespace = fixedModelView * uLightPosition;
where fixedModelView can be updated in your renderFrame() method.
This thread on opengl discussion boards helped :)
Please see Edit at end for progress.
I'm in the process of trying to learn OpenGL ES 2.0 (I'm going to be developing on Android devices)
I'm a little confused about the Vertex and Fragment shaders. I understand their purpose, but if I'm building a shape from a custom built class (say a 'point') and setting it's size and colour or applying a texture and assuming that both shaders are declared and defined initially in the object class's constructor, would this mean that each instance of that class would have it's very own pair of shaders?
That is my first question. My second is that if this is the case (shader pairs for each object).........is this the way to go? I've heard that having one shader pair and switching it's parameters isn't a good idea because of performance, but if I have 100 sprites all of the same size and colour (or texture) does it make sense for them all to have a different pair of shaders with exactly the same parameters?
I hope I'm asking the correct question, I've not been studying ES 2.0 for long so find it a little confusing. I currently only have a limited understanding of OpenGL!
Edit
Adding code as requested.
public class Dot {
int iProgId;
int iPosition;
float size = 10;
FloatBuffer vertexBuf;
float r = 1f;
float g = 1f;
float b = 1f;
float a = 1f;
int iBaseMap;
int texID;
Bitmap imgTexture;
//Constructor
public Dot() {
float[] vertices = {
0,0,0f
};
//Create vertex shader
String strVShader =
"attribute vec4 a_position;\n"+
"void main()\n" +
"{\n" +
"gl_PointSize = " +size+ ";\n" +
"gl_Position = a_position;\n"+
"}";
//Create fragment shader
String strFShader =
"precision mediump float;" +
"void main() " +
"{" +
"gl_FragColor = vec4(0,0,0,1);" +
"}";
iProgId = Utils.LoadProgram(strVShader, strFShader);
iPosition = GLES20.glGetAttribLocation(iProgId, "a_position");
vertexBuf = ByteBuffer.allocateDirect(vertices.length * 4).order(ByteOrder.nativeOrder()).asFloatBuffer();
vertexBuf.put(vertices).position(0);
}
My setTexture method
public void setTexture(GLSurfaceView view, Bitmap imgTexture){
this.imgTexture=imgTexture;
//Create vertex shader
String strVShader =
"attribute vec4 a_position;\n"+
"void main()\n" +
"{\n" +
"gl_PointSize = " +size+ ";\n" +
"gl_Position = a_position;\n"+
"}";
//Fragment shader
String strFShader =
"precision mediump float;" +
"uniform sampler2D u_baseMap;" +
"void main()" +
"{" +
"vec4 color;" +
"color = texture2D(u_baseMap, gl_PointCoord);" +
"gl_FragColor = color;" +
"}";
iProgId = Utils.LoadProgram(strVShader, strFShader);
iBaseMap = GLES20.glGetUniformLocation(iProgId, "u_baseMap");
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
GLES20.glUniform1i(iBaseMap, 0);
texID = Utils.LoadTexture(view, imgTexture); //See code below
}
LoadTexture() method from my Utils class:
public static int LoadTexture(GLSurfaceView view, Bitmap imgTex) {
int textures[] = new int[1];
try {
GLES20.glGenTextures(1, textures, 0);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textures[0]);
GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER,GLES20.GL_LINEAR);
GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER,GLES20.GL_LINEAR);
GLUtils.texImage2D(GL10.GL_TEXTURE_2D, 0, imgTex, 0);
} catch
}
return textures[0];
}
And finally my Drawing method:
public void drawDot(float x, float y){
float[] vertices = {
x,y,0f
};
vertexBuf = ByteBuffer.allocateDirect(vertices.length * 4).order(ByteOrder.nativeOrder()).asFloatBuffer();
vertexBuf.put(vertices).position(0);
GLES20.glUseProgram(iProgId);
GLES20.glVertexAttribPointer(iPosition, 3, GLES20.GL_FLOAT, false, 0, vertexBuf);
GLES20.glEnableVertexAttribArray(iPosition);
GLES20.glDrawArrays(GLES20.GL_POINTS, 0, 1);
}
So I can create stuff like so:
Dot dot1 = new Dot();
dot1.setSize(40);
setTexture(myBitmap); //(created earlier with BitmapFactory)
drawDot(0,0);
Thank you!
Edit 1: Thanks for the answer so far. On further research, it seem a few other people have had this exact same problem. The issue seems to be that I'm not calling glBindTexture in my rendering routine, thus OpenGL is just using the last texture that it loaded, which I guess makes sense.
If I put the following into my Rendering routine:
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, 1);
it will apply the first bitmap and display it
if I put:
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, 2);
it will apply the second bitmap and display it
So, getting somewhere! But my question would now be how can I get my rendering method to automatically know which bitmap to use based on which object is calling it (the rendering routine)?
Thanks again
How it works (briefly)
Shaders are just programs that run in the graphics card. You compile and link them, and then you can pass some variables to them to modify properties of vertices and fragments.
This means that when you call certain drawing functions, such as glDrawElements or glDrawArrays, the vertex data (this means position, texture coords, normals, color, etc. depending on what you want to send) will be sent to the pipeline.
This means that the currently loaded vertex shader will get the vertices one by one and run its code to apply whatever transformations it needs. After that OpenGL will apply rasterization to generate the fragments for the current frame. Then the fragment shader will take every fragment and modify it accordingly.
You can always unload a shader and load a different one. If you need different shaders for different objects, you can group your objects according to their shader and render them independently while reloading the corresponding shader for every group.
However, sometimes it's easier to pass some parameters to the shader and change them for every object. For instance, if you want to render a 3D model, you can split it in submeshes, with every submesh having a different texture. Then, when you pass the vertex data for a mesh, you load the texture and pass it to the shader. For the next mesh you will pass another texture, and so on.
In the real world everything is more complex, but I hope it's useful for you to get an idea of how it works.
Your example
You are loading a pair of shaders on the constructor (with no texture), and then creating a new shader every time you set a texture. I'm not sure this is the better approach.
Without knowing what Utils.LoadShader does is difficult to know, but you could log the result every time you call it. Maybe the second time you are linking the shader it doesn't work.
If I were you, I would just use a pair of shaders outside your dot object. You can pass parameters to the shader (with glUniform...), indicating the dot size, texture, etc.
The setTexture function would just bind the new texture without loading the shaders. Compile then at the beggining (after setting the GL context and so on).
When this works you may consider to change your shaders every time, only if it is really necessary.
I will answer myself as I found out what the problem was.
Added this to my drawDot method:
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, texID);
texID being the texture ID that corresponds to the object calling the drawDot() method.
Works perfectly
Hope this helps anyone who may have a similar problem in the future.
I've just switched my code over to using a separate shader instead of passing a boolean uniform to decide which algorithm to use. Unfortunately, after vigorous testing, I've discovered that one of the attributes (halo) is not being passed through the the new shader. The other attribute it uses (position) is passed through, though.
Abdridged code follows:
Java code:
// Attributes
protected static int position = 0;
protected static int colour = 1;
protected static int texture = 2;
protected static int halo = 3;
protected static int normal = 4;
protected static int program1;
protected static int program2;
...
// Linking shader1
GLES20.glBindAttribLocation(program1, position, "position");
GLES20.glBindAttribLocation(program1, colour, "colour");
GLES20.glBindAttribLocation(program1, texture, "texCoord");
GLES20.glBindAttribLocation(program1, normal, "normal");
GLES20.glLinkProgram(program1);
...
// Linking shader2
GLES20.glBindAttribLocation(program2, position, "position");
GLES20.glBindAttribLocation(program2, halo, "halo");
GLES20.glLinkProgram(program2);
...
GLES20.glUseProgram(program1);
GLES20.glVertexAttribPointer(
position,
3,
GLES20.GL_FLOAT,
false,
0,
buffer);
...
//Render with program1
...
GLES20.glUseProgram(program2);
GLES20.glVertexAttribPointer(
halo,
1,
GLES20.GL_FLOAT,
false,
0,
doHaloBuffer);
GLES20.glEnable(GLES20.GL_BLEND);
GLES20.glDisable(GLES20.GL_DEPTH_TEST);
...
// Using lines for testing purposes
GLES20.glDrawElements(GLES20.GL_LINE_LOOP, haloIndexCount, GLES20.GL_UNSIGNED_SHORT, haloIndexBuffer);
...
Fragment shaders are just simple "Render the texture and colour you get" shaders
shader1.vsh:
attribute vec3 position;
attribute vec4 colour;
attribute vec2 texCoord;
attribute vec3 normal;
...
varying vec2 fragTexCoord;
varying vec4 fragColour;
...
// All attributes used at some point
shader2.vsh:
attribute vec3 position;
attribute float halo;
varying vec4 fragColour;
...
vec4 colour = vec4(1.0, 1.0, 0.0, 1.0);
if(halo > 0.5){
colour.g = 0.0;
...
}
fragColour = colour;
...
If i change halo > 0.5 to halo == 0.0 or swap the green values in the above statements, red is rendered otherwise yellow is rendered.
I tried altering the input buffer to be all 1.0 for testing but it made no difference. It seems that halo is not being passed through.
Previously, I had the two shaders merged and had a boolean uniform to decide which code to run and it worked fine. Nothing else has changed; the input buffers are the same, the counts are the same it's just that I'm using separate shaders now that is different.
Any thoughts?
check if the halo attribute is enabled just before rendering with glDrawElements
I've recently picked up renderscript and really loving it but the lack of documentation and examples isn't helping. I've managed to use the live wallpapers and examples to get my own live wallpaper running but have been for texturing I have been using the fixed function shaders.
I've looked at GLSL tutorials but it doesn't seem to translate over exactly. I've looked into the renderscript source code but it still hasn't been of too much help either.
Here is some code that I dug up from the renderscript sources that seems like what the fixed function is doing:
Program vertex
shaderString.append("varying vec4 varColor;\n");
shaderString.append("varying vec2 varTex0;\n");
shaderString.append("void main() {\n");
shaderString.append(" gl_Position = UNI_MVP * ATTRIB_position;\n");
shaderString.append(" gl_PointSize = 1.0;\n");
shaderString.append(" varColor = ATTRIB_color;\n");
shaderString.append(" varTex0 = ATTRIB_texture0;\n");
shaderString.append("}\n");
Program fragment
shaderString.append("varying lowp vec4 varColor;\n");
shaderString.append("varying vec2 varTex0;\n");
shaderString.append("void main() {\n");
shaderString.append(" lowp vec4 col = UNI_Color;\n");
shaderString.append(" gl_FragColor = col;\n");
shaderString.append("}\n");
I don't think these are the best examples because the fragment doesn't seem to touch the varTex0 variable. I've tried to write my own program fragment and use the fixed function vertex shader.
Here's my fragment shader:
ProgramFragment.Builder b = new ProgramFragment.Builder(mRS);
String s = "void main() {" +
" gl_FragColor = vec4(1.0,1.0,1.0,0.5);" +
"}";
b.setShader(s);
pf = b.create();
mScript.set_gPFLights(pf);
Extremely basic but any attempt at binding a texture has failed. I don't know what variable is needed for the texture.
Could anyone provide an example of a basic program vertex and program fragment that uses textures? Thanks in advance.
Check out the FountainFBO sample. It uses a program fragment with a texture that is used as a frame buffer object.
I finally managed to find the sources for the FixedFunction classes that are used to create GLSL shaders. There are located within "android_frameworks_base / graphics / java / android / renderscript".
Here is what the fragment shader with these FixedFunction settings :
ProgramFragmentFixedFunction.Builder builder = new ProgramFragmentFixedFunction.Builder(mRS);
builder.setTexture(ProgramFragmentFixedFunction.Builder.EnvMode.REPLACE,
ProgramFragmentFixedFunction.Builder.Format.RGBA, 0); //CHANGED
ProgramFragment pf = builder.create(); //RENAMED
pf.bindSampler(Sampler.WRAP_NEAREST(mRS), 0);
would look like :
ProgramFragment.Builder pfBuilder = new ProgramFragment.Builder(mRS);
String s = "varying vec2 varTex0;" +
"void main() {" +
" lowp vec4 col;" +
" vec2 t0 = varTex0;" +
" col.rgba = texture2D(UNI_Tex0, t0).rgba;" +
" gl_FragColor = col;" +
"}";
pfBuilder.setShader(s);
pfBuilder.addTexture(TextureType.TEXTURE_2D);
pf = pfBuilder.create();
This fragment shader works with the ProgramVertexFixedFunction.
I haven't gotten around to seeing what the FixedFunction vertex shader looks like but I will update this answer when I do.