Is it possible to use same surfacetexture for 2 views? If so, how?
I use mediaplayer to play video and I want to play same video on 2 different views at the same time.
I tried to create SurfaceTexture and then set this surface texture to both views but it doesn't work.
public int createTextureObject() {
int[] textures = new int[1];
GLES20.glGenTextures(1, textures, 0);
int texId = textures[0];
GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, texId);
GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MIN_FILTER,
GLES20.GL_NEAREST);
GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MAG_FILTER,
GLES20.GL_LINEAR);
GLES20.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_WRAP_S,
GLES20.GL_CLAMP_TO_EDGE);
GLES20.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_WRAP_T,
GLES20.GL_CLAMP_TO_EDGE);
return texId;
}
SurfaceTexture st = new SurfaceTexture(createTextureObject());
textureView1.setSurfaceTexture(st);
textureView2.setSurfaceTexture(st);
mMediaPlayer.setSurface(new Surface(st));
It randomly works on one or another view but not on both at the same time.
I don't believe shared SurfaceTextures are supported by TextureView. I don't know that there's anything that would prevent it from being possible, but the onFrameAvailable() callback can only notify one object.
(You might be able to jury-rig something where you manually invoke the callback in the second instance from the first, but that seems like asking for trouble.)
An approach that will work is to create the SurfaceTexture as you are doing now, and send the video frames to it, but provide an onFrameAvailable() listener that renders the video frame to the two TextureViews using OpenGL ES.
Various examples of this can be found in Grafika, e.g. "continuous capture" receives Camera input on a SurfaceTexture and then renders it twice (once to the display, once to a video encoder).
Related
When i render the cameraTexture to a low resolution SurfaceView, it looks pixelated.
Seems i need to generate mipmap for the camera texture, but it doesn't work this way.
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
GLES20.glGenTextures(1, glTextures, 0);
GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, glTextures[0]);
GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MIN_FILTER, GL11.GL_LINEAR_MIPMAP_LINEAR);
GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
GLES20.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
GLES20.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);
mInputSurfaceTexture = new SurfaceTexture(inputTexture);
mInputSurfaceTexture.setDefaultBufferSize(CCamera.SIZE.getWidth(), CCamera.SIZE.getHeight());
mInputSurfaceTexture.setOnFrameAvailableListener(new CameraFrameListener(), mGLHandler);
mInputSurface = new Surface(mInputSurfaceTexture);
# feed mInputSurface to camera service.
public void onFrameAvailable(SurfaceTexture surfaceTexture) {
surfaceTexture.updateTexImage();
GLES20.glGenerateMipmap(GLES11Ext.GL_TEXTURE_EXTERNAL_OES);
//GLES11Ext.glGenerateMipmapOES(GLES11Ext.GL_TEXTURE_EXTERNAL_OES);
}
BTW, that is the different of :
GLES11Ext.glGenerateMipmapOES
GLES20.glGenerateMipmap
You can't, at least not directly.
Implement an offscreen pass that converts the YUV to RGB to write an RGB image, and then mipmap that. If you know you only need the low resolution version, that YUV to RGB pass could also implement the initial 2:1 downsample to minimize the memory bandwidth overheads.
I've four types of texture.png for one 3d.obj file. How to change texture.png after 3d.obj was placed in the surface, like color changing functionality for ARCORE?
Does anyone have any idea?
You can change the texture of the object. Assuming you are looking at the hello_ar_java sample, you can add a method to ObjectRenderer:
public void setTextureOnGLThread(Bitmap textureBitmap) {
// Bind the texture name already allocated.
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textures[0]);
// Set the filtering for handling different sizes to render.
GLES20.glTexParameteri(
GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR_MIPMAP_LINEAR);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
// Copy the bitmap contents into the texture buffer.
GLUtils.texImage2D(GLES20.GL_TEXTURE_2D, 0, textureBitmap, 0);
// Generate the mip map for the different sizes.
GLES20.glGenerateMipmap(GLES20.GL_TEXTURE_2D);
// Unbind the texture.
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, 0);
}
You need to call this from the GL thread, for example from onDrawFrame().
I am displaying a video on a GLSurfaceView with a custom renderer that requires that multiple shaders be applied in succession. Currently, it is working successfully with one shader, though I am not sure how to extend the rendering pipeline to apply multiple shaders in succession.
I know that there are some examples concerning applying multiple shaders (using FrameBuffers and RenderBuffers), but I have not found any that deal with an image passed in through a SurfaceTexture.
There is a specific concern I would like to address:
A SurfaceTexture must be bound to a GL_TEXTURE_EXTERNAL_OES texture. On the other hand, a FrameBuffer cannot be bound to a GL_TEXTURE_EXTERNAL_OES texture (typically a GL_TEXTURE_2D is used), so is it even possible to use a FrameBuffer for a multi-pass render when the input texture is of a different format than the output? If not, what are the other options for performing a multi-pass render?
Below is some relevant code in the onSurfaceCreated function of the renderer I am trying to extend to perform multiple passes::
GLES20.glGenTextures(1, this.textureID, 0);
GLES20.glBindTexture(GL_TEXTURE_EXTERNAL_OES, this.textureID[0]);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);
GLES20.glBindTexture(GL_TEXTURE_EXTERNAL_OES, 0);
this.surfaceTexture = new SurfaceTexture(this.textureID[0]);
Below is some relevant code in the onDrawFrame function of that renderer:
synchronized (this) {
if (this.updateSurface) {
this.surfaceTexture.updateTexImage();
this.surfaceTexture.getTransformMatrix(this.stMatrix);
this.updateSurface = false;
}
}
GLES20.glBindTexture(GL_TEXTURE_EXTERNAL_OES, this.textureID[0]);
GLES20.glClearColor(0.5f, 0.5f, 0.5f, 1.0f);
GLES20.glClear(GLES20.GL_DEPTH_BUFFER_BIT | GLES20.GL_COLOR_BUFFER_BIT);
//apply shader here and call glDrawArrays() at end
One way to approach this would be to use a SurfaceHolder rather than a SurfaceTexture.
From there, you can then get the Surface being held by the SurfaceHolder.
Then you can get the underlying Canvas being drawn to.
Note: Per this answer, you will need to use setBitmap(Bitmap canvas_bitmap) to specify the Bitmap being drawn into.
All the texture setup and parameter stuff will still need to be done basically the way you showed in you question.
GLES20.glGenTextures(1, this.textureID, 0);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, this.textureID[0]);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);
Then dump the bitmap into a texture using:
GLUtils.texImage2D( GLES20.GL_TEXTURE_2D, 0, canvas_bitmap, 0 );
After spending some time into it, it might be better to ask. Thanks for help, guys!
Question
How to render a video frame from MediaPlayer or VideoView to SurfaceTexture or a OpenGL texture, in order to change the texture/fragment color via GLSL? (We need it for fancy GLES/GLSL video processing routines.)
Context
a) Google TV (LG G2 2012 device) is an Android 3.2 device with SDK-only support (no NDK)
b) It is easy to render from camera to SurfaceTexture, but how to render video to SurfaceTexture in Android 3.x? For camera solution, see below.
c) I'm already rendering video frames to a GLView/GLRenderer, but I'm not grabbing a frame in order to change it via GLSL. It seems not to work. But I need accessable GLES/GLSL textures with video data for video processing:
MainActivity class:
public void onCreate(Bundle state) {
super.onCreate(state);
m_View = new GLSimpleView(this);
setContentView(m_View);
m_Holder = m_View.getHolder();
m_Holder.addCallback(this);
m_Holder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
}
GLSimpleView class:
public GLSimpleView(Context context) {
super(context);
m_Renderer = new GLTextureRenderer(context);
this.setRenderer(m_Renderer);
}
GLTextureRender class:
int[] textures = new int[1];
GLES20.glGenTextures(1, textures, 0);
m_SurfaceTexture = textures[0];
GLES20.glBindTexture(GL_TEXTURE_EXTERNAL_OES, mTextureID);
GLES20.glTexParameterf(GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_NEAREST);
GLES20.glTexParameterf(GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
GLES20.glTexParameteri(GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
GLES20.glTexParameteri(GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);
m_Surface = new SurfaceTexture(m_SurfaceTexture);
m_Surface.setOnFrameAvailableListener(this);
// THIS ONE DOESN'T WORK WITH ANDROID 3.x!
// HOW TO BIND m_Surface TO m_MediaPlayer?
//Surface surface = new Surface(m_Surface);
//m_MediaPlayer.setSurface(surface);
//surface.release();
Also compare this:
SurfaceTexture for camera (I need this one for MediaPlayer or VideoView!): Using SurfaceTexture in Android
Video to GLView (no texture access via GLSL!): Playing video in a GLSurfaceView instead of SurfaceView
Android 4.x SDK VideoSurfaceTexture sample (not 3.2 compatible!): http://source-android.frandroid.com/cts/tests/src/android/media/cts/VideoSurfaceView.java
Android MediaPlayer (in Android 3.x no support of setSurface()!): http://developer.android.com/reference/android/media/MediaPlayer.html
Therefore the main question is still: How to access and manipulate a video frame with Android 3.x? Perhaps a different solution? Am I missing something, after spending too much time? Consider that there is no NDK support on Google TV at all, that we're very limited if we try to manipulate video data.
Megha was kind enough to look into this for me. We don't support this on Google TV for ARM. It's impossible with our current SOC's (System on a Chip).
I am building a simple live wallpaper for Android. I am uploading the required texture into OpenGL ES 2.0 using the below code. I have loaded all my images into a single file of size 2048x2048. This below code takes about 900 to 1200 ms to load the texture. Is this a normal time or am I doing something wrong to make it slow?
I also try to clear the list of textures in Opengl every time the onSurfaceCreated is called in my renderer. Is this right to be done, or is there a way to simple check if the previously loaded texture is already in memory and if so avoid clearing and reloading? Please let me know your comments on this. Thank you.
Also on screen orientation change the OnSurfaceCreated is called. So the texture upload happens again. This is not a good idea. What is the work around?
public int addTexture(Bitmap texture) {
int bitmapFormat = texture.getConfig() == Config.ARGB_8888 ? GLES20.GL_RGBA : GLES20.GL_RGB;
int[] textures = new int[1];
GLES20.glGenTextures(1, textures, 0);
int textureId = textures[0];
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureId);
GLUtils.texImage2D(GLES20.GL_TEXTURE_2D, 0, bitmapFormat, texture, 0);
GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR_MIPMAP_LINEAR);
GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_REPEAT);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_REPEAT);
GLES20.glGenerateMipmap(GLES20.GL_TEXTURE_2D);
return textureId;
}
A few ways you can improve performance.
Do not load the texture every time onSurfaceChanged is called. Initialize your textureId to -1 (in the constructor/surfaceCreated of your renderer) and check at the beginning of onSurfaceChanged if you have a different Id. When you call glGenTextures, you will get a positive number.
Do you need the Mipmaps? That might be the key point of your method here. Try without the line GLES20.glGenerateMipMap(GLES20.GL_TEXTURE_2D);
2048x2048 is huge. Especially for textures. Do you really need that much detail? Maybe 1024x1024 is enough.
Avoid RGB_888, use RGB_565 instead: you'll get almost the same visual quality for half the size.