Render Android MediaCodec output on two views for VR Headset compatibility - android

What I know so far is that I need to use a SurfaceTexture that can be rendered on two TextureViews simultaneously.
So it will be:
MediaCodec -> SurfaceTexture -> 2x TextureViews
But how do I get a SurfaceTexture programmaticly to be used in the MediaCodec? As far as I know a new SurfaceTexture is created for every TextureView, so if I have two TextureViews in my activity, I will get two TextureViews!? Thats one to much... ;)
Or is there any other way to render the MediaCodec Output to a screen twice?

Do you actually require two TextureViews, or is that just for convenience?
You could, for example, have a single SurfaceView or TextureView that covers the entire screen, and then just render on the left and right sides with GLES. With the video output in a SurfaceTexture, you can render it however you like. The "texture from camera" activity in Grafika demonstrates various ways to manipulate image from a video source.
If you really want two TextureViews, you can have them. Use a single EGL context for the SurfaceTexture and both TextureViews, and just switch between EGL surfaces with eglMakeCurrent() when it's time to render.
In any event, you should be creating your own SurfaceTexture to receive the video, not using one that comes from a TextureView -- see e.g. this bit of code.

Related

Is it possible to render two video streams simultaneously on a single SurfaceView?

Is it possible to render two video streams simultaneously, using different "sections" of a single SurfaceView?
I have made sample code that renders two videos simultaneously using two SurfaceViews side-by-side, but I am wondering if it is possible to have both videos play on the same SurfaceView.
Using a MediaPlayer, you can set either the SurfaceHolder or the Surface itself as the display. I believe the code to achieve what I am asking would be inside of the surfaceCreated method:
#Override
public void surfaceCreated(SurfaceHolder holder)
{
mediaPlayerTop.setDisplay(holder);
mediaPlayerBottom.setDisplay(holder);
play();
}
However, simply setting both MediaPlayers to the same Surface results in an IllegalStateException when you try to prepare the second MediaPlayer (this is ignoring the fact that they'd probably overlap eachother anyways because I am not setting the position anywhere).
Basically, is what I am trying to achieve possible?
Yes, but it takes some effort.
The basic plan is to direct the output of MediaPlayer to a SurfaceTexture, which converts each incoming frame to a GLES texture. You then render that to the SurfaceView, drawing a rect that fills half the view. You do the same thing for the other MediaPlayer.
The pieces you need can be found in Grafika, e.g. the "texture from camera" Activity takes a video stream from the camera preview, converts it to a GLES texture, and renders it to a SurfaceView.
Simply directing the output of two MediaPlayers to separate SurfaceViews is much easier, but less flexible.
Surfaces are endpoints in a producer-consumer pair. There can only be one producer at a time, so you can't simply direct two MediaPlayers at a single SurfaceView.

Android: Attach SurfaceTexture to FrameBuffer

I am performing a video effect that requires dual pass rendering (the texture needs to be passed through multiple shader programs). Attaching a SurfaceTexture to a GL_TEXTURE_EXTERNAL_OES that is passed in the constructor does not seem to be a solution, since the displayed result is only rendered once.
One solution I am aware of is that the first rendering can be done to a FrameBuffer, and then the resulting texture can be rendered to where it actually gets displayed.
However, it seems that a SurfaceTexture must be attached to a GL_TEXTURE_EXTERNAL_OES texture, and not a FrameBuffer. I'm not sure if there is a workaround around this, or if there is a different approach I should take.
Thank you.
SurfaceTexture receives a buffer of graphics data and essentially wraps it up as an "external" texture. If it helps to see source code, start in updateTexImage(). Note the name of the class ("GLConsumer") is a more accurate description of the function than "SurfaceTexture": it consumes frames of graphic data and makes them available to GLES.
SurfaceTexture is expected to work with formats that OpenGL ES doesn't "naturally" work with, notably YUV, so it always uses external textures.

Android: Dual Pass Render To SurfaceTexture Using OpenGL

In order to perform a Gaussian blur on a SurfaceTexture, I am performing a dual pass render, meaning that I am passing the texture through one shader (horizontal blur) and then through another shader (vertical blur).
I understand the theory behind this: render the first texture to an FBO and the second one onto the SurfaceTexture itself.
There are some examples of this, but none of them seem applicable since a SurfaceTexture uses GL_TEXTURE_EXTERNAL_OES as its target in glBindTexture rather than GL_TEXTURE_2D. Therefore, in the call to glFramebufferTexture2D, GL_TEXTURE_2D cannot be used as the textarget, and I don't think GL_TEXTURE_EXTERNAL_OES can be used in this call.
Can anyone suggest a way to render a texture twice, with the final rendering going to a SurfaceTexture?
Important update: I am using a SurfaceTexture since this is a dynamic blur of a video that plays onto a surface.
Edit: This question was asked with some misunderstanding on my part. A SurfaceTexture is not a display element. It instead receives data from a surface, and is attached to a GL_TEXTURE_EXTERNAL_OES.
Thank you.
Rendering to a SurfaceTexture seems like an odd thing to do here. The point of SurfaceTexture is to take whatever is sent to the Surface and convert it into a GLES "external" texture. Since you're rendering with GLES, you can just use an FBO to render into a GL_TEXTURE_2D for your second pass.
SurfaceTexture is used when receiving frames from Camera or a video decoder because the source is usually YUV. The "external" texture format allows for a wider range of pixel formats, but constrains the uses of the texture. There's no value in rendering to a SurfaceTexture with GLES if your goal is to create a GLES texture.

How to take snapshot of surfaceview?

Am working in H264 video rendering in Android application using SurfaceView. It has one feature to take snapshot while rendering the video on surface view. Whenever I take a snapshot, I get the Transparent/Black screen only. I use getDrawingCache() method to capture the screen that returns a null value only. I use the below code to capture the screen.
SurfaceView mSUrfaceView = new SurfaceView(this); //Member variable
if(mSUrfaceView!=null)
mSUrfaceView.setDrawingCacheEnabled(true); // After video render on surfaceview i enable the drawing cache
Bitmap bm = mSUrfaceView.getDrawingCache(); // return null
Unless you're rendering H.264 video frames in software with Canvas onto a View, the drawing-cache approach won't work (see e.g. this answer).
You cannot read pixels from the Surface part of the SurfaceView. The basic problem is that a Surface is a queue of buffers with a producer-consumer interface, and your app is on the producer side. The consumer, usually the system compositor (SurfaceFlinger), is able to capture a screen shot because it's on the other end of the pipe.
To grab snapshots while rendering video you can render video frames to a SurfaceTexture, which provides both producer and consumer within your app process. You can then render the texture for display with GLES, optionally grabbing pixels with glReadPixels() for the snapshot.
The Grafika app demonstrates various pieces, though none of the activities specifically solves your problem. For example, "continuous capture" directs the camera preview to a SurfaceTexture and then renders it twice (once for display, once for video encoding), which is similar to what you want to do. The GLES utility classes include a saveFrame() function that shows how to use glReadPixels() to create a bitmap.
See also the Android System-Level Graphics Architecture document.

How to record webview activity screen using Android MediaCodec?

I have the task to record user activity in a webview, in other words I need to create an mp4 video file while the user navigates in a webview. Pretty challenging :)
I font that in Android 4.3 introduced MediaCodec : was expanded to include a way to provide input through a Surface (via the createInputSurface method). This allows input to come from camera preview or OpenGL ES rendering.
I even find an example where you could record a game written in opengl : http://bigflake.com/mediacodec/
My question is : how could I record a webview activity ? I assume that If I could draw the webview content to opengl texture, than everything would be fine. But I don't know how to do this.
Can anybody help me on this?
Why not try WebView.onDraw first, instead of using OpenGL? The latter approach may be more complicated, and not supported by all devices.
Once you will be able to obtain the screenshots, then you can create the video (to create video from image sequence on android), a separate task where mediacodec should help.
"I assume that If I could draw the webview content to opengl texture".
It is possible.
The SurfaceTexture is basically your entry point into the OpenGL layer. It is initialized with an OpenGL texture id, and performs all of it's rendering onto that texture.
The steps to render your view to opengl:
1.Initialize an OpenGL texture
2.Within an OpenGL context construct a SurfaceTexture with the texture id. Use SurfaceTexture.setDefaultBufferSize(int width, int height) to make sure you have enough space on the texture for the view to render.
3.Create a Surface constructed with the above SurfaceTexture.
4.Within the View's onDraw, use the Canvas returned by Surface.lockCanvas to do the view drawing. You can obviously do this with any View, and not just WebView. Plus Canvas has a whole bunch of drawing methods, allowing you to do funky, funky things.
The source code can be found here: https://github.com/ArtemBogush/AndroidViewToGLRendering And you can find some explanations here:http://www.felixjones.co.uk/neo%20website/Android_View/

Categories

Resources