extract Bitmap from SurfaceView [duplicate] - android

The difficulty I am having now is to take a screenshot of the SurfaceView. Could anyone please help?
My team is doing an Android project where we use MediaCodec to decode the video feed and render the feed onto a Surface.
The activity implements
SurfaceHolder.Callback
Then we created a SurfaceView and add the Callback:
mySurfaveView.getHolder().addCallback(this);
Then we start decode the video feed in (we have a method that does the decode)
surfaceChanged(SurfaceHolder holder, int format, int width, int height)
The video work fine, now I want to take screenshot of what gets rendered onto the SurfaceView. I have tried several ways to do this, but I didn't succeed. Following are what I have tried:
I have tired getting rootView's drawingcache.
Tried getting the frame directly from MediaCodec.
Tried the draw() method to draw on a canvas.
As of now I am tring to create a custom SurfaceView and override the onDraw method there. However, I have never created a custom SurfaceView. Am I on the right direction?
Help please. Thank you!

You can't capture a SurfaceView, as explained in this other question.
SurfaceViews will rarely want to override onDraw(). Generally the View part of the SurfaceView is a transparent hole.
For the specific case of MediaCodec output, you have a couple of choices. You can direct the output to a SurfaceTexture, and then render the video frame twice using OpenGL ES. Once to the SurfaceView for display, once to an off-screen buffer from which you can capture the pixels.
Another approach is to replace the SurfaceView with a TextureView, and then use the getBitmap() call to grab video frames.
Various examples can be found in Grafika.

Related

Is it possible to render two video streams simultaneously on a single SurfaceView?

Is it possible to render two video streams simultaneously, using different "sections" of a single SurfaceView?
I have made sample code that renders two videos simultaneously using two SurfaceViews side-by-side, but I am wondering if it is possible to have both videos play on the same SurfaceView.
Using a MediaPlayer, you can set either the SurfaceHolder or the Surface itself as the display. I believe the code to achieve what I am asking would be inside of the surfaceCreated method:
#Override
public void surfaceCreated(SurfaceHolder holder)
{
mediaPlayerTop.setDisplay(holder);
mediaPlayerBottom.setDisplay(holder);
play();
}
However, simply setting both MediaPlayers to the same Surface results in an IllegalStateException when you try to prepare the second MediaPlayer (this is ignoring the fact that they'd probably overlap eachother anyways because I am not setting the position anywhere).
Basically, is what I am trying to achieve possible?
Yes, but it takes some effort.
The basic plan is to direct the output of MediaPlayer to a SurfaceTexture, which converts each incoming frame to a GLES texture. You then render that to the SurfaceView, drawing a rect that fills half the view. You do the same thing for the other MediaPlayer.
The pieces you need can be found in Grafika, e.g. the "texture from camera" Activity takes a video stream from the camera preview, converts it to a GLES texture, and renders it to a SurfaceView.
Simply directing the output of two MediaPlayers to separate SurfaceViews is much easier, but less flexible.
Surfaces are endpoints in a producer-consumer pair. There can only be one producer at a time, so you can't simply direct two MediaPlayers at a single SurfaceView.

Use opengl es to render video to SurfaceView but concerned about more overhead

I wrote a video play view, it is a SurfaceView with a MediaPlayer instance. I attached the mediaplayer to the surface holder when surface created then start the play.
This is easy and everyone knows the details. But I want to draw a bitmap which is the first frame of the video to the surfaceview.Canvas is not a choice to draw the bitmap, because it will disable the mediaplayer to connect.
Since api level 14, we can new a surface with surfacetexture. So we can use opengl es to draw video frame and bitmap. But I am concerned about the performance.This way of playing video is more complicated and will it cause more overhead? Who can give me some advices?
You have a few options:
Use a FrameLayout to put a custom View (or maybe just an ImageView) on top of the SurfaceView. Draw your content there. When video playback starts, hide the View.
Connect GLES, draw the first frame, disconnect GLES, connect the MediaPlayer, play the movie. This is essentially what Grafika's PlayMovieSurfaceActivity does to clear the screen to black (see clearSurface()) before playing a movie.
As noted in your question, you can send the video to a SurfaceTexture, and then choose to render your content or render the image from the texture onto the SurfaceView.
#1 is the easiest. #3 adds complexity and is more expensive.

Render Android MediaCodec output on two views for VR Headset compatibility

What I know so far is that I need to use a SurfaceTexture that can be rendered on two TextureViews simultaneously.
So it will be:
MediaCodec -> SurfaceTexture -> 2x TextureViews
But how do I get a SurfaceTexture programmaticly to be used in the MediaCodec? As far as I know a new SurfaceTexture is created for every TextureView, so if I have two TextureViews in my activity, I will get two TextureViews!? Thats one to much... ;)
Or is there any other way to render the MediaCodec Output to a screen twice?
Do you actually require two TextureViews, or is that just for convenience?
You could, for example, have a single SurfaceView or TextureView that covers the entire screen, and then just render on the left and right sides with GLES. With the video output in a SurfaceTexture, you can render it however you like. The "texture from camera" activity in Grafika demonstrates various ways to manipulate image from a video source.
If you really want two TextureViews, you can have them. Use a single EGL context for the SurfaceTexture and both TextureViews, and just switch between EGL surfaces with eglMakeCurrent() when it's time to render.
In any event, you should be creating your own SurfaceTexture to receive the video, not using one that comes from a TextureView -- see e.g. this bit of code.

How to record webview activity screen using Android MediaCodec?

I have the task to record user activity in a webview, in other words I need to create an mp4 video file while the user navigates in a webview. Pretty challenging :)
I font that in Android 4.3 introduced MediaCodec : was expanded to include a way to provide input through a Surface (via the createInputSurface method). This allows input to come from camera preview or OpenGL ES rendering.
I even find an example where you could record a game written in opengl : http://bigflake.com/mediacodec/
My question is : how could I record a webview activity ? I assume that If I could draw the webview content to opengl texture, than everything would be fine. But I don't know how to do this.
Can anybody help me on this?
Why not try WebView.onDraw first, instead of using OpenGL? The latter approach may be more complicated, and not supported by all devices.
Once you will be able to obtain the screenshots, then you can create the video (to create video from image sequence on android), a separate task where mediacodec should help.
"I assume that If I could draw the webview content to opengl texture".
It is possible.
The SurfaceTexture is basically your entry point into the OpenGL layer. It is initialized with an OpenGL texture id, and performs all of it's rendering onto that texture.
The steps to render your view to opengl:
1.Initialize an OpenGL texture
2.Within an OpenGL context construct a SurfaceTexture with the texture id. Use SurfaceTexture.setDefaultBufferSize(int width, int height) to make sure you have enough space on the texture for the view to render.
3.Create a Surface constructed with the above SurfaceTexture.
4.Within the View's onDraw, use the Canvas returned by Surface.lockCanvas to do the view drawing. You can obviously do this with any View, and not just WebView. Plus Canvas has a whole bunch of drawing methods, allowing you to do funky, funky things.
The source code can be found here: https://github.com/ArtemBogush/AndroidViewToGLRendering And you can find some explanations here:http://www.felixjones.co.uk/neo%20website/Android_View/

Android::get a frame of a Video?

I have a video, and I need to get some of its frames.
I used to do the following: Create a standart bitmap with the size of the video in question, create a canvas and set it to draw on a bitmap.
I use a SurfaceView and a surfaceHolder. A mediaPlayer is drawing on the surfaceView, and I have a method that calls surfaceView.draw(canvas), which draws on the canvas, which draws to the bitmap, which I eventually take and use...
My problem is that 60% of the time I get black frames. The mediaplayer plays its content in a separete thread, and I do not know when the video has started, and when - not, so I believe this is what is getting my black screens.
I need a workaround, a fix, or another method to get the video frame.
Thanks in advance.
You might try looking at the source of the gallery app or wherever its video thumbnails come from if they are externally created
How about using MediaMetadataRetriever's getFrameAtTime(). For lower API versions use this

Categories

Resources