I'm using libvlc on and android app to play a network stream; the native code draws the image on a surfaceview.
Let's assume the video stops and I have no access to the native code, how can I detect if the SurfaceView is still moving (the video is not frozen)?
I tried getViewTreeObserver().addOnDrawListener(); and getViewTreeObserver().addOnPreDrawListener(); but they do not have the effect I'm looking for.
Thanks
You can't get that information from the SurfaceView, because the SurfaceView itself does not know.
The SurfaceView's job is to set up the Surface, and create a hole in the View layout that you can see through. Once it has done these things, it is no longer actively involved in the process of displaying video. The content flows from the decoder to the Surface, which is managed by SurfaceFlinger (the system graphics compositor).
This is, for example, how DRM video works. The app "playing" the video has no access to DRM-protected video frames. (Neither does SurfaceFlinger, actually, but that's a longer story.)
The best way to know if content is still arriving is to ask the video source if it is still sending you content. Another approach would be to change your SurfaceView to a TextureView, and provide an onSurfaceTextureUpdated() callback method.
I am not sure what exactly what you are trying to achieve here but you can see if surface view is rendering or not through implementing an interface called SurfaceHolder.Callback which gives you access to the following methods,
On Surface Created - This is called immediately after the surface is first created.
On Surface Changed - This is called immediately after any structural changes (format or size) have been made to the surface.
On Surface Destroyed - This is called immediately before a surface is being destroyed.
Take a look at the documentation for surface view. For SurfaceHolder take a look at this link. Basically in order to know
Related
I have a Service running in the background which is started by an activity. The service itself is a background Camera recorder. The Service tries to Starts Camera2 and writes to a Surface provided by the Media Recorder.
When I get the Activity runnning, now I want to have the live stream along with the background recording. So far, I have been creating a SurfaceView in an Activity and passing it as a target to Camera2 when the surface is created from the Activity. But, I have to reinitialize the Camera2 API each time the Surface gets destroyed(ie Activity goes to the background). Is this the right approach to solve this problem? Is it possible for the Service to own the SurfaceView and pass a reference to Surface back to the Activity so that it can display the live feed without re initializing the Camera device?
Not directly; the SurfaceView has to live in the app process because it's closely tied to the app's UI state (and as you've noticed, it gets torn down every time the app goes in the background). So it can't live in the service.
However, you could have the Service receive camera frames, and resend them to the app when it's in the foreground. For example, you could use an ImageReader to read YUV frames from the camera, and then write those (with an ImageWriter) to a Surface the app provides from its own ImageReader (or SurfaceView if you can do the YV12 bit described below). Then the camera doesn't need to be reconfigured when the app appears or disappears; the service can just start dumping the images to nowhere.
That does require the app to draw its own YUV frames, which is annoying, since there's unfortunately no requirement that a TextureView or SurfaceView will accept YUV_420_888 buffers. If your camera device supports YV12 output, then that can be written directly to a SurfaceView or TextureView; otherwise, you'll probably need to implement some custom translation. Most efficiently that can be done in JNI code, where you can use the NDK ANativeWindow APIs to lock the Surface from a SurfaceView, set its format to YV12, and write the YUV_420_888 image data into it (translating pixel stride/row stride as needed).
i know that TextureView is show up after ICS.
but, SurfaceView is not deprecated at ICS.
SurfaceView has hole-punching structure, so it has many limit point.
can't stack two SurfaceView and can't translate and etc..
why SurfaceView is not deprecated despite of TextureView is show up?
SurfaceView is faster, and can handle DRM-protected video.
The hole-punching structure is necessary because SurfaceView's Surface is handled directly by the system compositor. For TextureView, you draw on a Surface, which is converted to a GL texture within the app, which is rendered a second time by the app onto the View layer. So there's an extra copy.
For DRM-protected video, no user or system code -- not even the Linux kernel -- is allowed to see unencrypted pixels. Only the video decoder and the display hardware. Because SurfaceView just forwards references through, and doesn't touch the actual data, this works.
For more details, see the graphics architecture doc.
I have a confusion and look forward to some comments on this. I was assuming that WebView creates a separate surface to draw and does not use the default surface of the activity to draw. But, in surfaceflinger dump, I dont see a new surface getting created when using webview.
When I do the similar experimnet of using videoview, I see a separate surface getting created.
Onwebview also, I wanted to play a video, so was assuming a separate surface would be created and thereby the surface resolution would be as per video resolution. But if it uses application's surface, then the max resolution of the video has to be of the UI resolution.
In chromium code, I see the code for separate surface but then in practical I could not see one getting created.
Can someone help me to clarify this.
Thank You.
If you look at VideoView inheritance graph you'll notice that it inherits from SurfaceView, while WebView does not, so WebView can only achieve that by creating an external SurfaceView.
While if you search for usages of ExternalVideoSurface in WebView part of Chromium code, you will notice that it is only enabled if "video hole" is enabled, which is intended to be used only for decoding encrypted videos, where WebView needs to do "hole punching". There is a System API-level setting in WebView that enables this behaviour, but it has its own limitations, and thus not recommended to be used in general.
I am also curious webview is not in sufaceflinger dump.
I think the reason is webview also render to the related activity native window, so there is not another surface in this situation.
But the situation seems differs in the lastest Android and Webview version by developer's option.
I am trying to decode video samples using MediaCodec API. I am using surfaceView to show rendered samples. If i press home button, app going into pause state and surface destroyed. When i coming back to resume state, new surfaceView reference is creating, but decoder is unable to pump samples on surfaceView. so screen appearing as black.
video configure:
videoDecoder.configure(format, surface, null, 0);
So how can i reconfigure videoDecoder in above statement. It is similar to the following problem
How to keep decoding alive during screen orientation?
The MediaCodec API does not currently (API 19) provide a way to replace the output Surface.
As in the other question you refer to, I think the way to deal with this will be to decode to a Surface that isn't tied to the view hierarchy (and, hence, doesn't get torn down when the Activity is destroyed).
If you direct the output of the MediaCodec to a SurfaceTexture, you can then render that texture onto the SurfaceView. This will require a bit of GLES code. You can find the necessary pieces in the Grafika sources, but there isn't currently a full implementation of what you want (e.g. PlayMovieActivity decodes video to a SurfaceTexture, but that ST is part of a TextureView, which will get torn down).
The additional rendering step will increase the GPU load, and won't work for DRM-protected video. For most devices and apps this won't matter.
See also the bigflake examples.
Update: I've added this to Grafika, with a twist. See the "Double decode" example. The output goes to a SurfaceTexture associated with a TextureView. If the screen is rotated (or, currently, blanked by hitting the power button), decoding continues. If you leave the activity with the "back" or "home" button, decoding stops. It works by retaining the SurfaceTexture, attaching it to the new TextureView.
As you may notice, camera in android phones stops working when we minimize it (for example when we start a new application). My question is: Is there any way to create an app with android camera, which records even if we minimize it so it could be recording videos while we are doing something different on our phone? Or may be it is only possible if we create such camera without using MediaStore? If you share some links or code which might help me, I'll be grateful. Thanks in advance.
I believe the answer to this is that one must use
public final void setPreviewTexture (SurfaceTexture surfaceTexture)
Added in API level 11
Sets the SurfaceTexture to be used for live preview.
Either a surface or surface texture is necessary for preview,
and preview is necessary to take pictures.
from https://developer.android.com/reference/android/hardware/Camera.html . And
from https://developer.android.com/reference/android/graphics/SurfaceTexture.html :
The image stream may come from either camera preview or video decode.
A SurfaceTexture may be used in place of a SurfaceHolder when specifying the
output destination of a Camera or MediaPlayer object. Doing so will cause all the
frames from the image stream to be sent to the SurfaceTexture object rather than
to the device's display.
and I would really like to try this and send you some code, but I have no phone more recent than gingerbread, and this was introduced with honeycomb.
Using a surface associated with an Activity, surfaceDestroyed is called sometime between onPause and onStop when the Activity is being minimised, although, oddly, not when the phone is being put to sleep: How SurfaceHolder callbacks are related to Activity lifecycle? But I hope that a surfaceTexture is not destroyed in this way.