I have two surface Views
1> MediaRecorder display surfaceview.
2> MediaPlayer SurfaceView displaying the Media recorded by MediaRecorder.
I want to display both the views simultaneously on the screen z ordered.
The mediaPlayer will be palying in full screen and the MediaRecorderPreview shd appear in the top right corner with some smaller size.
I am able to do this using two surfaces but the issue is that the MediaRecorder Preview always goes to the background z order and gets hidden by the mediaplayer full screen display.
Is their any way to define the Z order of Surface View.
Or is their any other suzzestion i can do to make this work. Can i start both MediaRecorder and MediaPlayer in a single surface?
Pls suggest. Thanks!!
I want to display both the views simultaneously on the screen z ordered.
AFAIK, that is not supported by Android. Android cannot composite multiple SurfaceViews. It can handle a regular View (e.g., Button) on top of a SurfaceView, but not two SurfaceViews Z ordered.
I recommend redesigning your application to have a single SurfaceView at a time.
As of Android 2 (API level 5) having 2 surfaceviews is supported. You can set the zOrder of the 2 surface views using setZOrderMediaOverlay although apparently it breaks the intended semantics of SurfaceView.
Related
I am creating a layout of type FrameLayout, in which I am adding two views. Two views are objects of GLSurfaceView and SurfaceView respectively. According to Android Developers Documentation regarding SurfaceView,
"The surface is Z ordered so that it is behind the window holding its SurfaceView; the SurfaceView punches a hole in its window to allow its surface to be displayed."
It works well for me and SurfaceView always stays behind my GLSurfaceView (used for opneGL drawings). But resuming after external event the behavior is odd for a following configuration,
Android Version: 4.3
Device Model Number : Nexus 7
Kernel Version 3.4.0.g1f57c39
Jun 13
Build Number: JWR66N
For this configuration, resuming after external event puts my GLSurfaceView behind SurfaceView. In other words, SurfaceView is placed at top in ZOrder and my OpenGL drawings are no more visible. On versions greater that Android 4.3, this behavior is not seen.
I can replicate this behavior on all versions by calling SurfaceView's following method with true as a parameter.
void setZOrderOnTop
Is this known issue. Anybody can help me on this?
Regards,
Sumedh
SurfaceViews have two parts, the Surface and the View. The Surface is a completely independent layer. The View is there so the UI layout code has something to work with. Generally the View is just transparent black, so you can see through to whatever is behind it.
GLSurfaceView is just SurfaceView with some code to manage EGL contexts and threading. Underneath it's just a SurfaceView. So if you have both a SurfaceView and a GLSurfaceView, and they have the same dimensions and Z-order, then one of them is going to "win" and the other is going to "lose" because they're trying to occupy the same space at the same time. There is no defined value for which one will "win", so inconsistent behavior is expected.
One way to avoid clashes is to leave one set to the default Z, and call setZOrderMediaOverlay() on the other. The "media overlay" is still behind the UI, but above the default Surface position. If you use setZOrderOnTop(), the Surface will be positioned above the UI as well.
The upper Surface will need to be rendered with transparent pixels if you want to see something behind it (the same way that the View needs to be transparent to see the Surface).
The most efficient way to avoid this issue is to not have this issue: use one SurfaceView for everything, rendering all of your non-UI-element content to it. This requires a bit more work (and probably a SurfaceTexture) if you're rendering video or showing a camera preview on one of the Surfaces.
You can find some examples in Grafika. The "multi-surface exerciser" demonstrates three overlapping SurfaceViews rendered in software, overlapping with UI elements. Other activities show ways to work with Surfaces, GLES, the camera, and video.
See also the Android System-Level Graphics Architecture doc, which explains all this in much greater detail.
Dont use "setZOrderOnTop" as true. That will get it over all the other layouts.
If you are using multiple surfaceviews. use this for each surfaceview
yourSurfaceView.setZOrderMediaOverlay(true);
then set this setZOrderOnTop as false for the surfaceview you initiated later and wanted it to get back to the other surfaceviews
secondSurfaceview.setZOrderOnTop(false);
I have a special design requiring for the app I'm developing right now.
Right now, I have a third-party private video library which plays a video stream. The design of this screen includes a translucent panel overlaid on top of the video, blurring the portion of the video that lies behind.
Normally in order to blur the background, you are supposed to take a screenshot of the view behind, blur it and use it as an image for the foreground view.
In this case, the video keeps on playing, so the blurred image changes every frame. How would you implement this then?
A possible solution would be to create a thread, taking screenshots, cropping them and put them as a background. Even better if that view is a SurfaceView, I guess. But I'm wondering what would be the best approach in this case. Would a thread that is continually taking screenshots create a huge performance impact? Is it possible to feed a surfaceView buffer with these images?
Thanks!
A SurfaceView surface is a consumer of graphics buffers. You can't have two producers for one consumer, which means you can't send the video to it and draw on it at the same time.
You can have multiple layers; the SurfaceView surface is on a separate layer behind the View UI layer. So you could play the video to the SurfaceView's surface, and draw your blur rectangle on the SurfaceView's view. (Normally the SurfaceView's view is completely transparent, and is just used as a place-holder for layout purposes.)
Another option would be to render the video frame to a SurfaceTexture. You would then render that texture to the SurfaceView surface with GLES, and render the blur rectangle on top. You can find an example of treating live camera input as a GLES texture in Grafika ("texture from camera" activity). This has the additional advantage that, since you're not interacting with the View system -- the SurfaceView surface is composited by the system, not the app -- you can do it all on an independent thread.
In any event, rendering, grabbing a screenshot, and re-rendering is going to be slower than the options described above.
For more details about why things work the way they do, see the Android System-Level Graphics architecture doc.
I'm trying to do camera recording and drawings on top on Google Glass, using a LiveCard.
in a regular activity, this would be achieved by using a FrameLayout, with a SurfaceView for the camera preview 'in the back', and another View in front of it used for drawing.
but using a LiveCard, if one needs subsecond updates, one has to use the LiveCard itself as a Surface. according to the documentation: https://developers.google.com/glass/develop/gdk/reference/com/google/android/glass/timeline/LiveCard
If your application requires more frequent updates (several times per
second) or rendering more elaborate graphics than the standard widgets
support, enable direct rendering and add a SurfaceHolder.Callback to
the card's surface.
LiveCard liveCard; // initialized elsewhere
liveCard.setDirectRenderingEnabled(true);
liveCard.getSurfaceHolder().addCallback(callback); You can then draw directly on the surface inside a background thread or in response
to external events (for example, sensor or location updates). Use the
surfaceCreated and surfaceDestroyed methods to start and stop your
rendering logic when the card is displayed or hidden.
now I can either draw my on stuff on this Surface, or I can give this to the MediaRecorder as the camera preview service, but can't do both, as it will fail with an error
I wonder if anyone has ideas on how to make this work still?
the way I'd draw into the LiveCard myself is to 'manually' lock the canvas, and call FrameLayout.draw(canvas); one option would be to have a layout that contains two SurfaceViews - one for the camera preview, and one for my own drawings, and use the same approach. but, even if I define such a layout in XML, I can't get the SurfaceViews created (e.g. the appropriate SurfaceView callbacks are never called, and any attempt of drawing on them results in failure)
I'm trying to create an Android app based on camera.
I want to split my screen into 2 or 4 views. (when 2 views are show, one will be above and one will be below. And when 4 views, 2 will be on top (side by side) and 2 will be at bottom (side by side))
I want to show the what camera is seeing in all the views.
I mean all the views must render according to camera.
Is it possible? HOw?
This is not practical AFAIK. The camera will draw to only one SurfaceView, not four. While you could try to make a mock SurfaceView that then passes the information to four separate SurfaceViews, your performance probably will be awful.
The only way I see this being possible is having setPreviewCallback() method take those frames, converting them to bitmaps or jpegs, and then drawing those images on the other surface views. I'm with CommonsWare though. It will be slow and painful.
I've actually did that in my app Face Costume:
Face Costume
What I did was use the camera preview buffer, convert it from YUV to RGB, and render it on an OpenGL texture. You don't have to choose OpenGL, you can also do it using the regular canvas method.
On dual core tablets it works well... haven't tested it on other hardware.
How do I write code which layouts UI elements (Buttons, etc) over camera preview on Android ?
Put the SurfaceView in a container that allows for Z-axis layering, such as RelativeLayout or FrameLayout. Put the things to appear on top of the SurfaceView later in the XML -- later children of a parent will draw over top of earlier children in the parent.
Here is a project using a SurfaceView for video playback that demonstrates the technique. The same concepts should hold for a SurfaceView for camera preview.
Bear in mind that extra work needs to be done by Android to blend your widgets with the preview, so your preview frame rate may drop.