I have an android app with a RelativeLayout, to which I programmatically add multiple VideoViews, all of which are playing at the same time. The VideoViews can be moved around by dragging, and they can be moved so that they overlap.
The problem is that I cannot find a way to change the z-order of them during the runtime. .bringToFront() brings the view to front for receiving touch events, however the order of the actual videos remains the same.
I have tried removing and re-adding the videoView by .removeChild(), however that stops and resets the video.
Is there a way to change the z-order of the videos? A "hacky" way would work too, e.g. removing ad re-adding the view without stopping and resetting the video.
Thanks
P.S. It's a streaming video, so restarting the video to the same position would not work, there would be way too big delay.
EDIT: Even though people suggest multiple VideoViews should not work, they do work great, even overlapping, at least on my Nexus S, ICS 4.0.3
EDIT 2: Actually, not great, the second video sometimes flickers, but that's fine for this purpose. The 3rd and 4th video don't flicker, which is a bit weird
The document says the Z-order of SurfaceView's are determined before they are attached to window, and their orders are not altered once attached. Since VideoView's are also SurfaceView, I don't think it is possible to do it.
The VideoViews use SurfaceView, witch is Z-ordered with the windows, but you cannot have multiple SurfaceView z-ordered,because every SurfaceView will punches a hole on the window. If you add multiple video on the same area, video will not play correctly. I used to translate video out off the screen to keep its state. Look for EcoNews aap on google play.
Related
In a fragment I have a Texture View displaying the camera preview and on the top of that I draw several other views.
My goal is to record a short video of what the user sees (all views) or save several screenshots to compile later into a video.
Since I don't want any disclaimer to show up and Intent associated, I don't want to use MediaProjection.
I've tried many things but all either don't work or take screenshot/record all views except for the TextureView, which turns out black in the outcome. Note that I don't wish to use MediaRecorder either because it'll only allow me to record the textureview and I want all of the contents to be recorded/screenshot.
I understand that this is the reason TextureView comes out black.
I have actually managed to get screenshots with the PixelCopy api, particularly this call, but minimum sdk version is 26 and I needed a solution to work for minimum 24 sdk version, otherwise it would be an option for me... Also, the ideal scenario would be getting a video and not the frames for later making the video.
So, can anyone point out the better way of doing this? I'm currently not seeing any alternatives...
Again, I want to give the user a small video of the entire screen display (all views).
Thanks a lot in advance!
I'm using Android's MediaPlayer which I want to use in two different TextureViews, which are of different size. I use the same MediaPlayer for both, by calling mediaPlayer.setSurface(surface) to switch from one to the other. When my app switches from one to the other, however, small videos are not displayed properly. They look like this, where the green parts of the video should not be there.
How come the mediaplayer shows different things in different textureviews? How do I fix this?
I have a drawing app. I want to record finger drawing and make .mp4 file. Display whole drawing steps for how to create drawing. I can't understand how to use record screen on my apps.
You can not record screen video without rooting device.
But in your case if you want to play drawing then if you have state of your drawn items then you can redraw it in thread with proper delay, it will looks like video.
Its simple solution, without using any 3rd party library using handler, generate screenshot and use frame animation for full screen mode. Thats its.
What are possible reasons that a video might be rendered as shown in the below screenshot? There's supposed to be a fullscreen video container rendering a remote peer's video in a WebRTC chat. There's a smaller inlay for the local video in the lower left. The <video> container does fill the entire size it's supposed to, but it is largely filled with green, with the actual video rendered small and distorted in its upper left corner. The video is supposed to fill the entire green space. The green colour is the video, it's a rendering artefact, it is not my styling.
I have had occasional success getting it to render full screen, but more often than not the video clings to the corner. No code changes between success and failure, just trying again and again. This is running on an Android 4.4.2 tablet, tried both Opera and Chrome. Works fine on desktop browsers.
Looking for any hints as to what may cause this and how one could fix it or work around it.
This problem is indeed as you already found a problem with the video driver in combination with hardware decoding of the videostream within Chrome. It can be fixed on the device by disabling the hardware videodecoding for webrtc in the chrome://flags. The option text is a bit fuzzy, since you need to enable the disabling of video decoding in order for the hw video decoding to be disabled. I hope there will be a better solution soon.
It appears that this is a graphics driver related bug and nothing that can be fixed in the browser. It simply won't work. (For the record: ASUS Memo Pad 8.)
One possible workaround for the really desperate: hide the actual video, render the video on a <canvas>, traverse from the top-right towards the top-left until you find non-green pixels, do the same bottom-left to top-left, use the discovered offsets to crop the video on the canvas. This will be super slow and ugly, but is a desperate fix if you need it. I haven't coded this yet, but I might if necessary.
I'm trying to figure out if Android can handle two video video players occupying the same screen space, preferably with the one on top having alpha channel regions that are transparent to the one behind.
I know how to implement this code wise, I'm curious if anyone knows if this is physically possible before I bother throwing coding time at it.
TIA
AFAIK, no, at least before Android 4.0. You can't have two SurfaceViews overlap.
Now, it is conceivable that this is possible with TextureView with Android 4.0, though I am far from confident of that.
Another option :
Player 1 : Stock MediaPlayer that renders on a SurfaceView
Player 2 : Yet-another-player that can render on a GLSurfaceView or a Bitmap. This must be custom-built to decode frames and write on a GLSurfaceView's context or Native-bitmap via JNI.