I tried to succesfully render a webview onto a GLSurfaceView, which I managed to do thanks to this:
Is it possible to render an Android View to an OpenGL FBO or texture?
However when I tried to load up a website with a video, I realized, that the video doesn't play on the GLSurfaceView (there is sound however). I found out through googleing that the webview renders the video on a seperate surfaceview, which is probably the cause, that it doesnt appear on my GLSurfaceView.
My question:
How do I go about hacking the webview in order to access the extra video surface?
I consciously say hack... I am ready to use reflection and whatever is needed, but I don't really know where to start. In the SDK Manager provided source all I can find is the WebViewProvider interface without any implementation. Via grepcode I find some kind of implementations (http://grepcode.com/file/repo1.maven.org/maven2/org.robolectric/android-all/4.1.2_r1-robolectric-0/android/webkit/HTML5VideoViewProxy.java), but not in the SDK Manager supplied stuff. I am running KitKat 4.4 on an Xiaomi RedMi 2 phone.
How can I determine/use the relevant implementation?
Or do you think this isn't even feasable?
With a WebView I don't think it's possible, see this. I'm currently attempting something similar with a GeckoView because the you can get a handle to the surface, see this note and this on how to embed a GeckoView into your project.
Related
Is actually in the end everything rendered via OpenGL in Android. I have already checked out this video https://youtu.be/zdQRIYOST64 and the relevant documents. And it seems that not all is rendered via OpenGL or maybe Vulkan nowadays. But when it is not rendered via this, how is it rendered, via some internal stuff inside the SurfaceFlinger?
Can someone show up the way through the code from the application level to the very last point before the hardware?
I have asked #Romain Guy on twitter to answer this question and this is his answer
https://twitter.com/romainguy/status/1272314819333337090
Apps are rendered pretty much entirely with OpenGL yes. SurfaceFlinger
avoids using the GPU whenever possible and uses dedicated compositing
hardware instead (hardware composer). But sometimes it falls back to
GL.
I need to play a video on a OpenGL surface. I think I will need to render each frame of the video to a texture in a loop and then render it via OpenGL. is this possible under ios and/or android ?
It is possible on iOS, but it's pretty tricky business to get it to run fast enough to keep up with a video stream.
There is an old demo app from Apple called ChromaKey that takes a CVPixelBuffer from Core Video and maps it directly into an OpenGL texture without having to copy the data. That makes performance MUCH better, and is the approach I would suggest.
I don't know if there is more current sample code available that shows how it's done. That code is back from the days of iOS 6, and was written in Objective-C. (I would suggest doing new iOS development in Swift, since that's where Apple is putting its emphasis.)
I have a confusion and look forward to some comments on this. I was assuming that WebView creates a separate surface to draw and does not use the default surface of the activity to draw. But, in surfaceflinger dump, I dont see a new surface getting created when using webview.
When I do the similar experimnet of using videoview, I see a separate surface getting created.
Onwebview also, I wanted to play a video, so was assuming a separate surface would be created and thereby the surface resolution would be as per video resolution. But if it uses application's surface, then the max resolution of the video has to be of the UI resolution.
In chromium code, I see the code for separate surface but then in practical I could not see one getting created.
Can someone help me to clarify this.
Thank You.
If you look at VideoView inheritance graph you'll notice that it inherits from SurfaceView, while WebView does not, so WebView can only achieve that by creating an external SurfaceView.
While if you search for usages of ExternalVideoSurface in WebView part of Chromium code, you will notice that it is only enabled if "video hole" is enabled, which is intended to be used only for decoding encrypted videos, where WebView needs to do "hole punching". There is a System API-level setting in WebView that enables this behaviour, but it has its own limitations, and thus not recommended to be used in general.
I am also curious webview is not in sufaceflinger dump.
I think the reason is webview also render to the related activity native window, so there is not another surface in this situation.
But the situation seems differs in the lastest Android and Webview version by developer's option.
One of the features of Android 4.4 (Kit Kat) is that it provides a way for developers to capture an MP4 video of the screen using adb shell screenrecord. Does Android 4.4 provide any new API's for applications to capture and encode video, or does it just provide the screenrecord utility/binary?
I ask because I would like to do some screen capture work in an application I'm writing. Before anyone asks, yes, the application would have framebuffer access. However, the only Android-provided capturing/encoding API that I've seen (MediaRecorder) seems to be limited to recording video from the device's camera.
The only screen capture solutions I've seen mentioned on StackOverfow seem to revolve around taking screenshots at a regular interval or using JNI to encode the framebuffer with a ported version of ffmpeg. Are there more elegant, native solutions?
The screenrecord utility uses private APIs, so you can't do exactly what it does.
The way it works is to create a virtual display, route the virtual display to a video encoder, and then save the output to a file. You can do essentially the same thing, but because you're not running as the "shell" user you'd only be able to see the layers you created. The relevant APIs are designed around creating a Presentation, which may not be exactly what you want.
See the source code for a CTS test with a trivial example (just uses an ImageView).
Of course, if you happen to be a GLES application, you can just record the output directly (see e.g. EncodeAndMuxTest and the "Record GL app" activity in Grafika).
Well, AFAIK, i don't see an API support equivalent to capturing what's going on the screen.
I'm making android game.(using andengine)
I need to record game play screen .
This is not for making promotion video, It is for game players to review their game play.
My app should record video by itself.
So I can't solve this problem using available recording app in market.
I already checked below code.
http://code.google.com/p/andengineexamples/source/browse/src/org/anddev/andengine/examples/ScreenCaptureExample.java?spec=svn66d3057f672175a8a21f9d06e7a045942331f65c&r=66d3057f672175a8a21f9d06e7a045942331f65c
It works very well..
But I want to record game play video, not a one screenshot.
At least I need 24fps for smooth replay, But If I use glreadpixels , I can get 5 fps at my xoom device.
I searched various websites to solve this optimization problem.
most people saying glreadplxels is too slow to record video.
http://www.gamedev.net/topic/473794-glreadpixel-takes-tooooo-much-time/
they recommend glcopyteximage2d instead of glreadpixels.
because glcopyteximage2d is much more faster than glreadpixels.
but I can't find how to use glcopyteximage2d in andengine.
even someone say that android opengl ES do not support glcopyteximage2d.
Maybe Another method exists to record smooth video.
It is read framebuffer of android device.
most of recording app in market using this method. but these app needs root permission to grab framebuffer.
I've read some news that android will be support capture screen from suface_flinger after gingerbread.
But I can't find out how to use framebuffer without root permission. T_T
These are my guessing solution.
use another opengl API which has better speed than glreadpixels.
find some android API can get framebuffer without root permission.
(Maybe I can access to android SURFACE_FLINGER ??)
draw another offscreen texture to record video.
But I don't know how to implement these methods.
Which approach is correct?
Do you have a example code to record video for android?
please help me to solve this problem.
If you know any other method, That will be helpful.
any help will be appreciated
Does the GPU vendor of your device support es3.0, if it does you can try to use PBO.
Here is a topic I you can refer to :Low readback performance with PBO , help !!!!!