how to play video using opengl es in android?
Your question is a bit vague. If all you want to do is to play a video in a GL surface, it's possible. See the Media Player sample code that Dave was referring to in his sample. All you have to do there is to replace the SurfaceView with a GLSurfaceView in both the MediaPlayerDemo_Video.java file as well as in the corresponding layout file (mediaplayer_2.xml).
Also you need to create a custom Renderer class (one that implements the GLSurfaceView.Renderer interface) and set it to your GLSurfaceView.
After you do all this, you will have your video playing on a GL surface, but that would be all.
If, on the other hand, you want to manipulate the video, i.e. to draw the video frames in a GL texture and add effects to it or transform it (for example scale, rotate, etc.), then I'm afraid this can't be done. The reason is that you don't have access to the raw video frames in your application.
I'm not sure why you're metioning OpenGL here, but probably the easiest way to play video is using the VideoView class. You'll want to have a look at the MediaPlayer class too.
You might find the Audio and Video page in the Android Developer Documentation helpful.
There's VideoView sample code and MediaPlayer sample code in the API demos provided with the Android SDK.
Related
In a NativeScript application, I'm trying to apply a real-time effect to an applied video (specifically, grayscale video playback). I am currently playing the video in app using a fork of the nativescript-videoplayer nativescript plugin. The relevant source code that creates the video view in android is here:
https://github.com/nstudio/nativescript-videoplayer/blob/master/src/videoplayer.android.ts
In essence, I want to modify it to apply an effect / shader, similar to the answer given here:
https://stackoverflow.com/a/31958741/192694
However, I'm not sure where this shader code would hook up to my existing stream of MediaPlayer creation and setting its SurfaceTexture surface.
The original video player plugin uses a TextureView on MediaPlayer instance, instead you may use the GLSurfaceView as showcased in the other SO example #Bill linked.
I am a newbie in OpenGL.
I want to record video and audio from a GLSurfaceView and export it to .mp4 (or other formats).
I have a GlsurfaceView that implement Renderer
I've tried using fadden examples's in bigflake.com like EncodeAndMuxTest.java ,or RecordFBOActivity.java in google/grafika but without success because I don't know how to implement it.
Is there any example or "How-to" for recording a GLSurfaceView?
You can try to use INDE Media for Mobile: https://software.intel.com/en-us/articles/intel-inde-media-pack-for-android-tutorials GLCapturer class, it allows to make opengl capturing in a few lines of code, samples are here:
https://github.com/INDExOS/media-for-mobile/blob/master/samples/src/main/java/org/m4m/samples/GameRenderer.java
synchronized (videoCapture) {
if (videoCapture.beginCaptureFrame()) {
...
renderScene();
videoCapture.endCaptureFrame();
}
}
You can give the Android Breakout patch a try. It adds game recording to Android Breakout.
The main difference when working with GLSurfaceView, rather than SurfaceView, is that GLSurfaceView wants to manage its own EGL context. This requires you to create a second context that shares data with GLSurfaceView's context. It's a bit awkward to manage, but is doable.
You may want to consider switching from GLSurfaceView to SurfaceView. This requires you to do your own EGL setup and thread handling, but you can find examples of both in Grafika. It's a bit more work to get things set up, but it makes fancy stuff like video recording easier. Of course, if you're using a game or graphics engine that requires GLSurfaceView, that won't be an option.
I am working on a project where we need to record the rendered OpenGL surface. (for example if we use GLsurfaceView, we need to record the surface along with the audio from the MIC)
Presently I am using MediaRecorder API by setting the video source as the VIDEO_SOURCE_GRALLOC_BUFFER.
I am using the following sample as the base code
I wanted to know ....
Is this the right way? . Is there any better alternate ?
The sample test given in the link is recording the audio and video of the EGLSURFACE but it is not displayed properly.
What might be the reason? Any help/pointers is really appreciated.
thanks,
Satish
The code you reference isn't a sample, but rather internal test code that exercises a non-public interface. SurfaceMediaSource could change or disappear in a future release and break your code.
Update:
Android 4.3 (API 18) allows Surface input to MediaCodec. The EncodeAndMuxTest sample demonstrates recording OpenGL ES frames to a .mp4 file.
The MediaRecorder class doesn't take Surface input, so in your case you'd need to record the audio separately and then combine it with the new MediaMuxer class.
Update #2:
Android 5.0 (API 21) allows Surface input to MediaRecorder, which is often much more convenient than MediaCodec. If you neeed to use MediaCodec, there is an example showing three different ways of recording OpenGL ES output with it in Grafika's "record GL activity".
The MediaProjection class can also be useful for screen recording.
I'm looking at the class MediaRecorder of the Android SDK, and I was wondering if it can be used to record a video made from a Surface.
Example: I want to record what I display on my surface (a video game?) into a file.
As I said in the title: I'm not looking to record anything from the camera.
I think it is possible by overriding most of the class, but I'd very much like some ideas...
Beside, I'm not sure how the Camera class is used in MediaRecorder, and what I should get from my Surface to replace it.
Thank you for your interest!
PS: I'm looking at the native code used my MediaRecorder to have some clue, maybe it will inspire someone else:
http://www.netmite.com/android/mydroid/frameworks/base/media/jni/
The ability to record from a Surface was added in Android Lollipop. Here is the documentation:
http://developer.android.com/about/versions/android-5.0.html#ScreenCapture
Android 4.3 (API 18) adds some new features that do exactly what you want. Specifically, the ability to provide data to MediaCodec from a Surface, and the ability to store the encoded video as a .mp4 file (through MediaMuxer).
Some sample code is available here, including a patch for a Breakout game that records the game as you play it.
This is unfortunately not possible at the Java layer. All the communication between the Camera and Media Recorder happens in the native code layers, and there's no way to inject non-Camera data into that pipeline.
While Android 4.1 added the Media Codec APIs, which allow access to the device's video encoders, there's no easy-to-use way to take the resulting encoded streams and save them as a standard video file. You'd have find a library to do that or write one yourself.
You MAY wish to trace the rabbit hole from a different folder in AOSP
frameworks/av/media
as long as comfortable with NDK (C/C++, JNI, ...) and Android (permissions, ...)
Goes quite deep and am not sure about how far you can go on a non-rooted device.
Here's an article on how to draw on a EGLSurface and generate a video using MediaCodec:
https://www.sisik.eu/blog/android/media/images-to-video
This uses OpenGL ES but you can have MediaCodec provide a surface that you can then obtain a canvas from to draw on. No need for OpenGL ES.
I am developing an android application in which a specific video is played when the poster of a specific movie is shown infront of the camera in android, i found many AR tutorial just show 3D object when detect a pattern, i need some advice to make application that can play video with AR application using android camera and qcar SDK
I don't know qcar, but of course you can put a SurfaceView upon an existing SurfaceView from your CameraPreview and play a video in it.
What I would do is implementing your poster-recognition logic, this will be tricky enough and then, If it works reliable, I would add another SurfaceView in which the video is played.
You will find out, that the CameraPreview surface has actually to be on top of the video surface.
I assume you are trying to play video on glsurface ,it is possible by using surface video texture ,but only ics(4.0) and above devices support surface video texture.Vuforia have a very good sample of this here,you can handle what to do for lower devices such as playing video fullscreen etc..