I'm looking at the class MediaRecorder of the Android SDK, and I was wondering if it can be used to record a video made from a Surface.
Example: I want to record what I display on my surface (a video game?) into a file.
As I said in the title: I'm not looking to record anything from the camera.
I think it is possible by overriding most of the class, but I'd very much like some ideas...
Beside, I'm not sure how the Camera class is used in MediaRecorder, and what I should get from my Surface to replace it.
Thank you for your interest!
PS: I'm looking at the native code used my MediaRecorder to have some clue, maybe it will inspire someone else:
http://www.netmite.com/android/mydroid/frameworks/base/media/jni/
The ability to record from a Surface was added in Android Lollipop. Here is the documentation:
http://developer.android.com/about/versions/android-5.0.html#ScreenCapture
Android 4.3 (API 18) adds some new features that do exactly what you want. Specifically, the ability to provide data to MediaCodec from a Surface, and the ability to store the encoded video as a .mp4 file (through MediaMuxer).
Some sample code is available here, including a patch for a Breakout game that records the game as you play it.
This is unfortunately not possible at the Java layer. All the communication between the Camera and Media Recorder happens in the native code layers, and there's no way to inject non-Camera data into that pipeline.
While Android 4.1 added the Media Codec APIs, which allow access to the device's video encoders, there's no easy-to-use way to take the resulting encoded streams and save them as a standard video file. You'd have find a library to do that or write one yourself.
You MAY wish to trace the rabbit hole from a different folder in AOSP
frameworks/av/media
as long as comfortable with NDK (C/C++, JNI, ...) and Android (permissions, ...)
Goes quite deep and am not sure about how far you can go on a non-rooted device.
Here's an article on how to draw on a EGLSurface and generate a video using MediaCodec:
https://www.sisik.eu/blog/android/media/images-to-video
This uses OpenGL ES but you can have MediaCodec provide a surface that you can then obtain a canvas from to draw on. No need for OpenGL ES.
Related
Assuming we have a Surface in Android that displays a video (e.g. h264) with a MediaPlayer:
1) Is it possible to change the displayed saturation, contrast & brightness of the displayed on the surface video? In real time? E.g. Images can use setColorFilter is there anything similar in Android to process the video frames?
Alternative question (if no. 1 is too difficult):
2) If we would like to export this video with e.g. an increased saturation, we should use a Codec, e.g. MediaCodec. What technology (method, class, library, etc...) should we use before the codec/save action to apply the saturation change?
For display only, one easy approach is to use a GLSurfaceView, a SurfaceTexture to render the video frames, and a MediaPlayer. Prokash's answer links to an open source library that shows how to accomplish that. There are a number of other examples around if you search those terms together. Taking that route, you draw video frames to an OpenGL texture and create OpenGL shaders to manipulate how the texture is rendered. (I would suggest asking Prokash for further details and accepting his answer if this is enough to fill your requirements.)
Similarly, you could use the OpenGL tools with MediaCodec and MediaExtractor to decode video frames. The MediaCodec would be configured to output to a SurfaceTexture, so you would not need to do much more than code some boilerplate to get the output buffers rendered. The filtering process would be the same as with a MediaPlayer. There are a number of examples using MediaCodec as a decoder available, e.g. here and here. It should be fairly straightforward to substitute the TextureView or SurfaceView used in those examples with the GLSurfaceView of Prokash's example.
The advantage of this approach is that you have access to all the separate tracks in the media file. Because of that, you should be able to filter the video track with OpenGL and straight copy other tracks for export. You would use a MediaCodec in encode mode with the Surface from the GLSurfaceView as input and a MediaMuxer to put it all back together. You can see several relevant examples at BigFlake.
You can use a MediaCodec without a Surface to access decoded byte data directly and manipulate it that way. This example illustrates that approach. You can manipulate the data and send it to an encoder for export or render it as you see fit. There is some extra complexity in dealing with the raw byte data. Note that I like this example because it illustrates dealing with the audio and video tracks separately.
You can also use FFMpeg, either in native code or via one of the Java wrappers out there. This option is more geared towards export than immediate playback. See here or here for some libraries that attempt to make FFMpeg available to Java. They are basically wrappers around the command line interface. You would need to do some extra work to manage playback via FFMpeg, but it is definitely doable.
If you have questions, feel free to ask, and I will try to expound upon whatever option makes the most sense for your use case.
If you are using a player that support video filters then you can do that.
Example of such a player is VLC, which is built around FFMPEG [1].
VLC is pretty easy to compile for Android. Then all you need is the libvlc (aar file) and you can build your own app. See compile instructions here.
You will also need to write your own module. Just duplicate an existing one and modify it. Needless to say that VLC offers strong transcoding and streaming capabilities.
As powerful VLC for Android is, it has one huge drawback - video filters cannot work with hardware decoding (Android only). This means that the entire video processing is on the CPU.
Your other options are to use GLSL / OpenGL over surfaces like GLSurfaceView and TextureView. This guaranty GPU power.
I am a newbie in OpenGL.
I want to record video and audio from a GLSurfaceView and export it to .mp4 (or other formats).
I have a GlsurfaceView that implement Renderer
I've tried using fadden examples's in bigflake.com like EncodeAndMuxTest.java ,or RecordFBOActivity.java in google/grafika but without success because I don't know how to implement it.
Is there any example or "How-to" for recording a GLSurfaceView?
You can try to use INDE Media for Mobile: https://software.intel.com/en-us/articles/intel-inde-media-pack-for-android-tutorials GLCapturer class, it allows to make opengl capturing in a few lines of code, samples are here:
https://github.com/INDExOS/media-for-mobile/blob/master/samples/src/main/java/org/m4m/samples/GameRenderer.java
synchronized (videoCapture) {
if (videoCapture.beginCaptureFrame()) {
...
renderScene();
videoCapture.endCaptureFrame();
}
}
You can give the Android Breakout patch a try. It adds game recording to Android Breakout.
The main difference when working with GLSurfaceView, rather than SurfaceView, is that GLSurfaceView wants to manage its own EGL context. This requires you to create a second context that shares data with GLSurfaceView's context. It's a bit awkward to manage, but is doable.
You may want to consider switching from GLSurfaceView to SurfaceView. This requires you to do your own EGL setup and thread handling, but you can find examples of both in Grafika. It's a bit more work to get things set up, but it makes fancy stuff like video recording easier. Of course, if you're using a game or graphics engine that requires GLSurfaceView, that won't be an option.
I'm developing an app for applying effects to the camera image in real-time. Currently I'm using the MediaMuxer class in combination with MediaCodec. Those classes were implemented with Android 4.3.
Now I wanted to redesign my app and make it compatible for more devices. The only thing I found in the internet was a combination of FFmpeg and OpenCV, but I read that the framerate is not very well if I want to use a high resolution. Is there any possibility to encode video in real-time while capturing the camera image without using MediaMuxer and MediaCodec?
PS: I'm using GLSurfaceView for OpenGL fragment shader effects. So this is a must-have.
Real-time encoding of large frames at a moderate frame rate is not going to happen with software codecs.
MediaCodec was introduced in 4.1, so you can still take advantage of hardware-accelerated compression so long as you can deal with the various problems. You'd still need an alternative to MediaMuxer if you want a .mp4 file at the end.
Some commercial game recorders, such as Kamcord and Everyplay, claim to work on Android 4.1+. So it's technically possible, though I don't know if they used non-public APIs to feed surfaces directly into the video encoder.
In pre-Jellybean Android it only gets harder.
(For anyone interested in recording GL in >= 4.3, see EncodeAndMuxTest or Grafika's "Record GL app".)
I am working on a project where we need to record the rendered OpenGL surface. (for example if we use GLsurfaceView, we need to record the surface along with the audio from the MIC)
Presently I am using MediaRecorder API by setting the video source as the VIDEO_SOURCE_GRALLOC_BUFFER.
I am using the following sample as the base code
I wanted to know ....
Is this the right way? . Is there any better alternate ?
The sample test given in the link is recording the audio and video of the EGLSURFACE but it is not displayed properly.
What might be the reason? Any help/pointers is really appreciated.
thanks,
Satish
The code you reference isn't a sample, but rather internal test code that exercises a non-public interface. SurfaceMediaSource could change or disappear in a future release and break your code.
Update:
Android 4.3 (API 18) allows Surface input to MediaCodec. The EncodeAndMuxTest sample demonstrates recording OpenGL ES frames to a .mp4 file.
The MediaRecorder class doesn't take Surface input, so in your case you'd need to record the audio separately and then combine it with the new MediaMuxer class.
Update #2:
Android 5.0 (API 21) allows Surface input to MediaRecorder, which is often much more convenient than MediaCodec. If you neeed to use MediaCodec, there is an example showing three different ways of recording OpenGL ES output with it in Grafika's "record GL activity".
The MediaProjection class can also be useful for screen recording.
Is it possible and if so "how" would one create a "fake" Camera in an Android application. By "fake" I mean an all software creation that simply looks like a regular Camera to the OS but in actuality takes a Bitmap or byte array as its input data. I want to use such a device with a MediaRecorder to create h.264 videos.
Things this could be used for:
Image slideshow video creation
Screen capture to video file
Caveats: No rooting and no ROM modification
I think what you are looking for is a way to encode videos to H.264 in a way similar to what MediaRecorder does but not from the camera. You do not particularly care whether this is done with a "fake camera" or in some other way, correct? In that case...
You can use the MediaCodec API available in Android 4.1 and later. You can just give it a series of images and it will create video encoded with (where available) the hardware encoder. Some sample code: Create video from screen grabs in android and Encoding H.264 from camera with Android MediaCodec
If you are expecting to affect other apps with your "fake Camera", that is only possible by modifying the Android source code and rolling your own ROM mod.
Yes,you can!
No rooting and no ROM modification,the best way to do this is to build a virtual app that runs the other app as a plugin,so that you can modify anything in the target app. But there is so much work to do, the best news is that there are several open source projects to do this.
And so, the next thing is not so difficult,you only have to hook several libs so in /system/lib that affect the camera recording.
In fact, I have done this on my device, but I modified the system lib directly, it has to be rooted of course. But it works well on almost all apps except some apps that use the service to capture video.
We have to modify the service lib, but it is a little more difficult.