Video in Android : change visual properties (e.g. saturation, brightness) - android

Assuming we have a Surface in Android that displays a video (e.g. h264) with a MediaPlayer:
1) Is it possible to change the displayed saturation, contrast & brightness of the displayed on the surface video? In real time? E.g. Images can use setColorFilter is there anything similar in Android to process the video frames?
Alternative question (if no. 1 is too difficult):
2) If we would like to export this video with e.g. an increased saturation, we should use a Codec, e.g. MediaCodec. What technology (method, class, library, etc...) should we use before the codec/save action to apply the saturation change?

For display only, one easy approach is to use a GLSurfaceView, a SurfaceTexture to render the video frames, and a MediaPlayer. Prokash's answer links to an open source library that shows how to accomplish that. There are a number of other examples around if you search those terms together. Taking that route, you draw video frames to an OpenGL texture and create OpenGL shaders to manipulate how the texture is rendered. (I would suggest asking Prokash for further details and accepting his answer if this is enough to fill your requirements.)
Similarly, you could use the OpenGL tools with MediaCodec and MediaExtractor to decode video frames. The MediaCodec would be configured to output to a SurfaceTexture, so you would not need to do much more than code some boilerplate to get the output buffers rendered. The filtering process would be the same as with a MediaPlayer. There are a number of examples using MediaCodec as a decoder available, e.g. here and here. It should be fairly straightforward to substitute the TextureView or SurfaceView used in those examples with the GLSurfaceView of Prokash's example.
The advantage of this approach is that you have access to all the separate tracks in the media file. Because of that, you should be able to filter the video track with OpenGL and straight copy other tracks for export. You would use a MediaCodec in encode mode with the Surface from the GLSurfaceView as input and a MediaMuxer to put it all back together. You can see several relevant examples at BigFlake.
You can use a MediaCodec without a Surface to access decoded byte data directly and manipulate it that way. This example illustrates that approach. You can manipulate the data and send it to an encoder for export or render it as you see fit. There is some extra complexity in dealing with the raw byte data. Note that I like this example because it illustrates dealing with the audio and video tracks separately.
You can also use FFMpeg, either in native code or via one of the Java wrappers out there. This option is more geared towards export than immediate playback. See here or here for some libraries that attempt to make FFMpeg available to Java. They are basically wrappers around the command line interface. You would need to do some extra work to manage playback via FFMpeg, but it is definitely doable.
If you have questions, feel free to ask, and I will try to expound upon whatever option makes the most sense for your use case.

If you are using a player that support video filters then you can do that.
Example of such a player is VLC, which is built around FFMPEG [1].
VLC is pretty easy to compile for Android. Then all you need is the libvlc (aar file) and you can build your own app. See compile instructions here.
You will also need to write your own module. Just duplicate an existing one and modify it. Needless to say that VLC offers strong transcoding and streaming capabilities.
As powerful VLC for Android is, it has one huge drawback - video filters cannot work with hardware decoding (Android only). This means that the entire video processing is on the CPU.
Your other options are to use GLSL / OpenGL over surfaces like GLSurfaceView and TextureView. This guaranty GPU power.

Related

Is it possible to render frames in Exoplayer?

I am pulling h264 and AAC frames and at the moment I am feeding them to MediaCodec, decoding and rendering them myself, but the code is getting too complicated and I need to cover all cases. I was thinking if it's possible to set up an Exoplayer instance and feed them as a source.
I can only find that it supports normal files and streams, but not separate frames? Do I need to mux the frames myself, and if so is there an easy way to do it?
If you mean that you are extracting frames from a video file or a live stream, and then want to work on them individually or display them individually, you may find that OpenCV would suit your use case.
You can fairly simply open a stream or file, go frame by frame and do what you want with the resulting decoded bitmap.
This answer has a Python and Android example that might be useful: https://stackoverflow.com/a/58921325/334402

Editing video frames in exo player

I have an encoded video stream that I'm playing through exoplayer. What I want to do is get each frame of the video and edit it before it is displayed (e.g. changing some pixels).
Is it possible to do this with exoplayer? I've been looking at the implementation of MediaCodecVideoRenderer.java in the exoplayer source, but it seems that each MediaCodec releases its output buffer to a surface itself, without possibility of editing the frame before rendering.
It will depend on exactly what you want to modify, but it is possible to use a GLSurface view and listen for each frame and then transform the frame, assuming it is not encrypted (with encrypted you van usually still apply transformation bit you definitely should not be able to read the frame itself).
There is a good example project which does something similar to apply filters to videos, extending ExoPlayer - take a look at the EPlayerRenderer class in particular.
https://github.com/MasayukiSuda/ExoPlayerFilter
You can also do a similar thing with openCV - read in a frame, modify it and then display it. This may be easier if you are doing compilacted image manipulations.

Sniff H.264 raw stream at the video decoder abstract layer

I'm trying to make a H.264 sniffer for my android distribution.
Basically, what I want is to just dump any H.264 stream that is passed from a lambda android application by intercepting it on its way to the hardware video decoder.
I have an odroid-c1 board (AmLogic S805 SoC) and my android build setup is ready. Now, what I need to know is where is the code called when a new H.264 frame is ready to be sent to decoding. Surely there must be a common place ?
When searching the build tree for files referencing H.264 or OMX, I have various results including libstagefright/omx/, ffmpeg/libavcodec/, LibPlayer/amffmpeg/ (Amlogic's own fork of FFmpeg) [..].
If you have any idea or name of functions that are part of the video decoding path, I'll take them :). Thanks !
As far as I know the Amlogic CPU contains two DSPs for Audio and Video decoding. Data is delivered to them through the amports driver in kernel space.
The userspace part of this driver is the libamcodec which provides a thin layer to this driver.
I do not know what kind of layers in Android are invloded but most probably they use Amlogic ffmpeg which is using libamplayer as middle layer which calls in the end libamcodec....

Android MediaRecorder to record a Surface (not Camera)

I'm looking at the class MediaRecorder of the Android SDK, and I was wondering if it can be used to record a video made from a Surface.
Example: I want to record what I display on my surface (a video game?) into a file.
As I said in the title: I'm not looking to record anything from the camera.
I think it is possible by overriding most of the class, but I'd very much like some ideas...
Beside, I'm not sure how the Camera class is used in MediaRecorder, and what I should get from my Surface to replace it.
Thank you for your interest!
PS: I'm looking at the native code used my MediaRecorder to have some clue, maybe it will inspire someone else:
http://www.netmite.com/android/mydroid/frameworks/base/media/jni/
The ability to record from a Surface was added in Android Lollipop. Here is the documentation:
http://developer.android.com/about/versions/android-5.0.html#ScreenCapture
Android 4.3 (API 18) adds some new features that do exactly what you want. Specifically, the ability to provide data to MediaCodec from a Surface, and the ability to store the encoded video as a .mp4 file (through MediaMuxer).
Some sample code is available here, including a patch for a Breakout game that records the game as you play it.
This is unfortunately not possible at the Java layer. All the communication between the Camera and Media Recorder happens in the native code layers, and there's no way to inject non-Camera data into that pipeline.
While Android 4.1 added the Media Codec APIs, which allow access to the device's video encoders, there's no easy-to-use way to take the resulting encoded streams and save them as a standard video file. You'd have find a library to do that or write one yourself.
You MAY wish to trace the rabbit hole from a different folder in AOSP
frameworks/av/media
as long as comfortable with NDK (C/C++, JNI, ...) and Android (permissions, ...)
Goes quite deep and am not sure about how far you can go on a non-rooted device.
Here's an article on how to draw on a EGLSurface and generate a video using MediaCodec:
https://www.sisik.eu/blog/android/media/images-to-video
This uses OpenGL ES but you can have MediaCodec provide a surface that you can then obtain a canvas from to draw on. No need for OpenGL ES.

How to render video and audio

I am trying to implement my own media player. What is the best way to render video and audio? At this point I am thinking to use SurfaceView and AudioTrack classes, but not sure if it is the best option. I am interested in SDK and NDK solutions.
File output on regular desktop is non-blocking, that is OS takes care of buffering and actual disk writes are asynchronous to the thread that initiates the output. Does the same principle apply to video and audio output? If not, I would need to run a separate thread to handle output asynchronously from decoding/demuxing.
What free software decoders are available for android? I am thinking to use ffmpeg. Can relatively recent (say, top 30% in terms of CPU power) tablet handle 1,280×720 and 1,920×1,080 formats in software mode?
Rock Player is an open source player for android (it's official site). You can get the source from it's source code download page. They use the ffmpeg which is a LGPL library. Pock Player developers do extra efforts to write some asm making decoding faster.

Categories

Resources