Processing Android video frame by frame while recording - android

What I am attempting to do is create an application that adds effects to videos while recording. Is there any way to have a callback method receive the frame, then apply an effect to it and have that recorded. There is currently an application on the Android Market (Videocam Illusion) that claims it is the only application that can do this. Anyone know how Videocam Illusion does this or have some links to possible tutorials outlining video processing for Android?
This is a similar question that is unanswered:
Android preview processing while video recording

Unfortunately, (unless I'm unaware of some other method provided by the API) the way this is done is using a direct stream to the camera and manipulating it by using some sort of Native Code to modify the stream. I've done something similar to this before when I was working on an eyetracker - So I'll tell you how it works basically.
Open a stream using the NDK (possibly api, depending on implementations)
Modify the bytes of the stream - each frame is sent as a separate packet. You have to grab each packet from the camera, and modify it. You can do a replace of colors, or you can translate. You can also use OpenGL to modify the image entirely by adding things like glass effects.
Flatten the images back out
send the image over to the view controller to be displayed.
One thing that you have to be mindful of is the load and send of the packets & images happen in about 1/30th of a second for each frame. So the code has to be extremely optimized.

Related

Exoplayer 2: Play video in reverse

My android app plays videos in Exoplayer 2, and now I'd like to play a video backwards.
I searched around a lot and found only the idea to convert it to a gif and this from WeiChungChang.
Is there any more straight-forward solution? Another player or a library that implements this for me is probably too much to ask, but converting it to a reverse gif gave me a lot of memory problems and I don't know what to do with the WeiChungChang idea. Playing only mp4 in reverse would be enough tho.
Videos are frequently encoded such that the encoding for a given frame is dependent on one or more frames before it, and also sometimes dependent on one or more frames after it also.
In other words to create the frame correctly you may need to refer to one or more previous and one or more subsequent frames.
This allows a video encoder reduce file or transmission size by encoding fully the information for every reference frame, sometimes called I frames, but for the frames before and/or after the reference frames only storing the delta to the reference frames.
Playing a video backwards is not a common player function and the player would typically have to decode the video as usual (i.e. forwards) to get the frames and then play them in the reverse order.
You could extend ExoPlayer to do this yourself but it may be easier to manipulate the video on the server side if possible first - there exist tools which will reverse a video and then your players will be able to play it as normal, for example https://www.videoreverser.com, https://www.kapwing.com/tools/reverse-video etc
If you need to reverse it on the device for your use case, then you could use ffmpeg on the device to achieve this - see an example ffmpeg command to do this here:
https://video.stackexchange.com/a/17739
If you are using ffmpeg it is generally easiest to use via a wrapper on Android such as this one, which will also allow you test the command before you add it to your app:
https://github.com/WritingMinds/ffmpeg-android-java
Note that video manipulation is time and processor hungry so this may be slow and consume more battery than you want on your mobile device if the video is long.

Editing video frames in exo player

I have an encoded video stream that I'm playing through exoplayer. What I want to do is get each frame of the video and edit it before it is displayed (e.g. changing some pixels).
Is it possible to do this with exoplayer? I've been looking at the implementation of MediaCodecVideoRenderer.java in the exoplayer source, but it seems that each MediaCodec releases its output buffer to a surface itself, without possibility of editing the frame before rendering.
It will depend on exactly what you want to modify, but it is possible to use a GLSurface view and listen for each frame and then transform the frame, assuming it is not encrypted (with encrypted you van usually still apply transformation bit you definitely should not be able to read the frame itself).
There is a good example project which does something similar to apply filters to videos, extending ExoPlayer - take a look at the EPlayerRenderer class in particular.
https://github.com/MasayukiSuda/ExoPlayerFilter
You can also do a similar thing with openCV - read in a frame, modify it and then display it. This may be easier if you are doing compilacted image manipulations.

Video in Android : change visual properties (e.g. saturation, brightness)

Assuming we have a Surface in Android that displays a video (e.g. h264) with a MediaPlayer:
1) Is it possible to change the displayed saturation, contrast & brightness of the displayed on the surface video? In real time? E.g. Images can use setColorFilter is there anything similar in Android to process the video frames?
Alternative question (if no. 1 is too difficult):
2) If we would like to export this video with e.g. an increased saturation, we should use a Codec, e.g. MediaCodec. What technology (method, class, library, etc...) should we use before the codec/save action to apply the saturation change?
For display only, one easy approach is to use a GLSurfaceView, a SurfaceTexture to render the video frames, and a MediaPlayer. Prokash's answer links to an open source library that shows how to accomplish that. There are a number of other examples around if you search those terms together. Taking that route, you draw video frames to an OpenGL texture and create OpenGL shaders to manipulate how the texture is rendered. (I would suggest asking Prokash for further details and accepting his answer if this is enough to fill your requirements.)
Similarly, you could use the OpenGL tools with MediaCodec and MediaExtractor to decode video frames. The MediaCodec would be configured to output to a SurfaceTexture, so you would not need to do much more than code some boilerplate to get the output buffers rendered. The filtering process would be the same as with a MediaPlayer. There are a number of examples using MediaCodec as a decoder available, e.g. here and here. It should be fairly straightforward to substitute the TextureView or SurfaceView used in those examples with the GLSurfaceView of Prokash's example.
The advantage of this approach is that you have access to all the separate tracks in the media file. Because of that, you should be able to filter the video track with OpenGL and straight copy other tracks for export. You would use a MediaCodec in encode mode with the Surface from the GLSurfaceView as input and a MediaMuxer to put it all back together. You can see several relevant examples at BigFlake.
You can use a MediaCodec without a Surface to access decoded byte data directly and manipulate it that way. This example illustrates that approach. You can manipulate the data and send it to an encoder for export or render it as you see fit. There is some extra complexity in dealing with the raw byte data. Note that I like this example because it illustrates dealing with the audio and video tracks separately.
You can also use FFMpeg, either in native code or via one of the Java wrappers out there. This option is more geared towards export than immediate playback. See here or here for some libraries that attempt to make FFMpeg available to Java. They are basically wrappers around the command line interface. You would need to do some extra work to manage playback via FFMpeg, but it is definitely doable.
If you have questions, feel free to ask, and I will try to expound upon whatever option makes the most sense for your use case.
If you are using a player that support video filters then you can do that.
Example of such a player is VLC, which is built around FFMPEG [1].
VLC is pretty easy to compile for Android. Then all you need is the libvlc (aar file) and you can build your own app. See compile instructions here.
You will also need to write your own module. Just duplicate an existing one and modify it. Needless to say that VLC offers strong transcoding and streaming capabilities.
As powerful VLC for Android is, it has one huge drawback - video filters cannot work with hardware decoding (Android only). This means that the entire video processing is on the CPU.
Your other options are to use GLSL / OpenGL over surfaces like GLSurfaceView and TextureView. This guaranty GPU power.

Android Reverse Playback and Frame by Frame

I am trying to create an app with the following features:
normal video playback
slower video playback
frame by frame
reverse video playback (normal, slower, frame by frame)
seekable to specific times
video scrubbing
no video sound needed
video is recorded via the device's camera
The closest comparison to an app, would be the Ubersense Coach app for iOS and Coach's Eye on Android, though there are a few others, and they have all these features.
I have looked into several choices so far:
First the built in Android Media Player, which can't really do anything I need.
Then the MediaExtractor/decoder, looking through the code for Grafika (https://github.com/google/grafika), which can't play backwards.
Tried to pull out each frame as needed with the MediaMetadataRetriever, which is too slow (100ms per frame) and the GC is in the way.
Looked for a library that could potentially solve the issue, without luck so far.
With MediaExtractor I already have the ability to play back video, forward frame by frame or full speed. But I do not have that luxury in reverse, and the seeking does take some time since I need it without artifacts.
When trying to go in reverse, even trying to seek to a previous sync and advancing to the frame, before the one I currently had, it is not doable without huge lag (as expected).
I am wondering if there is a better codec I could use, or a library I have yet to stumble upon. And would rather avoid having to create something custom in native code if possible.
Thanks in advance
Michael

How to stream live webcam video in AndEngine?

Hi I'm using the game engine AndEngine, and I want to be able to stream live video from a webcam on a robot to my Android app. The reason I'm using AE is because I need game controls that control my robot. However, I have no idea how to stream video when using AndEngine (or even when not using it, for that matter). The controls and video feed need to be in the same screen (unless there's absolutely no other way). My question is how would one put a video stream over-top an AndEngine scene, and/or how would one format that feed so that it didn't obscure the controls? (they're oriented in the bottom left and top right of the screen, which is a pain I know, but I don't think I can change it due to some problems with multi-touch on my device).
Thanks.
Look at the Augumentged Reality example at GitHub.
https://github.com/nicolasgramlich/AndEngineExamples
It could be of use to you. However, I know that this example was problematic and didn't work when I tried it, but maybe you'll have more luck.

Categories

Resources