encoding images to video with ffpmeg - android

We are working on Android 3D Animation App.
We need to identify images, then save and encode the same to video using FFmpeg (Since Android API is not supporting). Once the video is generated, then audio is appended to the same.
We are facing 2 problems on this.
First is the memory leakage issue at the time of saving identified images for encoding. CPU of emulator is getting overloaded. Whether FFpmeg is called every time when an image is selected? How to resolve this issue?
Second (in case if we get through the first one) we are not able to encode the selected images, since this is generating green color video. What could be reason for this?
Whether is there any tool other than FFmpeg for video encoding from images to H264?
Whether images version (Rastar or Vector) will impact this video encoding?
Whether Android OS version is considered?
Any valuable inputs on this will be greatly appreciated.
Thanks

I played also with that idea using ffmpeg on an android phone, but I would suggest to do that on a server which has much more power. On a server you don't need to think about the cpu load of a smartphone.
In general for improving your ffmpeg run you need to publish the ffmpeg calls. ffmpeg is quiet complex where the order of the parameters directly correlates with the efficience.
I don't know which container format you preferer but maybe a simple mjpeg codec could work for you. AFIK there a just the jpeg frames connected to each other which should be much simple then encoding a video to h264/x264 (ffmpeg uses the last one).
A combination of both may be to generate a mjpeg stream which will be converted on the server side to a h264 video which may be downloaded to the client. but that really depends on the length of the video if you don't want to waste the traffic of your customers.

Related

Exoplayer 2: Play video in reverse

My android app plays videos in Exoplayer 2, and now I'd like to play a video backwards.
I searched around a lot and found only the idea to convert it to a gif and this from WeiChungChang.
Is there any more straight-forward solution? Another player or a library that implements this for me is probably too much to ask, but converting it to a reverse gif gave me a lot of memory problems and I don't know what to do with the WeiChungChang idea. Playing only mp4 in reverse would be enough tho.
Videos are frequently encoded such that the encoding for a given frame is dependent on one or more frames before it, and also sometimes dependent on one or more frames after it also.
In other words to create the frame correctly you may need to refer to one or more previous and one or more subsequent frames.
This allows a video encoder reduce file or transmission size by encoding fully the information for every reference frame, sometimes called I frames, but for the frames before and/or after the reference frames only storing the delta to the reference frames.
Playing a video backwards is not a common player function and the player would typically have to decode the video as usual (i.e. forwards) to get the frames and then play them in the reverse order.
You could extend ExoPlayer to do this yourself but it may be easier to manipulate the video on the server side if possible first - there exist tools which will reverse a video and then your players will be able to play it as normal, for example https://www.videoreverser.com, https://www.kapwing.com/tools/reverse-video etc
If you need to reverse it on the device for your use case, then you could use ffmpeg on the device to achieve this - see an example ffmpeg command to do this here:
https://video.stackexchange.com/a/17739
If you are using ffmpeg it is generally easiest to use via a wrapper on Android such as this one, which will also allow you test the command before you add it to your app:
https://github.com/WritingMinds/ffmpeg-android-java
Note that video manipulation is time and processor hungry so this may be slow and consume more battery than you want on your mobile device if the video is long.

Images to Video converter in Android

I would like to convert multiple images(frames) to a video(MP4) in an android device. Also, I would like to convert video(MP4) into multiple images(for each frame). I have limited knowledge on FFMPEG, but installing FFMPEG in Android may consume more time. I Would like to ask experienced engineers to suggest a better strategy which can take less time to complete this task. Please point me to some open source code which I may modify to complete this task quickly.
First you need to convert Image to YUV format using Image Decoder.
Then you can feed each YUV Image as a Video Input to Media Recorder Engine.
Go through the Media Recorder Source Code to get more info.

How to record the http live streaming from an IP Cam

I have created the application in which the client can view the ip camera which is giving
an http live stream of MJPEG using this link
Android ICS and MJPEG using AsyncTask
Now i want the user to record the video into its memory card .
I have googled for a while and the only two approaches which came in my mind :-
Either i keep storing the jpeg images and when user clicks stop recording then i
somehow clip all the images as to provide a 3GP video or some other file format.
But i don't know how to create the video from all the images and will this be an efficient
approach or not.
Or i do ffmpeg and in this case i will have to deal with NDK and it seems to be a longer
path which may lead to nowhere :P
So is FFMPEG a better option? If yes please share some links or is the first option better.
Thanks in advance
FFmpeg is the better option, but you'll probably get stuck with a pretty poor encoding resolution/compression. Maybe some low quality MPEG-4 like xvid will work, but even that might require too high of performance from the CPU.
Android doesn't have an API to access the video encoder logic in the SoC, so a native implementation is pretty much your only choice. If so, FFmpeg through NDK is probably the easiest.

Android: accessing the images that make up a video

I'm making an app that takes a video and does some computation on the video. I need to carry out this computation on individual frames of the video. So, I have two questions:
Are videos in Android stored as a sequence of pictures? (I've seen a lot of Android devices that advertise having 25-30 fps cameras) If yes, can I, as a developer, get access to these frames that make up a video and how so?
If not, is there any way for me to generate at least 15-20 distinct frames per second from a video taken on an android device? (and of course, do the computation on those frames generated)
Videos are stored as videos. To manipulate frames one can use FFMPEG library. There are FFMPEG ports to Android such as in Dolphin opensource player. This would require C/C++ programming with NDK though.

Example encoding video using FFMPEG for Android

I am working on an Android app, that needs to do the following:
- capture a (animated) view to video including audio (from a mp3 file)
- encode the captured video (probably a bunch of raw image buffers) and audio to avi.
After searching, FFMPEG seems the most suitable. Does anybody have a sample code to accomplish what I need. I would really appreciate.
Whyhow
It's not clear what you mean by 'a (animated) view' to capture, but be aware that android apps running with normal permissions cannot access the raw framebuffer. The computation part of ffmpeg builds in the ndk without undue work and there's a lot you can read about on the web, but the output (or in your case input) drivers are a bit of a permissions problem. Also you should expect encoding to be much slower than real time unless you can somehow manage to leverage hardware acceleration features of your particular device's SOC.
if u are building your app for android then u can use .avi writer code. You can get this code from "Koders website". Search for "Koders site" on google .you will get the link. I have tested the .avi file writer code and its working fine.

Categories

Resources