We need an Android app that can encode a folder of images to a video. I have been looking for solutions a while now, but cannot find anything good. The Android API does not support it. We are trying ffmpeg, but cannot get it to work. We need a working solution, using ffmpeg is not mandatory. A full Android Java solution is also a possibility, since this would work on all Android devices, possibly at the cost of some performance.
The app also needs to be able to add an audio track to the movie if the user chooses to do this.
Any help would be appreciated.
Kind regards,
AƤron
From the FFmpeg FAQ entry "How do I encode single pictures into movies?":
First, rename your pictures to follow a numerical sequence. For example, img1.jpg, img2.jpg, img3.jpg,... Then you may run:
ffmpeg -f image2 -i img%d.jpg /tmp/a.mpg
Adding an audio track should just involve add another input (e.g., -i audio.mp3), but could also require explicit -maping with older versions.
Related
I'm working on a feature in which I want to add picture over the video and save it to sd card.
in general, the user selects an image with semi-transparent background and puts that image above the video, after the user presses the save button he gets a new video but already with the image above the video.
I have heard about ffmpeg, and saw some commands that are provided by ffmpeg. but I don't know where I should initialize. can anyone provide me an example for the same?
Thank you.
One common approach is to use an ffmpeg wrapper to access ffmpeg functionality from your Android app.
There are several fairly well used wrappers available on GitHub - the ones below are particularly well featured and documented (note, I have not used these as they were not so mature when I was looking at this previously, but if I was doing something like this again now I would definitely build on one of these):
http://writingminds.github.io/ffmpeg-android-java/
https://github.com/guardianproject/android-ffmpeg
Using one of the well supported and used libraries will take care of some common issues that you might otherwise encounter - having to load different binaries for different processor types, and some tricky issues with native library reloading to avoid crashes on subsequent invocations of the wrapper.
Because this approach uses the standard ffmpeg cmd line syntax for commands it also means you should be able to search and find help easily on multiple different operations (as anyone using ffmpeg in 'normal' model will use the same syntax for the ffmpeg command itself).
For example, for your adding an image case here are some results from a quick search (ffmpeg syntax can change over time so it is worth doing a current check):
https://stackoverflow.com/a/32250369/334402
https://superuser.com/a/678171
I am working on an Android Camera Application which have capability of Image and Video capturing. Later users can annotate on the image and add watermark to Video. All went fine when drawing Annotation on Image but failed to get no solution. In iPhone there AVComposition Library to draw watermark on Videos. I don't know whether such library exists for android or not but I would like to know if someone has come across such requirement and got any solution.
Can some one guide how to get it started for composing an image on Video. Atleast adding text to the Video somewhere
In case anyone is still looking for this - ffmpeg can be used on Android to add a watermark to a video, but you do need to be aware that video processing on a power and processing limited device like a mobile may be slow and may run down battery if used extensively.
An example of the ffmpeg command to add a watermark to a video using ffmpeg, from the ffmpeg documentation is:
ffmpeg -i video.mkv -i image.png -filter_complex '[0:v][1:v]overlay[out]' -map '[out]' out.mkv
See here for the official documentation: https://ffmpeg.org/ffmpeg.html and look at '-filter_complex filtergraph (global)'
ffmpeg can be used in Android projects using one of the wrapper libraries that exist to support this. Although, ffmpeg was designed as a command line tool, this approach does work reliably in my experience.
The library below is a good example of a well supported Android ffmpeg wrapper library and it includes a sample APK so you may be able to actually try out the ffmpeg command you need with that to ensure it works before adding to your own project.
https://github.com/WritingMinds/ffmpeg-android-java
I've been trying to use an ffmpeg binary with command line access for a while now and getting nowhere (Using runtime.exec)
It looks like the only way I'll be able to get it to work is using a wrapper in C to access the built ffmpeg libraries using JNI...
Main problem: I haven't coded C for more than one and a half decades now and wouldn't know where to begin...
I just need 3 operations, I need to add audio to a video file, I need to concatenate two video files and if possible I need to rotate a clip by 90 degrees (but I could do without this)...
Does anyone have any example code that could work for me, or some good places to start (I've already exhausted much of the first pages of various google results to no avail)...
Any help would be greatly appreciated!
There are many open source projects available, But for simplicity, You can start from here
I believe this is what you are looking for:
https://github.com/hoary/JavaAV
Multiple platforms supported so your code will be more portable.
I have a very basic question regarding Android and ffmpeg. I obtained ffmpeg from http://bambuser.com/opensource and was able to compile it for ARM.
The results are the binaries (ffmpeg) as well as several libsomething.so files.
My question is: Is this enough to decode videos? How do I actually use ffmpeg then?
To load the library I have:
static {
System.load("/data/data/com.package/lib/libavcodec.so");
}
It loads fine. But what then?
More explanation: I saw other projects where people had their ffmpeg source in a JNI directory in the project. They also created some Android.mk files and some C code along with it. Would I need this as well? Why would I create the .so files first and then copy the ffmpeg source code again?
I know the NDK and how it should work but I've never seen an example of how one would actually call ffmpeg functions using it, because people seem to be hiding their implementations (which is sort of understandable) but not even giving useful pointers or examples.
Let's just say I wanted to decode a video file. Which kind of native methods would I need to implement? How do I run the project? Which data types need to be passed? etc. There are certainly a few people here who have at least done that, I know this from searching for hours and hours.
For your first question;
Just building is not enough for the proper use of the ffmpeg libraries. You should also wrap those so files in the right order because these so files NEED other libraries in the link time. You can display header information of the so file, by using.
objdump -x libavcodec.so | grep NEEDED
So you need to wrap these so files through Android.mk. You may check this link.
The second one;
You only need the header files from the ffmpeg project. The implementation will linked from the so libraries. Thats perhaps because, developers didn't bother to filter header files.
And the last one;
your thoughts seems right for the time being, most of the current developers are struggling to use ffmpeg but they lack of documentation and sample codes.
I want to make an app that downloads a specific MP3 file, allow the user to crop/trim it and re-upload it back to our server. How can I trim MP3 files inside my app ? Do I need to achieve this in C and then port it/use NDK ? If MP3 is tough, I can switch over to other easy formats too.
Will be great if anyone can show me the right path.
I think of all the alternatives, using something like LAME and compiling for the NDK is your best option. Be warned that there are patent restrictions on the MP3 format (most of which I don't understand), and you may run into problems if you distribute your app. YMMV
I didn't test it on Android yet, but there is a pure Java MP3 decoder / player I wrote a while ago. Maybe you can use parts of it. The code is based on JLayer from JavaZoom (also LGPL).
This library helps to trim MP3,AAC/MP4,WAV,3GPP/AMR...
http://code.google.com/p/ringdroid/