Android ffmpeg simple JNI wrapper - android

I've been trying to use an ffmpeg binary with command line access for a while now and getting nowhere (Using runtime.exec)
It looks like the only way I'll be able to get it to work is using a wrapper in C to access the built ffmpeg libraries using JNI...
Main problem: I haven't coded C for more than one and a half decades now and wouldn't know where to begin...
I just need 3 operations, I need to add audio to a video file, I need to concatenate two video files and if possible I need to rotate a clip by 90 degrees (but I could do without this)...
Does anyone have any example code that could work for me, or some good places to start (I've already exhausted much of the first pages of various google results to no avail)...
Any help would be greatly appreciated!

There are many open source projects available, But for simplicity, You can start from here

I believe this is what you are looking for:
https://github.com/hoary/JavaAV
Multiple platforms supported so your code will be more portable.

Related

How to Create a video in Android?

I am new to android. I have two files of same length, One is audio file and one is video file with no audio. I want to make a video with audio by combining these two files. Help me to achieve this task.
I assume you have native Android app and familiarity with Java (or know porting the code in native C) and are willing to use other open-source classes in your project.
This is what you might give a head-start: Since this project is not actively maintained now, you might have to fork and use their logic into your code.
https://github.com/tqnst/MP4ParserMergeAudioVideo
Another alternative is using ffmpeg port for Android (however I am not sure how this works natively).
https://github.com/WritingMinds/ffmpeg-android-java

Put image over video and save video to sd card Android

I'm working on a feature in which I want to add picture over the video and save it to sd card.
in general, the user selects an image with semi-transparent background and puts that image above the video, after the user presses the save button he gets a new video but already with the image above the video.
I have heard about ffmpeg, and saw some commands that are provided by ffmpeg. but I don't know where I should initialize. can anyone provide me an example for the same?
Thank you.
One common approach is to use an ffmpeg wrapper to access ffmpeg functionality from your Android app.
There are several fairly well used wrappers available on GitHub - the ones below are particularly well featured and documented (note, I have not used these as they were not so mature when I was looking at this previously, but if I was doing something like this again now I would definitely build on one of these):
http://writingminds.github.io/ffmpeg-android-java/
https://github.com/guardianproject/android-ffmpeg
Using one of the well supported and used libraries will take care of some common issues that you might otherwise encounter - having to load different binaries for different processor types, and some tricky issues with native library reloading to avoid crashes on subsequent invocations of the wrapper.
Because this approach uses the standard ffmpeg cmd line syntax for commands it also means you should be able to search and find help easily on multiple different operations (as anyone using ffmpeg in 'normal' model will use the same syntax for the ffmpeg command itself).
For example, for your adding an image case here are some results from a quick search (ffmpeg syntax can change over time so it is worth doing a current check):
https://stackoverflow.com/a/32250369/334402
https://superuser.com/a/678171

Decrease video(.mp4 file) size without losing its quality in android

I have a video(.mp4) file in my SDCard,I want to reduce a size of .mp4 file and upload this file to a server.
One way you can do this is to use ffmpeg.
There are several ways of using ffmpeg in an Android program:
use the native libraries directly from c using JNI
use a library which provides a wrapper around the 'ffmpeg' cmd line utility (also uses JNI in the wrapper library)
call ffmpeg cmd line via 'exec' from within you Android app
Of the three, I personally have used the wrapper approach in the past and found it worked well. IMHO, the documentation and examples available with the native libraries represented quite a steep learning curve.
Note, if you do use 'exec' there are some things it is worth being aware of - see bottom of this answer: https://stackoverflow.com/a/25002844/334402.
The wrapper does have limitations - at heart, the ffmpeg cmd line tool is not intended to be used this way and you have to keep that in mind, but it does work. There is an example project available on github which seems to have a reasonable user base - I did not use it myself but I did refer to it and found it useful, especially for an issue you will find if you need to call your ffmpeg wrapper more than once from the same activity or task:
https://github.com/jhotovy/android-ffmpeg
See this answer (and the questions and answers it is part if) for some more specifics on the 'calling ffmpeg two times' solution:
https://stackoverflow.com/a/28752190/334402

How to read an AVI file in OpenCV for Android

I need to read an AVI file in Android and process its frames. I know its C++ way but not the android one! Can someone help me and give a start point. I'm using opencv 2.4.5.
You can try to compile OpenCV with FFMPEG for Android.
Video IO is not officially supported but if I remember correctly this patch worked: http://code.opencv.org/issues/2546 At least you can use it as a good starting point.
You can use javacv, it has ffmpeg and opencv wrappers with examples.
I don't think that it is implemented yet (at least it wasn't six months ago). You might want to split you video into frames and open those frames instead of the video file (have a look at this thread).
i'm not sure if it's useful , but take a look at this:
http://www.stanford.edu/class/ee368/Android/Tutorial-2-OpenCV-for-Android-Setup-Macintosh-API11.pdf

How can we add animated/simple image as a top layer to video and export it as a single video in Android?

I need to merge images to video as an overlay and export it. I have found ways to create video from images using javacv but didn't find any jar or library which do add images as an overlay to existing video, some of the links suggest to use FFMPEG and JNI to achieve this but sadly i don't have any knowledge of JNI. They use avfoundation framework in IOS to achieve the same.
The above image is replica of my requirements, if any one can guide me in right direction and provide me some useful stuff to start with would be appreciated.
What i have achieved so far is:
1) Compiled FFMPEG.
2) Generated .so files
3) Compiled and able to run Hello Jni project.
What i am searching for is:
1) Splitting video into frames.
2) Merging my overlay images with video frames
3) Recreating the video with audio.
and i know JNI is the only way to achieve this so searched a lot but didn't find any good JNI stuff to start with. I am not asking for the whole code but if some one can point me out with some good tutorial or blog would be great help.
Thanks!!
The way to accomplish this is like you said - incorporate a video codec and use it to re-compose the video.
Decode the original video
Draw your overlay over the original video frames
Encode the frames again.
Using FFMPEG with JNI is the obvious solution, but if you find any other codec library that can accomplish the same with pure Java, it will work too.
No knowledge of JNI? That's about time to learn it =)
References for learning:
NDK docs on your file system: http://developer.android.com/tools/sdk/ndk/index.html#Docs
JNI - http://192.9.162.55/docs/books/jni/html/jniTOC.html
FFMPEG for Android - there are many tutorials out there and many source trees that contain a ready build environment.
You can either follow them or just do it on your own - provided that you understand the NDK environment and can read makefiles.
This is an example: http://vec.io/posts/how-to-build-ffmpeg-with-android-ndk
Using FFMpeg to encode\decode frames - sadly, no good up-to-date tutorials here, there is the API documentation: http://ffmpeg.org/doxygen/trunk/index.html
Blending bitmaps - use Android's Canvas infrastructure, or just manually copy pixels over each other and blend according to alpha values.
Warning - this is a complex library to build and you'd better experiment with easier NDK projects before attempting this one.

Categories

Resources