JavaCv and FFMpeg - android

I was given a task to merge an Image and Audio to make a video file out of them in android. I successfully finished the task by using javacv ffmpeg but after so much of compression my app size is still around 15mb. After analysing the apk, I've found that libavcodec.so is occupying 64% of the size. Since my app relies on that, I can't exclude the file.
So my question is "Is it feasible to complete this task with very less apk size? Like around < 5Mb?"

Yes you can. If you really want to use javacv than you can remove unnecessary parts from the sources and recompile them. More info in JavaCPP repo
As the easiest solluton of this problem you can just remove some unused .so files.
Finally, ffmpeg is not necessary here and you can find another solution. For example you can use an online service (I've found transloadit and it is free for small apps) or write your own native or java code. It is not easy but possible

Related

Android apk file too big when using FFMPeg encoder library

I'm developing an app which creates x264 videos with the following library:
com.arthenica:mobile-ffmpeg-full:4.2.2.LTS
but the result apk file is too big (~71mb), so I tried:
com.arthenica:mobile-ffmpeg-min-gpl:4.2.2.LTS
and this way, as the library downloads only a few codecs -included the ones I need- apk size was reduced to ~49mb, what is much better, but still looks too big to me, so I'd need to know if any of you know a better way to reduce apk size, because people generally refuse to download such big apps
Regarding the rest of the app (drawables, resources and so) they're well optimized, because if I remove this library and rebuild, the size of the app drops to 10mb
I was reading this question:
FFMPEG Android Library Increase Size
and user S.R suggests to compress all cpu architecture models in one archive file and extract target cpu lib based on cpu model on app's directory and load ffmpeg from there, but I really don't know how to do that.
I'm checking my app's folder structure and noticed there are this next folders regarding ffmpeg lib:
arm64-v8a => ~16mb
armeabi-v7a => ~29mb
x86 => ~17mb
x86_64 => ~21mb
But not sure if I could remove any of them, and as you can see armeabi-v7a is the largest.
As per your requirement, FFmpeg is the best option to so much of video processing task but the problem with FFmpeg is it will increase the size of the app. So I will suggest you to compile this FFMPEG library in your project, it also has many inbuilt functions like add music, add image on Video etc. It will definitely help you with your work as well as size is also very small ~11mb.
You can use Android App Bundle (aab) to reduce app size
further.
use
implementation "com.arthenica:mobile-ffmpeg-min:4.4.LTS"
according to documentation

How to load an online android module at run-time?

Is it possible to split my Android apk of 512MB into different parts or modules.
Compile small part of it into a release Apk (small Apk size) for Play Store.
Then fetch the remaining parts or modules from online or your own server during first time install on a device.
Thats:
Small Apk to Google Play Store
Fetch remaining big files after first install.
I am using Android Studio.
Of course you can, that can be easily achieved if you got to split heavy resources(like images, videos, databases). Here you must write some logic which will download that resources and will work with them after a successful download.
I cant imagine a situation when you must to split a code in a separate module, compiled code is light and dont increase apk size so much as another resources. Code can become heavy when there is a lot of code from libraries, in that case I suggest you to learn about Proguard Shrink.
Also you can learn why in android is not possible to load java modules in runtime. On of the reasons is performance given by JIT
Here a is workaround with ndk How do I import shared object libraries at runtime in Android?

Decrease video(.mp4 file) size without losing its quality in android

I have a video(.mp4) file in my SDCard,I want to reduce a size of .mp4 file and upload this file to a server.
One way you can do this is to use ffmpeg.
There are several ways of using ffmpeg in an Android program:
use the native libraries directly from c using JNI
use a library which provides a wrapper around the 'ffmpeg' cmd line utility (also uses JNI in the wrapper library)
call ffmpeg cmd line via 'exec' from within you Android app
Of the three, I personally have used the wrapper approach in the past and found it worked well. IMHO, the documentation and examples available with the native libraries represented quite a steep learning curve.
Note, if you do use 'exec' there are some things it is worth being aware of - see bottom of this answer: https://stackoverflow.com/a/25002844/334402.
The wrapper does have limitations - at heart, the ffmpeg cmd line tool is not intended to be used this way and you have to keep that in mind, but it does work. There is an example project available on github which seems to have a reasonable user base - I did not use it myself but I did refer to it and found it useful, especially for an issue you will find if you need to call your ffmpeg wrapper more than once from the same activity or task:
https://github.com/jhotovy/android-ffmpeg
See this answer (and the questions and answers it is part if) for some more specifics on the 'calling ffmpeg two times' solution:
https://stackoverflow.com/a/28752190/334402

How can we add animated/simple image as a top layer to video and export it as a single video in Android?

I need to merge images to video as an overlay and export it. I have found ways to create video from images using javacv but didn't find any jar or library which do add images as an overlay to existing video, some of the links suggest to use FFMPEG and JNI to achieve this but sadly i don't have any knowledge of JNI. They use avfoundation framework in IOS to achieve the same.
The above image is replica of my requirements, if any one can guide me in right direction and provide me some useful stuff to start with would be appreciated.
What i have achieved so far is:
1) Compiled FFMPEG.
2) Generated .so files
3) Compiled and able to run Hello Jni project.
What i am searching for is:
1) Splitting video into frames.
2) Merging my overlay images with video frames
3) Recreating the video with audio.
and i know JNI is the only way to achieve this so searched a lot but didn't find any good JNI stuff to start with. I am not asking for the whole code but if some one can point me out with some good tutorial or blog would be great help.
Thanks!!
The way to accomplish this is like you said - incorporate a video codec and use it to re-compose the video.
Decode the original video
Draw your overlay over the original video frames
Encode the frames again.
Using FFMPEG with JNI is the obvious solution, but if you find any other codec library that can accomplish the same with pure Java, it will work too.
No knowledge of JNI? That's about time to learn it =)
References for learning:
NDK docs on your file system: http://developer.android.com/tools/sdk/ndk/index.html#Docs
JNI - http://192.9.162.55/docs/books/jni/html/jniTOC.html
FFMPEG for Android - there are many tutorials out there and many source trees that contain a ready build environment.
You can either follow them or just do it on your own - provided that you understand the NDK environment and can read makefiles.
This is an example: http://vec.io/posts/how-to-build-ffmpeg-with-android-ndk
Using FFMpeg to encode\decode frames - sadly, no good up-to-date tutorials here, there is the API documentation: http://ffmpeg.org/doxygen/trunk/index.html
Blending bitmaps - use Android's Canvas infrastructure, or just manually copy pixels over each other and blend according to alpha values.
Warning - this is a complex library to build and you'd better experiment with easier NDK projects before attempting this one.

How do I actually use ffmpeg on Android?

I have a very basic question regarding Android and ffmpeg. I obtained ffmpeg from http://bambuser.com/opensource and was able to compile it for ARM.
The results are the binaries (ffmpeg) as well as several libsomething.so files.
My question is: Is this enough to decode videos? How do I actually use ffmpeg then?
To load the library I have:
static {
System.load("/data/data/com.package/lib/libavcodec.so");
}
It loads fine. But what then?
More explanation: I saw other projects where people had their ffmpeg source in a JNI directory in the project. They also created some Android.mk files and some C code along with it. Would I need this as well? Why would I create the .so files first and then copy the ffmpeg source code again?
I know the NDK and how it should work but I've never seen an example of how one would actually call ffmpeg functions using it, because people seem to be hiding their implementations (which is sort of understandable) but not even giving useful pointers or examples.
Let's just say I wanted to decode a video file. Which kind of native methods would I need to implement? How do I run the project? Which data types need to be passed? etc. There are certainly a few people here who have at least done that, I know this from searching for hours and hours.
For your first question;
Just building is not enough for the proper use of the ffmpeg libraries. You should also wrap those so files in the right order because these so files NEED other libraries in the link time. You can display header information of the so file, by using.
objdump -x libavcodec.so | grep NEEDED
So you need to wrap these so files through Android.mk. You may check this link.
The second one;
You only need the header files from the ffmpeg project. The implementation will linked from the so libraries. Thats perhaps because, developers didn't bother to filter header files.
And the last one;
your thoughts seems right for the time being, most of the current developers are struggling to use ffmpeg but they lack of documentation and sample codes.

Categories

Resources