I have a simple task to make a video from multiple images and an audio file, After searching a lot found that its possible with FFMPEG, Unfortunately there are no updated tutorials for FFMPEG, there are few but outdated and most of them are not working.
As I have compiled FFMPEG for Android using NDK android-ndk-r10e and ffmpeg-2.8.6 on my MAC with Android Studio following the tutorial http://www.roman10.net/how-to-build-ffmpeg-with-ndk-r9/
It makes the .so files as
Now I can't understand what should I do to integrate this in my Android project, I have also checked,
How to use Ffmpeg in android studio?
how to use ffmpeg in android?
FFmpeg on Android
and mainly this,
http://www.roman10.net/how-to-build-android-applications-based-on-ffmpeg-by-an-example/
https://github.com/roman10/android-ffmpeg-tutorial
But its not working and having errors with no way to resolve this. Can anyone please mention some steps that should be followed to use FFMPEG .so files as in image attached.
There's a very easy solution for this, There's a precompiled library for android, as below https://github.com/WritingMinds/ffmpeg-android-java
Simply include this as a gradle project in your code and add few methods as per their documentation and you are done with FFMPEG commands in android.
This library is not very updated and have some missing features but still its good to use for many simple tasks.
Related
This question already has answers here:
How to configure NDK with Android Gradle plugin 0.7
(5 answers)
Android studio, gradle and NDK
(23 answers)
Closed 8 years ago.
Im trying to integrate lame lib in my application in order to record audio in mp3 format.
Today android don't support mp3 encoding natively however with some digging i have found that i can use lame lib but the problem is , lame lib is in c code.
after more reading i found out the only way to make it work correctly is to use the c code into my project compile it as a library useing NDK through JNI or something like that.
My problem is that i don't really understand all this NDK and JNI and most importantly i don't know how to run native code with ANDROID STUDIO! I when to this page and read a bit about it but still they don't mention there how to do it in android studio. i am really confuse here.
My questions are:
-Please how can I integrate NDK with ANDROID studio (as far as i have read some are more lucky because its easier with Eclipse which i dont use)
-Is there any other way to use or to import the lameLib into my project without NDK?
Thank you
There are two steps you need to take
1: Create the JNI wrappers
You should read about JNI. It's complicated. The best way to generate wrappers for a large project is to use Swig to auto generate them. I recommend following the swig android tutorial to learn swig. http://www.swig.org/Doc2.0/Android.html
2: Run ndk-build automatically in Android Studio
See my answer to another question for detailed instructions on how to set up ndk with Android Studio.
I am developing an Android project/app for video compression and that's how I came to know about ffmpeg. I'm using Ubuntu 11.04 OS and Eclipse IDE(ADT-bundle).
I came across many topics dealing with ffmpeg at StackOverflow but i can't clearly figure out where I can get the ffmpeg jar file and how to add it and use in my project.
Any advice from you will be of great help.
If ffmpeg is so tough to handle for a beginner in Android like me, then is there any other way to compress a video in android?
If you're targetting Android 15 or greater version, try MediaCodec class.
May be this helps You.It's java porting of ffmpeg. Though it uses JNI internally,it provides java functions that hides the mess!!
I am new to android and opencv,and I am going to develop a APP using this code with this link
here
http://geekoverdose.wordpress.com/category/computer-vision/
The sample project says that "you will have to get the opencv libraries precompiled" in here what I want to do for run this code correctly.please consider about this issue. in here it is wanted to consider about opencv installation for android or is there any special thing have to do? thank you
It will take a little more effort to "run this code" correctly. In short, JavaCV is a wrapper on top of OpenCV, and you have to compile the OpenCV and correctly move all the shared libraries (.so) to your Android development project etc. Like the original article suggested, you need to go to JavaCV homepage and read the installation instructions.
How can I easily add WebRTC functionality to my android app so that I would be able to play video stream in MediaPlayer?
What library (.jar file) do I have to add to referencies?
Could you add some code example please?
Actually, there is no such ".jar" files for you to reference right now.
WebRTC is designed for web browsers, even if it can be build for android and iOS.
for your question, I think you may need to learn how to build WebRTC on Android or iOS.
for Android, you should know NDK and JNI first. then build whole webrtc projects or standalone VoE\ViE even NS\AECM\VAD\AGC modules for android.
for iOS, you should also build it by yourself, but you may need this help.
all of these info can be found on the internet, especially google group.
checkout following link : http://www.webrtc.org/reference/getting-started
I have added my project on github: https://github.com/SDkie/Webrtc-for-Android, you can fork and make changes in it.
Please take a look at http://www.webrtc.org/native-code/android.
It has all the information on how to build webRTC for android.
The java wrapper for the native webRTC libraries:
https://code.google.com/p/webrtc/source/browse/trunk/talk/app/webrtc/java/#java%2Fsrc%2Forg%2Fwebrtc
It has the instructions on how to build the native libraries.
It also has an example app that uses webRTC to talk to the https://apprtc.appspot.com.
https://code.google.com/p/webrtc/source/browse/trunk/webrtc/examples/android/media_demo/README
I could not find instructions on how to create an Android Studio Project and build it with gradle though. If anyone finds it, please share.
I hope it helps
One option is to use the crosswalk, you can create a native app using the crosswalk as a webview in your project.
In my case the Web-RTC worked without any problems.
Here are some tutorials on how to add the crosswalk on your project.
https://diego.org/2015/01/07/embedding-crosswalk-in-android-studio/
https://crosswalk-project.org/documentation/embedding_crosswalk.html
I hope this helps.
Lots of question and answer available on the ffmpeg and android. But I did not get thing that directly address ffmpeg building with x264 library.
Actually I want to make a movie from some still images in android.
Still do not get any solution to resolve this problem. Some of the forum told that it can be do using ffmpeg. If I build ffmpeg after downloading from "http://bambuser.com/opensource", it works fine to decode a video file. But it does not get any codec while it try to encode still image into movie.
That's why I try to use x264 as encoding library with ffmpeg. While I try to build it with ffmpeg it returns error.
Could you please provide any detail step by step guide line to build ffmpeg with x264 library in windows or mac for android?
If any one knows anything other that can be used to make movie from still images in android please tell me the way. Your help will be highly appreciated.
Thank you in advance for your kind response.
I made a tutorial on how to build ffmpeg and x264 for android:
http://db.tt/TjMqIF3u
You can also download the zip file containing the files you need to make an application on android and also an executable of lastest ffmpeg to run on android.
PS: you might need Cygwin to compile ffmpeg in Windows.