Simple Build of Libavcodec.so and libavformat.so on Mac - android

Who needs ffmpeg? Not me. What I need is to be able to decode a video stream along with its audio stream, so that can put the frames on an opengl surface in sync with the audio.
FFmpeg is a tool that transcodes video. That is not what I need. I need its libraries.
The problem is that every example for building FFmpeg includes junk I just dont need. The latest example I wasted my time on:
https://github.com/appunite/AndroidFFmpeg
uses things like freetype2 that I really, REALLY, do not need. Whats more annoying is that it wont even build as described because the example references freetype, not freetype2 so the build steps are broken. Don't even get me started on the problems I had with libtool.
The kicker is finding libav.org, where they describe on their about page the chaos in the ffmpeg project. Perhaps that is why this is so difficult.
So, should it be so hard to build just the shared libs? Can someone point me to some documentation, or a tutorial that works? I admit that this is new territory for me but all I have found using Google is chaos.

Related

Alternative for Xuggle-Xuggler / FFmpeg in Xamarin Android?

For days I am trying to find a working library that can decode the video stream of the Parrot AR Drone 2.0. The problem is actually that FFmpeg isn't working in Xamarin Android and the Xuggle-Xuggler is only for Java which makes it really difficult.
Furthermore, I tried to use FFmpeg, but everytime I got errors like this: DllImport error loading lbavcodec-55': 'dlopen failed: libavcodec-55" not found'. I have seen a lot of possible solutions but nothing works. I also tried to compile some .dll files which contains the FFmpeg source code, but unfortunately the same errors as before.
I just want create a TCP video stream to "192.168.1.1:5555". After that I want to use a possible decode class/library which could decode the bytes to frames or something like that and put it on the view using a VideoView, so the frames will be shown on the smartphone.
Has anyone experience with this? Or does someone know a working library for decoding the TCP video stream of the drone?
Thanks.
Good news, because I just solved the problem.
There is a possiblity to use FFMpeg, but you need to compile this specially for your platform. Actually this is in Windows a little bit harder than in Ununtu/Linux. However, I tried to implement a pre-compiled library into Xamarin Android, but there were errors like DllImport error loading lbavcodec-55': 'dlopen failed: libavcodec-55" not found', so that didn't work. Xuggle-Xuggler is a video decoder as well, but specially made for Java only and I am working in Xamarin Android, so I had to find something else.
After several weeks I saw a project which uses OpenCV. This could decode the video stream of the drone. However, there was this guy: https://github.com/AJRdev/ARDrone-Android-GEII who made the video stream in two different ways. Namely via OpenCV and a library called "Vitamio".
What I did was trying to use the Vitamio library which Xamarin Android supports. Because there is this Xamarin Android version known https://components.xamarin.com/gettingstarted/vitamiobinding, but that's an old version, so I decided to use the Vitamio library which can be found here: https://github.com/shaxxx/Xamarin.Vitamio. I am using this library because it's using .AAR which contains the same files as the Vitamio library in the project I was talking about before and the most important, no errors appeared :)
Unfortuantely there is no information on the internet about the Parrot AR Drone 2.0 using Xamarin Android. So, if there is someone with this problem, then you could use the source-code of the official app called "Freeflight 2.4", because that one is specially made for Android. However, there is a lot of code in the Freeflight 2.4 app which takes a lot of time to get the video stream part, but I did not have the time, so I chose for an easier way as I explained before.
After the implementation you should be able to see the video on your smartphone!
Good luck!

How to use prebuilt FFmpeg in Android Studio

I'm sure this is a very basic question but since this is the my first time messing around with the NDK, a lot of thing is still very unclear to me.
Use case:
I'm trying to develop a video scrubbing feature so fast and accurate frame seeking is crucial. I've tried most of the available players out there but the performance is still not up to my demand. That's why I'm going down the FFmpeg route.
Basically, what I'm looking for is FFmpeg input seeking. I've tried WrtingMinds' ffmpeg-android-java. However it is a file based implementation which means the out.jpg need to be written to external memory and read back which has a big hit on performance (roughly 1000 milliseconds for 1 seek).
That's why I'm trying to built my own FFmpeg player to do the input seeking in JNI and push back the byte[] to be displayed in Java.
Question: After a lot of struggling with the NDK, I've managed to set it up and successfully calling the JNI method from my Java code. The structure is as below:
MyApp
-app
-MyFFmpegPlayer
-build
-libs
-src
-main
-java
-com.example.myffmpegplayer
+HelloJNI.java
-jni
+MyFFmpegPlayer.c
After some fail attempt to build FFmpeg on Windows, I've decided to use WritingMinds prebuilt FFmpeg. However, after extraction they just come up as plain ffmpeg files (not .so file) so I don't really know how to use these.
It would be a great gratitude, if someone can just chime in and give me a good starting point for my next step.
Thank you so much for your time.
So to answer my own question which I should change to "How to build FFmpeg and use it with Android Studio", I have create a detail step-by-step instruction that's working for me (at May 24th 2016).
Sorry I can't post the whole instruction here as it is very long, plus it is easier for me to keep 1 source of information up-to-date.
I hope this can help someone as I know there is way too much confusing and contradicting information regarding this topic out there.

Android: Build a decoder library from Jpeg to MP4 video

I am working on Android 2.2, and my goal is to covert a sequence of images to mp4 video, MPEG-4_Part_14, to be more exact.
The most reasonable solution would be using FFmpeg libraries, and compile them to Android using NDK. How ever I am looking for solution without NDK.
I do not expect from you to build this tool for me, or find some one else who did. I spent quit of time now searching for that, and it's some thing that probably no1 one did yet.
So the only thing that I am asking, is help me find the specs so I be able to build the decoder by my self. I know that it's not a trivial task(May be this is why no one did it yet), but I want to do it any way. So please just help me start it.

FFmpeg - Finally compiled. Now what?

OK So here is my story:
I am creating an app that requires me to take a couple images and a video and merge them together. At first I had no idea what to use, never heard of ffmpeg or ndk.. After around 5 days of battling NDK, switching to Ubuntu and going crazy with ndk-build commands I finally got FFmpeg to compile using the dolphin-player example. Now that I can run ffmpeg on my computer and android device I have no idea what to do next.
Here are the main questions I have:
To use FFmpeg, I saw that I need to use some sort of commands. First off what are these commands, where do I run them?
Second of all, Are the commands all I need? By that I mean can i just run my application normally, somewhere in it execute the commands in some way and it will do the rest for me? or do I need some sort of element in the code, for example VideoEncoder instance or something..
Third of all, I saw people using NDK to use FFmpeg, Do I have to? Or is it optional? I would like to avoid using C if possible as I don't know it at all..
OPTIONAL: Last but not least, Is this the best way of handling what I need to do in my application? If so, can someone guide me in a brief manner of how to use FFmpeg to accomplish said task (mention commands or anything like this)..
I know it's a wall of text but every question is important to me!
Thank you very much stackoverflow community!
I see my answer may no longer relevant to your question but I still put it here as I've recently gone through that very same path and I understand the pain as well as the confusion causing by this matter (setting up NDK using mixed gradle plugin take me 1 day, building FFmpeg takes 2 days and then fail at wtf am I supposed to do next??)
So in short, as #Daniel has pointed out, if you just want to use FFmpeg to run command such ask compressing, cutting, inserting keyframes... then Writing mind's prebuilt FFmpeg Android Java is the easiest way to get FFmpeg running on your app. The downside is since it just run command so it needs to take an input and an output file for the process. See my question here for further clarification.
If you need to do more complex task than this then you have no choice but building the FFmpeg as a library and calling API from it. I've written down step by step instruction that work for me (May 2016). You can see it here:
Building FFmpeg v3.0.2 with NDK r11c (please use Ubuntu if you don't want to rebuild the whole thing, Linux Mint fails me)
Using FFmpeg in Android Studio 2.1.1
Please don't ask me to copy the whole thing here as its a very long instruction and it's easier for me to keep 1 source of information up-to-date. I hope this can save someone's keyboard ;).
1, FFmpeg can be either an app or a set of libraries. If you use it as an app (with an executable binary installed), you can type the commands in a terminal. The app only has limited functions and may not solve your problem. In this case you need to use ffmpeg as libraries and call APIs in your program.
2, To my understanding the commands cannot solve your problem. You need to call ffmpeg APIs. There are a bunch of sample codes for video/image encoding/decoding. You probably also need a container to package the outcome, and ffmpeg libraries can also do that.
3, NDK is preferred by me, since ffmpeg are written in C/C++. There are JAVA wrappers for ffmpeg; if you use them, NDK is not required. However, not all functions in ffmpeg are wrapped well - you may try. If not, then go back to the NDK solution.
4, The simplest way is to decode all your video/images into raw frames, combine them with desired order, and encode them. However in practice this consumes too much memory. The key point then becomes: how can I do the same on the fly? It's not too hard once you reach this step.

How do I use Android OpenCORE codecs using JNI?

I want to use the codecs in Android from my application. For now I just want to use the H.264 codec for testing, unless the mp3 or aac codecs provide functions for sending the audio to the device's speaker in which case I would prefer one of those.
I have the NDK installed along with Cygwin, GNU Make, and GNU Awk. I can't figure out what I need to do from here though. I'm downloading the entire OpenCORE tree right now but I don't even know how to build it or make Eclipse plugin know it needs to include the files.
An example or a tutorial would be much appreciated.
EDIT:
It looks like I can use JNI like P/Invoke which would mean I don't have to build the OpenCORE libraries myself. However, I can't find any documentation on the names of the libraries I need to load.
I'm also confused as to how to do it. I'm looking at http://www.koushikdutta.com/2009/01/jni-in-android-and-foreword-of-why-jni.html and I don't understand what the purpose of writing a library to access a library is. Couldn't you just use something like System.loadLibrary("opencore.so")?
You cannot build opencore seperately. It has to be built with whole source code. What are you trying to acheive. If you just want to play a video/audio, use VideoView or MediaPlayer object.
Build the Android source and use the headers and the static library from it. This will propel you straight to the zone of unsupported APIs.

Categories

Resources