Scenario :
I have to read the video frames and edit them and encode it a new .mp4 file.
According to the research i have done i can do this native level. Even if i am doing it in the native level which open source library i can use so that the encoding and decoding of the video will be faster. It have seen in many places people used ffmpeg but it in order to H264 stream it needs LGPL license. Is there any other libraries which have better performance and comes under GPL.
If you can use Android 4.3 (API 18) or greater, take a look to the Big Flake samples. It's easy to rewrite them to your needs.
If you want other API version, you will have to compile and use FFMPEG for Android
Related
I to allow users in my app to record video and then post process it. Basically all I need is to video to be square (low res, something about 400x400) and when recording is done then allow user to modify brightness/contrast.
I did some research on that and found ffmpeg library which can do that. But I'm not sure if I am ok with its licensing. When I use ffmpeg do I have to release my app sources as well? My app will be free to download and use but I am not comfortable with its releasing sources.
Also about that square recording - as I am supporting API 14, android doesn't let me adjust resolution directly. There are 2 ways I think of:
Record video file in 640x480, then resize/crop and after that allow user to do post processing. - totally need ffmpeg for that
Capture CameraPreviewFrames - crop them as they go and render them into mp4 video, and after video is rendered then allow user to post process it further - need ffmpeg for that as well.
My question is then - may I use ffmpeg without any worries about licensing etc?
Or is there any other library which allows me to do above and is open to use?
Thanks very much
I am not a lawyer, and this is not legal advice. You should consult your lawyer for real legal advice.
FFmpeg is LGPL. You should read the license; it's somewhat more readable than most legalese.
The LGPL differs from the GPL in that you are not required to distribute your source code so long as you do not incorporate FFmpeg source code into your project. To achieve this, you must use FFmpeg as a so-called dynamic link library (e.g., .so, .dylib, .framework, .dll, etc). This is the default configuration.
If you modify the FFmpeg source, you must make it available.
You must also comply with the copyright license/patent license restrictions of all codecs you compile with FFmpeg. These are possible to distinguish by the FFmpeg configure options, e.g. --enable-gpl. If you use this configure option, for example, you are agreeing to distribute your source code as well as the FFmpeg source code, subject to the requirements of that codec's license(s). (In the case of x264, I believe there is a commercial license as well as the GPL.)
Straight from the horse's mouth: http://www.ffmpeg.org/legal.html
Especially check the checklist.
For API 11+, you can use the stagefright framework to encode your video to mp4, you don't need ffmpeg for this.
OTOH, there are quite a few ports of ffmpeg to Android, some even install a separate service whose sole purpose is to provide ffmpeg support for any app on the device. Using such approach you definitely do not violate any SW licenses.
I am trying to build a video system on android. I am using the sample provided by Qualcomm, which allows me to use openmax and do hardware-acceleration on Qualcomm customer device.
Anyway, this sample only generates .h264 file. So I am looking forword a good way to do the muxer work. I've used MediaMuxer before, but it supports system later than android4.3, so this doesn't work on this sample. (Qualcomm sample only support android4.2 and before)
Does anyone have any ideas? Thank you!
you can use ffmpeg. build ffmpeg for android, create jni wrapper and easily expose muxing functionality to java level
Scenario:
I am working on a Android project where in one particular openGL page, I display videos.
FFmpeg is used to obtain frames from videos(as openGL does not support video as texture) and I am using the frames to obtain a video effect.
I am using pre-compiled FFmpeg binaries in the project.
I was not aware of the level of legal implications of using the FFmpeg library.
My superior brought this to my notice FFmpeg legal reference
Problem:
I am no legal expert, so only thing that I understood is that using FFmpeg in a comercial free app (but service needs to be paid for) is going to get me and my company into trouble :(
In no way the source or any part of the source of the project can be released.(The client is very strict about this.)
Questions?
1) Is there any alternative to FFmpeg (which uses Apache or MIT license) that I can use to obtain video frames?
2) In openGL, getting the video frames and looping through - Is it the only way of playing a video? Is there any alternate method to achieve this functionality?
IANAL, but LGPL means that if you compile and use ffmpeg as shared library (.so file) or standalone executable, then you are fine - even in closed source application that you sell for money.
I am writting an app which needs to decode H.264(AVC) bitstream. I find there are AVC codec sources exist in /frameworks/base/media/libstagefright/codecs/avc, does anyone know how can one get access to those codecs in an Android app? I guess it's through JNI, but not clear about how this can be done.
After some investigation I think one approach is to create my own classes and JNI interfaces in the Android source to enable using the CODECS in an Android App.
Another way which does not require any changes in Android source is to include CODECS as shared library in my application, use NDK. Any thoughts on these? Which way is better(if feasible)?
I didn't find much information about Stagefright, it would be great if anyone can point out some? I am developing on Android 2.3.3.
Any comments are highly appreciated.Thanks!
Stagefright does not support elementary H.264 decoding. However it does have H.264 decoder component. In theory, this library could be used. But in reality, it will be tough to use it as a standalone library due to its dependencies.
Best approach would be use to JNI wrapped independent h.264 decoder(like the one available with ffmpeg).
I'm looking for a way to decode AAC natively to PCM on Android. The decoder source code is at https://android.googlesource.com/platform/external/opencore/+/master/codecs_v2/audio/aac/dec, but I'm not familiar with NDK at all.
1) There's no way of doing this directly using the Android SDK, but can this be done via the NDK?
2) I would especially be interested in a simple way of accessing the decoder from SDK, with a short "bridge" through the NDK. Is this feasible?
3) Would such a solution work all Android versions (1.5-2.2)?
4) I guess I could use http://code.google.com/p/aacplayer-android/ instead, but it looks like this implementation is fairly CPU intensive. Does anyone have experiences with this?
Not sure what the policy is here for answering really old questions but what is working well for me is using OpenSL with the NDK; it comes built in and in fact the NDK comes with an example "native-audio" that demonstrates what you need.
One thing you may look into is the FFMpeg stuff, it is GPL and TuneIn radio posted their mods here: http://radiotime.com/mobile/android#/support/open-source