I am thinking about making app on videos in android. I came to know that there are 2 famous libraries, FFMPEG and Vitamio.
I just want to know that can i do simple video functions like playing the video, grabbing a frame, converting video to mp3 with both the libraries? What are their pros and cons?
Thanks in advance.
Vitamio is much easier to use. It is just like the default MediaPlayer in Android but with some fancy capabilities. At the same time it can't do everything you want. As I understand it just plays video.
FFmpeg is not as simple. It is quite a powerful library that requires some thoughtful native coding in C. I'm sure you can do whatever you need with it, but it will cost you lots of efforts.
So I can recommend you to use either default Android's or Vitamio's MediaPlayer for playing video. And implement some specific features with the help of FFmpeg.
Old question, but since I'm working on this very problem and I found a lot of outdated information, I think it might be worth an answer anyway.
Vitamio is acually based on ffmpeg which makes the "Vitamio vs ffmpeg" question moot:
What's Vitamio, on the Vitamio website
The Media Formats paragraph cites:
Vitamio used FFmpeg as the demuxers and main decoders, many audio and video codecs are packed into Vitamio beside the default media format built in Android platform, some of them are listed below.
the "used" instead of "uses" looks like a typo (the site is choke full of them...).
Therefore the tip of the scale would point toward Vitaminio, it seems, as it's very easy to use. However...
Vitamio is a derivative work of ffmpeg for android, with an unspecified license (it's spelled quite clearly on the site that individuals can use the library for their own apps freely, though. That part was probably added after the answer by Marcus Wolschon)
ffmpeg for android is a derivative work of ffmpeg (more than that, actually: it's a port) and it's released under the LGPL v3 to abide it's contractual terms with ffmpeg (it actually does), relaying the same obligations to the user in a viral way (attribution, making the source of the library used for the compilation available, etc etc)
ffmpeg (the original work) is released under a dual license: a very liberal GPL (of no consequence in this discussion) and LGPL (the one picked by ffmpeg for android)
What follows is strictly my biased personal opinion, not a statement about the facts
This leads me to think that unless the company selling Vitamio has some agreement with both the author of ffmpeg android and ffmpeg, that we know nothing about, Vitamio is violating the copyright of ffmpeg for android (and therefore ffmpeg) hard.
The fact that the Vitamio's website has a lot of broken links, a grammatically challenged documentation, and not all legal information required for an educated choice, doesn't exactly plays in its favor, if I had to make my mind between considering them a high profile company or some individual trying to live off the back of the ffmpeg team's work...
This leads me to the following considerations:
from a practical standpoint, Vitamio might or might not be the best choice as it should be easy to integrate in your project (I say it should, as I didn't manage to do it yet, and I have quite some experience as an android developer under my belt...). The conditional is due, as the software is based on ffmpeg for android, which already makes an effort for easy Android integration.
from a legal standpoint, the situation is even more shadier, the options are:
turn a blind eye. The company behind Vitamio says that the product is fine and it's free to use (more than that actually: it sells the library), so if there is a licensing issue, it's their issue: as developers we have a semi-legitimate reason to bother to a point, as we would be as cheated as the people behind ffmpeg (I'm not very convinced about this whole argument myself...).
golf for honesty and shun Vitamio, adopting ffmpeg for android instead, which might mean more homework during the integration, both to replicate MediaPlayer and to abide to the LGPL terms, but guarantees a clear conscience
I didn't my mind yet, but I'll probably opt for "2"
Hope this helps
UPDATE It looks like that Vitamio is (at least partially) complying with the terms of the LGPL license, as they are publishing the source code required to build their product:
ffmpeg for Vitamio on GitHub
I don't have the time to find out if this is all that's required for complying with the original ffmpeg license (I'm skeptic), and how that influences the previous considerations (sorry).
You can't use Vitamio in any project because it has no license.
Without a LICENSE file or any other mention what rights you are granted you are granted to rights to use it at all.
See here: http://vitamio.org/topics/93?locale=en
Related
I to allow users in my app to record video and then post process it. Basically all I need is to video to be square (low res, something about 400x400) and when recording is done then allow user to modify brightness/contrast.
I did some research on that and found ffmpeg library which can do that. But I'm not sure if I am ok with its licensing. When I use ffmpeg do I have to release my app sources as well? My app will be free to download and use but I am not comfortable with its releasing sources.
Also about that square recording - as I am supporting API 14, android doesn't let me adjust resolution directly. There are 2 ways I think of:
Record video file in 640x480, then resize/crop and after that allow user to do post processing. - totally need ffmpeg for that
Capture CameraPreviewFrames - crop them as they go and render them into mp4 video, and after video is rendered then allow user to post process it further - need ffmpeg for that as well.
My question is then - may I use ffmpeg without any worries about licensing etc?
Or is there any other library which allows me to do above and is open to use?
Thanks very much
I am not a lawyer, and this is not legal advice. You should consult your lawyer for real legal advice.
FFmpeg is LGPL. You should read the license; it's somewhat more readable than most legalese.
The LGPL differs from the GPL in that you are not required to distribute your source code so long as you do not incorporate FFmpeg source code into your project. To achieve this, you must use FFmpeg as a so-called dynamic link library (e.g., .so, .dylib, .framework, .dll, etc). This is the default configuration.
If you modify the FFmpeg source, you must make it available.
You must also comply with the copyright license/patent license restrictions of all codecs you compile with FFmpeg. These are possible to distinguish by the FFmpeg configure options, e.g. --enable-gpl. If you use this configure option, for example, you are agreeing to distribute your source code as well as the FFmpeg source code, subject to the requirements of that codec's license(s). (In the case of x264, I believe there is a commercial license as well as the GPL.)
Straight from the horse's mouth: http://www.ffmpeg.org/legal.html
Especially check the checklist.
For API 11+, you can use the stagefright framework to encode your video to mp4, you don't need ffmpeg for this.
OTOH, there are quite a few ports of ffmpeg to Android, some even install a separate service whose sole purpose is to provide ffmpeg support for any app on the device. Using such approach you definitely do not violate any SW licenses.
What is the best and most optimum way of having support of G729 on android device ?
In my current application i have to use G729 codec. I have search a lot, but couldn't get any library. Is there any way of using g729 ?
EDIT:
From where Can i buy those codec library and use it in my app for development ?
There are lots of G729 implementations if you google around for some.
There is the one in CSipSimple you take and use.
Here is another one I found which you could most likely convert to compile under android.
I have not seen any implementations in java tho, so you will most likely have to use the NDK to compile C/C++ source and access it from the android side.
The other fact you need to consider tho, is that G729 is NOT royalty free, so you need to pay royalty fees in your use of it. See here for details.
Finally I have ended using Linphone. It is very easy to build and use. Also they have added G729 codec support which you can find in application settings.
From here you can checkout the latest source. Please read Readme file for build process.
Really good application to work with.
Probably ffmpeg supports that codec. There's a lot of examples on the net how to integrate ffmpeg into android (although this is, probably, not very easy)
I'm looking for a way to integrate opus-codec (the decoder part) with my Android application.
Do you know of any implementations that have done so? We are currently using ogg-vorbis for spoken prompts, considering going with either speex (deprecated, but with few documented attempts) or opus (currently no documented attempts).
If we would have to go the NDK route, do you think it should provide us with a application size improvement? OggVorbis is supported by the platform, neither speex nor opus are.
I recommend you have a look at the Opus API documentation. Also, there's now an OpusFile library (equivalent to Vorbis' libvorbisfile) in early development. Otherwise, you can always read the opusdec source code.
Scenario:
I am working on a Android project where in one particular openGL page, I display videos.
FFmpeg is used to obtain frames from videos(as openGL does not support video as texture) and I am using the frames to obtain a video effect.
I am using pre-compiled FFmpeg binaries in the project.
I was not aware of the level of legal implications of using the FFmpeg library.
My superior brought this to my notice FFmpeg legal reference
Problem:
I am no legal expert, so only thing that I understood is that using FFmpeg in a comercial free app (but service needs to be paid for) is going to get me and my company into trouble :(
In no way the source or any part of the source of the project can be released.(The client is very strict about this.)
Questions?
1) Is there any alternative to FFmpeg (which uses Apache or MIT license) that I can use to obtain video frames?
2) In openGL, getting the video frames and looping through - Is it the only way of playing a video? Is there any alternate method to achieve this functionality?
IANAL, but LGPL means that if you compile and use ffmpeg as shared library (.so file) or standalone executable, then you are fine - even in closed source application that you sell for money.
Can somebody give me some direction on how to synthesize sounds of instruments (Piano, Drums, Guitar, etc...)
I am not even sure what to look for.
Thanks
Not sure if this is still the case but Android seems to have latency issues that inhibit it from being able to do true sound synthesis. NanoStudio, in my opinion, is the best audio app on the iOS and the author so far refuses to make an Android version because the framework isn't there yet.
See these links:
http://www.google.com/search?sourceid=chrome&ie=UTF-8&q=nanostudio+android#hl=en&q=+site:forums.blipinteractive.co.uk+nanostudio+android&bav=on.2,or.r_gc.r_pw.&fp=ee1cd411508a9e34&biw=1194&bih=939
It all depends on what kind of application you're making, if it's going to be a Akai APC firing off sounds you could be alright. If you're after true synthesis (crafting wave forms so they replicate pianos, guitars, and drums), which is what JASS mentioned above does, then Android might not be able to handle it.
If you're looking for a guide on emulating organic instruments via synthesis check out the books by Fred Welsh http://www.synthesizer-cookbook.com/
Synthesizing a guitar, piano, or natural drums would be difficult. Triggering samples that you pass through a synthesis engine less so. If you want to synthesize analog synth sounds that's easier.
Here is a project out there you might be able to grab code from:
https://sites.google.com/site/androidsynthesizer/
In the end if you want to create a full synthesizer or multi-track application you'll have to render your oscillators + filters, etc into an audio stream that can be piped into the MediaPlayer. You don't necessarily need MIDI to do that.
Here is one persons experience:
http://jazarimusic.com/2011/06/audio-on-android-a-developers-perspective/
Interesting read.
Two projects that might be worth looking at JASS (Java Audio Synthesis System) and PureData . PureData is quite interesting though probably the harder path.
MIDI support on Android sucks. (So does audio support in general, but that's another story.) There's an interesting blog post here that discusses the (lack of) MIDI capabilities on Android. Here's what he did to work around some of the limitations:
Personally I solved the dynamic midi generation issue as follows: programmatically generate a midi file, write it to the device storage, initiate a mediaplayer with the file and let it play. This is fast enough if you just need to play a dynamic midi sound. I doubt it’s useful for creating user controlled midi stuff like sequencers, but for other cases it’s great.
Android unfortunately took out MIDI support in the official Java SDK.
That is, you cannot play audio streams directly. You must use the provided MediaStream classes.
You will have to use some DSP (digital signal processing) knowledge and the NDK in order to do this.
I would not be surprised if there was a general package (not necessarily for Android) to allow you to do this.
I hope this pointed you in the right direction!