There is a 'native-media' project in NDK samples, in which it calls OMX functions in C level to do the video decode and play stuff, but it seems that NDK doesn't support encode of OMX now, is that true?
Besides, I also find this link. It seems that people are talking about use OMX to do video encode. But I can't find more information about OMX encode in android. Does anyone know about that?
We can do that in certain condition.
Google doesn't provide the API for native-media encoder now. If we want to do that, we need to use the code provided by the hardware company, compile the code into a .so file, and call the function through that file in NDK.
I've found a sample provided by Qualcomm. You can find it in this link. After you download the sample, you will find a user introduction and a demo.
Related
I have gone through this links and few other links also,
khronos
OpenMax_Development_Guide
bellagio_openmax_il_open_source_implementation_enables_developers_to_create
but all of them just explains how the calling sequence is, picture of block diagram etc but don't explain how to write and build openmax component and plug it in android. Even the link for android building and porting is complicated it doesn't explain, that you will need whole source code to write and build openmax plugin or part of android source code or without android source code you can create it.
I am having firefly K3288 board with android OS Kitkat 4.4 which is supporting hevc hardware decoder but I want to add hevc software decoder.
If anyone know how to write and build openmax hevc video decoder component and plug it in android please give some directions.
For the 1st question of how to develop an OMX component, you will have to write a new component either out of scratch or using a template of existing functions. Please do refer to the OMXIL specification, specifically chapter 2.
I would recommend you to write a component based on Bellagio implementation which can be found here. Please refer to omx_base_video_port.c as this is essential for your decoder development.
An alternative could be to refer the implementation from one of the vendors. In AOSP tree, please refer to the qcom implementation as here which could provide you a good reference for starting with your development.
Note: Please note that OMX wrapper is more aligned towards state management, context management and buffer management. The interaction with your decoder whether HW or SW is dependent on your driver architecture which you should decide on. Once this driver architecture is finalized, integrating into OMX should be fairly easy.
For the 2nd question on how to integrate the hevc decoder, please refer to this question which has relevant details.
I to allow users in my app to record video and then post process it. Basically all I need is to video to be square (low res, something about 400x400) and when recording is done then allow user to modify brightness/contrast.
I did some research on that and found ffmpeg library which can do that. But I'm not sure if I am ok with its licensing. When I use ffmpeg do I have to release my app sources as well? My app will be free to download and use but I am not comfortable with its releasing sources.
Also about that square recording - as I am supporting API 14, android doesn't let me adjust resolution directly. There are 2 ways I think of:
Record video file in 640x480, then resize/crop and after that allow user to do post processing. - totally need ffmpeg for that
Capture CameraPreviewFrames - crop them as they go and render them into mp4 video, and after video is rendered then allow user to post process it further - need ffmpeg for that as well.
My question is then - may I use ffmpeg without any worries about licensing etc?
Or is there any other library which allows me to do above and is open to use?
Thanks very much
I am not a lawyer, and this is not legal advice. You should consult your lawyer for real legal advice.
FFmpeg is LGPL. You should read the license; it's somewhat more readable than most legalese.
The LGPL differs from the GPL in that you are not required to distribute your source code so long as you do not incorporate FFmpeg source code into your project. To achieve this, you must use FFmpeg as a so-called dynamic link library (e.g., .so, .dylib, .framework, .dll, etc). This is the default configuration.
If you modify the FFmpeg source, you must make it available.
You must also comply with the copyright license/patent license restrictions of all codecs you compile with FFmpeg. These are possible to distinguish by the FFmpeg configure options, e.g. --enable-gpl. If you use this configure option, for example, you are agreeing to distribute your source code as well as the FFmpeg source code, subject to the requirements of that codec's license(s). (In the case of x264, I believe there is a commercial license as well as the GPL.)
Straight from the horse's mouth: http://www.ffmpeg.org/legal.html
Especially check the checklist.
For API 11+, you can use the stagefright framework to encode your video to mp4, you don't need ffmpeg for this.
OTOH, there are quite a few ports of ffmpeg to Android, some even install a separate service whose sole purpose is to provide ffmpeg support for any app on the device. Using such approach you definitely do not violate any SW licenses.
I am trying to build a video system on android. I am using the sample provided by Qualcomm, which allows me to use openmax and do hardware-acceleration on Qualcomm customer device.
Anyway, this sample only generates .h264 file. So I am looking forword a good way to do the muxer work. I've used MediaMuxer before, but it supports system later than android4.3, so this doesn't work on this sample. (Qualcomm sample only support android4.2 and before)
Does anyone have any ideas? Thank you!
you can use ffmpeg. build ffmpeg for android, create jni wrapper and easily expose muxing functionality to java level
I am writting an app which needs to decode H.264(AVC) bitstream. I find there are AVC codec sources exist in /frameworks/base/media/libstagefright/codecs/avc, does anyone know how can one get access to those codecs in an Android app? I guess it's through JNI, but not clear about how this can be done.
After some investigation I think one approach is to create my own classes and JNI interfaces in the Android source to enable using the CODECS in an Android App.
Another way which does not require any changes in Android source is to include CODECS as shared library in my application, use NDK. Any thoughts on these? Which way is better(if feasible)?
I didn't find much information about Stagefright, it would be great if anyone can point out some? I am developing on Android 2.3.3.
Any comments are highly appreciated.Thanks!
Stagefright does not support elementary H.264 decoding. However it does have H.264 decoder component. In theory, this library could be used. But in reality, it will be tough to use it as a standalone library due to its dependencies.
Best approach would be use to JNI wrapped independent h.264 decoder(like the one available with ffmpeg).
I want to use the codecs in Android from my application. For now I just want to use the H.264 codec for testing, unless the mp3 or aac codecs provide functions for sending the audio to the device's speaker in which case I would prefer one of those.
I have the NDK installed along with Cygwin, GNU Make, and GNU Awk. I can't figure out what I need to do from here though. I'm downloading the entire OpenCORE tree right now but I don't even know how to build it or make Eclipse plugin know it needs to include the files.
An example or a tutorial would be much appreciated.
EDIT:
It looks like I can use JNI like P/Invoke which would mean I don't have to build the OpenCORE libraries myself. However, I can't find any documentation on the names of the libraries I need to load.
I'm also confused as to how to do it. I'm looking at http://www.koushikdutta.com/2009/01/jni-in-android-and-foreword-of-why-jni.html and I don't understand what the purpose of writing a library to access a library is. Couldn't you just use something like System.loadLibrary("opencore.so")?
You cannot build opencore seperately. It has to be built with whole source code. What are you trying to acheive. If you just want to play a video/audio, use VideoView or MediaPlayer object.
Build the Android source and use the headers and the static library from it. This will propel you straight to the zone of unsupported APIs.