I have a new task of integrating a decoder(HEVC) from FFMPEG to Android's Stagefright. To do this, i first need to created an OMX component, My next thing is to register my codec in media_codecs.xml and then the OMX component registration in OMXCore.
Is there any guide or steps to create an OMX component for a video decoder? Secondly, this decoder plays only elementary streams (.bin or .h265 files) so there is no container format here.
Can anyone provide some steps or guidelines to be followed while creating the OMX component for a video codec. Any sort of pointers will be really helpful to me.
Thanks in advance.
In general, you could follow the steps pointed in this question for integrating a decoder into OMX Core.
HEVC is not yet part of the OMX IL specification. Hence, you would have to introduce a new role like video_decoder.hevc for your component while registering in media_codecs.xml. Please do check that your OMX core can support this new role.
If you are trying to play only elementary streams, you can consider modifying the stagefright command line utility to read the elementary stream data and feed the decoder.
Another option is to modify the current recordVideo utility to read a frame data and create a decoder instead of the encoder. With these, I presume you should be able to play your decoder from command line.
EDIT: If you wish to build a new OMX component, I would recommend that you could refer to the Bellagio Component Writers Guide which should give good information on how to build an OMX component. This gives a pretty comprehensive guide to build a new component. Please do ensure that you are able to identify the dependencies with the Bellagio implementation and your core implementation.
Also, you could look at other public domain OMX implementations as here:
http://androidxref.com/4.4.2_r1/xref/hardware/ti/omap4xxx/domx/
http://androidxref.com/4.4.2_r1/xref/hardware/qcom/media/mm-video-v4l2/vidc/
I feel Bellagio could work as a good starting reference if you haven't build an OMX component earlier. The sources for Bellagio are available on Sourceforge.
Related
I have gone through this links and few other links also,
khronos
OpenMax_Development_Guide
bellagio_openmax_il_open_source_implementation_enables_developers_to_create
but all of them just explains how the calling sequence is, picture of block diagram etc but don't explain how to write and build openmax component and plug it in android. Even the link for android building and porting is complicated it doesn't explain, that you will need whole source code to write and build openmax plugin or part of android source code or without android source code you can create it.
I am having firefly K3288 board with android OS Kitkat 4.4 which is supporting hevc hardware decoder but I want to add hevc software decoder.
If anyone know how to write and build openmax hevc video decoder component and plug it in android please give some directions.
For the 1st question of how to develop an OMX component, you will have to write a new component either out of scratch or using a template of existing functions. Please do refer to the OMXIL specification, specifically chapter 2.
I would recommend you to write a component based on Bellagio implementation which can be found here. Please refer to omx_base_video_port.c as this is essential for your decoder development.
An alternative could be to refer the implementation from one of the vendors. In AOSP tree, please refer to the qcom implementation as here which could provide you a good reference for starting with your development.
Note: Please note that OMX wrapper is more aligned towards state management, context management and buffer management. The interaction with your decoder whether HW or SW is dependent on your driver architecture which you should decide on. Once this driver architecture is finalized, integrating into OMX should be fairly easy.
For the 2nd question on how to integrate the hevc decoder, please refer to this question which has relevant details.
There is a 'native-media' project in NDK samples, in which it calls OMX functions in C level to do the video decode and play stuff, but it seems that NDK doesn't support encode of OMX now, is that true?
Besides, I also find this link. It seems that people are talking about use OMX to do video encode. But I can't find more information about OMX encode in android. Does anyone know about that?
We can do that in certain condition.
Google doesn't provide the API for native-media encoder now. If we want to do that, we need to use the code provided by the hardware company, compile the code into a .so file, and call the function through that file in NDK.
I've found a sample provided by Qualcomm. You can find it in this link. After you download the sample, you will find a user introduction and a demo.
In Android 4.2 release, I observe that the miracast implementation mandates an OMX encoder to support a new extension index "OMX.google.android.index.prependSPSPPSToIDRFrames". However, when I studied the subsequent implementation of MediaCodec, Converter and WifiDisplaySource, I observe that there is enough support in the existing framework to support this feature without the need of adding another index for the OMX component.
Can someone please confirm if my understanding is correct? If so, can you kindly provide some further information on reasons/rationale behind the same?
Thanks.
I have found the answers to my question in latest release of Android 4.2.2. As per my earlier question, Google has decided to support both an index inside the OMX Component as well as a Stagefright framework level handling of prepend SPS and PPS to IDR frames. This way an OMX Component needn't support the new index. The creation of the component from ACodec interface doesn't fail and the framework takes up the responsibility of prepend.
Is there a documentation explaining android Stagefright architecture?
Can I get some pointers on these subjects?
A good explanation of stagefright is provided at http://freepine.blogspot.com/2010/01/overview-of-stagefrighter-player.html.
There is a new playback engine implemented by Google comes with Android 2.0 (i.e, Stagefright), which seems to be quite simple and straightforward compared with the OpenCORE solution.
MediaExtractor is responsible for retrieving track data and the corresponding meta data from the underlying file system or http stream;
Leveraging OMX for decoding: there are two OMX plugins currently, adapting to PV's software codec and vendor's hardware implementation respectively. And there is a local implementation of software codecs which encapsulates PV's decoder APIs directly;
AudioPlayer is responsible for rendering audio, it also provides the timebase for timing and A/V synchronization whenever audio track is present;
Depending on which codec is picked, a local or remote render will be created for video rendering; and system clock is used as the timebase for video only playback;
AwesomePlayer works as the engine to coordinate the above modules, and is finally connected into android media framework through the adapter of StagefrightPlayer.
Look at this post.
Also, Android player is built up using PacketVideo (PV) Player, and here comes the docs about it (beware of really slow transfer speed :) ):
PVPlayer SDK Developer's Guide link 1, link 2
PVPlayer return codes link
Starting Gingerbread, it is Stagefright framework instead of PV framework. Above link has good info about the framework. If you have some specific questions, I may be able to help you out.
Thanks, Dolphin
I am writting an app which needs to decode H.264(AVC) bitstream. I find there are AVC codec sources exist in /frameworks/base/media/libstagefright/codecs/avc, does anyone know how can one get access to those codecs in an Android app? I guess it's through JNI, but not clear about how this can be done.
After some investigation I think one approach is to create my own classes and JNI interfaces in the Android source to enable using the CODECS in an Android App.
Another way which does not require any changes in Android source is to include CODECS as shared library in my application, use NDK. Any thoughts on these? Which way is better(if feasible)?
I didn't find much information about Stagefright, it would be great if anyone can point out some? I am developing on Android 2.3.3.
Any comments are highly appreciated.Thanks!
Stagefright does not support elementary H.264 decoding. However it does have H.264 decoder component. In theory, this library could be used. But in reality, it will be tough to use it as a standalone library due to its dependencies.
Best approach would be use to JNI wrapped independent h.264 decoder(like the one available with ffmpeg).