Accessing Native Codecs in Android 3.x version and above - android

I would like to use the native decoder for a custom video player. The VideoView and MediaPlayer does not provide functionality that will support my requirements.
I am using FFMPEG (software decoder) right now, but I would prefer to use native hardware decoding if possible. Is there a way to do this through the NDK?

There isn't currently a public API available for accessing any native hardware and I believe the presence of such hardware isn't guaranteed. While you could go digging into the DSP capabilities of the ARM processors being used by some devices it wouldn't be portable to all Android devices.
I would recommend continuing your software approach for guaranteed support on all devices.

Related

Why WebRTC only support H264 in Chrome but not in native application with some devices

I use official sample to create offer SDP in Android Chrome, we can find a=rtpmap:100 H264/90000 that meant it can support H264.
But if I build AppRTC(official Android sample) and use official prebuilt libraries version 1.0.25821, call createOffer then receive SDP in SdpObserver::onCreateSuccess, the SDP did not contain H264.
My test device is Oppo R15 (with MTK Helio P60, Android 8.1).
So, why WebRTC only support H264 in Chrome but not in native application with some Android devices?
Chrome build uses openh264 which is not used by regular **WebRTC. What I meant by regular is that there is variant with software h.264 encoder from the chrome build which you may use but I wouldn't recommend it.
On Android WebRTC, H.264 is supported only if
device hardware supports it, AND
WebRTC hardware encoder glue logic supports that hardware encoder. Currently only QCOM and EXYNOS devices are supported. So any other devices even if they support h.264 HW encoder, won't be used and won't be added as part of codec factory and you won't see in SDP generated from WebRTC sample apps.
At Java level, you can see that in HardwareVideoEncoderFactory.java which checks for QCOM and EXYNOS devices in isHardwareSupportedInCurrentSdkH264 function.
Interestingly, if you are using native code, even QCOM and EXYNOS hardware encoders are not supported (there is bug filed on Webrtc issue tracker). This is because of tight integration of HW encoding code with JNI code - definitely not a good modular code.

How to do hardware h.264 video encoding on android platform

I'm trying to do hardware h.264 video encoding platform, I've learned that "MediaCodec" seems support hardware video decoding, but does it support hardware video encoding?
Some search results from google suggest I should consider search for different solutions for different chips according to user's Android device, does this mean I should go to each chip provider's website to search for different solution?
Thanks for any advice
The MediaCodec class also supports video encoding. The MediaCodec class was explicitly designed for multi device hardware accelerated media processing so that the same code runs on every device (from experience i can tell you it won't)
Good readings about this topic: http://developer.android.com/reference/android/media/MediaCodec.html
http://bigflake.com/mediacodec/
Remember, MediaCodec min-sdk version is 16 (i recommend to target api 18 e.g. usage of surface / MediaMuxer class), so if you're targeting devices with api < 16 MediaCodec won't do. If you wan't to target these devices you'll have to use lib stagefright and OpenMax wich i do not recomend

RTP-Server in Android 4.0 and above

Working on Android 4.0+ above.
I am in process of analyzing ways to live stream my camera video to Window PC using RTP , encoding MPEG-2.
Is there readily available "rtp-server" in android 4.0+ ?
Is following true:: "The Android platform lacks support for
streaming protocol, which makes it difficult to stream live audio /
video to Android enabled devices." extracted from website
Currently I analyzed used the ffserver from the ffmpeg
libraries, but the FPS is < 5. which is far slow. Did any one
explored other solution which has more FPS?
Did anybody tried using StageFright for same? Capturing raw data
from camera and sending it to stagefright framework for encoding and
then streaming the same using RTP ??
Many Thanks.
The answers to your questions are as below. Though the links are related to Android 4.2.2, the same is true for Android 4.0 also.
Yes, there is a RTP transmitter available. You could look at this example in MyTransmitter as a starting point or you can consider using the standard recorder as in startRTPRecording.
You can stream data via RTP from an Android device to an external sink or you could have a different use-case as in Miracast a.k.a. Wi-Fi Display. However, streaming from one android device to another device through Wi-Fi Direct is still not completely enabled. The latter statement is mainly coming from Miracast scenario.
You can use the standard android software, which is capable of high resolution recording and transmission. This is mainly dependent on the underlying hardware as the overhead from software stack is not very high.
Yes. This is already answered in Q1 above.

Support for OMX interface in Android StageFright software codecs

Do Android software codecs (for example OMX.PV.mpeg4enc) support OpenMAX (OMX) interface? For example, do they support standard OMX functions like OMX_FillThisBuffer, OMX_EmptyThisBuffer etc? Or only hardware decoder support OMX interface?
Yes. OMX.PV.mpeg4enc is an openmax component. But its no longer used in android.
This is a software codec, provided by packet video company. Only chipset vendors like TI, Qualcomm provide hardware codecs as OpenMAX IL components.
So yes, both software and hardware codecs support (and should support) OMX interface in order to be used by the media framework on Android.
OMX.PV.mpeg4enc is an software codec .openmax component means that an wrapper of multimedia's functions . In my understanding ,its roles likes decodec or encodec .
From a certain point of view, it can be understood as codec,so it support omx.

add a new codec to Android?

I'm very new to Android. Now i need to work on Adding my own codec to Android.I mean, I want to know what all the steps i need to take to add my own codec to android.
I'm very fresh to this i need some basic things, so can someone please explain the steps I need to take in order to add a new codec to Android?
This is virtually impossible to do in a portable way as all audio and video codecs are compiled at the platform level (due to the fact that most of the time they require hardware specific acceleration)
If you are only interested on this working on a specific hardware platform and have an unlocked bootloader (So you can boot a custom build of Android) you can compile the full Android platform from scratch using the AOSP as a base.
Depending on which version of Android you're targeting you're looking at adding code to either Opencore or Stagefright (The subsystems that Android uses for A/V decoding and parsing) here you can add audio decoders, audio encoders, video encoders, video decoders and container parsers.
Here is some discussion of adding to Stagefright:
http://freepine.blogspot.com/2010/01/overview-of-stagefrighter-player.html
http://groups.google.com/group/android-porting/msg/5d88e76845a22bbb
However unless the encoding scheme you wish to support is very simple (What are you wanting to add?) it is likely to be too CPU intensive for most Android devices to run without being able to offload some of the work to another system (like the radio chipset or the GPU).
In Android Framework, mediacodec is implemented as Stagefright, a media playback engine at the native level that has built-in software-based codecs for popular media formats. Stagefright also supports integration with custom hardware codecs provided as OpenMAX component. Here is the article which summaries the steps.
CHECK HERE

Categories

Resources