In Android ICS and later, a new OpenMax IL API version is in use, making old binary blobs useless/unused. This leads to older devices that otherwise run ICS just fine and dandy to have broken video playback (YouTube HQ and IMBD, for example) because Androids fallback software decoder sucks when compared to what ffmpeg can do on the same device (I tested MXPlayer+arm6vfp ffmpeg and a 720p movie played back great).
I am trying to dig through the Android source code to see where and what exactly I could add/replace code to allow the ffmpeg library's awesomeness to be used. The problem is I don't know exactly what code is being used in for example the YouTube app to decode video, or how that's decided.
So I have two options as far as I can tell:
Figure out the current software decoder being used, and try to wrap its external interface around ffmpeg, effectively replacing the slow software decoder currently used. The end result would be a single .so I could push to the device.
Figure out how to trick Android into thinking an OMX library based on ffmpeg (I have built one succesfully for Android: limoa) and add this somewhere to the list of considered libraries (or better: replace the unusable hardware codec).
As an extension, I'd like to also make camcorder video encoding work through this, so a true integrated solution would be very much wanted. The question is: how, and where, and what? Searching the Android source tree gives numerous counts of "H264" and related stuff in many different places. I need the lowest and simplest possible, so I can simply wrap the hypothetical decode(buffer) function call to use ffmpeg (libavcodec).
It seems to me that this presentation ("Integrating a Hardware Video Codec into Android Stagefright using OpenMAX IL") is exactly what you'd like to do. Good luck with your project!
Related
I have built ffmpeg library for my Android device from here: https://github.com/appunite/AndroidFFmpeg. But some video files playing very very slow (i find out that very slow are playing videos which my Android device can play by itself). Here is build.sh script
https://github.com/appunite/AndroidFFmpeg/blob/master/FFmpegLibrary/jni/build_android.sh
May be this is because of these lines:
--enable-hwaccel=h264_vaapi \
--enable-hwaccel=h264_vaapi \
--enable-hwaccel=h264_dxva2 \
--enable-hwaccel=mpeg4_vaapi \
As I have understood these lines are enabling hw acceleration (the author of that code says that this can raise some bugs). The basic idea of the player is to decode video and audio streams in native code, then render video frame into AndroidBitmap and render Audio into Android MediaPlayer.
Does anyone know how to solve problem of slow video decoding (maybe decrease video frame resolution or something else?) I will be pleased for any help and ideas.
Strange that --enable-hwaccel=h264_vaapi is specified twice in a row, but I see that it's in the original build script that you linked to.
DXVA2 refers to DirectX Video Acceleration, available on Windows desktop computers. So that won't help here. VAAPI refers to Video Acceleration API. I was about to say that it targets only Unix desktops, but the Wikipedia page states that it can also target Android.
The likely reason that the decode is slow is that a software decode path is being taken. What type of video data are you decoding, and at what profile and resolution? Generally, it's best to leverage the Android media facilities, such as MediaPlayer for playback, unless you're doing something special. You have probably already researched this option and perhaps you found that you can't obtain raw AndroidBitmaps (I am not too familiar with Android development).
I'm looking at the source for both FFmpeg's VAAPI interface and the VAAPI->Android code. If you have FFmpeg compiled for Android, how is it accessing VAAPI? Do you have VAAPI compiled for Android as well? I have a feeling that VAAPI is not a stock component of Android (but again, I'm not sure), so you may need to ensure that VAAPI is in place. Then, are you correctly asking FFmpeg to use VAAPI? I don't think FFmpeg will autodetect this.
I'm trying to decode a raw h264 stream on "older" Android versions.
I've tried MediaPlayer class and does not seem to support the stream format.
I can see the stream on other Cam Viewer apps from the market, so I figure there must be a way to do it, probably using the NDK.
I've read about OpenMAX and Stagefright, but couldn't find a working example about streaming.
Could someone please point me in the right direction?
Also, I'm reading in several places about "frameworks/av/include/media/stagefright/MediaSource.h" and other sources, but they don't seem to be either in the regular SDK or the NDK.
Where is this source located? is there another sdk?
Thanks in advance.
Update: I'm receiving a rtsp connection.
If you wish to perform only a simple experiment to verify certain functionality, you can consider employing the command line stagefright utility. Please do consider this condition where your streaming input may not be supported.
If you wish to build a more comprehensive player pipeline, you can consider the handling for rtsp as in here or http as in here. Please note that NuCachedSource2 implementation is essential for streaming input as this provides a page cache implementation which acts as a jitter for the streaming data.
Please do note one critical point: Command line stagefright utility doesn't render to the screen. Hence, if you wish to render, you will to implement the complete playback pipeline with rendering.
On a related note, if your input is streaming input, the standard player implementation does have support for streaming inputs as can be observed here. Did you face any issues with the same?
As fadden has already pointed out, your work is made far more simpler with the introduction of MediaCodec in Android 4.x.
You should use third-party libs like android-h264-decoder which uses JNI for increasing the performance! Also look at this lib Intel
Update: Media codec wasn't exposed until API 16 (Android 4.1), so that won't work for a 2.3.3 device.
Stagefright and OpenMAX IL were (and still are) internal components of Android. You can find the code (including MediaSource.h) at https://android.googlesource.com/platform/frameworks/av/+/master Note that the media framework has moved to a separate "tree" frameworks/av only recently. Before it was part of 'frameworks/base', e.g. https://android.googlesource.com/platform/frameworks/base/+/gingerbread/media/
Is it possible to perform adaptive (multibitrate) streaming onto an Android device? If yes, how to do that?
If you have 4.0 or 3.2 you just use access the adaptive stream as you would any other video. Literally.
It's a HTTP access.
So if you use as a data source //mywebsite/video1.mp4 you woulduse as a data source the equivalent //mywebsite/video1.m3u8. Now, I'm not including any discussion on how you create your streaming file but only how you would access it.
All the magic happens within the client (ex: mediaplayer, videoview) supported on 4.0 and 3.2. For the record, you may be able to access and run streaming segments (.m3u8 files) on earlier versions of Android because the manufactures have sometimes played around with the code. But I haven't found any that actually adapt. They usually stick to the first segment they run or default to the lower bitrate segment in the bunch and stay there regardless of bitrate.
I'm very new to Android. Now i need to work on Adding my own codec to Android.I mean, I want to know what all the steps i need to take to add my own codec to android.
I'm very fresh to this i need some basic things, so can someone please explain the steps I need to take in order to add a new codec to Android?
This is virtually impossible to do in a portable way as all audio and video codecs are compiled at the platform level (due to the fact that most of the time they require hardware specific acceleration)
If you are only interested on this working on a specific hardware platform and have an unlocked bootloader (So you can boot a custom build of Android) you can compile the full Android platform from scratch using the AOSP as a base.
Depending on which version of Android you're targeting you're looking at adding code to either Opencore or Stagefright (The subsystems that Android uses for A/V decoding and parsing) here you can add audio decoders, audio encoders, video encoders, video decoders and container parsers.
Here is some discussion of adding to Stagefright:
http://freepine.blogspot.com/2010/01/overview-of-stagefrighter-player.html
http://groups.google.com/group/android-porting/msg/5d88e76845a22bbb
However unless the encoding scheme you wish to support is very simple (What are you wanting to add?) it is likely to be too CPU intensive for most Android devices to run without being able to offload some of the work to another system (like the radio chipset or the GPU).
In Android Framework, mediacodec is implemented as Stagefright, a media playback engine at the native level that has built-in software-based codecs for popular media formats. Stagefright also supports integration with custom hardware codecs provided as OpenMAX component. Here is the article which summaries the steps.
CHECK HERE
I know Android doesn't support MJPEG natively but are there any jar files/drivers available that can be added to a project to make it possible?
There is a View available to display MJPEG streams:
Android and MJPEG Topic
Hardly, unless it's your Android platform (i.e. you are the integrator of special-purpose devices running Android).
A good place to start looking on how the Android framework handles video streams is here:
http://opencore.net/files/opencore_framework_capabilities.pdf
If you want to cook up something entirely incompatible, I guess you could do that with the NDK, jam ffmpeg into there, and with a bit of luck (and a nightmare supporting different Android devices) you can have it working.
What is the root problem you are trying to solve, perhaps we could work something out.
You can of course write or port software to handle any documented video format, the problem is that you won't have the same degree of hardware optimized code as the built in video codecs, and won't have as efficient low-level access to the framebuffer. So your code is likely to not be able to play back at full speed. Sometimes that might be okay, if you just want to get a sense of something. Also mjpeg compresses frames individually, so it should be trivial to write something that just skips a lot of frames and only decodes whatever fraction of them it can keep up with.
I think that some people have managed to build ffmpeg or mplayer using the optional features of the cpus in some phones and get to full frame rate for some videos, but it's tricky and device-specific.
I'm probably stating the obvious here, but MJPEG consists simply of multiple JPEGs. If you can grab the frames by cutting out data, you can probably get that data to be displayed as any other image.
I couldn't find any information on when exactly this was implemented, but as of now (testing on Android 8) you can view MJPEG stream just fine using a WebView.