I have built ffmpeg library for my Android device from here: https://github.com/appunite/AndroidFFmpeg. But some video files playing very very slow (i find out that very slow are playing videos which my Android device can play by itself). Here is build.sh script
https://github.com/appunite/AndroidFFmpeg/blob/master/FFmpegLibrary/jni/build_android.sh
May be this is because of these lines:
--enable-hwaccel=h264_vaapi \
--enable-hwaccel=h264_vaapi \
--enable-hwaccel=h264_dxva2 \
--enable-hwaccel=mpeg4_vaapi \
As I have understood these lines are enabling hw acceleration (the author of that code says that this can raise some bugs). The basic idea of the player is to decode video and audio streams in native code, then render video frame into AndroidBitmap and render Audio into Android MediaPlayer.
Does anyone know how to solve problem of slow video decoding (maybe decrease video frame resolution or something else?) I will be pleased for any help and ideas.
Strange that --enable-hwaccel=h264_vaapi is specified twice in a row, but I see that it's in the original build script that you linked to.
DXVA2 refers to DirectX Video Acceleration, available on Windows desktop computers. So that won't help here. VAAPI refers to Video Acceleration API. I was about to say that it targets only Unix desktops, but the Wikipedia page states that it can also target Android.
The likely reason that the decode is slow is that a software decode path is being taken. What type of video data are you decoding, and at what profile and resolution? Generally, it's best to leverage the Android media facilities, such as MediaPlayer for playback, unless you're doing something special. You have probably already researched this option and perhaps you found that you can't obtain raw AndroidBitmaps (I am not too familiar with Android development).
I'm looking at the source for both FFmpeg's VAAPI interface and the VAAPI->Android code. If you have FFmpeg compiled for Android, how is it accessing VAAPI? Do you have VAAPI compiled for Android as well? I have a feeling that VAAPI is not a stock component of Android (but again, I'm not sure), so you may need to ensure that VAAPI is in place. Then, are you correctly asking FFmpeg to use VAAPI? I don't think FFmpeg will autodetect this.
Related
I've set up Apache 2.0 with several .m3u8 files serving a set of mpeg2ts files over HLS. These ts files were produced with libavformat by transmuxing an MP4 I downloaded from youtube. When I play the resulting HLS on VLC or QT, everything works fine. But on Android (Stagefright 1.2) the video has several problems:
The option to go full-screen does not work
The video duration says 1:40 when it is actually 2:00
The video sometimes fails to start and you have to reload page
The video reliably distorts (tears and pixelates) at transition points when switching the underlying .ts streams.
Some of this is ameliorated if I don't use HTML5's tag. But problem #4 remains.
I can play other m3u8's on Stagefright without any of the above problems, so I am assuming my transmuxing code is wrong, but even forgoing it and using the (recently added) HLS segmenting features of ffmpeg I have the same problem. Recoding with libx264 changes nothing.
I am at wit's end debugging this.
Android's libstagefright (along with mediaservice's NuPlayer) is not so mature product as vlc and a lot of troubles which are not present while using vlc are present in android it is much more vulnerable for any broken, corrupted, deviated content.
Such pixelation/macroblock artifacts are usually present while some frames where dropped (by android code or were lost) before decoding.
If those corruptions are present along with some green fields it might be a problem with synchronization of format change with a key frames (which might be a result of wrong implementation of source, or in part which notifies ANativeWindow about format change).
In corner case you might not get any green frames but crop/reolution would be deviated and pixelation might be visible).
What I would do:
1) Check for frame dropps
2) Check with some analyzer frames at the borders of consecutive sections
In Android ICS and later, a new OpenMax IL API version is in use, making old binary blobs useless/unused. This leads to older devices that otherwise run ICS just fine and dandy to have broken video playback (YouTube HQ and IMBD, for example) because Androids fallback software decoder sucks when compared to what ffmpeg can do on the same device (I tested MXPlayer+arm6vfp ffmpeg and a 720p movie played back great).
I am trying to dig through the Android source code to see where and what exactly I could add/replace code to allow the ffmpeg library's awesomeness to be used. The problem is I don't know exactly what code is being used in for example the YouTube app to decode video, or how that's decided.
So I have two options as far as I can tell:
Figure out the current software decoder being used, and try to wrap its external interface around ffmpeg, effectively replacing the slow software decoder currently used. The end result would be a single .so I could push to the device.
Figure out how to trick Android into thinking an OMX library based on ffmpeg (I have built one succesfully for Android: limoa) and add this somewhere to the list of considered libraries (or better: replace the unusable hardware codec).
As an extension, I'd like to also make camcorder video encoding work through this, so a true integrated solution would be very much wanted. The question is: how, and where, and what? Searching the Android source tree gives numerous counts of "H264" and related stuff in many different places. I need the lowest and simplest possible, so I can simply wrap the hypothetical decode(buffer) function call to use ffmpeg (libavcodec).
It seems to me that this presentation ("Integrating a Hardware Video Codec into Android Stagefright using OpenMAX IL") is exactly what you'd like to do. Good luck with your project!
I've a requirement where I need to transcode small video clips shot from Native camera app to lower bitrate/resolution Mp4 which is shreable via email etc.
What is the best way to transcode/convert the video on device itself. FFMPEG or any other library?
p.s. I know this is an overkill for the device but client leaves me with no option. He doesn't care about battery or time it takes. I'm targeting this for quad-cores, where CPU is not a problem.
Your best bet would be to use something like ffmpeg which has been ported to Android (see this SO post: ffmpeg for a android (using tutorial: "ffmpeg and Android.mk") and the ffmpeg port for android which is here: http://bambuser.com/opensource). You'll have to use JNI etc, but that will save you the hassle of dealing with the byte stream yourself.
Haven't tried it on Android myself, so YMMV:
Is there a Java API for mp4 files?
http://code.google.com/p/mp4parser/
If you're recording on-device, why not set the expected format from your code? It appears the api lets you set video size, framerate etc. in the MediaRecorder class.
I know Android doesn't support MJPEG natively but are there any jar files/drivers available that can be added to a project to make it possible?
There is a View available to display MJPEG streams:
Android and MJPEG Topic
Hardly, unless it's your Android platform (i.e. you are the integrator of special-purpose devices running Android).
A good place to start looking on how the Android framework handles video streams is here:
http://opencore.net/files/opencore_framework_capabilities.pdf
If you want to cook up something entirely incompatible, I guess you could do that with the NDK, jam ffmpeg into there, and with a bit of luck (and a nightmare supporting different Android devices) you can have it working.
What is the root problem you are trying to solve, perhaps we could work something out.
You can of course write or port software to handle any documented video format, the problem is that you won't have the same degree of hardware optimized code as the built in video codecs, and won't have as efficient low-level access to the framebuffer. So your code is likely to not be able to play back at full speed. Sometimes that might be okay, if you just want to get a sense of something. Also mjpeg compresses frames individually, so it should be trivial to write something that just skips a lot of frames and only decodes whatever fraction of them it can keep up with.
I think that some people have managed to build ffmpeg or mplayer using the optional features of the cpus in some phones and get to full frame rate for some videos, but it's tricky and device-specific.
I'm probably stating the obvious here, but MJPEG consists simply of multiple JPEGs. If you can grab the frames by cutting out data, you can probably get that data to be displayed as any other image.
I couldn't find any information on when exactly this was implemented, but as of now (testing on Android 8) you can view MJPEG stream just fine using a WebView.
I am working on an Android app, that needs to do the following:
- capture a (animated) view to video including audio (from a mp3 file)
- encode the captured video (probably a bunch of raw image buffers) and audio to avi.
After searching, FFMPEG seems the most suitable. Does anybody have a sample code to accomplish what I need. I would really appreciate.
Whyhow
It's not clear what you mean by 'a (animated) view' to capture, but be aware that android apps running with normal permissions cannot access the raw framebuffer. The computation part of ffmpeg builds in the ndk without undue work and there's a lot you can read about on the web, but the output (or in your case input) drivers are a bit of a permissions problem. Also you should expect encoding to be much slower than real time unless you can somehow manage to leverage hardware acceleration features of your particular device's SOC.
if u are building your app for android then u can use .avi writer code. You can get this code from "Koders website". Search for "Koders site" on google .you will get the link. I have tested the .avi file writer code and its working fine.