I am developing a player on android using ffmpeg. However, I found out that avcodec_decode_video2 very slow. Sometimes, it takes about 0.1, even 0.2 second to decode a frame from a video with 1920 × 1080 resolution.
How can I improve the speed of avcodec_decode_video2() ?
If your device has necessary hardware+firmware capabilities, you could use ffmpeg with libstagefright support.
Update: here is the easy procedure to decide whether it is worth while to switch to libstagefright on your device for a given class of videos: use ffmpeg on your PC to convert the representative video stream into mp4:
ffmpeg -i your_video -an -vcodec copy test.mp4
and try to open the resulting file with the stock video player on your device. If the video does play with reasonable quality, you can use libstagefright with ffmpeg to improve your player app. If you see "Cannot Play Video", your device hw+fw does not support the video.
That sounds about rite. HD video takes a lot of CPU. Some codecs may support multithread decode if you device has multiple cores. But the will consume massive amounts a battery, and heat the device. This is why most mobile devices use specialized hardware decoders instead of CPU. In Android using the MediaCodec API instead of libavcodec should invoke the hardware decoder.
Related
I want to convert all videos based on h265 to h264 and at the same time reduce the resolution to for example 720p to avoid working on very big resolutions and later uploading that kind of big size files.
I see docs https://developer.android.com/guide/topics/media/media-formats says that for h265 Android OS supports only decoding, not encoding.
I know that FFMPEG will solve all my problems, but including FFMPEG will increase the app size very much, I'd like to avoid that. I am trying to use currently the Android MediaCodec, but it looks like it would work fine with converting h264 to h264 but not with h265 to h264.
Do you have any ideas? I don't need to support old Android versions.
Thanks for any advice.
If you're compiling it for Android 12 or higher, you can use the built-in transcoder
Otherwise, you'll need to include a 3rd-party media transcoder library, and FFMPEG is still your best choice
I have recently used ffmpeg library for android to compress the video of length 10 seconds and size nearly 25 MB. Following are the commands i tried to use:
ffmpeg -i /video.mp4 -vcodec h264 -b:v 1000k -acodec mp2 /output.mp4
OR
ffmpeg -i input.mp4 -vcodec h264 -crf 20 output.mp4
Both of the commands were too slow. I canceled the task before it completed because it was taking too much time. It took more than 8 minutes to process JUST 20% of the video. Time is really critical for me so i can't opt for ffmpeg. I have following question:
Is there something wrong with the command or ffmpeg is slow anyway?
If its slow then is there any other well documented and reliable way/library for video compression that i can use in android?
Your file is in mp4 container and already has its streams in some predefined codec.
Now the size of any container(not specifically mp4) will depend upon what kind of compression(loosely codec) is used for compressing the data. This is why you will see different size for the same content in different formats.
There are other parameters which can affect the size of the file i.e frame rate, resolution, audio bit rate etc. If you reduce them then the size of file becomes less. e.g. in youtube you can choose to play video at a lower rate when bandwidth is the issue.
However, if you choose to do this you will have to re-process the entire file again and its going to to take a lot of time since you are demuxing the container, decoding the codec, applying filter(reducing frame etc), then recording, and then again remuxing. This entire process if not worth few extra MB of saving unless you have some compelling use case.
One solution is to use a more powerful machine but again this is limited by the architecture/constraint of the application to utilize powerful machine. To answer specifically for ffmpeg it wont make much difference.
I am searching for a library which offer ability for streaming video from android device (5.1+) and recording it at the same time.
I tried MediaRecorder - the usual way to record videos on android - but with it I am not able to stream it over webrtc or rtsp because camera is busy.
Currently I am using libstreaming. With little modification done app can record and stream over rtsp concurrently. But this lib lacks support for hardware codec in MTK and SPRG chipsets.
I am wonder if you can recommend a solution or another lib which.
By the moment lib works only on nexus 4 with qcom chipset.
After several days of research, I came to the decision to use a combination of FFMpeg and MediaCodec.
It seems that the only way to get frames from camera at high rate is to use Android MediaCodec API. But MediaCodec supports only mp4 file formats, which is not an option for me (I need ts), while FFMpeg can process\create any kind of human known video formats.
Currently I am trying to make it work together (read ByteBuffer from MediaCodec and feed FFMpeg recorder with it).
Useful links:
Grafika project: https://github.com/google/grafika
ContinuousCapture and Show + record are the most interesting parts to check
javacpp (specifically FFMpeg wrapper): https://github.com/bytedeco/javacpp
Has example with recording and streaming.
kickflip sdk: https://github.com/Kickflip/kickflip-android-sdk
The library which makes two mentioned above tools works together and also is open sourced. Sadly it doesn't solve my problem fully. The feature I need is requested but not already implemented: https://github.com/bytedeco/javacv/issues/95
How does recording a 1080P, H264 encoded video in android camera application is realtime fast but encoding a video in android using FFMPEG is slow at the same video size?
I know FFMPEG is a software level encoder and it wont support any hardware features.
I know camera applications directly get buffer data from camera driver.
But actually where the difference happens??
Why camera application is Realtime fast???
Does it use GPU and OpenGL features of the phone to encode the video so that its so realtime fast??
Because both Camera Application and FFMPEG runs on same mobile but still camera encodes H264 realtime ???
I know FFMPEG is a software level encoder and it wont support any hardware features.
You have basically answered this question for yourself. Many devices have hardware codecs that don't rely on the usual CPU instructions for any encoding. FFmpeg won't take advantage of these. (I believe there are hardware optimizations you can build into FFmpeg, though I am not sure of their availability on Android.)
FFMPEG does support NEON optimisations by default on ARM platforms hence differences are not likely visible at resolutions like QVGA or VGA. But on-chip HW for encoding video is much faster at higher resolutions like 1080P, minimally utilising ARM MHz. Note that encoders use different HW than the OpenGL HW engines.
ffmpeg may use optional x264 encoder if configured this way; note that this has dire licensing implications.x264 is very good and efficient, and when it is built to use sliced multithreading, it can achieve 25FPS for WVGA video on modern devices, like Samsung S4.
ffmpeg can be compiled with libstagefrihht which makes use of the built-in hardware decoder, but unfortunately does not include an encoder.
I also met this problem,Bothering me for a long time.I solved the problem through this:
AVDictionary *param = 0;
//H.264
if (pCodecCtx->codec_id == AV_CODEC_ID_H264) {
// av_dict_set(¶m, "preset", "slow", 0);
/**
*
* ultrafast,superfast, veryfast, faster, fast, medium
* slow, slower, veryslow, placebo. This is x264 encoding speed parameter
*/
av_dict_set(¶m, "preset", "superfast", 0);
av_dict_set(¶m, "tune", "zerolatency", 0);
}
if (avcodec_open2(pCodecCtx, pCodec, ¶m) < 0) {
loge("Failed to open encoder!\n");
return -1;
}
you need set preset superfast or ultrafast
What formats of video file are supported in the Android emulator?
I understand that it probably won't play in real time, but what ones will play at all?
The secret is that the emulator will play the MP4 baseline profile, while real devices will also play better MP4 profiles.
In order to get a video file that plays properly in the emulator, try these settings:
ffmpeg -i inputvideo.wmv -vcodec libx264 -vprofile baseline outputvideo.mp4
It supports H.263 encoding and decoding, H.264 AVC and MPEG-4 SP both only decoding.
On an emulator the playback quality in terms of speed or lags might be a bit cumbersome.
Checkout the chart of all supported media formats for more information.