All,
I am using MediaCodec class to generate video/avc video stream. stream encoding is fine but, I want to use B frames for better compression. Though I have set profile to AVCProfileHigh encoder is not generating B frames. Video stream has only I and P frames.
Below is media profile configuration.
mFormat.setInteger(MediaFormat.KEY_PROFILE, MediaCodecInfo.CodecProfileLevel.AVCProfileHigh);
Does MediaCoder support B Frames ??
If Yes, then hoe to configure the B frames.
This depends on the device. Android O and P briefly enabled B-frames automatically for AVC High profile encoding, but many apps started crashing as they did not expect out-of-order frames, so it got disabled shortly after launch. Also MediaMuxer failed on some stress streams with B-frames (e.g. if a B-frame is referring back from a frame over 1 second in the future). This is fixed in Android Q, though the back reference is still limited to 30 minutes.
Technically, apps can opt into getting B frames with MediaFormat.KEY_OUTPUT_REORDER_DEPTH format key (after setting high profile), but support for this in not required by devices, and is not hooked up in AOSP.
There is no guarantee that all devices will support it, but some devices might.
However, at least in earlier versions of Android, you had to set the level parameter at the same time, if you tried to set the profile parameter, otherwise it wouldn't be used. See https://stackoverflow.com/a/26293422/3115956 for more details about this. The thing with setting the level parameter is that you need to choose a level that is high enough to support the chosen resolution and frame rate.
Related
I'm using libstreaming.
I would like to initiate two MediaCodecs with different settings and bitrates (one low quality - which will be transmitted via HTTP and one high to be saved to the SDCARD).
The problem appears to be that I can't grab two separate Mediacodec objects with differing settings.
The high bitrate version is saved as a video containing nothing but a green background *unless the dimensions are set to < 352x288, however the low bitrate version is successfully (and correctly) being streamed to the web.
I am really hoping that I'm doing something obviously wrong and that there's a simple way to save the HD version of the stream to disk.
In general, this should work on most devices - I do it without a problem on a number of devices.
But there are some devices where the encoder driver has got restrictions for this - in particular, some Intel devices refuse to create a second encoder instance while one is active. (Samsung Galaxy Note 3 10.1 comes to mind - not sure if all other Intel based ones have the same issue, or only some of them.)
Unfortunately, even if the Android CTS tests have tests to ensure that the hardware encoder works, there's no test that guarantees that you can have more than one encoder active at the same time.
Does your case fail only if you have differing settings on the second encoder, or also if they have the same settings?
If one stream is of a low resolution, you could try using a SW encoder for that instead, while using the HW encoder for the high resolution version. On Android 6.0, the SW encoder OMX.google.h264.encoder should be quite decent, while on older versions, it's close to unusable.
I'm working on a Android app that uses MediaCodec to encode H.264 video using the Surface method. I am targeting Android 5.0 and I've followed all the examples and samples from bigflake.com (I started working on this project two years ago, so I kind of went through all the gotchas and other issues).
All is working nice and well on Nexus 6 (which uses the Qualcomm hardware encoder for doing this), and I'm able to record flawlessly in real-time 1440p video with AAC audio, in a multitude of outputs (from MP4 local files, upto http streaming).
But when I try to use the app on a Sony Android TV (running Android 5.1) which uses a Mediatek chipset, all hell breaks loose even from the encoding level. To be more specific:
It's basically impossible to make the hardware encoder work properly (that is, "OMX.MTK.VIDEO.ENCODER.AVC"). With the most basic setup (which succeeds at MediaCodec's level), I will almsot never get output buffers out of it, only weird, spammy, logcat error messages stating that the driver has encountered errors each time a frame should be encoded, like this:
01-20 05:04:30.575 1096-10598/? E/venc_omx_lib: VENC_DrvInit failed(-1)!
01-20 05:04:30.575 1096-10598/? E/MtkOmxVenc: [ERROR] cannot set param
01-20 05:04:30.575 1096-10598/? E/MtkOmxVenc: [ERROR] EncSettingH264Enc fail
Sometimes, trying to configure it to encode at a 360 by 640 pixels resolution will succeed in making the encoder actually encode stuff, but the first problem I'll notice is that it will only create one keyframe, that is, the first video frame. After that, no more keyframes are ever, ever created, only P-frames. Ofcourse, the i-frame-interval was set to a decent value and is working with no issues on other devices. Needless to say, this makes it impossible to create seekable MP4 files, or any kind of streamable solution on top.
Most of the times, after releasing the encoder, logcat will start spamming endlessly with "Waiting for input frame to be released..." which basically requires a reboot of the device, since nothing will work from that point on anyway.
In the case where it doesn;t go havoc after a simple release(), no problem - the hardware encoder is making sure that it cannot be created a second time, and it falls back to the generic SOFTWARE avc google encoder. hich ofcourse is basically a mockup encoder which does little to nothing than spit out an error when trying to make it encode anything larger than 160p videos...
So, my question is: is there any hope of making this MediaCodec API actually work on such a device? My understanding was that there are some CTS tests performed by Google/manufacturers (in this case, Sony) that would allow a developer to actually think that an API is supported on a device which prouds itself as running Android 5.1. Am I missing something obvious here? Did anyone actually ever tried doing this (a simple MediaCodec video encoding test) and succeeded? It's really frustrating!
PS: it's worth mentioning that not even Sony provides yet a recording capability for this TV set, which many people are complaining anyway. So, my guess is that this sounds more like a Mediatek problem, but still, what exactly are the Android's CTS for in this case anyway?
I created android app that records device screen (using MediaProjection) API and video from camera at the same time. I use MediaRecorder in both cases. I need a way to find out whether device is actually capable of recording two video streams simultaneously. I assume there is some limit on number of streams that can be encoded simultaneously on given devices but I cannot find any API on android platform to query for that information.
Things I discovered so far:
Documentation for MediaRecorder.release() advises to release MediaRecorder as soon as possible as:
" Even if multiple instances of the same codec are supported, some performance degradation may be expected when unnecessary multiple instances are used at the same time."
This suggests that there's a limit on number of instances of the coded which directly limits number of MediaRecorders.
I've wrote testing code that creates MediaRecorders (configured to use MPEG4/H264) and starts them in a loop - On Nexus 5 it always fails with java.io.IOException: prepare failed when calling prepare() on 6th instance. This suggests you can have only 5 instances of MediaRecorder on Nexus5.
I'm not aware of anything you can query for this information, though it's possible something went into Lollipop that I didn't see.
There is a limit on the number of hardware codec instances that is largely dependent on hardware bandwidth. It's not a simple question of how many streams the device can handle -- some devices might be able to encode two 720p streams but not two 1080p streams.
On some devices the codec may fall back to a software implementation if it runs out of hardware resources. Things will work but will be considerably slower. (I've seen this for H.264 decoding, but I don't know if it also happens for encoding.)
I don't believe there is a minimum system requirement in CTS. It would be useful to know that all devices could, say, decode two 1080p streams and encode one 1080p simultaneously, so that a video editor could be made for all devices, but I don't know if such a thing has been added. (Some very inexpensive devices would struggle to meet that.)
I think it really depends on devices and ram capacity ... you could read the buffers for screen and cam as much as you like but only one read at a time not simultaneously I think to prevent concurrency but honestly I don't really know for sure
On Android 4.1 and above, I am using MediaCodec framework to decode H264 data. I see the codec instance that I'm using (via createDecoderByType) supports multiple color-formats. However, it always gives the output in the 1st-indexed color-format (from its supported list).
Is there a way to force the decoder to give out decoded data in a particular color-format from the ColorFormats it supports? I know the developer docs does mention that the key KEY_COLOR_FORMAT can only be set for encoders, but then help me understand what is the rational of having multiple supported color-formats for decoders?
No, there is currently no way to specify the color format for the decoder output.
This is especially annoying on devices that use undocumented proprietary buffer layouts.
Directing the output to a Surface results in more consistent and portable behavior, but as of API 19 there's still no convenient way to get at the pixel data (ImageReader doesn't work with MediaCodec output formats, glReadPixels() can be slow and works in RGB, etc). If you can do what you need with OpenGL shaders then things work pretty well (see e.g. the effects in "show + capture camera").
I have an application streaming video from the device to a remote computer. When trying to set the frame rate I keep getting:
ERROR/StagefrightRecorder(131): Failed to set frame rate to 15 fps. The actual frame rate is 30
The code I use is:
video = new MediaStreamer();
video.setVideoSource(MediaRecorder.VideoSource.CAMERA);
video.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
video.setVideoFrameRate(frameRate);
Any ideas on how to fix this?
The decoders usually come from the semiconductor vendor like TI, Qualcomm etc. It depends on the decoders whether they honor the call of frame rate modification or not. From the app layer, you cannot do much on this. The calls that you are making are the right ones. If the underlying decoders support it, then you can modify else you cannot.
Vibgyor
I guess documentation says that you may or may not be able to set the frame rate from the application layer. It depends on the underlying decoder whether it gives the app that flexibility or not. I wagely rememeber that I ahve tried setting frame rate to even 3-4 frames but still it gives the default frame rate only. I have seen in the Stagefright framework that it passes the frame rate call to the decoder and then depends on the deocoder to honor the call or not.
Vibgyor