In my Android app, I need to encode live camera video in gpu. Here is some relevant code:
MediaFormat format = MediaFormat.createVideoFormat(MIME_TYPE, WIDTH, HEIGHT);
format.setInteger(MediaFormat.KEY_COLOR_FORMAT,
MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface);
format.setInteger(MediaFormat.KEY_BIT_RATE, BITRATE);
format.setInteger(MediaFormat.KEY_FRAME_RATE, FRAME_RATE);
format.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, IFRAME_INTERVAL);
this._encoder = MediaCodec.createEncoderByType(MIME_TYPE);
this._encoder.configure(format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
Surface s = this._encoder.createInputSurface();
When MIME_TYPE is MediaFormat.MIMETYPE_VIDEO_AVC,, this code works as expected. However, when the mime type is MediaFormat.MIMETYPE_VIDEO_VP8, the call to createInputSurface() throws an illegal state exception. Here is some more info from logcat:
I/ACodec: setupVideoEncoder succeeded
E/OMXNodeInstance: OMX_GetExtensionIndex OMX.google.android.index.storeMetaDataInBuffers failed
E/ACodec: [OMX.google.vpx.encoder] onCreateInputSurface returning error -2147483648
W/MediaCodec: createInputSurface failed, err=-2147483648
I am thinking vp8 encoder is present on the device. Otherwise, MediaCodec.configure() would have failed. Appreciate your insight on why createInputSurface() fails? Regards.
My tests were on a device running OS 4.3. It seems this problem was fixed in version 4.4 and up as per this discussion: https://code.google.com/p/android/issues/detail?id=58834
Related
I implemented a video encoder by MediaCodec API to encode 2560x720 videos. I verified it on several devices but I encountered a configuration problem on Samsung Note 2 (Android 4.4.2). The following is my code to prepare the codec instance:
Format = MediaFormat.createVideoFormat("video/avc", 2560, 720);
Format.setInteger(MediaFormat.KEY_BIT_RATE, (int) (2560 * 720 * 20 * 0.5f));
Format.setInteger(MediaFormat.KEY_FRAME_RATE, RefreshRate);
Format.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar);
Format.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 1);
try {
Codec = MediaCodec.createEncoderByType("video/avc");
Codec.configure(Format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
CodecFound = true;
} catch (Exception e) {
CodecFound = false;
if(DEBUG) Log.e(CodecTag, "Failed to find available codec", e);
}
No exception is thrown during this try-catch block. But after starting codec operation, neither the encoded data nor the output format is output by the encoder even if the input buffers are filled ("dequeueOutputBuffer" always returns -1). The following is the log after starting codec, where some errors are reported:
05-04 11:29:17.071: E/MFC_ENC(1006): SOMXVE_Init: isSEIOn : 0
05-04 11:29:17.071: E/MFC_ENC_APP(1006): SsbSipMfcEncInit] IOCTL_MFC_ENC_INIT failed
05-04 11:29:17.111: A/libc(1006): Fatal signal 11 (SIGSEGV) at 0x00000000 (code=1), thread 1403 (AvcEncBufPrc)
05-04 11:29:19.326: E/VideoCodecInfo(1353): Failed to get track format due to codec error
05-04 11:29:25.606: E/ACodec(1353): OMX/mediaserver died, signalling error!
05-04 11:29:25.606: E/MediaCodec(1353): Codec reported an error. (omx error 0x8000100d, internalError -32)
05-04 11:29:25.606: E/AudioService(2462): Media server died.
05-04 11:29:25.606: E/Camera(1353): Error 100
I finally find these commands to get the supported media profile from Note 2
adb shell cat /system/etc/media_codecs.xml
adb shell cat /system/etc/media_profiles.xml
According to the profile list, the max resolution the h264 encoder can support is 1920x1080
<VideoEncoderCap name="h264" enabled="true"
minBitRate="64000" maxBitRate="20000000"
minFrameWidth="176" maxFrameWidth="1920"
minFrameHeight="144" maxFrameHeight="1080"
minFrameRate="1" maxFrameRate="30" />
My questions:
Why the codec can be config successfully with a unacceptable format? Is that a bug or a limitation related to API level?
I heard that there are some APIs for checking this kind of capabilities available from Android 5.0, but I must support 4.4 in my application. Is there any other way to obtain detailed capability info during run-time? (except for parsing the media_profile.xml)
Any help would be really appreciated.
The mediaserver process crashed, apparently due to a null pointer dereference. This process manages all interaction with the media codec hardware. A crash like this indicates a bug in the platform, possibly in the OEM's driver implementation.
The configure() call should fail if the resolution is not supported. My guess is that this particular implementation has a faulty check.
I don't think there's a way to check to see if a particular width x height pair works in Android 4.4. You can generally rely on common resolutions like 720p, but some sizes have strange behaviors, especially when you get non-multiples of 16.
I'm getting on the logcat the next error while encoding via the MediaCodec in Android.
The actual encoding works fine and the output is produced correctly, so I can't really understand why I get this trace. Is it a harmless error trace, or is there something I'm missing?
E/ACodec(6438): [OMX.qcom.video.encoder.h263] storeMetaDataInBuffers (output) failed w/ err -1010
Next is the code where I get the trace
final int BIT_RATE = 4000000;
final int FRAME_RATE = 30;
final int IFRAME_INTERVAL = 5;
final String MIME_TYPE = "video/avc";
final MediaFormat format = MediaFormat.createVideoFormat(MIME_TYPE, width, height);
format.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface);
format.setInteger(MediaFormat.KEY_BIT_RATE, BIT_RATE);
format.setInteger(MediaFormat.KEY_FRAME_RATE, FRAME_RATE);
format.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, IFRAME_INTERVAL);
MediaCodec encoder = MediaCodec.createEncoderByType(MIME_TYPE);
//---------------------------------
// NEXT LINE PRODUCES THE TRACE
encoder.configure(format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
//---------------------------------
It's harmless, most devices show this. See Q12 at http://bigflake.com/mediacodec/.
This only tells that the first way of signaling surface encoding wasn't supported by the encoder, so it used some other ways of setting it up. (There are multiple ways for the MediaCodec/ACodec layer to tell the individual encoder about it.)
The previous answer has indicated that the warning is quite harmless. Some additional information on the log and reasons behind the same
This trace in the log is indicating that the encoder is not supporting storeMetadataInBuffers on the output port. For an encoder, this mode could be supported on both input and output ports.
This mode is employed for input port to pass raw image data in metadata format i.e. pass only a reference to the gralloc handle which can employed by the encoder to access the data. This is employed by the camera and/or other screen recording applications to pass a reference to YUV data to the encoder.
The metadata mode was supported for output port also for potential encapsulation of output bitstream data. For example, when a Miracast or WiFi-Display session is active and the data being encoded is secure like a premium content, it becomes necessary to protect data between the encoder and HDCP encryption module, during which metadata format becomes handy. Not many encoders support this mode and hence, you observe this warning.
In my i want encode yuv data into h264 using mediacodec software codec.
I use Google software encoder OMX.google.h264.encoder
when i use hardware encoder[OMX.qcom.video.encoder.avc] that time it work but when i use software encoder[OMX.google.h264.encoder] it not encode file.it will give error [see in log].
what is problem i couldn’t identify.
Source :
mediaCodec = MediaCodec.createByCodecName("OMX.google.h264.encoder");
//mediaCodec = MediaCodec.createByCodecName(codecInfo.getName());
Log.i(TAG,"codec name : "+ mediaCodec.getName());
int mBitrate = (int) ((MainActivity.mHeight * MainActivity.mWidth * MainActivity.frameRate)*2*0.07);
MediaFormat mediaFormat = MediaFormat.createVideoFormat("video/avc",MainActivity.mWidth,MainActivity.mHeight);
mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE,mBitrate);
mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, MainActivity.frameRate);
// mediaFormat.setInteger(MediaFormat.KEY_MAX_INPUT_SIZE, 320*240);
mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT,colorFormat);
//mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT,MediaCodecInfo.CodecProfileLevel.AVCLevel12);
mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL,1);
try{
mediaCodec.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
mediaCodec.start();
Log.i(TAG,"H264 Encoder init success");
}catch(IllegalArgumentException e)
{
e.printStackTrace();
}catch (IllegalStateException e) {
e.printStackTrace();
}catch (Exception e) {
e.printStackTrace();// TODO: handle exception
}
But i getting this error.
Log :
I/H264Encoder(7772): outputStream initialized
I/OMXClient(7772): Using client-side OMX mux.
I/H264Encoder(7772): found colorFormat: 21
I/OMXClient(7772): Using client-side OMX mux.
E/OMXMaster(7772): A component of name 'OMX.qcom.audio.decoder.aac' already exists, ignoring this one.
I/SoftAVCEncoder(7772): Construct SoftAVCEncoder
I/H264Encoder(7772): codec name : OMX.google.h264.encoder
E/SoftAVCEncoder(7772): internalSetParameter: StoreMetadataInBuffersParams.nPortIndex not zero!
E/OMXNodeInstance(7772): OMX_SetParameter() failed for StoreMetaDataInBuffers: 0x80001001
E/ACodec(7772): [OMX.google.h264.encoder] storeMetaDataInBuffers (output) failed w/ err -2147483648
I/ACodec(7772): setupVideoEncoder succeeded
I/H264Encoder(7772): H264 Encoder init success
E/SoftAVCEncoder(7772): Video frame size 1920x1080 must be a multiple of 16
E/SoftAVCEncoder(7772): Failed to initialized encoder params
E/ACodec(7772): [OMX.google.h264.encoder] ERROR(0x80001001)
E/MediaCodec(7772): Codec reported an error. (omx error 0x80001001, internalError -2147483648)
W/System.err(7772): java.lang.IllegalStateException
W/System.err(7772): at android.media.MediaCodec.getBuffers(Native Method)
W/System.err(7772): at android.media.MediaCodec.getInputBuffers(MediaCodec.java:542)
W/System.err(7772): at com.ei.encodertest.H264Encoder.offerEncoder(H264Encoder.java:170)
W/System.err(7772): at com.ei.encodertest.MainActivity$ReadRawFileTask.doInBackground(MainActivity.java:113)
W/System.err(7772): at com.ei.encodertest.MainActivity$ReadRawFileTask.doInBackground(MainActivity.java:1)
W/System.err(7772): at android.os.AsyncTask$2.call(AsyncTask.java:288)
W/System.err(7772): at java.util.concurrent.FutureTask.run(FutureTask.java:237)
W/System.err(7772): at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:231)
W/System.err(7772): at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1112)
W/System.err(7772): at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:587)
W/System.err(7772): at java.lang.Thread.run(Thread.java:841)
The SW encoder OMX.google.h264.encoder is very limited at the moment (edit: On Android 5.0), close to being unusable.
This encoder doesn't allow using resolutions that aren't a multiple of 16. In your case, 1920x1080, the height 1080 isn't evenly dividable by 16, and thus isn't acceptable for this encoder. (See https://android-review.googlesource.com/38904 for an attempt at fixing this.)
If you'd change it into 1088, the multiple-of-16 wouldn't be an issue, but the encoder also won't allow you to use it with any resolution above 352x288 (see e.g. https://android-review.googlesource.com/82133).
Finally, on older Android versions (prior to 5.0), it also did output in a slightly different format (missing startcodes, see https://android-review.googlesource.com/42321), which meant that you would have to manually add startcodes at the start of each output packet to be able to use them in certain places (the MediaMuxer might have handled it as it was, though, mostly by chance).
In the current AOSP master (that is, maybe in the next major release, unless that already is being finalized and this change hasn't been included there), the encoder has been replaced with a more capable one, but for existing releases, there's not much you can do other than bundling a better SW encoder within your app.
Edit: The Android M preview that was released today does include the new SW encoder, which should work fine for this usecase.
Edit2: The new encoder was included in the Android 6.0 release (M), so since then, it should be usable.
I am well aware that there are a few other topics regarding this exception, but none of them seems to be the source of my problem. I'm trying to record a live video from Google Glass, but only one of the encoders seems to work (mime video/avc). Whenever I try a different encoder, I keep getting IllegalStateExceptions. Relevant code:
MediaFormat format = MediaFormat.createVideoFormat("video/svc", 1280, 720);
format.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface);
format.setInteger(MediaFormat.KEY_BIT_RATE, 10000000);
format.setInteger(MediaFormat.KEY_FRAME_RATE, 30);
format.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5);
mEncoder = MediaCodec.createEncoderByType("video/svc");
mEncoder.configure(format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
mEncoder.start();
As I said, the app crashes at mEncoder.configure, throwing an IllegalStateException at me:
E/ACodec﹕ [OMX.TI.DUCATI1.VIDEO.H264SVCE] configureCodec returning error -1010
E/MediaCodec﹕ Codec reported an error. (omx error 0x80001001, internalError -1010)
W/System.err﹕ java.lang.IllegalStateException
W/System.err﹕ at android.media.MediaCodec.native_configure(Native Method)
I hope someone can shed some light on what I have done wrong here.
Thanks in advance,
Wolfram
IllegalStateExceptions! It seems like that KEY_COLOR_FORMAT was not specified. As my testing result, Google glass' mediacodec api just support 'COLOR_TI_FormatYUV420PackedSemiPlanar', if encode type is ‘video/avc’.
I intend to encode raw YUV Data to H264 data for which I'm using Android's MediaCodec interface. Below is the snippet I have in place for the same:
MediaCodec mEncoder = MediaCodec.createEncoderByType("video/avc");
MediaFormat mVideoFormat = MediaFormat.createVideoFormat("video/avc", 640 , 480);
mVideoFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, CodecCapabilities.COLOR_FormatYUV420SemiPlanar);
mVideoFormat.setInteger(MediaFormat.KEY_BIT_RATE, 64000);
mVideoFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 24);
mVideoFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5);
mVideoFormat.setInteger(MediaFormat.KEY_AAC_PROFILE, MediaCodecInfo.CodecProfileLevel.AVCProfileBaseline);
mEncoder.configure(mVideoFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
mEncoder.start();
ByteBuffer[] mInputVideoBuffers = mEncoder.getInputBuffers();
ByteBuffer[] mOutputVideoBuffers = mEncoder.getOutputBuffers();
Although it works well on ARM devices, it fails on Intel x86 device I have (Samsung Tab 3) with below message:
E/ACodec(21756): [OMX.Intel.VideoEncoder.AVC] ERROR(0x80001001)
E/MediaCodec(21756): Codec reported an error. (omx error 0x80001001,
internalError -2147483648)
Any help on this would be useful.
Found the fix for the issue. I did not release the Codec before creating another one. Multiple instances of Encoder is not permissible on Samsung Tab 3 running on Intel x86 device. This behaviour is pretty inconsistent across android devices; taking into account other devices on which I've had tested my code.
The code shown won't work on some ARM devices. COLOR_FormatYUV420SemiPlanar isn't supported everywhere.
You need to detect the set of available color formats at runtime. See the isRecognizedFormat() method in EncodeDecodeTest. To pass CTS, the device must allow one of those formats. There's five listed, but really there's only two (planar and semi-planar), and they're not radically different.
For Intel Devices Encoder.getOutput is Crashing , Created a media format and directly supplied to the encoder
MediaFormat mVideoFormat = MediaFormat.createVideoFormat("video/avc", 640 , 480);
mVideoFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface);
mVideoFormat.setInteger(MediaFormat.KEY_BIT_RATE, 64000);
mVideoFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 24);
mVideoFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5);
mTrackIndex = mMuxer.addTrack(mVideoFormat );