Why is video made with MediaCodec garbled for Samsung Galaxy S7? - android

When I encode a video via Surface -> MediaCodec -> MediaMuxer, I get a very strange result when testing on the Samsung Galaxy S7. For other devices tested (emulator with Marshmallow and HTC Desire), the video comes out correctly, but on this device the video is garbled.
Using MediaCodec to save series of images as Video had a similar output of video, but I don't see how the solution could apply here because I am using a Surface as input and set the color format to COLOR_FormatSurface.
I also tried messing with the video resolution (settled on 1280 x 720) per MediaCodec Encoded video has green bar at bottom and chrominance screwed up, but that didn't solve the problem either. (c.f. Nexus 7 2013 mediacodec video encoder garbled output)
Does anyone have suggestions for what I might try to get the video formatted correctly?
Here is part of the log from the encoding:
D/ViewRootImpl: #1 mView = android.widget.LinearLayout{1dc79f2 V.E...... ......I. 0,0-0,0 #102039c android:id/toast_layout_root}
I/ACodec: [] Now uninitialized
I/OMXClient: Using client-side OMX mux.
I/ACodec: [OMX.qcom.video.encoder.avc] Now Loaded
W/ACodec: [OMX.qcom.video.encoder.avc] storeMetaDataInBuffers (output) failed w/ err -1010
W/ACodec: do not know color format 0x7fa30c06 = 2141391878
W/ACodec: do not know color format 0x7fa30c04 = 2141391876
W/ACodec: do not know color format 0x7fa30c08 = 2141391880
W/ACodec: do not know color format 0x7fa30c07 = 2141391879
W/ACodec: do not know color format 0x7f000789 = 2130708361
D/ViewRootImpl: MSG_RESIZED_REPORT: ci=Rect(0, 0 - 0, 0) vi=Rect(0, 0 - 0, 0) or=1
I/ACodec: setupVideoEncoder succeeded
W/ACodec: do not know color format 0x7f000789 = 2130708361
I/ACodec: [OMX.qcom.video.encoder.avc] Now Loaded->Idle
I/ACodec: [OMX.qcom.video.encoder.avc] Now Idle->Executing
I/ACodec: [OMX.qcom.video.encoder.avc] Now Executing
I/MPEG4Writer: setStartTimestampUs: 0
I/MPEG4Writer: Earliest track starting time: 0
The 5th unrecognized color seems to be COLOR_FormatSurface... Is that a problem?
Other details:
MIME: video/avc
Resolution: 1280 x 720
Frame rate: 30
IFrame interval: 2
Bitrate: 8847360

Per Android Docs for MediaCodec.createInputSurface():
The Surface must be rendered with a hardware-accelerated API, such as
OpenGL ES. lockCanvas(android.graphics.Rect) may fail or produce
unexpected results.
I must have missed (or ignored) that in writing the code. Since I was using lockCanvas() to get a canvas upon which to draw my video frames, the code broke. I have put a quick fix on the problem by using lockHardwareCanvas() if API level >= 23 (since it is unavailable prior to that and since the code ran fine on API level 19).
Long term however (for me and anyone else who might stumble across this), I may have to get into more OpenGL stuff for a more permanent and stable solution. It's not worth going that route though unless I find an example of a device which will not work with my quick fix.

If you are still looking for an example for rendering bitmaps to a InputSurface.
I was able to get this to work.
Look at my answers here.
https://stackoverflow.com/a/49331192/7602598
https://stackoverflow.com/a/49331352/7602598
https://stackoverflow.com/a/49331295/7602598

Related

Android MediaCodec decode is SUPER SLOW

Android MediaCodec decode takes a super long time, about 115 to 118 msecs per frame. This is a h264 frame. The Android device has a qualcomm snapdragon 845 processor, so I assume the Android MediaCodec APIs target the qualcomm GPU and not the ARM core CPU.
Wondering if anyone has experienced such issue/s before and can provide guidance on how to make this decode go faster?
The code is all native code, no java at all. With no Java, I have no active window, no surface texture... so Grafika examples don't help here. I am using AndroidP(9.0) API 28. NDK 19.2.5x.
Here's how my code is setup:
Step1: I have two codec instances configured on two separate threads as follows:
codecData.codec = AMediaCodec_createDecoderByType("video/avc");
AMediaFormat_setString(codecData.format_eye, AMEDIAFORMAT_KEY_MIME, "video/avc");
AMediaFormat_setInt32(codecData.format_eye, AMEDIAFORMAT_KEY_HEIGHT, 1920);
AMediaFormat_setInt32(codecData.format_eye, AMEDIAFORMAT_KEY_WIDTH, 1080);
AMediaFormat_setFloat(codecData.format_eye, AMEDIAFORMAT_KEY_FRAME_RATE, 60.0f);
Step2: I enqueue the encoded buffer using these calls which take 14 to 17 msecs on 60 FPS input with two separate threads populating the individual codec Qs:
bufIdx = AMediaCodec_dequeueInputBuffer(codecData.codec, -1); //-1 makes it blocking call
auto buf = AMediaCodec_getInputBuffer(codecData.codec, bufIdx, &bufSize);
uint64_t presentTime = presentTimer.getTimeUs();
memcpy(buf, data, size);
AMediaCodec_queueInputBuffer(codecData.codec, bufIdx, 0, size, presentTime, 0);
Step3: I dequeue the decoded buffer as follows, these takes 115 to 118 msecs per frame per codec on 60 FPS output. The dequeue for both the codecs are done by one consumer thread which goes through both the codec instances one at a time:
AMediaCodecBufferInfo info_eye;
bufIdx = AMediaCodec_dequeueOutputBuffer(codecData.codec, &info_eye, 1);
auto decodedBuf = AMediaCodec_getOutputBuffer(codecData.codec, bufIdx, &bufSize);
Step4: The decoded buffer is then fed to a NV12toRGBA shader on the render thread that populates the texture which takes about 2 msecs. This texture then gets displayed.
I am expecting 60 FPS but get about 50 FPS due to the delays in Step3 i.e. the 115 to 118 msecs latency is killing me :-(
Any ideas? Appreciate any and all help.

MediaCodec.createInputSurface() throws IllegalStateException on some devices

I'm working on a video processing app. The app has one Activity that contains a Fragment. The Fragment in turn contains a VideoSurfaceView derived from GLSurfaceView for me to show the preview of the video with effect (using OpenGL) to users. After previewing, users can start processing the video.
To process the video, I mainly apply the method described in here.
Everything works fine on most devices, but the Oppo Mirror 3 (Android 4.4). On this device, everytime I try to create an Surface using MediaCodec.createInputSurface(), it throws out java.lang.IllegalStateException with code -38.
E/OMXMaster: A component of name 'OMX.qcom.audio.decoder.aac' already exists, ignoring this one.
E/SoftAVCEncoder: internalSetParameter: StoreMetadataInBuffersParams.nPortIndex not zero!
E/OMXNodeInstance: OMX_SetParameter() failed for StoreMetaDataInBuffers: 0x80001001
E/ACodec: [OMX.google.h264.encoder] storeMetaDataInBuffers (output) failed w/ err -2147483648
E/OMXNodeInstance: createInputSurface requires COLOR_FormatSurface (AndroidOpaque) color format
E/ACodec: [OMX.google.h264.encoder] onCreateInputSurface returning error -38
E/VideoProcessing: java.lang.IllegalStateException
at android.media.MediaCodec.createInputSurface(Native Method)
at com.ltpquang.android.core.processing.codec.VideoEncoder.<init>(VideoEncoder.java:46)
at com.ltpquang.android.core.VideoProcessing.setupVideo(VideoProcessing.java:200)
at com.ltpquang.android.core.VideoProcessing.<init>(VideoProcessing.java:167)
at com.ltpquang.android.ui.activity.PreviewEditActivity.lambda$btNext$12(PreviewEditActivity.java:723)
at com.ltpquang.android.ui.activity.PreviewEditActivity.access$lambda$12(PreviewEditActivity.java)
at com.ltpquang.android.ui.activity.PreviewEditActivity$$Lambda$13.run(Unknown Source)
at java.lang.Thread.run(Thread.java:841)
Playing around a little bit, I observed that:
BEFORE creating and adding the VideoSurfaceView to the layout, I can create MediaCodec encoder and obtain the input surface successfully. And I can create as many as I want if I release the previous MediaCodec before creating a new one, otherwise I can only obtain one and only one input surface regardless how many MediaCodec I have.
AFTER creating and adding the VideoSurfaceView to the layout, there is no chance that I can get the input surface from the MediaCodec, it thows java.lang.IllegalStateException always.
I've tried removing the VideoSurfaceView from the layout, set it to null, before creating the surface, but no luck for me.
I also tried with suggestions from here, or here, but they didn't help.
From this, it seems that my device can only get the software codec. So that I cant create the input surface.
My question is:
Why was that?
If the device's resources is limited, what can I do (release something for example) to continue the process?
If it is related to the software codec, what should I do? How can I detect and release the resource?
Is this related to GL contexts? If yes, what should I do? Should I manage the contexts my self?

MediaCodec config passed, but input resolution is actually not supported

I implemented a video encoder by MediaCodec API to encode 2560x720 videos. I verified it on several devices but I encountered a configuration problem on Samsung Note 2 (Android 4.4.2). The following is my code to prepare the codec instance:
Format = MediaFormat.createVideoFormat("video/avc", 2560, 720);
Format.setInteger(MediaFormat.KEY_BIT_RATE, (int) (2560 * 720 * 20 * 0.5f));
Format.setInteger(MediaFormat.KEY_FRAME_RATE, RefreshRate);
Format.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar);
Format.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 1);
try {
Codec = MediaCodec.createEncoderByType("video/avc");
Codec.configure(Format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
CodecFound = true;
} catch (Exception e) {
CodecFound = false;
if(DEBUG) Log.e(CodecTag, "Failed to find available codec", e);
}
No exception is thrown during this try-catch block. But after starting codec operation, neither the encoded data nor the output format is output by the encoder even if the input buffers are filled ("dequeueOutputBuffer" always returns -1). The following is the log after starting codec, where some errors are reported:
05-04 11:29:17.071: E/MFC_ENC(1006): SOMXVE_Init: isSEIOn : 0
05-04 11:29:17.071: E/MFC_ENC_APP(1006): SsbSipMfcEncInit] IOCTL_MFC_ENC_INIT failed
05-04 11:29:17.111: A/libc(1006): Fatal signal 11 (SIGSEGV) at 0x00000000 (code=1), thread 1403 (AvcEncBufPrc)
05-04 11:29:19.326: E/VideoCodecInfo(1353): Failed to get track format due to codec error
05-04 11:29:25.606: E/ACodec(1353): OMX/mediaserver died, signalling error!
05-04 11:29:25.606: E/MediaCodec(1353): Codec reported an error. (omx error 0x8000100d, internalError -32)
05-04 11:29:25.606: E/AudioService(2462): Media server died.
05-04 11:29:25.606: E/Camera(1353): Error 100
I finally find these commands to get the supported media profile from Note 2
adb shell cat /system/etc/media_codecs.xml
adb shell cat /system/etc/media_profiles.xml
According to the profile list, the max resolution the h264 encoder can support is 1920x1080
<VideoEncoderCap name="h264" enabled="true"
minBitRate="64000" maxBitRate="20000000"
minFrameWidth="176" maxFrameWidth="1920"
minFrameHeight="144" maxFrameHeight="1080"
minFrameRate="1" maxFrameRate="30" />
My questions:
Why the codec can be config successfully with a unacceptable format? Is that a bug or a limitation related to API level?
I heard that there are some APIs for checking this kind of capabilities available from Android 5.0, but I must support 4.4 in my application. Is there any other way to obtain detailed capability info during run-time? (except for parsing the media_profile.xml)
Any help would be really appreciated.
The mediaserver process crashed, apparently due to a null pointer dereference. This process manages all interaction with the media codec hardware. A crash like this indicates a bug in the platform, possibly in the OEM's driver implementation.
The configure() call should fail if the resolution is not supported. My guess is that this particular implementation has a faulty check.
I don't think there's a way to check to see if a particular width x height pair works in Android 4.4. You can generally rely on common resolutions like 720p, but some sizes have strange behaviors, especially when you get non-multiples of 16.

Android MediaCodec Format/Resolution Change mid-stream

I'm processing a live stream via MediaCodec and have a scenario where the MediaFormat changes mid-stream (ie: resolution of the video being decoded changes). Given I'm attaching the decoder to a Surface to render it as soon as I detect the change in resolution on the incoming stream I recreate the decoder before feeding it the new resolution buffer (providing it with the proper new MediaFormat).
I've been getting some weird errors which don't give me too much info as to what could be wrong, ie when calling MediaCodec.configure with the new format and same Surface:
android.media.MediaCodec$CodecException: Error 0xffffffea
at android.media.MediaCodec.native_configure(Native Method)
at android.media.MediaCodec.configure(MediaCodec.java:577)
Which when fetching the CodecException.getDiagnosticInfo it shows nothing that I can really use to understand the reason for the failure: android.media.MediaCodec.error_neg_22
I've also noted the following on the logs and found some related information and am wondering if there's something I need to do regarding the Surface itself (like detaching it from the old instance of the decoder being giving it to the new one):
07-09 15:00:17.217 E/BufferQueueProducer( 139): [SurfaceView] connect(P): already connected (cur=3 req=3)
07-09 15:00:17.217 E/MediaCodec( 5388): native_window_api_connect returned an error: Invalid argument (-22)
07-09 15:00:17.218 E/MediaCodec( 5388): configure failed with err 0xffffffea, resetting...
Looks like calling stop() and release() as well as reinitializing any references I had to the getInputBuffers() and getOutputBuffers() did the trick. At least I don't get the messages/exceptions anymore. Now I just need to figure out the Surface reference part as it seems the resized stream (when resolution changes) is still being fit in the original surface dimensions instead of adjusting the Surface for the new resolution.
If your encoder supports adaptive playback, then apparently you can alter some codec paramaters on the fly:
https://stackoverflow.com/a/34427724/1048170

Min undequeued buffer count exceeded

I am using a SurfaceTexture to get preview frames in the following way.
First, I set a preview texture:
camera.setPreviewTexture(new SurfaceTexture(0));
Then, just before starting the preview and then each time onPreviewFrame is called, I set the callback buffer like this:
camera.addCallbackBuffer(buffer);
camera.setPreviewCallbackWithBuffer(this);
It works. Sometimes, I take a picture using camera.takePicture(null, null, callback), which results in calling onPictureTaken successfully. The image is saved. Since I want to restart the preview after the picture has been taken, I do the following:
try
{
camera.setPreviewTexture(new SurfaceTexture(0));
camera.startPreview();
}
...
The preview restarts and everything seems to be fine. But the following error is reported in my Logcat, seemingly after the preview has be restarted:
E/BufferQueue﹕ [unnamed-5682-5] dequeueBuffer: min undequeued buffer count (2) exceeded (dequeued=5 undequeudCount=1)
Am I missing something? Should I release the old texture at some point?
Configuration: Samsung Galaxy S4, Samsung Galaxy S5, Nexus 5, running on Android KitKat.
EDIT: I am not sure wether it is linked or not, but after a while, my App does not take pictures anymore and the following messages appear continuously in my Logcat:
E/LocSvc_api_v02( 318): I/---> locClientSendReq line 2332 QMI_LOC_INJECT_SENSOR_DATA_REQ_V02
E/gsiff_dmn( 318): I/loc_api_resp_ind_callback: Received LocAPI Resp ind = 77
E/LocSvc_api_v02( 318): D/loc_sync_process_ind:172]: loc_sync_array not in use
E/LocSvc_utils_q( 318): D/msg_q_rcv: Received message 0xB899D940 rv = 0
E/gsiff_dmn( 318): I/gsiff_data_task: Handling message type = 4
E/gsiff_dmn( 318): I/gsiff_daemon_inject_sensor_data_handler: Sending Sensor Data to LocApi. opaque_id = 1226
E/LocSvc_api_v02( 318): I/---> locClientSendReq line 2332 QMI_LOC_INJECT_SENSOR_DATA_REQ_V02
E/gsiff_dmn( 318): I/loc_api_resp_ind_callback: Received LocAPI Resp ind = 77
E/LocSvc_api_v02( 318): D/loc_sync_process_ind:172]: loc_sync_array not in use
E/mm-camera( 284): [cpp_hardware_process_frame:997] Too many cpp frames dropped!!
E/mm-camera( 284): cpp_thread_handle_process_buf_event:224] get buffer fail. drop frame id:1845 identity:0x20002
W/QCamera2HWI( 269): [CHECK_BUF_LOCK] Too many preview buffer is locked by surfaceflinger : 29
E/mm-camera( 284): [cpp_hardware_process_frame:997] Too many cpp frames dropped!!
E/mm-camera( 284): cpp_thread_handle_process_buf_event:224] get buffer fail. drop frame id:1846 identity:0x20002
EDIT 2: If, instead of a new SurfaceTexture(0), I always use the same SurfaceTexture (that I keep as a member), then some errors disappear and the App continues to work. The min undequeued buffer count exceeded error and the Too many preview buffer is locked by surfaceflinger warning stay.
It seems that the camera is holding something in its buffer that is not dequeued by your activity. You have to find the way to clear the camera buffer when you start a new preview.
As you can find in Android documentation about Camera class :
The buffer queue will be cleared if this method [setPreviewCallbackWithBuffer] is called with a null callback, setPreviewCallback(Camera.PreviewCallback) is called, or setOneShotPreviewCallback(Camera.PreviewCallback) is called.
So maybe it is enough to remove your callback when you take a picture and then reinstatiate it when you restart your preview.

Categories

Resources