I'm trying to get the video stream of a camera into an imagereader surface to be able to process these images. I found a lot of examples that deal with the camera2 API but I don't use that because my video stream comes from an external camera.
Ideally, I would have two surfaces: one as a preview and and one from the ImageReader to process the image. Similar to this. I understand that you combine the two surfaces with a CaptureRequest.Builder and then .addTarget(surface). The problem is that I don't have a CamerDevice to make the createCaptureRequest.
The code I am using can be found here.
I tried to just create an ImageReader and its surface and pass it to the startDecoding function. But this didn't work quite well as I got this error:
E/JNI: close+++++++
E/BufferQueueProducer: [ImageReader-1280x720f32315659m16-17834-0] dequeueBuffer: BufferQueue has been abandoned
E/ACodec: NATIVE_WINDOW_MIN_UNDEQUEUED_BUFFERS query failed: No such device (19)
E/ACodec: Failed to allocate output port buffers after port reconfiguration: (-19)
E/ACodec: signalError(omxError 0x80001001, internalError -19)
E/MediaCodec: Codec reported err 0xffffffed, actionCode 0, while in state 6
E/AccessHeadCameraActivity: Error has occured.
java.lang.IllegalStateException
at android.media.MediaCodec.native_dequeueOutputBuffer(Native Method)
at android.media.MediaCodec.dequeueOutputBuffer(MediaCodec.java:2379)
Any hint into the right direction would be nice!
Update 1:
The error results from the return value of dequeueOutputBuffer, as this has the value of -1. According to the docs on the MediaCodec, this means that the call timed out. But why does that happen?
Update 2
I don't have the surfaceCreated (because I don't have the SurfaceView anymore), so that code moved into the onCreate. Everthing else is pretty much the same as in here
#Override
public void onCreate(Bundle savedInstanceState) {
getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_acess_headcamera);
mediaManager = (MediaManager) getUnitManager(FuncConstant.MEDIA_MANAGER);
setupImageReader();
StreamOption streamOption = new StreamOption();
streamOption.setChannel(StreamOption.MAIN_STREAM);
streamOption.setDecodType(StreamOption.HARDWARE_DECODE);
streamOption.setJustIframe(false);
mediaManager.openStream(streamOption);
surface = imageReader.getSurface();
startDecoding(surface);
initListener();
}
private void setupImageReader() {
imageReader = ImageReader.newInstance(width, height, ImageFormat.YV12,
IMAGE_READER_BUFFER_SIZE);
imageReader.setOnImageAvailableListener(onImageAvailableListener, backgroundHandler);
}
Related
I'm trying to write a simple video encoder that uses the Android platform's MediaCodec class in "surface input" mode.
These are the steps I'm following (supporting code left out for the sake of brevity):
mediaCodec = MediaCodec::CreateByType(looper, "video/avc", true);
mediaCodec->configure(config, NULL, NULL, CONFIGURE_FLAG_ENCODE);
mediaCodec->createInputSurface(&inputSurface);
mediaCodec->start();
Following this, I'm trying to dequeue a buffer from the created input surface (which is an IGraphiBufferProducer interface object), but it fails with the NO_INIT error:
inputSurface->dequeueBuffer(&slot, &fence, w, h, format, 0);
The error message in the ADB log is:
BufferQueueProducer: [GraphicBufferSource] dequeueBuffer: BufferQueue has no connected producer
Any idea why the buffer queue has no connected producer? I would assume that the MediaCodec class would handle the creation of the buffer queue as well as the connection of the producer and consumers to the queue.
I'm using Android API level 26 (7.1.2). I'm using the platform-level libs because my use case requires access to GraphicBuffer objects.
Thanks in advance!
EDIT: The general idea is to:
Dequeue buffers from the input surface & fill them.
Queue the filled buffers back to the input surface (which would presumably trigger the media codec (video encoder) instance that the surface belongs
to).
Dequeue output buffers (containing raw H.264 bitstream data) from the media codec instance, and write it to file.
Release output buffers back to the media codec instance.
From IGraphiBufferProducer documentation:
// * NO_INIT - the buffer queue has been abandoned or the producer is not
// connected.
I guess that the part that is missing in your code is this "connect".
IGraphiBufferProducer has such a method, are you using it?
I'm working on a video processing app. The app has one Activity that contains a Fragment. The Fragment in turn contains a VideoSurfaceView derived from GLSurfaceView for me to show the preview of the video with effect (using OpenGL) to users. After previewing, users can start processing the video.
To process the video, I mainly apply the method described in here.
Everything works fine on most devices, but the Oppo Mirror 3 (Android 4.4). On this device, everytime I try to create an Surface using MediaCodec.createInputSurface(), it throws out java.lang.IllegalStateException with code -38.
E/OMXMaster: A component of name 'OMX.qcom.audio.decoder.aac' already exists, ignoring this one.
E/SoftAVCEncoder: internalSetParameter: StoreMetadataInBuffersParams.nPortIndex not zero!
E/OMXNodeInstance: OMX_SetParameter() failed for StoreMetaDataInBuffers: 0x80001001
E/ACodec: [OMX.google.h264.encoder] storeMetaDataInBuffers (output) failed w/ err -2147483648
E/OMXNodeInstance: createInputSurface requires COLOR_FormatSurface (AndroidOpaque) color format
E/ACodec: [OMX.google.h264.encoder] onCreateInputSurface returning error -38
E/VideoProcessing: java.lang.IllegalStateException
at android.media.MediaCodec.createInputSurface(Native Method)
at com.ltpquang.android.core.processing.codec.VideoEncoder.<init>(VideoEncoder.java:46)
at com.ltpquang.android.core.VideoProcessing.setupVideo(VideoProcessing.java:200)
at com.ltpquang.android.core.VideoProcessing.<init>(VideoProcessing.java:167)
at com.ltpquang.android.ui.activity.PreviewEditActivity.lambda$btNext$12(PreviewEditActivity.java:723)
at com.ltpquang.android.ui.activity.PreviewEditActivity.access$lambda$12(PreviewEditActivity.java)
at com.ltpquang.android.ui.activity.PreviewEditActivity$$Lambda$13.run(Unknown Source)
at java.lang.Thread.run(Thread.java:841)
Playing around a little bit, I observed that:
BEFORE creating and adding the VideoSurfaceView to the layout, I can create MediaCodec encoder and obtain the input surface successfully. And I can create as many as I want if I release the previous MediaCodec before creating a new one, otherwise I can only obtain one and only one input surface regardless how many MediaCodec I have.
AFTER creating and adding the VideoSurfaceView to the layout, there is no chance that I can get the input surface from the MediaCodec, it thows java.lang.IllegalStateException always.
I've tried removing the VideoSurfaceView from the layout, set it to null, before creating the surface, but no luck for me.
I also tried with suggestions from here, or here, but they didn't help.
From this, it seems that my device can only get the software codec. So that I cant create the input surface.
My question is:
Why was that?
If the device's resources is limited, what can I do (release something for example) to continue the process?
If it is related to the software codec, what should I do? How can I detect and release the resource?
Is this related to GL contexts? If yes, what should I do? Should I manage the contexts my self?
I'm processing a live stream via MediaCodec and have a scenario where the MediaFormat changes mid-stream (ie: resolution of the video being decoded changes). Given I'm attaching the decoder to a Surface to render it as soon as I detect the change in resolution on the incoming stream I recreate the decoder before feeding it the new resolution buffer (providing it with the proper new MediaFormat).
I've been getting some weird errors which don't give me too much info as to what could be wrong, ie when calling MediaCodec.configure with the new format and same Surface:
android.media.MediaCodec$CodecException: Error 0xffffffea
at android.media.MediaCodec.native_configure(Native Method)
at android.media.MediaCodec.configure(MediaCodec.java:577)
Which when fetching the CodecException.getDiagnosticInfo it shows nothing that I can really use to understand the reason for the failure: android.media.MediaCodec.error_neg_22
I've also noted the following on the logs and found some related information and am wondering if there's something I need to do regarding the Surface itself (like detaching it from the old instance of the decoder being giving it to the new one):
07-09 15:00:17.217 E/BufferQueueProducer( 139): [SurfaceView] connect(P): already connected (cur=3 req=3)
07-09 15:00:17.217 E/MediaCodec( 5388): native_window_api_connect returned an error: Invalid argument (-22)
07-09 15:00:17.218 E/MediaCodec( 5388): configure failed with err 0xffffffea, resetting...
Looks like calling stop() and release() as well as reinitializing any references I had to the getInputBuffers() and getOutputBuffers() did the trick. At least I don't get the messages/exceptions anymore. Now I just need to figure out the Surface reference part as it seems the resized stream (when resolution changes) is still being fit in the original surface dimensions instead of adjusting the Surface for the new resolution.
If your encoder supports adaptive playback, then apparently you can alter some codec paramaters on the fly:
https://stackoverflow.com/a/34427724/1048170
In my i want encode yuv data into h264 using mediacodec software codec.
I use Google software encoder OMX.google.h264.encoder
when i use hardware encoder[OMX.qcom.video.encoder.avc] that time it work but when i use software encoder[OMX.google.h264.encoder] it not encode file.it will give error [see in log].
what is problem i couldn’t identify.
Source :
mediaCodec = MediaCodec.createByCodecName("OMX.google.h264.encoder");
//mediaCodec = MediaCodec.createByCodecName(codecInfo.getName());
Log.i(TAG,"codec name : "+ mediaCodec.getName());
int mBitrate = (int) ((MainActivity.mHeight * MainActivity.mWidth * MainActivity.frameRate)*2*0.07);
MediaFormat mediaFormat = MediaFormat.createVideoFormat("video/avc",MainActivity.mWidth,MainActivity.mHeight);
mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE,mBitrate);
mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, MainActivity.frameRate);
// mediaFormat.setInteger(MediaFormat.KEY_MAX_INPUT_SIZE, 320*240);
mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT,colorFormat);
//mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT,MediaCodecInfo.CodecProfileLevel.AVCLevel12);
mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL,1);
try{
mediaCodec.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
mediaCodec.start();
Log.i(TAG,"H264 Encoder init success");
}catch(IllegalArgumentException e)
{
e.printStackTrace();
}catch (IllegalStateException e) {
e.printStackTrace();
}catch (Exception e) {
e.printStackTrace();// TODO: handle exception
}
But i getting this error.
Log :
I/H264Encoder(7772): outputStream initialized
I/OMXClient(7772): Using client-side OMX mux.
I/H264Encoder(7772): found colorFormat: 21
I/OMXClient(7772): Using client-side OMX mux.
E/OMXMaster(7772): A component of name 'OMX.qcom.audio.decoder.aac' already exists, ignoring this one.
I/SoftAVCEncoder(7772): Construct SoftAVCEncoder
I/H264Encoder(7772): codec name : OMX.google.h264.encoder
E/SoftAVCEncoder(7772): internalSetParameter: StoreMetadataInBuffersParams.nPortIndex not zero!
E/OMXNodeInstance(7772): OMX_SetParameter() failed for StoreMetaDataInBuffers: 0x80001001
E/ACodec(7772): [OMX.google.h264.encoder] storeMetaDataInBuffers (output) failed w/ err -2147483648
I/ACodec(7772): setupVideoEncoder succeeded
I/H264Encoder(7772): H264 Encoder init success
E/SoftAVCEncoder(7772): Video frame size 1920x1080 must be a multiple of 16
E/SoftAVCEncoder(7772): Failed to initialized encoder params
E/ACodec(7772): [OMX.google.h264.encoder] ERROR(0x80001001)
E/MediaCodec(7772): Codec reported an error. (omx error 0x80001001, internalError -2147483648)
W/System.err(7772): java.lang.IllegalStateException
W/System.err(7772): at android.media.MediaCodec.getBuffers(Native Method)
W/System.err(7772): at android.media.MediaCodec.getInputBuffers(MediaCodec.java:542)
W/System.err(7772): at com.ei.encodertest.H264Encoder.offerEncoder(H264Encoder.java:170)
W/System.err(7772): at com.ei.encodertest.MainActivity$ReadRawFileTask.doInBackground(MainActivity.java:113)
W/System.err(7772): at com.ei.encodertest.MainActivity$ReadRawFileTask.doInBackground(MainActivity.java:1)
W/System.err(7772): at android.os.AsyncTask$2.call(AsyncTask.java:288)
W/System.err(7772): at java.util.concurrent.FutureTask.run(FutureTask.java:237)
W/System.err(7772): at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:231)
W/System.err(7772): at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1112)
W/System.err(7772): at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:587)
W/System.err(7772): at java.lang.Thread.run(Thread.java:841)
The SW encoder OMX.google.h264.encoder is very limited at the moment (edit: On Android 5.0), close to being unusable.
This encoder doesn't allow using resolutions that aren't a multiple of 16. In your case, 1920x1080, the height 1080 isn't evenly dividable by 16, and thus isn't acceptable for this encoder. (See https://android-review.googlesource.com/38904 for an attempt at fixing this.)
If you'd change it into 1088, the multiple-of-16 wouldn't be an issue, but the encoder also won't allow you to use it with any resolution above 352x288 (see e.g. https://android-review.googlesource.com/82133).
Finally, on older Android versions (prior to 5.0), it also did output in a slightly different format (missing startcodes, see https://android-review.googlesource.com/42321), which meant that you would have to manually add startcodes at the start of each output packet to be able to use them in certain places (the MediaMuxer might have handled it as it was, though, mostly by chance).
In the current AOSP master (that is, maybe in the next major release, unless that already is being finalized and this change hasn't been included there), the encoder has been replaced with a more capable one, but for existing releases, there's not much you can do other than bundling a better SW encoder within your app.
Edit: The Android M preview that was released today does include the new SW encoder, which should work fine for this usecase.
Edit2: The new encoder was included in the Android 6.0 release (M), so since then, it should be usable.
I am using FFmpegFrameRecorder to capture preview frames from camera. I am using this setting:
mVideoRecorder = new FFmpegFrameRecorder(mVideoPath, 300, 300, 1);
mVideoRecorder.setFormat("mp4");
mVideoRecorder.setSampleRate(44100);
mVideoRecorder.setFrameRate(30);
mVideoRecorder.setVideoCodec(avcodec.AV_CODEC_ID_H264);
mVideoRecorder.setAudioCodec(avcodec.AV_CODEC_ID_AAC);
mVideoRecorder.setVideoQuality(0);
mVideoRecorder.setAudioQuality(0);
mVideoRecorder.setVideoBitrate(768000);
mVideoRecorder.setAudioBitrate(128000);
mVideoRecorder.setGopSize(1);
After I have finished capturing all frames by calling .record(IplImage) method I call mVideoRecorder.stop().
But from time to time stop() method throws
org.bytedeco.javacv.FrameRecorder$Exception: av_interleaved_write_frame() error -22 while writing interleaved video frame.
at org.bytedeco.javacv.FFmpegFrameRecorder.record(FFmpegFrameRecorder.java:727)
at org.bytedeco.javacv.FFmpegFrameRecorder.stop(FFmpegFrameRecorder.java:613)
I have not seen any regularity in this behaviour nor have been able to found what error -22 is. And after that no ffmpeg call on file in mVideoPath works (I guess that file is not even valid due that error).
I would really appreciate any help with this issue, thanks :)