I received this crashlytics crash event,
Fatal Exception: java.lang.IllegalStateException: start failed
at android.media.MediaCodec.native_start(MediaCodec.java)
at android.media.MediaCodec.start(MediaCodec.java:2322)
at com.liveviewsports.lvsrtspplayer.decoder.H264Decoder.initializeAndStartDecoder(H264Decoder.kt:52)
at com.liveviewsports.lvsrtspplayer.RTSPPlayer.didReceiveParameterSets(RTSPPlayer.kt:108)
at com.liveviewsports.lvsrtspplayer.rtsp.RTSPClient.parseSPSandPPS(RTSPClient.kt:215)
at com.liveviewsports.lvsrtspplayer.rtsp.RTSPClient.parseRTSPResponse(RTSPClient.kt:139)
at com.liveviewsports.lvsrtspplayer.rtsp.RTSPClient.listenForConnections$lambda-1(RTSPClient.kt:107)
at com.liveviewsports.lvsrtspplayer.rtsp.RTSPClient.$r8$lambda$uHlr4eXY-NjjcFuziT8ck_Gn4Pk()
at com.liveviewsports.lvsrtspplayer.rtsp.RTSPClient$$ExternalSyntheticLambda1.run(:2)
at java.lang.Thread.run(Thread.java:1012)
I can't reproduce this on my devices so I can't find exact solution
fun initializeAndStartDecoder(outputSurface: Surface, sps: ByteArray, pps: ByteArray) {
if (isDecoderRunning.get()) return
codec = MediaCodec.createDecoderByType(MediaFormat.MIMETYPE_VIDEO_AVC)
val decodingVideoFormat: MediaFormat =
MediaFormat.createVideoFormat(MediaFormat.MIMETYPE_VIDEO_AVC, 720, 1280).apply {
setInteger(MediaFormat.KEY_CAPTURE_RATE, 30)
setByteBuffer("csd-0", ByteBuffer.wrap(sps))
setByteBuffer("csd-1", ByteBuffer.wrap(pps))
}
bufferInfo = MediaCodec.BufferInfo()
try {
codec?.configure(decodingVideoFormat, outputSurface, null, 0)
codec?.start()
} catch (exc: java.lang.IllegalArgumentException) {
Logger.w("Error while configuring H264 decoder; aborting: ${exc.localizedMessage}")
stopAndDeinit()
return
} catch (exc: java.lang.IllegalStateException) {
Logger.w("Error while configuring H264 decoder; aborting: ${exc.localizedMessage}")
stopAndDeinit()
return
}
isDecoderRunning.set(true)
}
How to I can reproduce and fix this issue?
Related
I am decoding an mp3 file, first I convert the mp3 file into a chunks of byteArray of size 1000 and put it in a circularArray and then pass it to mediaCodec callback for decoding (decode one byteArray at a time), I follow this link. It is working fine for Samsung devices, but if I use other than Samsung devices (Vivo, pixel 3a) it crashes at the mediaCodec.getInputBuffer(index) in the callback of onInputBufferAvailable by giving the exception IllegalStateException. My code is as follows:
var decoder: MediaCodec = MediaCodec.createDecoderByType("audio/mpeg")
decoder.configure(format, null, null, 0)
decoder.setCallback(object : MediaCodec.Callback() {
override fun onInputBufferAvailable(mediaCodec: MediaCodec, i: Int) {
while (true) {
if (circularArray!!.size() > 0) {
val data: ByteArray = circularArray.popFirst()
val info = MediaCodec.BufferInfo()
val buffer = mediaCodec.getInputBuffer(i)
buffer!!.put(data, 0, data.size)
mediaCodec.queueInputBuffer(i, 0, data.size, 0, 0)
break
}
}
}
override fun onOutputBufferAvailable(mediaCodec: MediaCodec, i: Int, info: MediaCodec.BufferInfo) {
//DECODING PACKET ENDED
val outBuffer = mediaCodec.getOutputBuffer(i)
val chunk = ByteArray(info.size)
outBuffer!![chunk] // Read the buffer all at once
outBuffer!!.clear()
Log.d(TAG, "onOutputBufferAvailable: ${info.size}")
audioTrack!!.write(chunk, info.offset, info.offset + info.size)
mediaCodec.releaseOutputBuffer(i, false)
}
override fun onError(mediaCodec: MediaCodec, e: MediaCodec.CodecException) {}
override fun onOutputFormatChanged(mediaCodec: MediaCodec, mediaFormat: MediaFormat) {}
})
decoder!!.start()
I converted my file like this
val tempBuf = ByteArray(1000)
var byteRead: Int
try {
val bufferedInputStream = BufferedInputStream(FileInputStream(mp3File))
while (bufferedInputStream.read(tempBuf).also { byteRead = it } != -1) {
circularArray.addLast(tempBuf.copyOf())
}
bufferedInputStream.close()
Thread(aacDecoderAndPlayRunnable).start()
} catch (e: java.lang.Exception) {
Log.d(TAG, "fileToInputStream: ${e.message}")
e.printStackTrace()
null
}
The exception where the app crashes is
Even if I try to get the format form mediaCodec in the callback, it gives an exception and crashes anyway. I also checked supportedTypes from the codec it supports audio/mpeg.
First of all, the MediaCodec works with a queue of input buffers. And you can read more about it in the docs.
The second parameter of the onInputBufferAvailable callback is the index of the buffer. When calling getInputBuffer() you must pass this index instead of 0:
val buffer = mediaCodec.getInputBuffer(i)
Second, consider using the MediaExtractor instead of reading the file yourself. It supplies you will presentation timestamps and flags to pass into queueInputBuffer().
Third, you need to remove the while (true) loop. You can only queue one buffer per callback.
I am trying to write the audio signal recorded on the phone microphone to PCM file, using Android AudioRecord class. The PCM files I am getting are too small and when I convert them to WAV there is only the click sound(10-20ms). The filesize is roughly couple KB for minute. Please help! Changing the buffer size is not helping.
val SAMPLE_RATE = 44100
val AUDIO_SOURCE = MediaRecorder.AudioSource.MIC
val CHANNEL_CONFIG = AudioFormat.CHANNEL_IN_STEREO
val AUDIO_FORMAT = AudioFormat.ENCODING_PCM_16BIT
val BUFFER_SIZE_RECORDING =
AudioRecord.getMinBufferSize(SAMPLE_RATE, CHANNEL_CONFIG, AUDIO_FORMAT)
fun startRecording(dir: String) {
isRecordingAudio = true
if (ActivityCompat.checkSelfPermission(
fragmentActivitySender,
Manifest.permission.RECORD_AUDIO
) != PackageManager.PERMISSION_GRANTED
) {
return
}
recorder = AudioRecord(
AUDIO_SOURCE, SAMPLE_RATE, CHANNEL_CONFIG, AUDIO_FORMAT,
BUFFER_SIZE_RECORDING
)
if (this::recorder.isInitialized) {
recorder.startRecording()
recordingThread = thread(true) {
writeAudioDataToFile(dir)
}
}
}
private fun writeAudioDataToFile(dir: String) {
val audioBuffer = ByteArray(BUFFER_SIZE_RECORDING)
val outputStream: FileOutputStream?
try {
outputStream = FileOutputStream(dir)
} catch (e: FileNotFoundException) {
return
}
while (isRecordingAudio) {
val read = recorder.read(audioBuffer,0, BUFFER_SIZE_RECORDING)
try {
outputStream.write(read)
// clean up file writing operations
} catch (e: IOException) {
e.printStackTrace()
}
}
try {
outputStream.flush()
outputStream.close()
} catch (e: IOException) {
Log.e(ContentValues.TAG, "exception while closing output stream $e")
e.printStackTrace()
}
}
First thing: You are flushing the file in the loop, and that is wrong. Move: outputStream.flush() outside of the loop to one line above outputStream.close() read more about it here
Second issue, you need to write the audioBuffer, not the read variable. The read variable is the result of the read method, is it successful or not. The real data is in the read buffer.
Firstly, I know using OpenGL ES is more optimized, yet here not a choice.
So, when user would be able to save frames with original size in a H.264 live-stream, there are two scenarios, which would have better performance-wise?
Using MediaCodec in asynchronous mode, get YUV image and show that image on an ImageView. (Does it have overhead compared to second option??)
Using MediaCodec in synchronous mode, set TextureView's surface as MediaCodec input surface and whenever user wants to get screenshot, use textureView.getBitmap()
SurfaceView which cannot retrieve the (frame) bitmap after render because it's an output element so failing, no argument.
Code for option 1:
val frame = ...//ByteArray from server
mediaCodec.setCallback(object : MediaCodec.Callback() {
override fun onInputBufferAvailable(
_codec: MediaCodec,
index: Int
) {
try {
val buffer = _codec.getInputBuffer(index)
buffer?.put(frame)
mediaCodec.queueInputBuffer(
index,
0,
data.size,
0,
0
)
} catch (e: Exception) {
try {
_codec.flush()
} catch (e: Exception) {
}
}
}
override fun onOutputBufferAvailable(
_codec: MediaCodec,
index: Int,
info: MediaCodec.BufferInfo
) {
try {
val info = MediaCodec.BufferInfo()
val outputIndex = index
val image: Image? = _codec.getOutputImage(outputIndex)
if (image == null) {
return
}
val rect = image.cropRect
val yuvImage = YuvImage(
YUV_420_888toNV21(image),
NV21,
rect.width(),
rect.height(),
null
)
val stream = ByteArrayOutputStream()
yuvImage.compressToJpeg(
Rect(0, 0, rect.width(), rect.height()),
100,
stream
)
frameBitmap =
BitmapFactory.decodeByteArray(
stream.toByteArray(),
0,
stream.size()
)
imageView.setImageBitmap(frameBitmap)
_codec.stop()
stream.close()
image.close()
if (outputIndex >= 0) {
_codec.releaseOutputBuffer(outputIndex, false)
}
} catch (e: Exception) {
}
}
override fun onError(
_codec: MediaCodec,
e: MediaCodec.CodecException
) {
}
override fun onOutputFormatChanged(
_codec: MediaCodec,
format: MediaFormat
) {
}
})
try {
mediaCodec.start()
} catch (e: Exception) {
mediaCodec.flush()
}
Code for option 2:
val frame = ...//ByteArray from server
try {
val index = mediaCodec.dequeueInputBuffer(-1)
if (index >= 0) {
val buffer = mediaCodec.getInputBuffer(index)
buffer?.put(frame)
mediaCodec.queueInputBuffer(index, 0, data.size, 0, 0)
val info = MediaCodec.BufferInfo()
val outputIndex = mediaCodec.dequeueOutputBuffer(info, 0)
if (outputIndex >= 0) {
mediaCodec.releaseOutputBuffer(outputIndex, true)
)
lastRenderTime = System.currentTimeMillis()
}
} else {
}
} catch (e: Exception) {
//mediaCodec.flush()
}
I'm trying to get an mp4 file. I shoot video using the camera2 api and can save this as an avc file using MediaCodec. But I do not understand how I can redo this code, for encoding into an mp4 file using MediaMuxer. Sorry for my English, this is translated through a translator
private class EncoderCallback : MediaCodec.Callback() {
override fun onInputBufferAvailable(codec: MediaCodec, index: Int) {
}
override fun onOutputBufferAvailable(
codec: MediaCodec,
index2: Int,
info: MediaCodec.BufferInfo
) {
outPutByteBuffer = mCodec!!.getOutputBuffer(index2)
val outDate = ByteArray(info.size)
outPutByteBuffer!![outDate]
try {
Log.i("EncoderCallBack", " outDate.length : " + outDate.size)
outputStream!!.write(outDate, 0, outDate.size)
} catch (e: IOException) {
e.printStackTrace()
}
mCodec!!.releaseOutputBuffer(index2, false)
}
override fun onError(codec: MediaCodec, e: MediaCodec.CodecException) {
Log.i("EncoderCallBack", "Error: $e")
}
override fun onOutputFormatChanged(codec: MediaCodec, format: MediaFormat) {
Log.i("EncoderCallBack", "encoder output format changed: $format")
}
}
after initializing MediaCodec, I record the video:
var texture: SurfaceTexture = textureViewOver
texture.setDefaultBufferSize(320, 240)
surface = Surface(texture)
builder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW)
builder.addTarget(surface)
builder.addTarget(mEncoderSurface!!)
mCameraDevice.createCaptureSession(
mutableListOf(surface, mEncoderSurface),
object : CameraCaptureSession.StateCallback() {...
the muxer code is missing:
override fun onOutputBufferAvailable(
codec: MediaCodec,
index2: Int,
info: MediaCodec.BufferInfo
) {
outPutByteBuffer = mCodec!!.getOutputBuffer(index2)
mediaMuxer?.writeSampleData(trackIndex, outPutByteBuffer, info)
mCodec!!.releaseOutputBuffer(index2, false)
}
I am building a streaming app. I am facing a problem, here is code
I want to live stream camera feed to the server and I hope I will get ByteBuffer in onOutputBufferAvailable(). I am getting output buffer but I am never getting MediaCodec.BUFFER_FLAG_END_OF_STREAM when I call stopVideoCapture()
Here are code segments
Creating Media Codec
private val recorderStreamSurface by lazy {
val format = MediaFormat.createVideoFormat(VIDEO_MIME_TYPE, width, height)
val frameRate = 30 // 30 fps
var recorderStreamSurface: Surface? = null
// Set some required properties. The media codec may fail if these aren't defined.
format.setInteger(
MediaFormat.KEY_COLOR_FORMAT,
MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface
)
format.setInteger(MediaFormat.KEY_BIT_RATE, 6000000) // 6Mbps
format.setInteger(MediaFormat.KEY_FRAME_RATE, frameRate)
format.setInteger(MediaFormat.KEY_CAPTURE_RATE, frameRate)
format.setInteger(MediaFormat.KEY_REPEAT_PREVIOUS_FRAME_AFTER, 1000000 / frameRate)
format.setInteger(MediaFormat.KEY_CHANNEL_COUNT, 1)
format.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 1) // 1 seconds between I-frames
videoEncoder = MediaCodec.createEncoderByType(VIDEO_MIME_TYPE)
// Create a MediaCodec encoder and configure it. Get a Surface we can use for recording into.
try {
videoEncoder.configure(format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE)
recorderStreamSurface = videoEncoder.createInputSurface()
videoEncoder.setCallback(object : MediaCodec.Callback() {
override fun onError(codec: MediaCodec, exception: MediaCodec.CodecException) {
Log.d(TAG, "==onError $codec $exception")
serverChannel.onError(exception)
}
override fun onOutputFormatChanged(codec: MediaCodec, format: MediaFormat) {
Log.d(TAG, "video encoder: output format changed")
}
override fun onInputBufferAvailable(codec: MediaCodec, index: Int) {
Log.d(TAG, "video encoder: returned input buffer: $index")
val frameData: ByteArray
frameData = queue.take().array()
val inputData = codec.getInputBuffer(index)
inputData!!.clear()
inputData.put(frameData)
codec.queueInputBuffer(index, 0, frameData.size, 0, 0)
}
override fun onOutputBufferAvailable(codec: MediaCodec, index: Int, info: MediaCodec.BufferInfo) {
Log.d(TAG, "video encoder: returned output buffer: $index flag : ${info.flags}")
Log.d(TAG, "video encoder: returned buffer of size " + info.size)
if ((info.flags and MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
Log.i(TAG,"serverChannel.onCompleted()1")
}
videoEncoder.releaseOutputBuffer(index, false)
}
})
videoEncoder.start()
} catch (e: IOException) {
videoEncoder.stop()
videoEncoder.release()
serverChannel.onError(e)
}
recorderStreamSurface
}
local variables
lateinit var videoEncoder: MediaCodec
val queue: ArrayBlockingQueue<ByteBuffer> = ArrayBlockingQueue<ByteBuffer>(10)
val targets by lazy { listOf(viewFinder.holder.surface, recorderStreamSurface!!) }
private const val VIDEO_MIME_TYPE = "video/avc"
val cameraId = "1"
val fps = 30
val width = 1080
val height = 1920
Record Request
private val recordRequest: CaptureRequest by lazy {
// Capture request holds references to target surfaces
session.device.createCaptureRequest(CameraDevice.TEMPLATE_RECORD).apply {
// Add the preview and recording surface targets
for (target: Surface in targets) {
addTarget(target)
}
// Sets user requested FPS for all targets
set(CaptureRequest.CONTROL_AE_TARGET_FPS_RANGE, Range(fps, fps))
}.build()
}
and finally start and stop recording
private fun startVideoCapture() {
// Prevents screen rotation during the video recording
requireActivity().requestedOrientation =
ActivityInfo.SCREEN_ORIENTATION_LOCKED
session.setRepeatingRequest(previewRequest, null, cameraHandler)
// Start recording repeating requests, which will stop the ongoing preview
// repeating requests without having to explicitly call `session.stopRepeating`
session.setRepeatingRequest(recordRequest, null, cameraHandler)
recordingStartMillis = System.currentTimeMillis()
Log.d(TAG, "Recording started")
}
private fun stopVideoCapture() {
// Unlocks screen rotation after recording finished
requireActivity().requestedOrientation =
ActivityInfo.SCREEN_ORIENTATION_UNSPECIFIED
videoEncoder.stop()
videoEncoder.release()
Log.d(TAG, "Recording stopped")
session.setRepeatingRequest(previewRequest, null, cameraHandler)
}
you must pass as parameter the flag BUFFER_FLAG_END_OF_STREAM with the last data to encode.
codec.queueInputBuffer(index, 0, frameData.size, 0, BUFFER_FLAG_END_OF_STREAM)