I am decoding an mp3 file, first I convert the mp3 file into a chunks of byteArray of size 1000 and put it in a circularArray and then pass it to mediaCodec callback for decoding (decode one byteArray at a time), I follow this link. It is working fine for Samsung devices, but if I use other than Samsung devices (Vivo, pixel 3a) it crashes at the mediaCodec.getInputBuffer(index) in the callback of onInputBufferAvailable by giving the exception IllegalStateException. My code is as follows:
var decoder: MediaCodec = MediaCodec.createDecoderByType("audio/mpeg")
decoder.configure(format, null, null, 0)
decoder.setCallback(object : MediaCodec.Callback() {
override fun onInputBufferAvailable(mediaCodec: MediaCodec, i: Int) {
while (true) {
if (circularArray!!.size() > 0) {
val data: ByteArray = circularArray.popFirst()
val info = MediaCodec.BufferInfo()
val buffer = mediaCodec.getInputBuffer(i)
buffer!!.put(data, 0, data.size)
mediaCodec.queueInputBuffer(i, 0, data.size, 0, 0)
break
}
}
}
override fun onOutputBufferAvailable(mediaCodec: MediaCodec, i: Int, info: MediaCodec.BufferInfo) {
//DECODING PACKET ENDED
val outBuffer = mediaCodec.getOutputBuffer(i)
val chunk = ByteArray(info.size)
outBuffer!![chunk] // Read the buffer all at once
outBuffer!!.clear()
Log.d(TAG, "onOutputBufferAvailable: ${info.size}")
audioTrack!!.write(chunk, info.offset, info.offset + info.size)
mediaCodec.releaseOutputBuffer(i, false)
}
override fun onError(mediaCodec: MediaCodec, e: MediaCodec.CodecException) {}
override fun onOutputFormatChanged(mediaCodec: MediaCodec, mediaFormat: MediaFormat) {}
})
decoder!!.start()
I converted my file like this
val tempBuf = ByteArray(1000)
var byteRead: Int
try {
val bufferedInputStream = BufferedInputStream(FileInputStream(mp3File))
while (bufferedInputStream.read(tempBuf).also { byteRead = it } != -1) {
circularArray.addLast(tempBuf.copyOf())
}
bufferedInputStream.close()
Thread(aacDecoderAndPlayRunnable).start()
} catch (e: java.lang.Exception) {
Log.d(TAG, "fileToInputStream: ${e.message}")
e.printStackTrace()
null
}
The exception where the app crashes is
Even if I try to get the format form mediaCodec in the callback, it gives an exception and crashes anyway. I also checked supportedTypes from the codec it supports audio/mpeg.
First of all, the MediaCodec works with a queue of input buffers. And you can read more about it in the docs.
The second parameter of the onInputBufferAvailable callback is the index of the buffer. When calling getInputBuffer() you must pass this index instead of 0:
val buffer = mediaCodec.getInputBuffer(i)
Second, consider using the MediaExtractor instead of reading the file yourself. It supplies you will presentation timestamps and flags to pass into queueInputBuffer().
Third, you need to remove the while (true) loop. You can only queue one buffer per callback.
Related
I am getting some audio streaming data as base64 String, I convert it in byteArray and then write a file locally as mp3 file to play in mediaplayer. But the problem is mediaplayer througing error(1,-2147483648). How to solve this, I tried with many SO posts but nothing works.
**what I am trying to do is fetch base64 string save locally and play**.
val file = requireContext().getExternalFilesDir(null)?.absolutePath + "/audioRecording1.mp3"
val mediaPlayer = MediaPlayer()
try {
val output = FileOutputStream(file)
output.write(mp3SoundByteArray)
output.close()
val fis = FileInputStream(file)
mediaPlayer.setDataSource(fis.fd)
fis.close()
mediaPlayer.setAudioAttributes(
AudioAttributes.Builder().
setContentType(AudioAttributes.CONTENT_TYPE_MUSIC).
setUsage(AudioAttributes.USAGE_MEDIA).
build())
mediaPlayer.prepareAsync()
mediaPlayer.setOnPreparedListener {
mediaPlayer.start()
}
mediaPlayer.setOnErrorListener { mediaPlayer, i, i2 ->
Log.v("","${i,i2}")
true
}
}catch (e:Exception){
toast(e.message!!)
}
could you please tell me how to overcome this?
I am not sure, but it seams that you have trouble with file saving
fun saveFile(responseBody: ResponseBody?, pathWhereYouWantToSaveFile: String) {
val body = responseBody ?: return
var input: InputStream? = null
try {
val uri = Uri.parse(pathWhereYouWantToSaveFile)
input = body.byteStream()
val parcelFileDescriptor =
context.getContentResolver().openFileDescriptor(uri, FileConst.WRITE_MODE)
val fileOutputStream = FileOutputStream(parcelFileDescriptor?.fileDescriptor)
fileOutputStream.use { output ->
val bufferSize = BUFFER_SIZE.toInt()
val buffer = ByteArray(bufferSize)
var read: Int
while (input.read(buffer).also { read = it } != END_OF_FILE) {
output.write(buffer, START_OFFSET, read)
}
output.flush()
}
} catch (exception: Exception) {
logErrorIfDebug(exception)
} finally {
input?.close()
}
}
const val READ_MODE = "r"
const val WRITE_MODE = "w"
const val START_OFFSET = 0
const val END_OF_FILE = -1
const val BUFFER_SIZE = 4 * BYTES_IN_KILOBYTE
Try this in your viewModel or data sourse layer, then send result to UI layer and use there
Have you checked that your file saved correct? You can go to directory and try to open file. If everything okey, you can get it by uri in your media player.
Also you should check - perhaps you are creating another path for save and retrieve
Better way to use player is https://exoplayer.dev/
But native library also can work with internal uri path.
If you just take a random part of a base64 encoded audio stream then your bytearray will (after decoding) contain a part of an audiofile.
Some audio stream bytes.
Not a complete valid mp3 file with headers and such.
If you had said: am getting a mp3 file in one base64 String then your approch would be ok.
I have solved the issue without writing any header. below way.
val clipData =android.util.Base64.decode(data,0)
val output = FileOutputStream(file,true)
output.write(clipData)
output.close()
I am decoding H.264 video data received through a websocket with an async Android MediaCodec.
My question is how to handle input buffers when there is no new encoded data to decode?
Currently I queue the input buffers without any data when there is no new encoded data from the websocket:
override fun onInputBufferAvailable(codec: MediaCodec, index: Int) {
if (videoFrames.isNotEmpty()) {
val inputBuffer = codec.getInputBuffer(index)
inputBuffer?.put(videoFrames.removeFirst())
codec.queueInputBuffer(index, 0, currentFrame.size, 0, 0)
} else {
codec.queueInputBuffer(index, 0, 0, 0, 0)
}
}
where videoFrames is a list of frames with encoded video data.
I do it like above because if I alternatively remove the else clause and don't queue the empty input buffer, the input buffer never becomes available again.
Is there a different way of handling this? It feels wrong to me to queue an empty input buffer.
An alternative solution that seemed to increase the decoder's performance by a bit was to store the available input buffers in a queue and fetch them when new encoded video data is received. Something like the following:
val videoFrames = ConcurrentList<ByteArray>()
val availableBuffers = mutableListOf<Int>()
var decoder: MediaCodec? = null
fun onInputBufferAvailable(codec: MediaCodec, index: Int) {
availableBuffers.add(index)
}
fun processFrame(frame: ByteArray) {
var frameToProcess = frame
if (videoFrames.isNotEmpty()) {
videoFrames.add(frame)
frameToProcess = videoFrames.removeFirst()
}
if (availableBuffers.isNotEmpty() && decoder != null) {
val bufferIndex = availableBuffers.removeFirst()
val inputBuffer = decoder!!.getInputBuffer(bufferIndex)
inputBuffer?.put(frameToProcess)
decoder!!.queueInputBuffer(bufferIndex, 0, frameToProcess.size, 0, 0)
}
}
I'm trying to create an Android app that connects to the Doorbird device, I know the company's official app, but, I need more features that are tailored to my needs.
For someone that doesn't know what is Doorbird device, Doorbird is a smart intercom, a product of Doorbird company, the device can transmit audio and video from him to any consumer, like Android system, over HTTP and RTSP and he can get Audio stream and play it, for example, to record audio from Android device and transmit it to Doorbird. The audio is in format G711 u-law.
I was able to get the video and audio stream received from Doorbird and it works perfectly but I don't succeed to transmit the audio, in the u-law format of course, to Doorbird.
The error I get is
HTTP FAILED: java.net.ProtocolException: Unexpected status line:
I tried to transmit the same bytes I get from Doorbird back to Doorbird but still the same error.
Of course, I work according to the API that they published but there is not much information about an agreed protocol to transmit audio.
Offical Doorbird API
Is there an example of an Android project that integrates with Doorbird?
Can anyone help in trying to broadcast audio to Doorbird?
Which protocol should be?
Even someone who knows to transmit audio to Doorbird with any other tools and any system and not just Android OS, I'd appreciate it.
This is what I tried, I received the data from Doorbird (and as I said its works) and waiting 3 seconds, and transmit it with Retrofit Libray back to Doorbird.
const val AUDIO_PATH =
"http://192.168.1.187/bha-api/audio-receive.cgi?http-user=XXXXXX0001&http-password=XXXXXXXXXX"
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_main)
//InputStream inputStream = getResources().openRawResource(R.raw.piano12);
val thread = Thread { this.playUrl() }
thread.start()
//val inStr = assets.open("doorbird_record")
}
private fun playUrl() {
val inStr = URL(AUDIO_PATH).openStream()
val buffer = ByteArray(1000)
var i = 0
//while (inStr.read(buffer).also { i = it } != -1) {
Handler(Looper.getMainLooper()).postDelayed({
//inStr.close()
inStr.read(buffer)
Log.d("DoorbirdLog", inStr.toString())
val part = MultipartBody.Part.createFormData(
"doorbirdStream", "doorbird", buffer.toRequestBody(
("audio/basic").toMediaType()
)
)
//val rb = file.asRequestBody(("audio/*").toMediaType())
val call = NetworkManager.instanceServiceApi.upload(part)
call.enqueue(object : Callback<ResponseBody> {
override fun onResponse(
call: Call<ResponseBody>,
response: Response<ResponseBody>
) {
val i = response.body()
Log.d("success", i.toString())
}
override fun onFailure(call: Call<ResponseBody>, t: Throwable) {
Log.d("failed", t.message.toString())
}
})
}, 3000)
}
And the Retrofit instance:
#Multipart
#Headers( "Content-Type: audio/basic",
"Content-Length: 9999999",
"Connection: Keep-Alive",
"Cache-Control: no-cache")
#POST("audio-transmit.cgi?http-user=XXXXXX0001&http-password=XXXXXXXXXX")
fun upload(#Part part: MultipartBody.Part): Call<ResponseBody>
I'd appreciate your assistance
Eventually, I was able to find a solution, I'll briefly present here the solution for those who will encounter an attempt to integrate with Doorbird.
private const val FREQUENCY_SAMPLE_RATE_TRANSMIT = 8000
private const val RECORD_STATE_STOPPED = 0
override suspend fun recordAndTransmitAudio(audioTransmitUrl: String) =
withContext(Dispatchers.IO) {
val minBufferSize = AudioRecord.getMinBufferSize(
FREQUENCY_SAMPLE_RATE_TRANSMIT, AudioFormat.CHANNEL_IN_MONO,
AudioFormat.ENCODING_PCM_16BIT
)
mRecorder = AudioRecord(
MediaRecorder.AudioSource.VOICE_COMMUNICATION,
FREQUENCY_SAMPLE_RATE_TRANSMIT, AudioFormat.CHANNEL_IN_MONO,
AudioFormat.ENCODING_PCM_16BIT, minBufferSize
)
mRecorder?.let { enableAcousticEchoCanceler(it.audioSessionId) }
mRecorder?.startRecording()
val bufferShort = ShortArray(minBufferSize)
val buffer = ByteArray(minBufferSize)
val urlConnection = URL(audioTransmitUrl).openConnection() as HttpURLConnection
urlConnection.apply {
doOutput = true
setChunkedStreamingMode(minBufferSize)
}
val output = DataOutputStream(urlConnection.outputStream)
output.flush()
try {
mRecorder?.let { recorder ->
while (recorder.read(bufferShort, 0, bufferShort.size) != RECORD_STATE_STOPPED) {
G711UCodecManager.encode(bufferShort, minBufferSize, buffer, 0)
output.write(buffer)
}
}
}catch (e: Exception){
Log.d(TAG, e.message.toString())
}
output.close()
urlConnection.disconnect()
}
First, we will prepare the necessary parameters for recording and transmission
We get the minimum size of the buffer for recording
Define the object with which we will record
Activate the echo cancellation
And start recording
Open connection with the transmit URL
While loop as long as the recording has not stopped
Encode the data we recorded from PCM 16Bit format to G.711 μ-law format
And of course, after we finished the recording we cleaned up resources.
I am building a streaming app. I am facing a problem, here is code
I want to live stream camera feed to the server and I hope I will get ByteBuffer in onOutputBufferAvailable(). I am getting output buffer but I am never getting MediaCodec.BUFFER_FLAG_END_OF_STREAM when I call stopVideoCapture()
Here are code segments
Creating Media Codec
private val recorderStreamSurface by lazy {
val format = MediaFormat.createVideoFormat(VIDEO_MIME_TYPE, width, height)
val frameRate = 30 // 30 fps
var recorderStreamSurface: Surface? = null
// Set some required properties. The media codec may fail if these aren't defined.
format.setInteger(
MediaFormat.KEY_COLOR_FORMAT,
MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface
)
format.setInteger(MediaFormat.KEY_BIT_RATE, 6000000) // 6Mbps
format.setInteger(MediaFormat.KEY_FRAME_RATE, frameRate)
format.setInteger(MediaFormat.KEY_CAPTURE_RATE, frameRate)
format.setInteger(MediaFormat.KEY_REPEAT_PREVIOUS_FRAME_AFTER, 1000000 / frameRate)
format.setInteger(MediaFormat.KEY_CHANNEL_COUNT, 1)
format.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 1) // 1 seconds between I-frames
videoEncoder = MediaCodec.createEncoderByType(VIDEO_MIME_TYPE)
// Create a MediaCodec encoder and configure it. Get a Surface we can use for recording into.
try {
videoEncoder.configure(format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE)
recorderStreamSurface = videoEncoder.createInputSurface()
videoEncoder.setCallback(object : MediaCodec.Callback() {
override fun onError(codec: MediaCodec, exception: MediaCodec.CodecException) {
Log.d(TAG, "==onError $codec $exception")
serverChannel.onError(exception)
}
override fun onOutputFormatChanged(codec: MediaCodec, format: MediaFormat) {
Log.d(TAG, "video encoder: output format changed")
}
override fun onInputBufferAvailable(codec: MediaCodec, index: Int) {
Log.d(TAG, "video encoder: returned input buffer: $index")
val frameData: ByteArray
frameData = queue.take().array()
val inputData = codec.getInputBuffer(index)
inputData!!.clear()
inputData.put(frameData)
codec.queueInputBuffer(index, 0, frameData.size, 0, 0)
}
override fun onOutputBufferAvailable(codec: MediaCodec, index: Int, info: MediaCodec.BufferInfo) {
Log.d(TAG, "video encoder: returned output buffer: $index flag : ${info.flags}")
Log.d(TAG, "video encoder: returned buffer of size " + info.size)
if ((info.flags and MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
Log.i(TAG,"serverChannel.onCompleted()1")
}
videoEncoder.releaseOutputBuffer(index, false)
}
})
videoEncoder.start()
} catch (e: IOException) {
videoEncoder.stop()
videoEncoder.release()
serverChannel.onError(e)
}
recorderStreamSurface
}
local variables
lateinit var videoEncoder: MediaCodec
val queue: ArrayBlockingQueue<ByteBuffer> = ArrayBlockingQueue<ByteBuffer>(10)
val targets by lazy { listOf(viewFinder.holder.surface, recorderStreamSurface!!) }
private const val VIDEO_MIME_TYPE = "video/avc"
val cameraId = "1"
val fps = 30
val width = 1080
val height = 1920
Record Request
private val recordRequest: CaptureRequest by lazy {
// Capture request holds references to target surfaces
session.device.createCaptureRequest(CameraDevice.TEMPLATE_RECORD).apply {
// Add the preview and recording surface targets
for (target: Surface in targets) {
addTarget(target)
}
// Sets user requested FPS for all targets
set(CaptureRequest.CONTROL_AE_TARGET_FPS_RANGE, Range(fps, fps))
}.build()
}
and finally start and stop recording
private fun startVideoCapture() {
// Prevents screen rotation during the video recording
requireActivity().requestedOrientation =
ActivityInfo.SCREEN_ORIENTATION_LOCKED
session.setRepeatingRequest(previewRequest, null, cameraHandler)
// Start recording repeating requests, which will stop the ongoing preview
// repeating requests without having to explicitly call `session.stopRepeating`
session.setRepeatingRequest(recordRequest, null, cameraHandler)
recordingStartMillis = System.currentTimeMillis()
Log.d(TAG, "Recording started")
}
private fun stopVideoCapture() {
// Unlocks screen rotation after recording finished
requireActivity().requestedOrientation =
ActivityInfo.SCREEN_ORIENTATION_UNSPECIFIED
videoEncoder.stop()
videoEncoder.release()
Log.d(TAG, "Recording stopped")
session.setRepeatingRequest(previewRequest, null, cameraHandler)
}
you must pass as parameter the flag BUFFER_FLAG_END_OF_STREAM with the last data to encode.
codec.queueInputBuffer(index, 0, frameData.size, 0, BUFFER_FLAG_END_OF_STREAM)
I'm having issues finding a solution to saving a FloatArray buffer of audio data produced from TarsosDSP on Android, using Kotlin. The goal is to have a buffer of 1 second of audio, that is continuously updated with new buffer data, and older data discarded. I wish to save this buffer when requested.
I've tried to find a solution using the TarsosDSP library, but it want to write a continuous stream to a wav file; I need it limited to only one second, and have saved on demand. This WavFileWriter looked promising -> https://github.com/philburk/jsyn/blob/master/src/com/jsyn/util/WaveFileWriter.java but as I had added it to my android project, javax was needed. I didn't know until looking up what javax was, and it was not supported in android. Trying to find a library that could solve this issue turned up with little results.
private val SAMPLE_RATE = 16000
private val BUFFER_SIZE = 1024
private val SECONDS = 1.0
private val sampleFileName: String = "audio_sample.wav"
private var audioBuffer = FloatArray(SAMPLE_RATE * SECONDS.toInt())
private var dispatcher =
AudioDispatcherFactory.fromDefaultMicrophone(SAMPLE_RATE, BUFFER_SIZE, 128)
init {
blankProcessor = object : AudioProcessor {
override fun processingFinished() {}
override fun process(audioEvent: AudioEvent): Boolean {
var buffer = audioEvent.floatBuffer
val insertPoint = audioBuffer.lastIndex - buffer.lastIndex
Arrays.copyOfRange(audioBuffer, insertPoint, audioBuffer.size)
.copyInto(audioBuffer, 0)
buffer.copyInto(audioBuffer, insertPoint)
return true
}
}
dispatcher.addAudioProcessor(blankProcessor)
audioThread = Thread(dispatcher, "Audio Thread")
}
private fun writeWavFile() {
val file = File(context.cacheDir.absolutePath + "/" + sampleFileName)
// missing wav write code
}
TarsosDSP offers the WriterProcessor class, for writing audio to file:
https://github.com/JorenSix/TarsosDSP/blob/c26e5004e203ee79be1ec25c2603b1f11b69d276/src/core/be/tarsos/dsp/writer/WriterProcessor.java
Here's your modified example:
private var dispatcher =
AudioDispatcherFactory.fromDefaultMicrophone(SAMPLE_RATE, BUFFER_SIZE, 128)
init {
blankProcessor = object : AudioProcessor {
override fun processingFinished() {}
override fun process(audioEvent: AudioEvent): Boolean {
var buffer = audioEvent.floatBuffer
val insertPoint = audioBuffer.lastIndex - buffer.lastIndex
Arrays.copyOfRange(audioBuffer, insertPoint, audioBuffer.size)
.copyInto(audioBuffer, 0)
buffer.copyInto(audioBuffer, insertPoint)
return true
}
}
dispatcher.addAudioProcessor(blankProcessor)
// The important bit
val outputFile = File(context.filesDir, "file_name")
val randomAccessFile = RandomAccessFile(outputFile, "rw")
val fileWriter = WriterProcessor(audioFormat, randomAccessFile)
dispatcher.addAudioProcessor(fileWriter)
audioThread = Thread(dispatcher, "Audio Thread")
}