I am trying to write the audio signal recorded on the phone microphone to PCM file, using Android AudioRecord class. The PCM files I am getting are too small and when I convert them to WAV there is only the click sound(10-20ms). The filesize is roughly couple KB for minute. Please help! Changing the buffer size is not helping.
val SAMPLE_RATE = 44100
val AUDIO_SOURCE = MediaRecorder.AudioSource.MIC
val CHANNEL_CONFIG = AudioFormat.CHANNEL_IN_STEREO
val AUDIO_FORMAT = AudioFormat.ENCODING_PCM_16BIT
val BUFFER_SIZE_RECORDING =
AudioRecord.getMinBufferSize(SAMPLE_RATE, CHANNEL_CONFIG, AUDIO_FORMAT)
fun startRecording(dir: String) {
isRecordingAudio = true
if (ActivityCompat.checkSelfPermission(
fragmentActivitySender,
Manifest.permission.RECORD_AUDIO
) != PackageManager.PERMISSION_GRANTED
) {
return
}
recorder = AudioRecord(
AUDIO_SOURCE, SAMPLE_RATE, CHANNEL_CONFIG, AUDIO_FORMAT,
BUFFER_SIZE_RECORDING
)
if (this::recorder.isInitialized) {
recorder.startRecording()
recordingThread = thread(true) {
writeAudioDataToFile(dir)
}
}
}
private fun writeAudioDataToFile(dir: String) {
val audioBuffer = ByteArray(BUFFER_SIZE_RECORDING)
val outputStream: FileOutputStream?
try {
outputStream = FileOutputStream(dir)
} catch (e: FileNotFoundException) {
return
}
while (isRecordingAudio) {
val read = recorder.read(audioBuffer,0, BUFFER_SIZE_RECORDING)
try {
outputStream.write(read)
// clean up file writing operations
} catch (e: IOException) {
e.printStackTrace()
}
}
try {
outputStream.flush()
outputStream.close()
} catch (e: IOException) {
Log.e(ContentValues.TAG, "exception while closing output stream $e")
e.printStackTrace()
}
}
First thing: You are flushing the file in the loop, and that is wrong. Move: outputStream.flush() outside of the loop to one line above outputStream.close() read more about it here
Second issue, you need to write the audioBuffer, not the read variable. The read variable is the result of the read method, is it successful or not. The real data is in the read buffer.
Related
I am decoding an mp3 file, first I convert the mp3 file into a chunks of byteArray of size 1000 and put it in a circularArray and then pass it to mediaCodec callback for decoding (decode one byteArray at a time), I follow this link. It is working fine for Samsung devices, but if I use other than Samsung devices (Vivo, pixel 3a) it crashes at the mediaCodec.getInputBuffer(index) in the callback of onInputBufferAvailable by giving the exception IllegalStateException. My code is as follows:
var decoder: MediaCodec = MediaCodec.createDecoderByType("audio/mpeg")
decoder.configure(format, null, null, 0)
decoder.setCallback(object : MediaCodec.Callback() {
override fun onInputBufferAvailable(mediaCodec: MediaCodec, i: Int) {
while (true) {
if (circularArray!!.size() > 0) {
val data: ByteArray = circularArray.popFirst()
val info = MediaCodec.BufferInfo()
val buffer = mediaCodec.getInputBuffer(i)
buffer!!.put(data, 0, data.size)
mediaCodec.queueInputBuffer(i, 0, data.size, 0, 0)
break
}
}
}
override fun onOutputBufferAvailable(mediaCodec: MediaCodec, i: Int, info: MediaCodec.BufferInfo) {
//DECODING PACKET ENDED
val outBuffer = mediaCodec.getOutputBuffer(i)
val chunk = ByteArray(info.size)
outBuffer!![chunk] // Read the buffer all at once
outBuffer!!.clear()
Log.d(TAG, "onOutputBufferAvailable: ${info.size}")
audioTrack!!.write(chunk, info.offset, info.offset + info.size)
mediaCodec.releaseOutputBuffer(i, false)
}
override fun onError(mediaCodec: MediaCodec, e: MediaCodec.CodecException) {}
override fun onOutputFormatChanged(mediaCodec: MediaCodec, mediaFormat: MediaFormat) {}
})
decoder!!.start()
I converted my file like this
val tempBuf = ByteArray(1000)
var byteRead: Int
try {
val bufferedInputStream = BufferedInputStream(FileInputStream(mp3File))
while (bufferedInputStream.read(tempBuf).also { byteRead = it } != -1) {
circularArray.addLast(tempBuf.copyOf())
}
bufferedInputStream.close()
Thread(aacDecoderAndPlayRunnable).start()
} catch (e: java.lang.Exception) {
Log.d(TAG, "fileToInputStream: ${e.message}")
e.printStackTrace()
null
}
The exception where the app crashes is
Even if I try to get the format form mediaCodec in the callback, it gives an exception and crashes anyway. I also checked supportedTypes from the codec it supports audio/mpeg.
First of all, the MediaCodec works with a queue of input buffers. And you can read more about it in the docs.
The second parameter of the onInputBufferAvailable callback is the index of the buffer. When calling getInputBuffer() you must pass this index instead of 0:
val buffer = mediaCodec.getInputBuffer(i)
Second, consider using the MediaExtractor instead of reading the file yourself. It supplies you will presentation timestamps and flags to pass into queueInputBuffer().
Third, you need to remove the while (true) loop. You can only queue one buffer per callback.
I have implemented On-device TTS from the Huawei ML Kit in my app and it works well.
Now I would like to find out the duration of the synthesized audio. I want to for example display the remaining time while the audio is playing.
I tried writing the generated audio fragments from the callback to a .pcm file
override fun onAudioAvailable(
taskId: String?,
audioFragment: MLTtsAudioFragment?,
offset: Int,
range: android.util.Pair<Int, Int>?,
bundle: Bundle?
) {
if (taskId != null) {
if (audioFragment != null) {
val fileName = "audio.pcm"
writeToFile(audioFragment.audioData, fileName, true)
}
}
}
fun writeToFile(buffer: ByteArray?, strFileName: String?, append: Boolean) {
if (speechFile == null) {
val pcmFileDir: File = view.getExternalFilesDir("/PCM")!!
speechFile = File(pcmFileDir, strFileName)
}
var raf: RandomAccessFile? = null
var out: FileOutputStream? = null
try {
if (append) {
raf = RandomAccessFile(speechFile, "rw")
speechFile?.length()?.let { raf.seek(it) }
raf.write(buffer)
} else {
out = FileOutputStream(speechFile)
out.write(buffer)
out.flush()
}
} catch (e: IOException) {
e.printStackTrace()
} finally {
try {
raf?.close()
out?.close()
} catch (e: IOException) {
e.printStackTrace()
}
}
}
and getting the duration that way, but the Android MediaPlayer.getDuration() doesn't seem to work with a .pcm file.
Is there a better way to get the duration of the audio? If not then is it possible to calculate the duration of the .pcm file somehow?
I have configured Bluetooth Device that send data to Android App.
But I am receiving data in unreadable format.
Data I received from Bluetooth is below format.
���
��
���
Here is my android code
private val mmInStream: InputStream?
private val mmOutStream: OutputStream?
init {
var tmpIn: InputStream? = null
var tmpOut: OutputStream? = null
// Get the input and output streams, using temp objects because
// member streams are final
try {
tmpIn = mmSocket.inputStream
tmpOut = mmSocket.outputStream
} catch (e: IOException) {
Log.e(TAG, ": " + e.message)
}
mmInStream = tmpIn
mmOutStream = tmpOut
}
override fun run() {
val buffer = ByteArray(1024) // buffer store for the stream
var bytes = 0 // bytes returned from read()
// Keep listening to the InputStream until an exception occurs
while (true) {
try {
bytes = mmInStream?.read(buffer) ?:0
val readMessage = String(buffer, 0, bytes)
Log.e("Arduino Message", readMessage.toString())
handler?.obtainMessage(MESSAGE_READ, readMessage)?.sendToTarget()
} catch (e: IOException) {
handler?.obtainMessage(CONNECTING_STATUS, 0, -1)?.sendToTarget()
e.printStackTrace()
break
}
}
}
Pls convert the received bytes to hex string via bytesToHex extension method and log it
fun ByteArray.bytesToHexString(
spaces: Boolean = false
): String {
val format = if (spaces) "%02x " else "%02x"
val sb = StringBuilder()
for (i in 0 until size) {
sb.append(format.format(this[i]))
}
return sb.toString()
}
Quite a few questions/answers and many I have tried without success. I receive a compressed string that uses a MemoryStream and DeflateStream to do so (c#). The following decompression function works fine
fun decompress(string: String): String? {
var decompressedString: String? = ""
try {
val bytes: ByteArray = Base64.decode(string, Base64.DEFAULT)
val inflater = Inflater(true)
val outputStream = ByteArrayOutputStream()
val buffer = ByteArray(1024)
inflater.setInput(bytes)
while (!inflater.finished()) {
val count = inflater.inflate(buffer)
outputStream.write(buffer, 0, count)
}
inflater.end()
outputStream.close()
decompressedString = outputStream.toString("UTF8")
} catch (e: IOException) {
e.printStackTrace()
}
return decompressedString
}
At a later time I need to compress the data and send it back. Attempts to compress the data have been unsuccessful. The server keeps telling me that the "block length does not match with its complement." I use the following function for compressing
fun compress(string: String): String? {
var compressedString: String? = null
try {
val bytes: ByteArray = string.toByteArray(charset("UTF-8"))
// Compress the bytes
val deflater = Deflater()
//val outputStream = ByteArrayOutputStream()
val buffer = ByteArray(1024)
deflater.setInput(bytes)
deflater.finish()
deflater.deflate(buffer)
deflater.end()
//outputStream.close()
compressedString = Base64.encodeToString(buffer, Base64.DEFAULT)
} catch (e: IOException) {
e.printStackTrace()
}
return compressedString
}
The problem isn't server side as it works fine with an iOS app but not Android. I've tried many variants of this all without success.
Anyone have any suggestions on what it is that I am doing incorrectly and what I need to do to get it to work?
Thanks ^.^
In case anyone else runs into this problem. I was able to solve it by changing the deflate function to
var compressedString: String? = ""
val bytes: ByteArray = string.toByteArray(charset("UTF-8"))
val deflater = Deflater(1, true)
deflater.setInput(bytes)
deflater.finish()
val outputStream = ByteArrayOutputStream(bytes.size)
try {
val bytesCompressed = ByteArray(Short.MAX_VALUE.toInt())
val numberOfBytesAfterCompression = deflater.deflate(bytesCompressed)
val returnValues = ByteArray(numberOfBytesAfterCompression)
System.arraycopy(bytesCompressed, 0, returnValues, 0, numberOfBytesAfterCompression)
compressedString = Base64.encodeToString(returnValues, Base64.DEFAULT)
} catch (e: IOException) {
e.printStackTrace()
} finally {
deflater.end()
outputStream.close()
}
Obtained from here deflater examples site.
Apparently using the prior function adds 2 additional bytes and this is what was causing the issue. After the change, the 2 bytes are not added. I don't quite understand how or why so if someone knows and wishes to share, please do so.
I am downloading the file using OkHttp3, I want to see the downloading speed. but I am confused about how to measure the speed.
I tried getting the current millis before reading the buffer and calculating after it is written, but it always returns a static value.
Following is my download function.
fun download(fileName: String) {
val request = Request.Builder().url(url)
.get().build()
val call = OkHttpClient().newCall(request)
val response = call.execute()
if (response.isSuccessful) {
var inputStream: InputStream? = null
try {
inputStream = response.body()?.byteStream()
val buffer = ByteArray(8192)
val mediaFile = File(downloadDir, fileName)
val output = RandomAccessFile(mediaFile, "rw")
output.seek(0)
while (true) {
val readed = inputStream?.read(buffer)
if (readed == -1 || readed == null) {
break
}
output.write(buffer, 0, readed)
downloaded.append(readed.toLong())
}
output.close()
} catch (e: IOException) {
// TODO: handle IOException
console.log("${e.message}")
} finally {
inputStream?.close()
}
}
}
It's a very simple problem, I got confused by overthinking. Anyway here is the solution.
all I need to do is store the downloaded bytes in a variable after 1s subtract downloaded bytes from newly downloaded bytes, that will give me the downloaded bytes in 1s, then I can use those bytes to convert into speed like kbps or Mbps.
fun getSpeed(callback: (String) -> Unit) {
doAsync {
var prevDownloaded = 0L
while (true) {
if (contentLength != null) {
if (downloaded.get() >= contentLength!!) {
break
}
}
if (prevDownloaded != 0L) {
callback(formatBytes(downloaded.get() - prevDownloaded))
}
prevDownloaded = downloaded.get()
Thread.sleep(1000)
}
}
}