I am trying to analyze internal audio from android system to know wave amplitude of the sound that is being played at the moment. I start playing the audio on the device and after more or less four seconds the apps gets the sound that was playing. I think that the problem could start with the priority of my app on the system that cannot get audio on real time or closer to it.
I am using the AudioRecord from android
internalRecorder = AudioRecord.Builder()
.setAudioFormat(audioFormat)
.setBufferSizeInBytes(124800)
.setAudioPlaybackCaptureConfig(config)
.build()
and the config:
val config = AudioPlaybackCaptureConfiguration.Builder(mediaProjection!!)
.addMatchingUsage(AudioAttributes.USAGE_MEDIA)
.addMatchingUsage(AudioAttributes.USAGE_UNKNOWN)
.addMatchingUsage(AudioAttributes.USAGE_GAME)
.build()
I have tried to make an accessibility service but this not improve the performance. The audio buffer is having seconds of delay.
Related
I am making an android app that records audio from a bluetooth earpiece. Everything works fine and the app is able to record audio through the earpiece.
However I will need the exact (or very close to exact) millisecond of when the first audio signal was recorded, if the audio recordings are going to be useful to me. I tried getting the timestamp with System.currentTimeMillis(); right after calling .start() on the MediaRecorder, as mentioned in this SO Question.
Sadly when I use this method for getting the start time of my audio recording it is often off by quite a lot (100ms or more even).
By getting the time befor calling MediaRecorder.start() and after the call, like this:
audioStartTime1 = System.currentTimeMillis();
recorder.start();
audioStartTime2 = System.currentTimeMillis();
I'm able to see that between audioStartTime1 and audioStartTime2 anywhere between 50ms to 150ms elapse. It seems like starting the MediaRecorder takes quite some time. This would be no problem at all if the recording started right after recorder.start() returns, however the recording doesn't start right after recorder.start() has returned, but still takes some time until the bluetooth earpiece really starts recording. This delay is different every time and can be more than 100ms.
Is there some kind of way for me to get a more exact timestamp for the first audio sample?
Maybe store the time of when the MediaRecorder gets the first data packet over bluetooth. How would I do that though?
This person here also has a similar problem for video recording and did not get any useful answeres
I would really appriciate some Help.
Thanks.
I have class A which uses AudioRecorder instance for getting audio amplitude level. Also I have a class B where I want to record an audio to amr every 5 minutes.
How can I use both functionalities at the same time ? As far as I know, at the certain time only one object can have an access to the MIC.
I need to record audio and then send the recorded clip to server every 5 seconds. For example, at 5 seconds after recording started, you will send 5 seconds clip. Another 5 seconds after, it will send total 10 seconds clip from the start to the end.
How can I do?
I'm following this Android AudioRecord class - process live mic audio quickly, set up callback function
However, it's not working.
Assuming you don't need to send the audio in real time, why don't you record the whole audio file first, and then split it into increments of 5 seconds? You could then send all these files to the server.
I am developing an Android Application which is time sensitive and I would like to record precise timestamps when a particular audio sample is recorded. I am using AudioRecord.startRecording().
I want the timestamp to be noted exactly when the audio is processed by the hardware to avoid any errors.Is it possible to achieve this kind of accuracy and if so, how should I proceed with it.
Any pointers in this regard is highly appreciated.
You can use System Time for timestamp like below :
int bufferReadResult = audioRecord.read(buffer, 0,blockSize);
long startTime = System.nanoTime()/ 1000;
You can count the samples between 2 events on the recorded audio signal, and get the precise elapsed time, by counting the samples and knowing the sample frequency. AFAIK, you can't get the absolute time of the aquired signal. As, audio is sampled realtime by local oscillator. A chunk is sent to you as soon as the operative system has free time to send it to you. I'm not sure about this.
I am creating an app like Talking Tom. I am using AudioRecorder for recording. I only have the basic Android, no Sound Touch. The problem I am facing is that I need to record only audio that is above a certain volume.
The statement:
if (recorder.getState() == AudioRecord.STATE_INITIALIZED)
{
return recorder;
}
The recorder gets initialized for even low-level volumes and keeps on recording even when I stop speaking and a very low level sound is in background.
You could run your AudioRecord continuously and do a level check (RMS or whatever measurement you prefer) at relatively close intervals. Once the sound level exceeds your desired threshold, save the buffer data until it drops below the stop threshold for a given amount of time, otherwise throw the data away.