Distinguish source when capturing audio with AudioRecord - android

The AudioRecord class allows recording of phone calls with one of the following options as the recording source:
VOICE_UPLINK: The audio transmitted from your end to the other party. IOW, what you speak into the microphone.
VOICE_DOWNLINK: The audio transmitted from the other party to your end.
VOICE_CALL: VOICE_UPLINK + VOICE_DOWNLINK.
I'd like to build an App that records both VOICE_UPLINK & VOICE_DOWNLINK and identify the source of the voice.
When using VOICE_CALL as the AudioSource option, the UP/DOWN-LINK streams are bundled together in to the received data buffer which makes it hard to identify the source of the voice.
Using two AudioRecords with VOICE_UPLINK & VOICE_DOWNLINK does not work - the second AudioRecord fails to start because the first AudioRecord locks the recording stream.
Is there any creative way to bypass the locking problem presented at case (2), thus enable recording of the VOICE_UPLINK & VOICE_DOWNLINK streams simultaneously and easily identifying the source?

Related

Android Call Recording Downlink muted

I am trying to record both Uplink and Downlink voice using Android. Regardless the law and everything, i am already aware, so please do not put comments related to the law.
The code below works fine, except when i mute the microphone, it wont record the downlink voice.
I am using Android 8.1. I've tried using a third party app called ACR on the same device, and it works fine, while i am muted, it still records the downlink voice.
val audioManager = applicationContext.getSystemService(Context.AUDIO_SERVICE) as AudioManager
val maximumVolume = audioManager.getStreamMaxVolume(AudioManager.STREAM_VOICE_CALL)
audioManager.setStreamVolume(AudioManager.STREAM_VOICE_CALL, maximumVolume, 0)
val audioSource = MediaRecorder.AudioSource.MIC
val mediaRecorder = MediaRecorder()
mediaRecorder.apply {
setAudioSource(audioSource)
setOutputFormat(MediaRecorder.OutputFormat.MPEG_4)
setAudioEncoder(MediaRecorder.AudioEncoder.AAC)
setAudioChannels(audioChannels)
setAudioSamplingRate(audioSamplingRate)
setAudioEncodingBitRate(audioEncodingBitRate)
setOutputFile(path)
prepare()
start()
This is not an issue. You set the MediaRecorder to use MIC as input, so if you MUTE the microphone it's obliviously that the input signat is lost/muted. When you use "downlink" word I expected to see a different input source as VOICECALL or DOWNLINK instead of MIC. Trying to record a voicecall using the MIC it's wrong in my opinion because: (1) you have to set max volume to speaker and redirect the voicecall through it (2) while recording a voicecall from the MIC the caller hears ALL what it happens around your device and all what you're saying to other people (3) this method records much noise and echoes. The right way is to record from VOICECALL but most of new devices (using newer Android version) prevents to record from this source and allows it only at System Apps. ACR uses a workaround by calling hidden API methods, but this method could stop stop work at any time due to Android updates.

Is it possible to record audio from two mic inputs independently at a time

Tried to create new audiorecord instances like
mAudioInstance = new Record(MediaRecorder.AudioSource.MIC);
mAudioInstanceSecond = new Record(MediaRecorder.AudioSource.CAMCORDER);
So when I tried to start recording I got the following warning message for second record instance
"startInput() input failed: other input already started"
So I cant use second mic for recording, only able to record for first mic
Is there anyway to use two audio inputs for recording at a time for an anddoid device
Note : I am using Nexus 9 which have a mic port near camera, so I believe the second mic instance is a valid one.
You can do this by doing a stereo recording using the AudioRecord
(http://developer.android.com/reference/android/media/AudioRecord.html)
Refer this : https://stackoverflow.com/a/15418720/7795876
Specifying the audio format as stereo and the audio source as the camcorder automatically selects two microphones, one for each channel, on a (compatible) two microphone device.
Eg:-
audioRecorder = new AudioRecord(MediaRecorder.AudioSource.CAMCORDER,
sampleRate, android.media.AudioFormat.CHANNEL_CONFIGURATION_STEREO,
android.media.AudioFormat.ENCODING_PCM_16BIT, bufferSize);
this will initialise a new AudioRecord class, that can record from two device microphones in stereo in PCM, 16 bit format.

Recording from AudioRecorder while another MediaRecorder is recording

I am attempting to add custom voice commands to a glass app by using AudioRecorder to grab input from the microphone and do speech recognition on that. Everything is working great except for the command to "stop recording", where I need to grab the current microphone input at the same time that a MediaRecorder object is recording the video.
Specifically, I don't get any errors, but doing a
int bufferResult = recorder.read(readBuffer, 0, readBuffer.length);
results in 0 bytes being read, and bufferResult being 0. readBuffer.length is 64000 bytes (4 seconds worth of audio)
I suspect that that there is some sort of lock on the underlying resources that is preventing from AudioRecorder from doing .reads() while MediaRecorder is recording as well. Has anyone run into this issue before? More generally, is there any way to get the audio from the microphone while MediaRecorder is recording, either via AudioRecorder or otherwise?

Unexpected noise when using audiotrack + audiorecord class in android

I am doing an Android IP phone application with android 2.1 version.
My application is to provide a simple ip phone function.My program consist of a listener thread which receive command and poll user to start the call.
The audiotrack class will receive audio data and playback
while it will record audio data with audiorecord and stream it out to the other side.
When there is only one user streaming to another, the sound quality is good.However, when the receiver side also start record and stream, both side ear weird sound and loud noise.But still both sides can hear what the others said.
Is it not suitable for using audiotrack and audiorecord class on the same side?I cannot figure out the problem. Can anyone suggest any solution?

Can an Android app have exclusive access to the audio h/w?

I am trying to build an application that will alert the user in case of an emergency by playing an audio file. To override situations where the user may be playing loud music and the emergency announcement may not be heard by the user (due to sharing of audio h/w with multiple apps), can I get exclusive access to audio output so only my audio stream is audible and rest all are stopped/killed/muted?
You can use AudioManager to set your audio source to 'solo' which will mute other audio streams setStreamSolo or you can individually mute other streams using setStreamMute

Categories

Resources