split two channels of AudioRecord of CHANNEL_IN_STEREO - android

I'm working on a project using stereo record of the Android phones (note 3). But I need to split the data from different channels (right, left). Any idea of how to perform that?
Now, I use AudioRecord to record the sound of internal microphones. And I can record, save the sound to .raw and .wav files.
Some codes as follows.
private int audioSource = MediaRecorder.AudioSource.MIC;
private static int sampleRateInHz = 44100;
private static int channelConfig = AudioFormat.CHANNEL_IN_STEREO;
private static int audioFormat = AudioFormat.ENCODING_PCM_16BIT;
bufferSizeInBytes = AudioRecord.getMinBufferSize(sampleRateInHz,
channelConfig, audioFormat);
audioRecord = new AudioRecord(audioSource, sampleRateInHz,
channelConfig, audioFormat, bufferSizeInBytes);
// some other codes....
//get the data from audioRecord
readsize = audioRecord.read(audiodata, 0, bufferSizeInBytes);

Finally, I got the answers. I used stereo record of android phone. And the audioFormat is PCM_16BIT.
private int audioSource = MediaRecorder.AudioSource.MIC;
private static int sampleRateInHz = 48000;
private static int channelConfig = AudioFormat.CHANNEL_IN_STEREO;
private static int audioFormat = AudioFormat.ENCODING_PCM_16BIT;
which means the data stored in buffer as follows.
leftChannel data: [0,1],[4,5]...
rightChannel data: [2,3],[6,7]...
So the code of splitting data of stereo record.
readSize = audioRecord.read(audioShortData, 0, bufferSizeInBytes);
for(int i = 0; i < readSize/2; i = i + 2)
{
leftChannelAudioData[i] = audiodata[2*i];
leftChannelAudioData[i+1] = audiodata[2*i+1];
rightChannelAudioData[i] = audiodata[2*i+2];
rightChannelAudioData[i+1] = audiodata[2*i+3];
}
Then you can write the data to file.
leftChannelFos = new FileOutputStream(rawLeftChannelDataFile);
rightChannelFos = new FileOutputStream(rawRightChannelDataFile);
leftChannelBos = new BufferedOutputStream(leftChannelFos);
rightChannelBos = new BufferedOutputStream(rightChannelFos);
leftChannelDos = new DataOutputStream(leftChannelBos);
rightChannelDos = new DataOutputStream(rightChannelBos);
leftChannelDos.write(leftChannelAudioData);
rightChannelDos.write(rightChannelAudioData);
Happy coding!

Related

Cannot play PCM stream with Encoding ENCODING_PCM_FLOAT

I know that there are a lot of questions about this topic. But I'm having issue with ENCODING_PCM_FLOAT.
I have an PCM stream with following information:
Encoding : 32 bit float
Byte order: Little Endian
Channels : 2 channels ( stereo)
Sample Rate : 48000
And I want to feed it to AudioTrack. I use the following APIs:
private final int SAMPLE_RATE = 48000;
private final int CHANNEL_COUNT = AudioFormat.CHANNEL_OUT_STEREO;
private final int ENCODING = AudioFormat.ENCODING_PCM_FLOAT;
...
int bufferSize = AudioTrack.getMinBufferSize(SAMPLE_RATE, CHANNEL_COUNT, ENCODING);
AudioAttributes audioAttributes = new AudioAttributes.Builder()
.setUsage(AudioAttributes.USAGE_MEDIA)
.setContentType(AudioAttributes.CONTENT_TYPE_MUSIC)
.build();
AudioFormat audioFormat = new AudioFormat.Builder()
.setEncoding(ENCODING)
.setSampleRate(SAMPLE_RATE)
.build();
audioTrack = new AudioTrack(audioAttributes, audioFormat, bufferSize
, AudioTrack.MODE_STREAM, AudioManager.AUDIO_SESSION_ID_GENERATE);
audioTrack.play();
And then I start listening from audio stream:
private void startListening() {
while (true) {
try {
byte[] buffer = new byte[8192];
DatagramPacket packet = new DatagramPacket(buffer, buffer.length);
mSocket.receive(packet);
FloatBuffer floatBuffer = ByteBuffer.wrap(packet.getData()).asFloatBuffer();
float[] audioFloats = new float[floatBuffer.capacity()];
floatBuffer.get(audioFloats);
for (int i = 0; i < audioFloats.length; i++) {
audioFloats[i] = audioFloats[i] / 0x8000000;
}
audioTrack.write(audioFloats, 0, audioFloats.length, AudioTrack.WRITE_NON_BLOCKING);
} catch (IOException e) {
e.printStackTrace();
}
}
}
But I don't hear any sound at all. I can play the byte array as PCM_16 (without convert it to float array), but it contains so much noise. So I think the stream input is not the problem.
If you have any idea, please let me know.
Thanks for reading!
for (int i = 0; i < audioFloats.length; i++) {
audioFloats[i] = audioFloats[i] / 0x8000000;
}
I was an idiot, the code block above is not needed. After removed, the audio play normally.

Delay in audio for Mic to speaker direct streaming application Android

I am trying to built a mic application, sound from mic is directly played by speaker.The problem is there is a delay in sound heard. Code is given below. Is there a way to avoid this delay? I have heard that we can avoid this by adding native code in c/c++ and then call it from java. Is it possible? If so how?
public class MainActivity extends AppCompatActivity {
boolean isRecording;
AudioManager am;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
am = (AudioManager) getSystemService(Context.AUDIO_SERVICE);
Record record = new Record();
record.run();
}
public class Record extends Thread
{
static final int bufferSize = 200000;
final short[] buffer = new short[bufferSize];
short[] readBuffer = new short[bufferSize];
public void run() {
isRecording = true;
android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);
int buffersize = AudioRecord.getMinBufferSize(11025,AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT);
AudioRecord arec = new AudioRecord(MediaRecorder.AudioSource.MIC, 11025, AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT,buffersize);
AudioTrack atrack = new AudioTrack(AudioManager.STREAM_VOICE_CALL, 11025, AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT, buffersize, AudioTrack.MODE_STREAM);
am.setRouting(AudioManager.MODE_NORMAL, AudioManager.ROUTE_EARPIECE, AudioManager.ROUTE_ALL);
atrack.setPlaybackRate(11025);
byte[] buffer = new byte[buffersize];
arec.startRecording();
atrack.play();
while(isRecording) {
arec.read(buffer, 0,
buffersize);
atrack.write(buffer, 0,
buffer.length);
}
arec.stop();
atrack.stop();
isRecording = false;
}
}
}
Use this class to set up native audio on Android: https://github.com/superpoweredSDK/Low-Latency-Android-Audio-iOS-Audio-Engine/tree/master/Superpowered/AndroidIO
You can find example projects there as well.
Well you can try this library called superpowered that claims to have low latency audio instead of writing your own native code.
Hope this works for you. the source is also available on git hub.

Play sound from array on Android

Solved: I forgot the track.play(); at the end...
I want to play a sound on my Android Smartphone (4.0.4 Api level 15).
I tried to hear some random noise, but its not working:
public class Sound {
private static int length = 22050 * 10; //10 seconds long
private static byte[] data = new byte[length];
static void fillRandom() {
new Random().nextBytes(data); //Create some random noise to listen to.
}
static void play() {
fillRandom();
final int TEST_SR = 22050; //This is from an example I found online.
final int TEST_CONF = AudioFormat.CHANNEL_OUT_MONO;
final int TEST_FORMAT = AudioFormat.ENCODING_PCM_16BIT;
final int TEST_MODE = AudioTrack.MODE_STATIC; //I need static mode.
final int TEST_STREAM_TYPE = AudioManager.STREAM_ALARM;
AudioTrack track = new AudioTrack(TEST_STREAM_TYPE, TEST_SR, TEST_CONF, TEST_FORMAT, length, TEST_MODE);
track.write(data, 0, length);
}
}
I have played a little bit with the variabels, but could not get it to work.
All you have left to do is play it. Add this line to the end of your play() function:
track.play();

How to get pcm from fm radio source on android?

I try to get raw data in PCM format form fm radio source. I do this:
int bufSize = AudioRecord.getMinBufferSize(SAMPLE_RATE_16kHz, AudioFormat.CHANNEL_IN_MONO,
AudioFormat.ENCODING_PCM_16BIT);
mRecorder = new AudioRecord(AudioSource.FM_RX, SAMPLE_RATE_16kHz, AudioFormat.CHANNEL_IN_MONO,
AudioFormat.ENCODING_PCM_16BIT, bufSize);
mBuffer = new short[bufSize / 2];
mRecorder.startRecording();
and when I call in a loop:
int ret = mRecorder.read(mBuffer, 0, mBuffer.length);
the value in ret is 0 and buffer is empty.
But if I change AudioSource.FM_RX to AudioSource.MIC I can get data form microphone. What I do wrong?
AudioSource.FM_RX might be available via CyanogenMod but not in the standard API (see e.g. this question).

How to record an FM audio in android?

i need to record the songs being played by a FM app.
I checked the MediaRecorder.AudioSource but could not find what to use for setAudioSource
can anyone please help me?
thanks,
Ramachandran.R
There is no FM radio support in the Android SDK. Various device manufacturers may have hacked in their own FM radio support, but you would have to contact those manufacturers to learn what APIs, if any, they have for them.
try this code
int audioSource = MediaRecorder.AudioSource.VOICE_DOWNLINK;
int sampleRateInHz = 8000;
int channelConfig = AudioFormat.CHANNEL_CONFIGURATION_MONO;
int audioFormat = AudioFormat.ENCODING_PCM_16BIT;
bufferSize = AudioRecord.getMinBufferSize(sampleRateInHz,
channelConfig, audioFormat);
AudioRecord recordInstance = new AudioRecord(audioSource,
sampleRateInHz, channelConfig, audioFormat, bufferSize);
recordInstance.startRecording();

Categories

Resources