Some time silent recording with Audio Record - android

I am using Audio Record to record voice using android phone. It is working most of the time but some time on my moto g phone it records silent data. When I replay it I do not hear any thing. When I checked the
audio file it was there in the storage I played using media play from pc it was playing with out any voice .
my code:
int mSampleRate = 8000;
int mBufferSize = AudioRecord.getMinBufferSize(mSampleRate,AudioFormat.CHANNEL_CONFIGURATION_MONO,AudioFormat.ENCODING_PCM_16BIT);
byte bData[] = new byte[mBufferSize];
ByteArrayOutputStream encodedStream = new ByteArrayOutputStream( );
while (mIsRecording) {
mRecorder.read(bData, 0, mBufferSize);
encode(bData,encodedStream);
}
Any help will be highly appreciated

Thanks for the help guys . Its was my mistake. I was using the audio recorder through JNI too and I was not releasing the audio record that's why whenever I tried to use audio record from java (Android SDK) it was not capturing any voice . Hope this will help to some one else .

Related

Very long MediaExtractor delay/latency for online audio stream

I'm writing a small (personal use) app that plays an audio stream from the internet. The stream is an AAC audio stream. I can play that stream just fine on linux through mpv and it plays immediately with no perceivable latency.
On Android, I'm first creating a MediaExtractor to extract the data to decode, like this:
// tested URLs:
// https://r.dcs.redcdn.pl/sc/o2/Eurozet/live/audio.livx
// https://rs101-krk.rmfstream.pl/RMFFM48
String streamSource = ...;
MediaExtractor extractor = new MediaExtractor();
extractor.setDataSource(streamSource);
extractor.selectTrack(0);
// it takes ~5 seconds to get there
this.format = extractor.getTrackFormat(0);
And for some reason there is a few seconds of delay from beginning downloading data to actually being able to get any information (like even track count, or select a track) from MediaExtractor. This is both when I construct it with a URL and when I create my own StreamMediaSource, and in this case it actually outputs any information after first reading about 130-150kiB of data, which at 64kbps is 16+ seconds of audio.
Is there any way to replicate the instant playback I can get on a PC? Why does it have to read that much data to proceed?

Is it possible to record audio from two mic inputs independently at a time

Tried to create new audiorecord instances like
mAudioInstance = new Record(MediaRecorder.AudioSource.MIC);
mAudioInstanceSecond = new Record(MediaRecorder.AudioSource.CAMCORDER);
So when I tried to start recording I got the following warning message for second record instance
"startInput() input failed: other input already started"
So I cant use second mic for recording, only able to record for first mic
Is there anyway to use two audio inputs for recording at a time for an anddoid device
Note : I am using Nexus 9 which have a mic port near camera, so I believe the second mic instance is a valid one.
You can do this by doing a stereo recording using the AudioRecord
(http://developer.android.com/reference/android/media/AudioRecord.html)
Refer this : https://stackoverflow.com/a/15418720/7795876
Specifying the audio format as stereo and the audio source as the camcorder automatically selects two microphones, one for each channel, on a (compatible) two microphone device.
Eg:-
audioRecorder = new AudioRecord(MediaRecorder.AudioSource.CAMCORDER,
sampleRate, android.media.AudioFormat.CHANNEL_CONFIGURATION_STEREO,
android.media.AudioFormat.ENCODING_PCM_16BIT, bufferSize);
this will initialise a new AudioRecord class, that can record from two device microphones in stereo in PCM, 16 bit format.

Android: listen to the auxiliary port

I am looking for a way to have an Android application listen to a signal coming in from the auxiliary port and immediately play it back. Usually this port would be used as an output for sound (headphones) but I need it to listen instead (more like a microphone). My end goal is to connect a male to male auxiliary (audio) cable from my TV's headphones jack to the android device's headphones jack. Then, I need to have an application that can broadcast that input (from the TV) to some Bluetooth headphones. So my question is, how can I set a listener for the headphones jack and immediately play back that audio which is received/recorded?
Things I have looked into -
Audio Recording, Audio Track
As you pointed out it is essential thay you understand AudioRecord and AudioTrack classes from your link and play with some examples.
One idea comes to my mind. Actually it is possible to play some sounds using audiotrack class which will let you not to worry about AudioRecord at first instance. Using Audio Track you can generate your own samples and play them back. By the nature of a waveform you can play them back and each will have different sounds. There are online tone generators, you can check it out if you want to understand me better http://onlinetonegenerator.com/ Generating samples that represent those soundwaves can be the first step since you will be able to hear them from your phone.
After that you should start "playing with the connection between bluetooth devices and the phone. Once you know how do the soundwave works you can try to send that file through bluetooth protocol to the device and check if it works.
The way you communicate between any action from an asyncTask to another thing in the main thread is through the publishProgress(nameOfTheSample), so while using bluetooth conection you should take the samples from there.
Once you are done with that you can worry about the sound from the TV.
private class GenerateSamples extends AsyncTask<Void, Void, Void>
{
#Override
protected Void doInBackground(Void... params) {
final int SAMPLE_RATE = 8000;
int minSize = AudioTrack.getMinBufferSize(SAMPLE_RATE,
AudioFormat.CHANNEL_CONFIGURATION_MONO,
AudioFormat.ENCODING_PCM_16BIT);
AudioTrack audioTrack = new AudioTrack(
AudioManager.STREAM_MUSIC, SAMPLE_RATE,
AudioFormat.CHANNEL_CONFIGURATION_MONO,
AudioFormat.ENCODING_PCM_16BIT,
minSize,
AudioTrack.MODE_STREAM);
audioTrack.play();
short[] buffer = {
8130,15752,22389,27625,31134,32695,32210,29711,25354,19410,12253,
4329,-3865,-11818,-19032,-25055,-29511,-32121,-32722,-31276,-27874,
-22728,-16160,-8582,-466
};
while (isPlaying) {
audioTrack.write(buffer, 0, buffer.length);
}
return null;
}
}
}

Recording from AudioRecorder while another MediaRecorder is recording

I am attempting to add custom voice commands to a glass app by using AudioRecorder to grab input from the microphone and do speech recognition on that. Everything is working great except for the command to "stop recording", where I need to grab the current microphone input at the same time that a MediaRecorder object is recording the video.
Specifically, I don't get any errors, but doing a
int bufferResult = recorder.read(readBuffer, 0, readBuffer.length);
results in 0 bytes being read, and bufferResult being 0. readBuffer.length is 64000 bytes (4 seconds worth of audio)
I suspect that that there is some sort of lock on the underlying resources that is preventing from AudioRecorder from doing .reads() while MediaRecorder is recording as well. Has anyone run into this issue before? More generally, is there any way to get the audio from the microphone while MediaRecorder is recording, either via AudioRecorder or otherwise?

Reading Audio file in C and forwarding over bluetooth to play in Android Audio track

What I am Trying to do : Reading .wav file in C(linux) ,forwarding buffer data through bluetooth rfcomm socket , receiving buffer in android and then giving buffer to Audio Track to play.(Need android application to play audio streaming)
code :
1- C-code for rfcomm socket creation Ccode for rfcomm socket
2 - C-code for forwarding data
FILE *fp;
char buffer[1024];
fp = fopen("feelgood.wav","r"); //for audio track use reading .wav file
while(i=fread(buffer, sizeof(buffer),1, fp) > 0){
status=write(bluetooth_socket, buffer,strlen(buffer));
usleep(100000);
}
3- Android code for reading from socket is something like this:
//Audio Track initialization for Streaming
AudioTrack track = new AudioTrack(AudioManager.STREAM_MUSIC,44100,AudioFormat.CHANNEL_OUT_MONO,AudioFormat.ENCODING_PCM_8BIT,10000, AudioTrack.MODE_STREAM);
track.play();
//Receiving data from socket
byte[] buffer = new byte[1024];
int bytes;
bytes = socket.getInputStream().read(buffer);
track.write(buffer, 0,bytes);
Problem : Actually problem I am not getting why Audio track is not playing properly(hint of audio music with lot of noise is heard).How to listen noisefree audio on Android application part with this approach .Is there audio track implementation problem or buffer problem.
Related question(Receive audio via Bluetooth in Android) but cannot follow a2dp approach on Android as sink.

Categories

Resources