Do anyone know if there is a need for audiorecord and audiotrack to be seperated in a special way/type of thread, runnable, service etc for it to be able to work together? Now I can start an audiotrack, then audiorecord without issues.
But if I start and stop the audiotrack while recording the audiorecord starts to output 0's as if it was muted. (But it is not muted or stopped)
If I start the audiorecord then the audiotrack then the audiorecord is also "muted".
Also weird is it that when I unplug and plug in my wired headset it will start recording/output recording other than 0's again (which makes me think my phones Lenovo B and Lenovo C2 is too cheap(circuit/hardware issues) or have build issues) but I do not know.
Anyone heard of this issue with a suddenly "muted" audiorecord or an audiorecord which responds to unplug/plugging of a wiredheadset without having any settings/methods applied for it?
Code update
class myRecordAndPlayer(){
public void initiateRecorder() {
if(audio.getMode()!=AudioManager.MODE_IN_CALL) {
audio.setMode(AudioManager.MODE_IN_CALL);
//audio.setSpeakerphoneOn(false); for mode in com (mode in com produces more echo/crosstalk)
}
rec = true;
Thread thread = new Thread(new Runnable() {
#Override
public void run() {
android.os.Process.setThreadPriority(Process.THREAD_PRIORITY_URGENT_AUDIO);
AudioRecord audioRecorder = new AudioRecord(MediaRecorder.AudioSource.MIC, SAMPLE_RATE,
AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT,
AudioRecord.getMinBufferSize(SAMPLE_RATE, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT) * 10);
try {
audioRecorder.startRecording();
while (rec) {
bytes_read = audioRecorder.read(buf_audio, 0, buf_audio_len);
public void initiatePlayer() {
if(!play) {
play = true;
android.os.Process.setThreadPriority(Process.THREAD_PRIORITY_URGENT_AUDIO);
Thread receiveThread = new Thread(new Runnable() {
#Override
public void run() {
AudioTrack track = new AudioTrack(AudioManager.STREAM_MUSIC, SAMPLE_RATE, AudioFormat.CHANNEL_OUT_MONO,
AudioFormat.ENCODING_PCM_16BIT, BUF_SIZE, AudioTrack.MODE_STREAM);
track.play();
try {
while(play) {
track.write(bufToPlay, 0, bufToPlay.length);
Not tested.
private Thread audioRecordThread
private AudioRecord audioRecorder;
private AudioTrack audioTrack;
public void initialize() {
audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC, SAMPLE_RATE, AudioFormat.CHANNEL_OUT_MONO,
AudioFormat.ENCODING_PCM_16BIT, BUF_SIZE, AudioTrack.MODE_STREAM);
audioRecorder = new AudioRecord(MediaRecorder.AudioSource.MIC, SAMPLE_RATE,
AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT,
AudioRecord.getMinBufferSize(SAMPLE_RATE, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT) * 10);
audioRecordThread = new Thread(new Runnable() {
public void run() {
writeAudioDataToAudioTrack();
}
});
}
public void start() {
audioTrack.play();
audioRecorder.startRecording();
audioRecordThread.start();
}
private void writeAudioDataToAudioTrack() {
while(AudioRecord.RECORDSTATE_RECORDING == audioRecord.getRecordingState()) {
audioTrack.write(bufToPlay, 0, bufToPlay.length);
}
}
Related
I am working with Visualizer. It will get data from AudioTrack and display when I click a button. In the button, I will call the function DrawStart as below:
private void DrawStart() {
if (startDrawing) {
initRecorder();
mVisualizerView.link(track);
startRecording();
}
else {
DrawStop();
}
}
It works well for about 10 first click. That means if I call DrawStart more than 10 times it has error
Fatal signal 11 (SIGSEGV) at 0x00030000 (code=1), thread 8164 (Visualizer)
Could you help me to fix it? Thanks so much. There are my sub-fuctions
private void initRecorder() {
_audioManager = (AudioManager) getSystemService(Context.AUDIO_SERVICE);
int maxJitter = AudioTrack.getMinBufferSize(SAMPLE_RATE, AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_16BIT);
track = new AudioTrack(AudioManager.MODE_IN_COMMUNICATION, SAMPLE_RATE, AudioFormat.CHANNEL_OUT_MONO,
AudioFormat.ENCODING_PCM_16BIT, maxJitter, AudioTrack.MODE_STREAM);
_audioManager.startBluetoothSco();
_audioManager.setMode(AudioManager.STREAM_VOICE_CALL);
}
private void startRecording() {
recordingThread = new AudioRecordingThread(track,mRecorder, bufferSize,SAMPLE_RATE,new AudioRecordingHandler() {
// Do something
});
recordingThread.start();
}
private void DrawStop() {
if (recordingThread != null) {
recordingThread = null;
}
track.release();
startDrawing = true;
}
And
public void link(AudioTrack player)
{
if(player == null)
{
throw new NullPointerException("Cannot link to null MediaPlayer");
}
int playerId=player.getAudioSessionId();
// Create the Visualizer object and attach it to our media player.
mVisualizer = new Visualizer(playerId);
mVisualizer.setScalingMode(Visualizer.SCALING_MODE_NORMALIZED);
mVisualizer.setCaptureSize(Visualizer.getCaptureSizeRange()[1]);
}
It was fixed by setting the application's hardwareAccelerated attribute to false in AndroidManifest.xml
<application
android:hardwareAccelerated="false"
I am developing a real-time audio processing software which should update the UI in at least 100ms from the worker thread.
However, this appears harder to achieve than it looks.
I am calling runOnUiThread(uiUpdaterRunnable) from the worker thread but the delay of execution of uiUpdaterRunnable is variable and generally more than 300ms.
I tried to Use AsyncTask with publishProgress but it also gave me similar results.
How can i update UI from the worker thread to get at least 10FPS ?
Here is the runnable of my worker thread :
new Runnable() {
public void run() {
try {
int sampleRate = 44100;
int audioFormat = AudioFormat.ENCODING_PCM_16BIT;
int channelConfig = MediaRecorder.AudioSource.MIC;
int bufferSize = AudioRecord.getMinBufferSize(sampleRate, channelConfig, audioFormat);
audioRecorder = new AudioRecord(channelConfig, sampleRate, AudioFormat.CHANNEL_IN_MONO, audioFormat, bufferSize);
if (bufferSize == AudioRecord.ERROR_BAD_VALUE)
Log.i("AudioRecord", "AudioRecord Bad Buffer Size");
if (audioRecorder.getState() == AudioRecord.STATE_INITIALIZED)
Log.i("AudioRecord", "AudioRecord Initialized Successfully");
audioRecorder.startRecording();
while (recording) {
startTime = System.currentTimeMillis();
short[] buffer = new short[bufferSize];
audioRecorder.read(buffer, 0, bufferSize);
double[] bufferDouble = shortToDoubleArray(buffer);
final DataPoint[] resultArray = getFFTResult(bufferDouble);
//This is where I update UI
runOnUiThread(new Runnable() {
public void run() {
updateGraph(resultArray);
}
});
}
releaseAudioResources();
} catch (Exception e) {
if (e.getMessage() == null)
e.printStackTrace();
else
Log.e("Exception", e.getMessage());
releaseAudioResources();
}
}
}
....
// write this outside your code block
Handler mHandler = new Handler(){
#Override
public void handleMessage(Message msg) {
super.handleMessage(msg);
updateGraph(resultArray);
}
};
// to call it inside runnable
mHandler.sendEmptyMessage(0);
I tried to use AudioTrack.write() in order to hear the recorded sound but no output , so the purpose i need is to record audio via AudioRecord and play it back via AudioTrack without save it in sd-card or internal storage,so any help will be appreciated.
public class MainActivity extends Activity {
short[]buffer=new short[512];
AudioManager am = null;
AudioRecord record =null;
AudioTrack track =null;
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
setVolumeControlStream(AudioManager.MODE_IN_COMMUNICATION);
init();
final Button jjbt1=(Button)findViewById(R.id.jt1);
final Button jjbt2=(Button)findViewById(R.id.jt2);
am = (AudioManager) this.getSystemService(Context.AUDIO_SERVICE);
am.setMode(AudioManager.MODE_IN_COMMUNICATION);
record.startRecording();
track.play();
jjbt1.setOnClickListener(new View.OnClickListener() {
public void onClick(View v) {
for( int i=0 ; i<512 ; i++ ){
record.read(buffer, 0,512);
}
record.stop();
}
});
jjbt2.setOnClickListener(new View.OnClickListener(){
public void onClick(View v) {
try{
track.write(buffer, 0,buffer.length);
}catch(Exception de){Toast.makeText(getBaseContext(), de.getMessage().toString(), Toast.LENGTH_LONG).show();}
}
});
}
private void init() {
int min = AudioRecord.getMinBufferSize(8000, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);
record = new AudioRecord(MediaRecorder.AudioSource.VOICE_COMMUNICATION, 8000, AudioFormat.CHANNEL_IN_MONO,
AudioFormat.ENCODING_PCM_16BIT, min);
int maxJitter = AudioTrack.getMinBufferSize(8000, AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_16BIT);
track = new AudioTrack(AudioManager.MODE_IN_COMMUNICATION, 8000, AudioFormat.CHANNEL_OUT_MONO,
AudioFormat.ENCODING_PCM_16BIT, maxJitter, AudioTrack.MODE_STREAM);
}
}
This works for me:
boolean isRecording = false;
AudioManager am = null;
AudioRecord record = null;
AudioTrack track = null;
#Override
protected void onCreate(Bundle savedInstanceState)
{
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
setVolumeControlStream(AudioManager.MODE_IN_COMMUNICATION);
initRecordAndTrack();
am = (AudioManager) this.getSystemService(Context.AUDIO_SERVICE);
am.setSpeakerphoneOn(true);
(new Thread()
{
#Override
public void run()
{
recordAndPlay();
}
}).start();
Button startButton = (Button) findViewById(R.id.start_button);
startButton.setOnClickListener(new OnClickListener()
{
#Override
public void onClick(View v)
{
if (!isRecording)
{
startRecordAndPlay();
}
}
});
Button stopButton = (Button) findViewById(R.id.stop_button);
stopButton.setOnClickListener(new OnClickListener()
{
#Override
public void onClick(View v)
{
if (isRecording)
{
stopRecordAndPlay();
}
}
});
}
private void initRecordAndTrack()
{
int min = AudioRecord.getMinBufferSize(8000, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);
record = new AudioRecord(MediaRecorder.AudioSource.VOICE_COMMUNICATION, 8000, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT,
min);
if (AcousticEchoCanceler.isAvailable())
{
AcousticEchoCanceler echoCancler = AcousticEchoCanceler.create(record.getAudioSessionId());
echoCancler.setEnabled(true);
}
int maxJitter = AudioTrack.getMinBufferSize(8000, AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_16BIT);
track = new AudioTrack(AudioManager.MODE_IN_COMMUNICATION, 8000, AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_16BIT, maxJitter,
AudioTrack.MODE_STREAM);
}
private void recordAndPlay()
{
short[] lin = new short[1024];
int num = 0;
am.setMode(AudioManager.MODE_IN_COMMUNICATION);
while (true)
{
if (isRecording)
{
num = record.read(lin, 0, 1024);
track.write(lin, 0, num);
}
}
}
private void startRecordAndPlay()
{
record.startRecording();
track.play();
isRecording = true;
}
private void stopRecordAndPlay()
{
record.stop();
track.pause();
isRecording = false;
}
You also need two buttons in your activity_main layout with the id start_button and stop_button.
This sample also contains an EchoCanceler!
Good luck!
I am currently trying to build an amplifier for the Android. The goal is to record and playback what is being recorded simultaneously. I created a thread that would take care of this. However, the sound comes out choppy. Here is what I tried.
private class RecordAndPlay extends Thread{
int bufferSize;
AudioRecord aRecord;
short[] buffer;
public RecordAndPlay() {
bufferSize = AudioRecord.getMinBufferSize(22050, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);
buffer = new short[bufferSize];
}
#Override
public void run() {
aRecord = new AudioRecord(MediaRecorder.AudioSource.MIC, 22050, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferSize);
try {
aRecord.startRecording();
} catch (Exception e) {
}
int bufferedResult = aRecord.read(buffer,0,bufferSize);
final AudioTrack aTrack = new AudioTrack(AudioManager.STREAM_MUSIC, samplingRate, AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferedResult, AudioTrack.MODE_STREAM);
aTrack.setNotificationMarkerPosition(bufferedResult);
aTrack.setPlaybackPositionUpdateListener(new OnPlaybackPositionUpdateListener() {
#Override
public void onPeriodicNotification(AudioTrack track) {
// TODO Auto-generated method stub
}
#Override
public void onMarkerReached(AudioTrack track) {
Log.d("Marker reached", "...");
aTrack.release();
aRecord.release();
run();
}
});
aTrack.play();
aTrack.write(buffer, 0, buffer.length);
}
public void cancel(){
aRecord.stop();
aRecord.release();
}
}
Your playback is choppy because the AudioTrack is getting starved and not getting data smoothly. In your code you are recursively calling run and creating a new AudioTrack per marker. Instead, instantiate AudioRecord and AudioTrack only once and just handle their events. Also, to help smooth out playback you should probably start recording slightly before playback and maintain a queue of the recorded buffers. You can then manage passing these buffers to the AudioTrack and make sure there is always a new buffer to submit on each marker event.
I am trying to stream live audio from an Axis network security camera over a Multipart HTTP stream that is encoded in g711 ulaw 8 khz, 8 bit samples on an Android phone. It seems like this should be pretty straight forward, and this is the basis of my code. I reused some streaming code I had that grabbed JPEG frames from a MJPEG stream, and now it grabs 512 byte blocks of audio data and hands it down to the AudioTrack. The audio sounds all garbled and distorted though, am I missing something obvious?
#Override
public void onResume() {
super.onResume();
int bufferSize = AudioTrack.getMinBufferSize(8000, AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_8BIT);
mAudioTrack = new AudioTrack(AudioManager.STREAM_MUSIC, 8000, AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_8BIT, bufferSize, AudioTrack.MODE_STREAM);
mAudioTrack.play();
thread.start();
}
class StreamThread extends Thread {
public boolean running = true;
public void run() {
try {
MjpegStreamer streamer = MjpegStreamer.read("/axis-cgi/audio/receive.cgi?httptype=multipart");
while(running) {
byte[] buf = streamer.readMjpegFrame();
if(buf != null && mAudioTrack != null) {
mAudioTrack.write(buf, 0, buf.length);
}
}
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}