Crash when calls AudioTrack multiple times - android

I am working with Visualizer. It will get data from AudioTrack and display when I click a button. In the button, I will call the function DrawStart as below:
private void DrawStart() {
if (startDrawing) {
initRecorder();
mVisualizerView.link(track);
startRecording();
}
else {
DrawStop();
}
}
It works well for about 10 first click. That means if I call DrawStart more than 10 times it has error
Fatal signal 11 (SIGSEGV) at 0x00030000 (code=1), thread 8164 (Visualizer)
Could you help me to fix it? Thanks so much. There are my sub-fuctions
private void initRecorder() {
_audioManager = (AudioManager) getSystemService(Context.AUDIO_SERVICE);
int maxJitter = AudioTrack.getMinBufferSize(SAMPLE_RATE, AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_16BIT);
track = new AudioTrack(AudioManager.MODE_IN_COMMUNICATION, SAMPLE_RATE, AudioFormat.CHANNEL_OUT_MONO,
AudioFormat.ENCODING_PCM_16BIT, maxJitter, AudioTrack.MODE_STREAM);
_audioManager.startBluetoothSco();
_audioManager.setMode(AudioManager.STREAM_VOICE_CALL);
}
private void startRecording() {
recordingThread = new AudioRecordingThread(track,mRecorder, bufferSize,SAMPLE_RATE,new AudioRecordingHandler() {
// Do something
});
recordingThread.start();
}
private void DrawStop() {
if (recordingThread != null) {
recordingThread = null;
}
track.release();
startDrawing = true;
}
And
public void link(AudioTrack player)
{
if(player == null)
{
throw new NullPointerException("Cannot link to null MediaPlayer");
}
int playerId=player.getAudioSessionId();
// Create the Visualizer object and attach it to our media player.
mVisualizer = new Visualizer(playerId);
mVisualizer.setScalingMode(Visualizer.SCALING_MODE_NORMALIZED);
mVisualizer.setCaptureSize(Visualizer.getCaptureSizeRange()[1]);
}

It was fixed by setting the application's hardwareAccelerated attribute to false in AndroidManifest.xml
<application
android:hardwareAccelerated="false"

Related

Detect the presence of environmental noise

I am very new to android development and I don't have enough experience. So the question I am asking might be very simple. I want to detect if there is noise in the environment using microphone. Now if there is no mic on the cellphone I will toast a relevant.
I found the code from here: android: Detect sound level
On main activity I have a button. Pressing the button will toast some result. But I only get 0.0 even though there is a noise in the room. Could some one give me some hint on this please.
public class MainActivity extends AppCompatActivity implements SensorEventListener {
double soundLevel;
protected void onCreate(Bundle savedInstanceState) {
noiseButton = findViewById(R.id.noiseCheck);
PackageManager PM= this.getPackageManager();
final boolean microphone = PM.hasSystemFeature(PackageManager.FEATURE_MICROPHONE);
noiseButton.setOnClickListener(new View.OnClickListener() {
public void onClick(View v) {
if (microphone){
double soundLevel = detectEnvironmentalNoise();
Toast.makeText(mContext, "Environmental noise level is " + soundLevel , Toast.LENGTH_LONG).show();
}
else{
Toast.makeText(mContext, "This device is not equipped to microphone to detect environmental noise", Toast.LENGTH_LONG).show();
}
}
});
}
public double detectEnvironmentalNoise() {
AudioRecord audio = null;
int sampleRate = 8000;
double lastLevel = 0;
try {
int bufferSize = AudioRecord.getMinBufferSize(sampleRate, AudioFormat.CHANNEL_IN_MONO,
AudioFormat.ENCODING_PCM_16BIT);
audio = new AudioRecord(MediaRecorder.AudioSource.MIC, sampleRate,
AudioFormat.CHANNEL_IN_MONO,
AudioFormat.ENCODING_PCM_16BIT, bufferSize);
} catch (Exception e) {
android.util.Log.e("TrackingFlow", "Exception", e);
}
audio.startRecording();
short[] buffer = new short[100];
int bufferReadResult = 1;
if (audio != null) {
// Sense the voice...
bufferReadResult = audio.read(buffer, 0, 10000);
double sumLevel = 0;
for (int i = 0; i < bufferReadResult; i++) {
sumLevel += buffer[i];
}
lastLevel = Math.abs((sumLevel / bufferReadResult));
}
return lastLevel;
}
}
You need to implement getMaxAmplitude() for AudioRecord as described in below post:
Implement getMaxAmplitude for audioRecord
Or you can use MediaRecorder as below:
public class SoundMeter {
private MediaRecorder mRecorder = null;
public void start() {
if (mRecorder == null) {
mRecorder = new MediaRecorder();
mRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
mRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
mRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
mRecorder.setOutputFile("/dev/null");
mRecorder.prepare();
mRecorder.start();
}
}
public void stop() {
if (mRecorder != null) {
mRecorder.stop();
mRecorder.release();
mRecorder = null;
}
}
public double getAmplitude() {
if (mRecorder != null)
return mRecorder.getMaxAmplitude();
else
return 0;
}
}

AudioTrack AudioRecord android issue, silence

Do anyone know if there is a need for audiorecord and audiotrack to be seperated in a special way/type of thread, runnable, service etc for it to be able to work together? Now I can start an audiotrack, then audiorecord without issues.
But if I start and stop the audiotrack while recording the audiorecord starts to output 0's as if it was muted. (But it is not muted or stopped)
If I start the audiorecord then the audiotrack then the audiorecord is also "muted".
Also weird is it that when I unplug and plug in my wired headset it will start recording/output recording other than 0's again (which makes me think my phones Lenovo B and Lenovo C2 is too cheap(circuit/hardware issues) or have build issues) but I do not know.
Anyone heard of this issue with a suddenly "muted" audiorecord or an audiorecord which responds to unplug/plugging of a wiredheadset without having any settings/methods applied for it?
Code update
class myRecordAndPlayer(){
public void initiateRecorder() {
if(audio.getMode()!=AudioManager.MODE_IN_CALL) {
audio.setMode(AudioManager.MODE_IN_CALL);
//audio.setSpeakerphoneOn(false); for mode in com (mode in com produces more echo/crosstalk)
}
rec = true;
Thread thread = new Thread(new Runnable() {
#Override
public void run() {
android.os.Process.setThreadPriority(Process.THREAD_PRIORITY_URGENT_AUDIO);
AudioRecord audioRecorder = new AudioRecord(MediaRecorder.AudioSource.MIC, SAMPLE_RATE,
AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT,
AudioRecord.getMinBufferSize(SAMPLE_RATE, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT) * 10);
try {
audioRecorder.startRecording();
while (rec) {
bytes_read = audioRecorder.read(buf_audio, 0, buf_audio_len);
public void initiatePlayer() {
if(!play) {
play = true;
android.os.Process.setThreadPriority(Process.THREAD_PRIORITY_URGENT_AUDIO);
Thread receiveThread = new Thread(new Runnable() {
#Override
public void run() {
AudioTrack track = new AudioTrack(AudioManager.STREAM_MUSIC, SAMPLE_RATE, AudioFormat.CHANNEL_OUT_MONO,
AudioFormat.ENCODING_PCM_16BIT, BUF_SIZE, AudioTrack.MODE_STREAM);
track.play();
try {
while(play) {
track.write(bufToPlay, 0, bufToPlay.length);
Not tested.
private Thread audioRecordThread
private AudioRecord audioRecorder;
private AudioTrack audioTrack;
public void initialize() {
audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC, SAMPLE_RATE, AudioFormat.CHANNEL_OUT_MONO,
AudioFormat.ENCODING_PCM_16BIT, BUF_SIZE, AudioTrack.MODE_STREAM);
audioRecorder = new AudioRecord(MediaRecorder.AudioSource.MIC, SAMPLE_RATE,
AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT,
AudioRecord.getMinBufferSize(SAMPLE_RATE, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT) * 10);
audioRecordThread = new Thread(new Runnable() {
public void run() {
writeAudioDataToAudioTrack();
}
});
}
public void start() {
audioTrack.play();
audioRecorder.startRecording();
audioRecordThread.start();
}
private void writeAudioDataToAudioTrack() {
while(AudioRecord.RECORDSTATE_RECORDING == audioRecord.getRecordingState()) {
audioTrack.write(bufToPlay, 0, bufToPlay.length);
}
}

Sound synchronization?

I'm newbie in programming and I'm try to do a audio app that plays audio samples, but i want to sync all of them. Like a verification loop that counts 0, 1, 2, 3. And when the user click "play/stop", the audio only start/stop if the loop is in "0".
This is one of my classes, where i set play and stop methods.
public class Sample {
private static final int SAMPLE_RATE = 44100;
private String name;
private AudioTrack audioTrack;
private int loopPoint;
int soundId;
private Uri uri;
private Context context;
private MediaPlayer currentPlayer;
private boolean isImported;
private boolean isLooping = false;
public Sample(String name, byte[] soundBytes) {
this.name = name;
loopPoint = soundBytes.length / 2;
isImported = false;
audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC, SAMPLE_RATE,
AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_16BIT,
soundBytes.length, AudioTrack.MODE_STATIC);
audioTrack.write(soundBytes, 0, soundBytes.length);
}
public Sample(String name, File file, Context context) {
this.name = name;
this.context = context;
isImported = true;
uri = Uri.parse(file.getAbsolutePath());
}
public String getName() {
return name;
}
public void updateSample(byte[] soundBytes) {
if (!isImported) {
audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC, SAMPLE_RATE,
AudioFormat.CHANNEL_OUT_MONO,
AudioFormat.ENCODING_PCM_16BIT, soundBytes.length,
AudioTrack.MODE_STATIC);
audioTrack.write(soundBytes, 0, soundBytes.length);
}
}
public void play(boolean isLooped) {
isLooping = isLooped;
audioTrack.setPlaybackRate(88200);
if (isImported) {
if (currentPlayer != null) {
currentPlayer.seekTo(0);
} else {
currentPlayer = MediaPlayer.create(context, uri);
}
currentPlayer.setLooping(isLooped);
currentPlayer.start();
} else {
audioTrack.stop();
audioTrack.reloadStaticData();
if (isLooped) {
audioTrack.setLoopPoints(0, loopPoint, -1);
} else {
audioTrack.setLoopPoints(0, 0, 0);
}
audioTrack.play();
}
}
public void stop() {
try {
if (isImported && currentPlayer != null) {
currentPlayer.stop();
currentPlayer.release();
currentPlayer = null;
} else if (!isImported && audioTrack != null) {
audioTrack.stop();
}
} catch (Exception e) {
e.printStackTrace();
}
isLooping = false;
}
public boolean isImported() {
return isImported;
}
public boolean isLooping() {
return isLooping;
}
}
You can use currentPlayer.getCurrentPosition() to get the current position of playback of the sample in milliseconds.
Say your tempo was 120 bpm. That's half a second per beat.
Now say we are in 4/4 timing, then we have 4 beats in a bar, so each bar is a total of 2 seconds long.
So to stop the playback only at the start of a bar, you would need to stop playback when the CurrentPosition is a multiple of 2000 milliseconds.
You could make the MediaPlayer stop at the right time with something like the logic below:
onUserPressedStop() {
// Check if the remainder is 0 when you divide by bar length
if ((currentPlayer.getCurrentPosition() % barLengthMillis) == 0) {
// We are at the start of a bar so stop and release the MediaPlayer
stopMyMediaPlayer();}
// Else were not at the start of a bar, so find out how much longer we need to wait
else {
int waitTime = currentPlayer.getCurrentPosition() % barLengthMillis;
// Stop and release the MediaPlayer once the necessary time has elapsed
final Handler handler = new Handler();
handler.postDelayed(new Runnable() {
#Override
public void run() {
stopMyMediaPlayer();
}
}, waitTime);
}
}

Audio Recording and Streaming in Android

I an developing an android app. I want to accomplish below feature.
I will use my phone's built-in mic to record and at the same time i want the recorded audio to be played through either phone's speakers or headphones.
Is it feasible? If yes, please help me in this.
Here is a simple Recording and Playback application.
Uses Android AudioRecord and AudioTrack,
Design:
The recorded audio is written to a buffer and played back from the same buffer, This mechanism runs in a loop (using Android thread) controlled by buttons.
Code
private String TAG = "AUDIO_RECORD_PLAYBACK";
private boolean isRunning = true;
private Thread m_thread; /* Thread for running the Loop */
private AudioRecord recorder = null;
private AudioTrack track = null;
int bufferSize = 320; /* Buffer for recording data */
byte buffer[] = new byte[bufferSize];
/* Method to Enable/Disable Buttons */
private void enableButton(int id,boolean isEnable){
((Button)findViewById(id)).setEnabled(isEnable);
}
The GUI has two Buttons START and STOP.
Enable the Button:
enableButton(R.id.StartButton,true);
enableButton(R.id.StopButton,false);
/* Assign Button Click Handlers */
((Button)findViewById(R.id.StartButton)).setOnClickListener(btnClick);
((Button)findViewById(R.id.StopButton)).setOnClickListener(btnClick);
Mapping START and STOP Button for OnClickListener
private View.OnClickListener btnClick = new View.OnClickListener() {
#Override
public void onClick(View v) {
switch(v.getId()){
case R.id.StartButton:
{
Log.d(TAG, "======== Start Button Pressed ==========");
isRunning = true;
do_loopback(isRunning);
enableButton(R.id.StartButton,false);
enableButton(R.id.StopButton,true);
break;
}
case R.id.StopButton:
{
Log.d(TAG, "======== Stop Button Pressed ==========");
isRunning = false;
do_loopback(isRunning);
enableButton(R.id.StopButton,false);
enableButton(R.id.StartButton,true);
break;
}
}
}
Start the Thread:
private void do_loopback(final boolean flag)
{
m_thread = new Thread(new Runnable() {
public void run() {
run_loop(flag);
}
});
m_thread.start();
}
Method for Initializing AudioRecord and AudioTrack:
public AudioTrack findAudioTrack (AudioTrack track)
{
Log.d(TAG, "===== Initializing AudioTrack API ====");
int m_bufferSize = AudioTrack.getMinBufferSize(8000,
AudioFormat.CHANNEL_OUT_MONO,
AudioFormat.ENCODING_PCM_16BIT);
if (m_bufferSize != AudioTrack.ERROR_BAD_VALUE)
{
track = new AudioTrack(AudioManager.STREAM_MUSIC, 8000,
AudioFormat.CHANNEL_OUT_MONO,
AudioFormat.ENCODING_PCM_16BIT, m_bufferSize,
AudioTrack.MODE_STREAM);
if (track.getState() == AudioTrack.STATE_UNINITIALIZED) {
Log.e(TAG, "===== AudioTrack Uninitialized =====");
return null;
}
}
return track;
}
public AudioRecord findAudioRecord (AudioRecord recorder)
{
Log.d(TAG, "===== Initializing AudioRecord API =====");
int m_bufferSize = AudioRecord.getMinBufferSize(8000,
AudioFormat.CHANNEL_IN_MONO,
AudioFormat.ENCODING_PCM_16BIT);
if (m_bufferSize != AudioRecord.ERROR_BAD_VALUE)
{
recorder = new AudioRecord(MediaRecorder.AudioSource.MIC, 8000,
AudioFormat.CHANNEL_IN_MONO,
AudioFormat.ENCODING_PCM_16BIT, m_bufferSize);
if (recorder.getState() == AudioRecord.STATE_UNINITIALIZED) {
Log.e(TAG, "====== AudioRecord UnInitilaised ====== ");
return null;
}
}
return recorder;
}
The Values for findAudioRecord or findAudioTrack can change based on device.
Please refer this question.
Code for Running the loop:
public void run_loop (boolean isRunning)
{
/** == If Stop Button is pressed == **/
if (isRunning == false) {
Log.d(TAG, "===== Stop Button is pressed ===== ");
if (AudioRecord.STATE_INITIALIZED == recorder.getState()){
recorder.stop();
recorder.release();
}
if (AudioTrack.STATE_INITIALIZED == track.getState()){
track.stop();
track.release();
}
return;
}
/** ======= Initialize AudioRecord and AudioTrack ======== **/
recorder = findAudioRecord(recorder);
if (recorder == null) {
Log.e(TAG, "======== findAudioRecord : Returned Error! =========== ");
return;
}
track = findAudioTrack(track);
if (track == null) {
Log.e(TAG, "======== findAudioTrack : Returned Error! ========== ");
return;
}
if ((AudioRecord.STATE_INITIALIZED == recorder.getState()) &&
(AudioTrack.STATE_INITIALIZED == track.getState()))
{
recorder.startRecording();
Log.d(TAG, "========= Recorder Started... =========");
track.play();
Log.d(TAG, "========= Track Started... =========");
}
else
{
Log.d(TAG, "==== Initilazation failed for AudioRecord or AudioTrack =====");
return;
}
/** ------------------------------------------------------ **/
/* Recording and Playing in chunks of 320 bytes */
bufferSize = 320;
while (isRunning == true)
{
/* Read & Write to the Device */
recorder.read(buffer, 0, bufferSize);
track.write(buffer, 0, bufferSize);
}
Log.i(TAG, "Loopback exit");
return;
}
Please include the following in AndroidManifest.xml
<uses-permission android:name="android.permission.RECORD_AUDIO" > </uses-permission>
This above procedure is also possible by Writing/Reading from a file using the same APIs.
Why use audioRecord over mediaRecorder - See here.
The Code is tested (on Google Nexus 5) and working perfectly.
Note: Please add some error-checking code for recorder.read and track.write, in case you fail. Same applies for findAudioRecord and findAudioTrack.
First create objects in onCreate method, MediaRecorder class object and the path to file where you want to save the recorded data.
String outputFile = Environment.getExternalStorageDirectory().
getAbsolutePath() + "/myrecording.3gp"; // Define outputFile outside onCreate method
MediaRecorder myAudioRecorder = new MediaRecorder(); // Define this outside onCreate method
myAudioRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
myAudioRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
myAudioRecorder.setAudioEncoder(MediaRecorder.OutputFormat.AMR_NB);
myAudioRecorder.setOutputFile(outputFile);
These three function you can call it on any button, in order to play Rec, stop Rec and start Rec;
public void start(View view){
try {
myAudioRecorder.prepare();
myAudioRecorder.start();
} catch (IllegalStateException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
start.setEnabled(false);
stop.setEnabled(true);
Toast.makeText(getApplicationContext(), "Recording started", Toast.LENGTH_LONG).show();
}
public void stop(View view){
myAudioRecorder.stop();
myAudioRecorder.release();
myAudioRecorder = null;
stop.setEnabled(false);
play.setEnabled(true);
Toast.makeText(getApplicationContext(), "Audio recorded successfully",
Toast.LENGTH_LONG).show();
}
public void play(View view) throws IllegalArgumentException,
SecurityException, IllegalStateException, IOException{
MediaPlayer m = new MediaPlayer();
m.setDataSource(outputFile);
m.prepare();
m.start();
Toast.makeText(getApplicationContext(), "Playing audio", Toast.LENGTH_LONG).show();
}
As I read Developer document here , Android supports RTSP protocol (for real time streaming) and also HTTP/HTTPS live streaming draft protocol.
There is also an example here. You must have base knowledge about Streaming server, like Red5 or Wowza.

just hearing clicking noise on audiotrack playback

I'm trying to record sound from the mic and then play it back, but all I here is a clicking sound. Both of the AudioTrack and AudioRecord are on the same settings, and initalised correctly. Here is my code:
stop.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
isRecording = false;
recordingThread.join();
player = findAudioTrack();
player.play();
for (int i = 0; i < audioQueue.size(); i++) {
int written = player.write(audioQueue.get(i), 0,
audioQueue.get(i).length);
}
player.stop();
player.release();
player = null;
}
});
}
private void startRecording() {
recorder = findAudioRecord();
recorder.startRecording();
isRecording = true;
recordingThread = new Thread(new Runnable() {
public void run() {
addAudioToQueue();
}
}, "AudioRecorder Thread");
recordingThread.start();
}
private void addAudioToQueue() {
short[] capturedAudio = new short[recordBufferSize/2];
while (isRecording) {
int read = recorder.read(capturedAudio, 0, capturedAudio.length);
audioQueue.add(capturedAudio);
}
recorder.stop();
recorder.release();
recorder = null;
}
}
Does anyone know why this is?
Here is the full source code:
https://www.dropbox.com/s/h38cs9vjkztyyro/AudioTesting.java
It probably has to do with the fact that you are using a short array instead of a byte array.
public int read (byte[] audioData, int offsetInBytes, int sizeInBytes)
It's likely that this function is only filling the first byte of each short so you are left with half of each index being 1 byte of audio data, and the other byte being junk or zeroes.
Try using a byte array instead of short.

Categories

Resources