I have a problem reading audiodata from microphone in android. My app must change lightness of a logo picture depending on volume of a sound, which I read from microphone. I use AudioRecord class to get audiodata. Depending on sample rate and buffer size, every 2nd/4th/8th read operation goes with latency of 100-200 ms. So my app can't react on volume changes properly. I tried to reduce buffer size, and to execute data processing in other thread (using AudioRecord.setPositionNotificationPeriod() and callback) but those not make much sense - latency just becomes a little shorter. Here is my code:
package com.mikesoft.Clown;
import android.app.Activity;
import android.media.AudioFormat;
import android.media.AudioRecord.OnRecordPositionUpdateListener;
import android.media.AudioRecord;
import android.media.MediaRecorder.AudioSource;
import android.util.Log;
import android.view.View;
import android.view.View.OnClickListener;
import android.widget.Button;
import java.lang.Math;
public class TestActivity extends Activity implements OnClickListener, OnRecordPositionUpdateListener {
private static final String LOG_TAG = TestActivity.class.getSimpleName();
//arrays for selecting usable audioformat
private static int[] mSampleRates = new int[] {8000, 11025, 22050, 44100};
private static short[] audioFormat = new short[] {AudioFormat.ENCODING_PCM_8BIT, AudioFormat.ENCODING_PCM_16BIT};
private static short[] channelConfig = new short[] {AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.CHANNEL_CONFIGURATION_STEREO};
private short[][] buffers;
private int ix;
private AudioRecord recorder = null;
private int bufferSize = 0;
private AudioThread recordingThread = null;
private boolean isRecording = false;
/** The OpenGL View */
private LogoView glSurface;
//markers
private short[] buffer;
private long read_time;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.layout_test);
findViewById(R.id.btnStart).setOnClickListener(this);
findViewById(R.id.btnStop).setOnClickListener(this);
mVolumeText = (TextView)findViewById(R.id.volume);
setRecordMode();
glSurface = (LogoView)findViewById(R.id.glSurface);
}
private void setRecordMode() {
((Button) findViewById(R.id.btnStop)).setClickable(isRecording);
((Button) findViewById(R.id.btnStart)).setClickable(!isRecording);
}
public void startRecording() {
if (null == recorder) {
recorder = findAudioRecord();
recorder.setRecordPositionUpdateListener(this);
recorder.startRecording();
isRecording = true;
recordingThread = new AudioThread();
}
Log.d(LOG_TAG, "Audio recording started");
}
//thread for executing audio recorder
private class AudioThread extends Thread {
private AudioThread() {
start();
}
#Override
public void run() {
android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);
recordAudio();
}
}
private void recordAudio() {
ix = 0;
int read = 0;
recorder.setPositionNotificationPeriod(bufferSize);
while (isRecording) {
//long t1 = System.currentTimeMillis();
buffer = buffers[ix++ % buffers.length];
read = recorder.read(buffer, 0, buffer.length);
//time after reading
//read_time = System.currentTimeMillis();
//Log.d(LOG_TAG, "read bytes: " + read);
//Log.d(LOG_TAG, "read_time=" + (read_time - t1));
//int vol = 0;
//for (int i=0;i<buffer.length;i++)
// vol = Math.max(vol, Math.abs(buffer[i]));
//float norm = (float)vol/(float)Short.MAX_VALUE;
//glSurface.setLightness(norm);
}
}
//initialize AudioRecord with suitable format
public AudioRecord findAudioRecord() {
//for (int rate : mSampleRates) {
int rate = 22050;
for (short format : audioFormat) {
for (short channel : channelConfig) {
try {
Log.d(LOG_TAG, "Attempting rate " + rate + "Hz, bits: " + audioFormat + ", channel: "
+ channelConfig);
bufferSize = AudioRecord.getMinBufferSize(rate, channel, format);
if (bufferSize != AudioRecord.ERROR_BAD_VALUE) {
// check if we can instantiate and have a success
AudioRecord recorder = new AudioRecord(AudioSource.MIC, rate, channel, format, bufferSize);
//bufferSize/=8;
if (recorder.getState() == AudioRecord.STATE_INITIALIZED) {
buffers = new short[256][bufferSize];
return recorder;
}
}
} catch (Exception e) {
Log.e(LOG_TAG, rate + "Exception, keep trying.",e);
}
}
}
//}
return null;
}
public void stopRecording() {
if (null != recorder) {
isRecording = false;
recorder.stop();
recorder.release();
recorder = null;
recordingThread = null;
}
Log.d(LOG_TAG, "Audio recording stopped");
}
#Override
public void onClick(View v) {
switch (v.getId()) {
case R.id.btnStart:
startRecording();
setRecordMode();
break;
case R.id.btnStop:
stopRecording();
setRecordMode();
break;
default:
break;
}
}
#Override
public void onMarkerReached(AudioRecord arg0) {
// TODO Auto-generated method stub
}
#Override
public void onPeriodicNotification(AudioRecord arg0) {
float norm;
short[] buf = buffers[ix % buffers.length];
int vol = 0;
for (int i = 0; i < buf.length; i++)
vol = Math.max(vol, Math.abs(buf[i]));
norm = (float) vol / (float) Short.MAX_VALUE;
glSurface.setLightness(norm);
}
}
I tried devices&emulators with Android 1.6-3.2 but it's the same thing everywhere. Is there a solution for my problem in Java? Or I should use NDK approaches?
Will appreciate any help.
Related
I am trying to record audio with different frequency. Though I have been trying to implement it by the following code but its not working (app unexpectedly stopped).
//importing packages
import android.app.Activity;
import android.media.AudioFormat;
import android.media.AudioRecord;
import android.media.MediaRecorder;
import android.os.Bundle;
import android.os.Environment;
import android.view.KeyEvent;
import android.view.View;
import android.widget.Button;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.IOException;
public class MainActivity extends Activity {
private static final int RECORDER_SAMPLERATE = 44100;
private static final int RECORDER_CHANNELS = AudioFormat.CHANNEL_IN_MONO;
private static final int RECORDER_AUDIO_ENCODING = AudioFormat.ENCODING_PCM_16BIT;
private AudioRecord recorder = null;
private Thread recordingThread = null;
private boolean isRecording = false;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
setButtonHandlers();//handling buttons
enableButtons(false);
int bufferSize = AudioRecord.getMinBufferSize(RECORDER_SAMPLERATE,
RECORDER_CHANNELS, RECORDER_AUDIO_ENCODING);
}
private void setButtonHandlers() {
((Button) findViewById(R.id.Start)).setOnClickListener(btnClick);
((Button) findViewById(R.id.Stop)).setOnClickListener(btnClick);
}
private void enableButton(int id, boolean isEnable) {
((Button) findViewById(id)).setEnabled(isEnable);
}
private void enableButtons(boolean isRecording) {
enableButton(R.id.Start, !isRecording);
enableButton(R.id.Stop, isRecording);
}
int BufferElements2Rec = 1024; // want to play 2048 (2K) since 2 bytes we use only 1024
int BytesPerElement = 2; // 2 bytes in 16bit format
public void Record()
{
recorder = new AudioRecord(MediaRecorder.AudioSource.MIC,
RECORDER_SAMPLERATE, RECORDER_CHANNELS,
RECORDER_AUDIO_ENCODING, BufferElements2Rec * BytesPerElement);
recorder.startRecording();
isRecording = true;
recordingThread = new Thread(new Runnable() {
public void run() {
writeAudioDataToFile();
}
}, "AudioRecorder Thread");
recordingThread.start();
}
private byte[] short2byte(short[] sData) {
int shortArrsize = sData.length;
byte[] bytes = new byte[shortArrsize * 2];
for (int i = 0; i < shortArrsize; i++) {
bytes[i * 2] = (byte) (sData[i] & 0x00FF);
bytes[(i * 2) + 1] = (byte) (sData[i] >> 8);
sData[i] = 0;
}
return bytes;
}
private void writeAudioDataToFile() {
// Write the output audio in byte
String filePath = Environment.getExternalStorageDirectory().getAbsolutePath()+"/myrec.pcm";
short sData[] = new short[BufferElements2Rec];
FileOutputStream os = null;
try {
os = new FileOutputStream(filePath);
} catch (FileNotFoundException e) {
e.printStackTrace();
}
while (isRecording) {
// gets the voice output from microphone to byte format
recorder.read(sData, 0, BufferElements2Rec);
System.out.println("Short wirting to file" + sData.toString());
try {
// // writes the data to file from buffer
// // stores the voice buffer
byte bData[] = short2byte(sData);
os.write(bData, 0, BufferElements2Rec * BytesPerElement);
} catch (IOException e) {
e.printStackTrace();
}
}
try {
os.close();
} catch (IOException e) {
e.printStackTrace();
}
}
public void Record_Stop()
{
if (null != recorder) {
isRecording = false;
recorder.stop();
recorder.release();
recorder = null;
recordingThread = null;
}
}
private View.OnClickListener btnClick = new View.OnClickListener() {
public void onClick(View v) {
switch (v.getId()) {
case R.id.Start: {
enableButtons(true);
Record();
break;
}
case R.id.Stop: {
enableButtons(false);
Record_Stop();
break;
}
}
}
};
#Override
public boolean onKeyDown(int keyCode, KeyEvent event) {
if (keyCode == KeyEvent.KEYCODE_BACK) {
finish();
}
return super.onKeyDown(keyCode, event);
}
}
As soon as I press the start button the app gets unexpectedly stopped.
com.example.kaushal.learningfft E/AndroidRuntime: FATAL EXCEPTION:
main Process: com.example.kaushal.learningfft, PID: 4705
java.lang.IllegalStateException: startRecording() called on an
uninitialized AudioRecord. at
android.media.AudioRecord.startRecording(AudioRecord.java:976) at
com.example.kaushal.learningfft.MainActivity.Record(MainActivity.java:57)
at
com.example.kaushal.learningfft.MainActivity$2.onClick(MainActivity.java:126)
at android.view.View.performClick(View.java:5610) at
android.view.View$PerformClick.run(View.java:22265) at
android.os.Handler.handleCallback(Handler.java:751) at
android.os.Handler.dispatchMessage(Handler.java:95) at
android.os.Looper.loop(Looper.java:154)
android.app.ActivityThread.main(ActivityThread.java:6077) at
java.lang.reflect.Method.invoke(Native Method) at
com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:865)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:755)
I'm developing an android (compileSdkVersion 23) app to record audio by using AudioRecord and the reason of using it is to get frequency after FFT in real time.
Not only this work, I need to save the recorded sound to check the sound(In this process, tracking the frequency is unnecessary.)
How to save recorded sound to file by using the AudioRecord in android?
Thus, am I using the AudioRecord correctly?
Here is code:
public class MainActivity extends Activity {
int frequency = 8000;
int channelConfiguration = AudioFormat.CHANNEL_CONFIGURATION_MONO;
int audioEncoding = AudioFormat.ENCODING_PCM_16BIT;
AudioRecord audioRecord;
RecordAudio recordTask;
int blockSize;// = 256;
boolean started = false;
boolean CANCELLED_FLAG = false;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
blockSize = 256;
final Button btRec = (Button) findViewById(R.id.btRec);
btRec.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
if (started == true) {
//started = false;
CANCELLED_FLAG = true;
//recordTask.cancel(true);
try{
audioRecord.stop();
}
catch(IllegalStateException e){
Log.e("Stop failed", e.toString());
}
btRec.setText("Start");
// canvasDisplaySpectrum.drawColor(Color.BLACK);
}
else {
started = true;
CANCELLED_FLAG = false;
btRec.setText("Stop");
recordTask = new RecordAudio();
recordTask.execute();
}
}
});
}
private class RecordAudio extends AsyncTask<Void, double[], Boolean> {
#Override
protected Boolean doInBackground(Void... params) {
int bufferSize = AudioRecord.getMinBufferSize(frequency,
channelConfiguration, audioEncoding);
audioRecord = new AudioRecord(
MediaRecorder.AudioSource.DEFAULT, frequency,
channelConfiguration, audioEncoding, bufferSize);
int bufferReadResult;
short[] buffer = new short[blockSize];
double[] toTransform = new double[blockSize];
try {
audioRecord.startRecording();
} catch (IllegalStateException e) {
Log.e("Recording failed", e.toString());
}
while (started) {
if (isCancelled() || (CANCELLED_FLAG == true)) {
started = false;
//publishProgress(cancelledResult);
Log.d("doInBackground", "Cancelling the RecordTask");
break;
} else {
bufferReadResult = audioRecord.read(buffer, 0, blockSize);
for (int i = 0; i < blockSize && i < bufferReadResult; i++) {
toTransform[i] = (double) buffer[i] / 32768.0; // signed 16 bit
}
//transformer.ft(toTransform);
//publishProgress(toTransform);
}
}
return true;
}
}
}
You have to download your file and save in cache, than for any request you have to check for cahce file if it is available use otherwise request new file
For complete help look into one of my answer Download and cache media files
I want to capture audio from an Android device. My code below seems to successfully make a wav file on the SD card but it cannot be played. I tried to play it using different media players but none work. There is an issue in my code that is causing this problem.
code
public class MainActivity extends ActionBarActivity {
private static final String LOG_TAG = "AudioRecordTest";
static final int AUDIO_PORT = 2048;
static final int SAMPLE_RATE = 8000;
static final int SAMPLE_INTERVAL = 20; // milliseconds
static final int SAMPLE_SIZE = 2; // bytes per sample
static final int BUF_SIZE = SAMPLE_INTERVAL * SAMPLE_INTERVAL * SAMPLE_SIZE * 2;
private static int[] mSampleRates = new int[]{44100, 44056, 47250, 48000, 22050, 16000, 11025, 8000};
private Thread recordingThread = null;
private boolean isRecording = false;
int BufferElements2Rec = 1024; // want to play 2048 (2K) since 2 bytes we
// use only 1024
int BytesPerElement = 2; // 2 bytes in 16bit format
private int bufferSize;
private AudioRecord recorder;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
startRecording();
}
#Override
public boolean onCreateOptionsMenu(Menu menu) {
// Inflate the menu; this adds items to the action bar if it is present.
getMenuInflater().inflate(R.menu.main, menu);
return true;
}
private void startRecording() {
recorder = findAudioRecord();
recorder.startRecording();
isRecording = true;
recordingThread = new Thread(new Runnable() {
public void run() {
writeAudioDataToFile();
}
}, "AudioRecorder Thread");
recordingThread.start();
}
// convert short to byte
private byte[] short2byte(short[] sData) {
int shortArrsize = sData.length;
byte[] bytes = new byte[shortArrsize * 2];
for (int i = 0; i < shortArrsize; i++) {
bytes[i * 2] = (byte) (sData[i] & 0x00FF);
bytes[(i * 2) + 1] = (byte) (sData[i] >> 8);
sData[i] = 0;
}
return bytes;
}
public AudioRecord findAudioRecord() {
for (int rate : mSampleRates) {
for (short audioFormat : new short[]{AudioFormat.ENCODING_PCM_8BIT, AudioFormat.ENCODING_PCM_16BIT}) {
for (short channelConfig : new short[]{AudioFormat.CHANNEL_IN_MONO, AudioFormat.CHANNEL_IN_STEREO}) {
try {
Log.d(LOG_TAG, "Attempting rate " + rate + "Hz, bits: " + audioFormat + ", channel: "
+ channelConfig);
bufferSize = AudioRecord.getMinBufferSize(rate, channelConfig, audioFormat);
if (bufferSize != AudioRecord.ERROR_BAD_VALUE) {
// check if we can instantiate and have a success
recorder = new AudioRecord(MediaRecorder.AudioSource.DEFAULT, rate, channelConfig, audioFormat, bufferSize);
if (recorder.getState() == AudioRecord.STATE_INITIALIZED)
return recorder;
}
} catch (Exception e) {
Log.e(LOG_TAG, rate + "Exception, keep trying.", e);
}
}
}
}
return null;
}
private void writeAudioDataToFile() {
/*// Write the output audio in byte
short sData[] = new short[BufferElements2Rec];
while (isRecording) {
// gets the voice output from microphone to byte format
recorder.read(sData, 0, BufferElements2Rec);
System.out.println("Short wirting to file" + sData.toString());
// // stores the voice buffer
byte bData[] = short2byte(sData);
sendLiveAudio(bData);
}*/
String filePath = "/sdcard/test.wav";
short sData[] = new short[bufferSize / 2];
FileOutputStream os = null;
try {
os = new FileOutputStream(filePath);
} catch (FileNotFoundException e) {
e.printStackTrace();
}
while (isRecording) {
// gets the voice output from microphone to byte format
recorder.read(sData, 0, bufferSize / 2);
Log.d("eray", "Short wirting to file" + sData.toString());
try {
// // writes the data to file from buffer
// // stores the voice buffer
byte bData[] = short2byte(sData);
os.write(bData, 0, bufferSize);
} catch (IOException e) {
e.printStackTrace();
}
}
try {
os.close();
} catch (IOException e) {
e.printStackTrace();
}
}
private void stopRecording() {
// stops the recording activity
if (null != recorder) {
isRecording = false;
recorder.stop();
recorder.release();
recorder = null;
recordingThread = null;
}
}
#Override
public boolean onOptionsItemSelected(MenuItem item) {
// Handle action bar item clicks here. The action bar will
// automatically handle clicks on the Home/Up button, so long
// as you specify a parent activity in AndroidManifest.xml.
int id = item.getItemId();
if (id == R.id.action_settings) {
return true;
}
return super.onOptionsItemSelected(item);
}
#Override
public void onBackPressed() {
super.onBackPressed();
stopRecording();
}
}
Try This.....
public class Audio_Record extends Activity {
private static final int RECORDER_SAMPLERATE = 8000;
private static final int RECORDER_CHANNELS = AudioFormat.CHANNEL_IN_MONO;
private static final int RECORDER_AUDIO_ENCODING = AudioFormat.ENCODING_PCM_16BIT;
private AudioRecord recorder = null;
private Thread recordingThread = null;
private boolean isRecording = false;
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
setButtonHandlers();
enableButtons(false);
int bufferSize = AudioRecord.getMinBufferSize(RECORDER_SAMPLERATE,
RECORDER_CHANNELS, RECORDER_AUDIO_ENCODING);
}
private void setButtonHandlers() {
((Button) findViewById(R.id.btnStart)).setOnClickListener(btnClick);
((Button) findViewById(R.id.btnStop)).setOnClickListener(btnClick);
}
private void enableButton(int id, boolean isEnable) {
((Button) findViewById(id)).setEnabled(isEnable);
}
private void enableButtons(boolean isRecording) {
enableButton(R.id.btnStart, !isRecording);
enableButton(R.id.btnStop, isRecording);
}
int BufferElements2Rec = 1024; // want to play 2048 (2K) since 2 bytes we use only 1024
int BytesPerElement = 2; // 2 bytes in 16bit format
private void startRecording() {
recorder = new AudioRecord(MediaRecorder.AudioSource.MIC,
RECORDER_SAMPLERATE, RECORDER_CHANNELS,
RECORDER_AUDIO_ENCODING, BufferElements2Rec * BytesPerElement);
recorder.startRecording();
isRecording = true;
recordingThread = new Thread(new Runnable() {
public void run() {
writeAudioDataToFile();
}
}, "AudioRecorder Thread");
recordingThread.start();
}
//convert short to byte
private byte[] short2byte(short[] sData) {
int shortArrsize = sData.length;
byte[] bytes = new byte[shortArrsize * 2];
for (int i = 0; i < shortArrsize; i++) {
bytes[i * 2] = (byte) (sData[i] & 0x00FF);
bytes[(i * 2) + 1] = (byte) (sData[i] >> 8);
sData[i] = 0;
}
return bytes;
}
private void writeAudioDataToFile() {
// Write the output audio in byte
String filePath = "/sdcard/voice8K16bitmono.pcm";
short sData[] = new short[BufferElements2Rec];
FileOutputStream os = null;
try {
os = new FileOutputStream(filePath);
} catch (FileNotFoundException e) {
e.printStackTrace();
}
while (isRecording) {
// gets the voice output from microphone to byte format
recorder.read(sData, 0, BufferElements2Rec);
System.out.println("Short wirting to file" + sData.toString());
try {
// // writes the data to file from buffer
// // stores the voice buffer
byte bData[] = short2byte(sData);
os.write(bData, 0, BufferElements2Rec * BytesPerElement);
} catch (IOException e) {
e.printStackTrace();
}
}
try {
os.close();
} catch (IOException e) {
e.printStackTrace();
}
}
private void stopRecording() {
// stops the recording activity
if (null != recorder) {
isRecording = false;
recorder.stop();
recorder.release();
recorder = null;
recordingThread = null;
}
}
private View.OnClickListener btnClick = new View.OnClickListener() {
public void onClick(View v) {
switch (v.getId()) {
case R.id.btnStart: {
enableButtons(true);
startRecording();
break;
}
case R.id.btnStop: {
enableButtons(false);
stopRecording();
break;
}
}
}
};
#Override
public boolean onKeyDown(int keyCode, KeyEvent event) {
if (keyCode == KeyEvent.KEYCODE_BACK) {
finish();
}
return super.onKeyDown(keyCode, event);
}
}
I am trying to record the voice using AudioRecord class and writing
the read bytes to speaker using AudioTrack class. I am able to hear
the voice coming from Speaker but the voice is very low and a lot of
noise is coming along with the recorded voice.
Any solution to resolve this problem to reduce the noise and loud the
actual voice.
I am using below code for this:
package com.my.mic.record;
import android.media.AudioFormat;
import android.media.AudioManager;
import android.media.AudioRecord;
import android.media.AudioTrack;
import android.media.MediaRecorder;
import android.util.Log;
public class Record extends Thread
{
int numCrossing,p,numSamples,af;
static final int bufferSize = 200000;
short[] buffer = new short[bufferSize];
short[] readBuffer = new short[bufferSize];
boolean isRecording;
AudioManager am;
public AudioRecord arec;
public AudioTrack atrack;
private int sampleRate = 8000;
public void run() {
isRecording = true;
android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);
int buffersize =
AudioRecord.getMinBufferSize(sampleRate,AudioFormat.CHANNEL_CONFIGURATION_MONO,
AudioFormat.ENCODING_PCM_16BIT);
arec = new
AudioRecord(MediaRecorder.AudioSource.MIC,sampleRate,AudioFormat.CHANNEL_CONFIGURATION_MONO,AudioFormat.ENCODING_PCM_16BIT,
buffersize);
//buffersize = arec.getMinBufferSize(arec.getSampleRate(),
arec.getChannelConfiguration(), arec.getAudioFormat());
atrack = new
AudioTrack(AudioManager.STREAM_MUSIC,arec.getSampleRate(),arec.getChannelConfiguration(),arec.getAudioFormat(),
buffersize,AudioTrack.MODE_STREAM);
am.setRouting(AudioManager.MODE_NORMAL,AudioManager.ROUTE_EARPIECE,
AudioManager.ROUTE_ALL);
//am.setMode(AudioManager.MODE_NORMAL);
Log.d("SPEAKERPHONE", "Is speakerphone on? : " +
am.isSpeakerphoneOn());
atrack.setPlaybackRate(sampleRate);
byte[] buffer = new byte[buffersize];
arec.startRecording();
atrack.play();
//atrack.setStereoVolume(atrack.getMaxVolume(),
atrack.getMaxVolume());
final float frequency = sampleRate;
float increment = (float)((2*Math.PI) * frequency / 44100); //
angular increment for each sample
float angle = 0;
//AndroidAudioDevice device = new AndroidAudioDevice( );
while(isRecording) {
try {
arec.read(buffer, 0, buffersize);
atrack.write(buffer, 0, buffer.length);
} catch (Exception e) {
Log.d("Record", ""+e);
}
}
arec.stop();
atrack.stop();
//device.releaseTrack();
isRecording = false;
}
public boolean isRecording() {
return isRecording;
}
public void setRecording(boolean isRecording) {
this.isRecording = isRecording;
}
public AudioManager getAm() {
return am;
}
public void setAm(AudioManager am) {
this.am = am;
}
public void stopRecording(){
arec.stop();
arec.release();
atrack.stop();
atrack.release();
arec=null;
atrack=null;
setRecording(false);
this.stop();
}
}
Go to this working audio record example, if your testing in emulator then sample rate ll be 8000 default. For device increase the sample rate to 44100 or high and record it, it ll work better.
I made a little signal processing app. It processes audio signal (morse code) on certain frequency with Goerztel algorithm. Application saves temporary file to the filesystem and after recording is finished, starts to detect signals. Now I got the result with bunch of magnitudes.
I don't really know what to read from those magnitudes. How can I decode the morse code from those magnitudes? How can I read them? Tried to find references, but nowhere is explained what is the result and how to read it.
EDIT:
My morse code application is made with Delphi and uses Windows Beep function to send signals with certain frequency. I'm using 1200 Hz for signals. Also pauses between signals and words and morse beeps are like wikipedia described. All is accurate.
Goertzel.java:
public class Goertzel {
private float samplingRate;
private float targetFrequency;
private int n;
private double coeff, Q1, Q2;
private double sine, cosine;
public Goertzel(float samplingRate, float targetFrequency, int inN) {
this.samplingRate = samplingRate;
this.targetFrequency = targetFrequency;
n = inN;
sine = Math.sin(2 * Math.PI * (targetFrequency / samplingRate));
cosine = Math.cos(2 * Math.PI * (targetFrequency / samplingRate));
coeff = 2 * cosine;
}
public void resetGoertzel() {
Q1 = 0;
Q2 = 0;
}
public void initGoertzel() {
int k;
float floatN;
double omega;
floatN = (float) n;
k = (int) (0.5 + ((floatN * targetFrequency) / samplingRate));
omega = (2.0 * Math.PI * k) / floatN;
sine = Math.sin(omega);
cosine = Math.cos(omega);
coeff = 2.0 * cosine;
resetGoertzel();
}
public void processSample(double sample) {
double Q0;
Q0 = coeff * Q1 - Q2 + sample;
Q2 = Q1;
Q1 = Q0;
}
public double[] getRealImag(double[] parts) {
parts[0] = (Q1 - Q2 * cosine);
parts[1] = (Q2 * sine);
return parts;
}
public double getMagnitudeSquared() {
return (Q1 * Q1 + Q2 * Q2 - Q1 * Q2 * coeff);
}
}
SoundCompareActivity.java
import java.io.File;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.IOException;
import android.app.Activity;
import android.media.AudioFormat;
import android.media.AudioRecord;
import android.media.MediaRecorder;
import android.os.Bundle;
import android.util.Log;
import android.view.View;
import android.view.View.OnClickListener;
import android.widget.Button;
public class SoundCompareActivity extends Activity {
private static final int RECORDER_SAMPLE_RATE = 8000; // at least 2 times
// higher than sound
// frequency,
private static final int RECORDER_CHANNELS = AudioFormat.CHANNEL_CONFIGURATION_MONO;
private static final int RECORDER_AUDIO_ENCODING = AudioFormat.ENCODING_PCM_16BIT;
private AudioRecord recorder = null;
private int bufferSize = 0;
private Thread recordingThread = null;
private boolean isRecording = false;
private Button startRecBtn;
private Button stopRecBtn;
/** Called when the activity is first created. */
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
startRecBtn = (Button) findViewById(R.id.button1);
stopRecBtn = (Button) findViewById(R.id.button2);
startRecBtn.setEnabled(true);
stopRecBtn.setEnabled(false);
bufferSize = AudioRecord.getMinBufferSize(RECORDER_SAMPLE_RATE,
RECORDER_CHANNELS, RECORDER_AUDIO_ENCODING);
startRecBtn.setOnClickListener(new OnClickListener() {
#Override
public void onClick(View v) {
Log.d("SOUNDCOMPARE", "Start Recording");
startRecBtn.setEnabled(false);
stopRecBtn.setEnabled(true);
stopRecBtn.requestFocus();
startRecording();
}
});
stopRecBtn.setOnClickListener(new OnClickListener() {
#Override
public void onClick(View v) {
Log.d("SOUNDCOMPARE", "Stop recording");
startRecBtn.setEnabled(true);
stopRecBtn.setEnabled(false);
startRecBtn.requestFocus();
stopRecording();
}
});
}
private void startRecording() {
recorder = new AudioRecord(MediaRecorder.AudioSource.MIC,
RECORDER_SAMPLE_RATE, RECORDER_CHANNELS,
RECORDER_AUDIO_ENCODING, bufferSize);
recorder.startRecording();
isRecording = true;
recordingThread = new Thread(new Runnable() {
#Override
public void run() {
writeAudioDataToTempFile();
}
}, "AudioRecorder Thread");
recordingThread.start();
}
private String getTempFilename() {
File file = new File(getFilesDir(), "tempaudio");
if (!file.exists()) {
file.mkdirs();
}
File tempFile = new File(getFilesDir(), "signal.raw");
if (tempFile.exists())
tempFile.delete();
return (file.getAbsolutePath() + "/" + "signal.raw");
}
private void writeAudioDataToTempFile() {
byte data[] = new byte[bufferSize];
String filename = getTempFilename();
FileOutputStream os = null;
try {
os = new FileOutputStream(filename);
} catch (FileNotFoundException e) {
e.printStackTrace();
}
int read = 0;
if (os != null) {
while (isRecording) {
read = recorder.read(data, 0, bufferSize);
if (read != AudioRecord.ERROR_INVALID_OPERATION) {
try {
os.write(data);
} catch (IOException e) {
e.printStackTrace();
}
}
}
try {
os.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
private void deleteTempFile() {
File file = new File(getTempFilename());
file.delete();
}
private void stopRecording() {
if (recorder != null) {
isRecording = false;
recorder.stop();
recorder.release();
recorder = null;
recordingThread = null;
}
new MorseDecoder().execute(new File(getTempFilename()));
}
}
MorseDecoder.java:
import java.io.File;
import java.io.FileInputStream;
import java.io.FileNotFoundException;
import java.io.IOException;
import java.nio.ByteBuffer;
import java.nio.ByteOrder;
import java.nio.ShortBuffer;
import android.media.AudioFormat;
import android.media.AudioRecord;
import android.os.AsyncTask;
import android.util.Log;
public class MorseDecoder extends AsyncTask<File, Void, Void> {
private FileInputStream is = null;
#Override
protected Void doInBackground(File... files) {
int index;
//double magnitudeSquared;
double magnitude;
int bufferSize = AudioRecord.getMinBufferSize(8000,
AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT);
Goertzel g = new Goertzel(8000, 1200, bufferSize);
g.initGoertzel();
for (int i = 0; i < files.length; i++) {
byte[] data = new byte[bufferSize];
try {
is = new FileInputStream(files[i]);
while(is.read(data) != -1) {
ShortBuffer sbuf = ByteBuffer.wrap(data).order(ByteOrder.LITTLE_ENDIAN).asShortBuffer();
short[] audioShorts = new short[sbuf.capacity()];
sbuf.get(audioShorts);
float[] audioFloats = new float[audioShorts.length];
for (int j = 0; j < audioShorts.length; j++) {
audioFloats[j] = ((float)audioShorts[j]) / 0x8000;
}
for (index = 0; index < audioFloats.length; index++) {
g.processSample(data[index]);
}
magnitude = Math.sqrt(g.getMagnitudeSquared());
Log.d("SoundCompare", "Relative magnitude = " + magnitude);
g.resetGoertzel();
}
is.close();
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
return null;
}
}
EDIT2:
Notices some bugs in processing samples. Changed code in while loop.
while(is.read(data) != -1) {
ShortBuffer sbuf = ByteBuffer.wrap(data).order(ByteOrder.LITTLE_ENDIAN).asShortBuffer();
short[] audioShorts = new short[sbuf.capacity()];
sbuf.get(audioShorts);
float[] audioFloats = new float[audioShorts.length];
for (int j = 0; j < audioShorts.length; j++) {
audioFloats[j] = ((float)audioShorts[j]) / 0x8000;
}
for (index = 0; index < audioFloats.length; index++) {
g.processSample(audioFloats[index]);
magnitude = Math.sqrt(g.getMagnitudeSquared());
Log.d("SoundCompare", "Relative magnitude = " + magnitude);
}
//magnitude = Math.sqrt(g.getMagnitudeSquared());
//Log.d("SoundCompare", "Relative magnitude = " + magnitude);
g.resetGoertzel();
}
Regards,
evilone
The output of your Goertzel filter will increase when a tone within its passband is present, and then decrease when the tone is removed. In order to detect pulses of a tone, e.g. morse code, you need some kind of threshold detector on the output of the filter which will just give a boolean value for "tone present" / "tone not present" on a sample-by-sample basis. Try plotting the output values and it should be obvious once you see it in graphical form.
Plot the signal magnitudes on a graph versus time (some CW decoding apps for the PC do this in real-time). Now figure out what the graph for each Morse code symbol should look like. Then study some pattern matching algorithms. If there is enough noise present, you may want to try some statistical pattern matching methods.
Here's the Wikipedia link for proper Morse Code timing.