Audio Recorder get noise level in decibal - android

I am trying to fetch noise level of recorded audio in decibals. I am using following code but it is not giving the correct output
byte[] audioData = new byte[bufferSize];
recorder.read(audioData, 0, bufferSize);
ByteBuffer bb = ByteBuffer.wrap(audioData);
int sampleSize = bb.getInt();
now if I log sampleSize then it gives very huge value like 956318464
Can anybody tell how to get correct noise level in decibals.

First off- decibels is a ratio. You can't just get decibels, you need to compare the volume to a baseline measurement. So the real equation in terms of amplitude is
db= 10* log10(amplitude/baseline_amplitude);
If you're recording the audio now, to get the amplitude use MediaRecorder.getMaxAmplitude. For a baseline amplitude, measure the expected background noise.

public int calculatePowerDb(short[] sdata, int off, int samples)
{
double sum = 0;
double sqsum = 0;
for (int i = 0; i < samples; i++)
{
final long v = sdata[off + i];
sum += v;
sqsum += v * v;
}
double power = (sqsum - sum * sum / samples) / samples;
power /= MAX_16_BIT * MAX_16_BIT;
double result = Math.log10(power) * 10f + FUDGE;
return (int)result;
}
private static final float MAX_16_BIT = 32768;
private static final float FUDGE = 0.6f;
its works fine this methode

Related

Incorrect frequency on manually drawn graph

I am currently writting a a spectrum analyzer for android for university and part of this involves plotting the FFT of sound. However, I am having an issue with plotting the frequencies. The freq values start off correct, but as i move to higher frequencies the error is becoming greater and greater (at 3000Hz, the graph will show ~3750). I feel as though there is an error in the way I am calculating the x-axis or freq values. This is a manually drawn graph for speed purposes.
If more info/code is needed just let me know, but my guess is that it is something simple that I have overlooked. Thanks
xVal is the frequency value. and the scale value is to scale it according to the real graph dimensions.
int length = currentWaveDataDouble.length;
int pow2 = Integer.highestOneBit(length) << 1;
int sampleRate = 44100;
...
//actual plot part
for(int i =0; i<p2.length; i++) {
float xVal = (float)(i * scaleX.ScaleValue(((double) sampleRate / (pow2 >> 1))));
if (xVal < maxFreqPlus1) {
xVal += axisWidth + yAxisMargin;
float yVal = (float) scaleY.ScaleValue(p2[i]);
yVal += axisWidth + xAxisMargin;
canvas.drawPoint(xVal,yVal, marker);
if(yVal > yMax)
{
yMax = yVal;
xMax = xVal;
}
}
}
Freq generator set to 4000 Hz
Freq generator set to 1000 Hz (value is 1250Hz)
Found the issue. it was in the scaler.
ValueScaler scaleY = new ValueScaler(0,maxAmpPlus1 - yAxisMargin,0,baseY);
ValueScaler scaleX = new ValueScaler(0,maxFreqPlus1 - xAxisMargin,0,baseX);
i wasn't taking into account the x and y margin when scaling the numbers.

Simple Music Visualisation with a Circle

Im trying to build a simple Music Visualisation App which just should resize a Circle. So if the Music Part which is currently playing is loud it should get bigger and if not it should get smaller.
To Visualize the Circle I just created a custom View Class which draws the circle in the onDraw Method.
To get the informations out of the current Audio, I found the Visualizer Class of Android and also used the setDataCaptureListener.
mVisualizer = new Visualizer(mMediaPlayer.getAudioSessionId());
mVisualizer.setCaptureSize(Visualizer.getCaptureSizeRange()[0]);
mVisualizer.setDataCaptureListener(
new Visualizer.OnDataCaptureListener() {
public void onWaveFormDataCapture(Visualizer visualizer,byte[] bytes, int samplingRate) {
mVisualizerView.updateVisualizer(bytes);
}
public void onFftDataCapture(Visualizer visualizer,byte[] bytes, int samplingRate) {
}
}, (int)(Visualizer.getMaxCaptureRate() / 1.5), true, false);
But my Problem is that I don't really know how I use the byte Array which is given back, to find out the music change in general (got louder or not ?).
I just tried to get the average of the array but this gives me completely bad results. The circle changed his size like it is on drugs. So I thought maybe the array has too many outlined/extreme values (which was true) so I calculated the median of the array. This gaved me better results but still isn't what I want. It's not very smooth and it's to complex. I always have to sort the array which is not really efficient. What am I thinking wrong ?
Im really a beginner in this AudioFX section and Im completely sorry If this is a dumb question and attempt of me.
Thank you for your help !
EDIT:
private float schwelle = 5000;
private float last = 0;
...
float summe = 0;
for (Byte currentByte: mBytes)
summe += currentByte;
if (summe > schwelle && summe > last)
{
last = summe; //make it bigger
}
else {
last -= 100; //make circle smaller
}
canvas.drawCircle(getWidth()/2,getHeight()/2,last / 100,mForePaint);
A really good git project is https://github.com/felixpalmer/android-visualizer.
I myself came up with this:(it's a lot simple than the git solution)
You can use the values of the array to draw the the waveform on the outline of a circle using trigonometry, and make the start radius of the circle bigger if the sum of the array is bigger than certain treshhold:
class StarWaveformRenderer implements Renderer {
private Paint p = new Paint();
private static final int BOOST_TRASH_HOLD = 10000;
private float stretchFade = 1; //circle fades after a prominent beat
#Override
public void render(Canvas canvas, byte[] data) {
if (data == null || data.length == 0)
return;
int centerX = canvas.getWidth() / 2;
int centerY = canvas.getHeight() / 2;
float stretch = stretchFade;
int sum = RenderUtils.sum(data);
p.setColor((p.getColor() + sum / 2)); //change color of circle
if (sum > BOOST_TRASH_HOLD) {//prominent beat
stretch = (float) Math.min(canvas.getWidth(), canvas.getHeight()) / Byte.MAX_VALUE / 3; //maximum
stretchFade = stretch;
}
double radDif = 2 * Math.PI / data.length; //the angle between each element of the array
double radPos = 0;
float lX = (float) Math.cos(radPos) * data[0] + centerX;
float lY = (float) Math.sin(radPos) * data[0] + centerY;
float cX;
float cY;
for (byte b : data) {
cX = (float) Math.cos(radPos) * b * stretch + centerX;
cY = (float) Math.sin(radPos) * b * stretch + centerY;//calculate position of outline, stretch indicates promince of the beat
canvas.drawLine(lX, lY, cX, cY, p);
lX = cX;
lY = cY;
radPos += radDif;
}
stretchFade = Math.max(1, stretchFade / 1.2f);//beat fades out
}
}
You can programm your own renderes and let the user select which one he wants to use. Just pass the array from onWaveformDataCapture to the onRender method.
Utils for analysing the waveform (the amplitude is stored kind of weird):
class RenderUtils {
private static final byte SHIFT = Byte.MAX_VALUE;
static int sum(byte[] data) {
int sum = 0;
for (byte b : data)
sum += b;
return sum;
}
static int toAmplitude(byte b) {
return b > 0 ? b + SHIFT : -b;//+127=high positive;+1=low positive;-127=low negative;-1=high negative
}
static float toAmplitude(float f) {
return f > 0 ? f + SHIFT : -f;//+127=high positive;+1=low positive;-127=low negative;-1=high negative
}
}

Get amplitude from MediaPlayer using Visualizer

i've been reading another posts about calculate the amplitude in real time from a Mediaplayer, but i have no clear how to get a value useful for me. What i need is a linear amplitude value normalize between 0-100, but as i've watched in another posts they are performing a db calculation which has not much sense, cause they are not normalized to max 0dB value (from How to calculate the audio amplitude in real time (android)):
double amplitude = 0;
for (int i = 0; i < audioData.length/2; i++) {
double y = (audioData[i*2] | audioData[i*2+1] << 8) / 32768.0
// depending on your endianness:
// double y = (audioData[i*2]<<8 | audioData[i*2+1]) / 32768.0
amplitude += Math.abs(y);
}
amplitude = amplitude / audioData.length / 2;
I've watched that for calculate de dB, i should do as below (from How to compute decibel (dB) of Amplitude from Media Player?
)
double sum=0;
for (int i = 0; i < audioData.length/2; i++) {
double y = (audioData[i*2] | audioData[i*2+1] << 8) / 32768.0;
sum += y * y;
}
double rms = Math.sqrt(sum / audioData.length/2);
dbAmp = 20.0*Math.log10(rms);
I've tried for that solution but the real time values are near to 0 but sometimes are over than 0, i mean, something between -Inifinit (no sound) to 1.2 (if i avoid 20.0* multiply) or anything else from than order. Anyway, i'd like to obtain a normalized value [0-100], not a dB value.

Implementation of FFT array

I have a problem when implementing a FFT algorithm in Android.
Let´s say that I have a wav file of 8.000 bytes length.
I am aware that you have to select a size of the FFT algorithm (and also has to be a power of 2). My problem is that I am not really sure about how to further proceed from now on.
Lets say that I have chosen a size of the FFT of N=1024.
I have basically to options on my mind:
1) Apply the FFT algorithm directly to the whole array of 8.000 bytes
2) Divide the 8000 byte array wav file in chunks of 1024 bytes (and fill with 0´s the last chunk untill having 8 exact chunks),
then apply the fft to each of this chunks and finally collate all the different chunks again to have one single byte array to represent.
8000*2*1 sec = 8192
I think it´s the option 2 but I am not completely sure.
Here is the fft array thaT I am using:
package com.example.acoustics;
public class FFT {
int n, m;
// Lookup tables. Only need to recompute when size of FFT changes.
double[] cos;
double[] sin;
public FFT(int n) {
this.n = n;
this.m = (int) (Math.log(n) / Math.log(2));
// Make sure n is a power of 2
if (n != (1 << m))
throw new RuntimeException("FFT length must be power of 2");
// precompute tables
cos = new double[n / 2];
sin = new double[n / 2];
for (int i = 0; i < n / 2; i++) {
cos[i] = Math.cos(-2 * Math.PI * i / n);
sin[i] = Math.sin(-2 * Math.PI * i / n);
}
}
/***************************************************************
* fft.c
* Douglas L. Jones
* University of Illinois at Urbana-Champaign
* January 19, 1992
* http://cnx.rice.edu/content/m12016/latest/
*
* fft: in-place radix-2 DIT DFT of a complex input
*
* input:
* n: length of FFT: must be a power of two
* m: n = 2**m
* input/output
* x: double array of length n with real part of data
* y: double array of length n with imag part of data
*
* Permission to copy and use this program is granted
* as long as this header is included.
****************************************************************/
public void fft(double[] x, double[] y) {
int i, j, k, n1, n2, a;
double c, s, t1, t2;
// Bit-reverse
j = 0;
n2 = n / 2;
for (i = 1; i < n - 1; i++) {
n1 = n2;
while (j >= n1) {
j = j - n1;
n1 = n1 / 2;
}
j = j + n1;
if (i < j) {
t1 = x[i];
x[i] = x[j];
x[j] = t1;
t1 = y[i];
y[i] = y[j];
y[j] = t1;
}
}
// FFT
n1 = 0;
n2 = 1;
for (i = 0; i < m; i++) {
n1 = n2;
n2 = n2 + n2;
a = 0;
for (j = 0; j < n1; j++) {
c = cos[a];
s = sin[a];
a += 1 << (m - i - 1);
for (k = j; k < n; k = k + n2) {
t1 = c * x[k + n1] - s * y[k + n1];
t2 = s * x[k + n1] + c * y[k + n1];
x[k + n1] = x[k] - t1;
y[k + n1] = y[k] - t2;
x[k] = x[k] + t1;
y[k] = y[k] + t2;
}
}
}
}
}
I think that you can use the entire array with the FFT. There is not problem with that, you can use 2^13 = 8192 and complete the array with zeros, this processing is also called zero padding and is used in more than one implementation of the FFT. If your procedure works well there is not problem with run the entire array, but if you use section of size 1024 for compute the FFT, then you will have a segmented Fourier transform that not describe well the entire spectrum of the signal, because the FFT use all the positions in the array to compute each value in the new transformed array, then you not get the correct answer in the position one for example if you don't use the entire array of the signal.
This is my analysis of your question I am not hundred percent sure but my knowledge about Fourier series tell me that this is almost that is going to do if you compute a segmented form of the Fourier Transform instead the entire serie.

Calculate Frequency from sound input using FFT

My app. is displaying the peak frequency of input sound in RPM .. i have array of doubles contains the samples in time domain.
audioRecord.read(buffer, 0, 1024);
Then i did FFT on it .
transformer.ft(toTransform);
using this class Here
then i got the max magnitude of complex values which are the results of FFT
// block size = 1024
double magnitude[] = new double[blockSize / 2];
for (int i = 0; i < magnitude.length; i++) {
double R = toTransform[2 * i] * toTransform[2 * i];
double I = toTransform[2 * i + 1] * toTransform[2 * i * 1];
magnitude[i] = Math.sqrt(I + R);
}
int maxIndex = 0;
double max = magnitude[0];
for(int i = 1; i < magnitude.length; i++) {
if (magnitude[i] > max) {
max = magnitude[i];
maxIndex = i;
}
}
Now i got the index of the max magnitude ...
1 - How can i get the Peak Frequency in details pls ?
2 - Is there any ready function called ComputeFrequency() or getFrequency() ?
Thanks in advance :)
The frequency corresponding to a given FFT bin index is given by:
f = i * Fs / N;
where:
Fs = sample rate (Hz)
N = FFT size
i = bin index
So for your peak index maxIndex and FFT size blockSize the frequency of the peak will be:
f = maxIndex * Fs / blockSize;
See this answer for more details.

Categories

Resources