AudioTrack plays static sound - android

I want to play music tracks with AudioTrack class in android, using stereo with PCM 16 bit channel. Here's my code for MusicListFragment.java
import android.media.AudioFormat;
import android.media.AudioManager;
import android.media.AudioTrack;
import android.media.MediaPlayer;
import android.os.Bundle;
import android.support.v4.app.Fragment;
import android.view.LayoutInflater;
import android.view.View;
import android.view.ViewGroup;
import android.widget.AdapterView;
import android.widget.ListView;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileNotFoundException;
import java.io.IOException;
import java.io.InputStream;
public class MusicListFragment extends Fragment implements AdapterView.OnItemClickListener {
private AudioTrack mAudioTrack;
public MusicListFragment() {
}
#Override
public View onCreateView(LayoutInflater inflater, ViewGroup container,
Bundle savedInstanceState) {
// Inflate the layout for this fragment
View view = inflater.inflate(R.layout.fragment_music_list, container, false);
ListView musicListView = (ListView) view.findViewById(R.id.music_list_view);
musicListView.setAdapter(new MusicListAdapter(getActivity()));
int minBufferSize = AudioTrack.getMinBufferSize(
22000,
AudioFormat.CHANNEL_OUT_STEREO, AudioFormat.ENCODING_PCM_16BIT);
mAudioTrack = new AudioTrack(AudioManager.STREAM_MUSIC, 22000
, AudioFormat.CHANNEL_OUT_STEREO, AudioFormat.ENCODING_PCM_16BIT, minBufferSize,
AudioTrack.MODE_STREAM);
musicListView.setOnItemClickListener(this);
return view;
}
#Override
public void onItemClick(AdapterView<?> adapterView, View view, int i, long l) {
Music music = (Music) adapterView.getItemAtPosition(i);
InputStream inputStream = null;
byte[] bytes = new byte[512];
mAudioTrack.play();
try {
File file = new File(music.getPath());
inputStream = new FileInputStream(file);
int k;
while ((k = inputStream.read(bytes)) != -1) {
mAudioTrack.write(bytes, 0, k);
}
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
}
The adapter works fine, since I tested it using the MediaPlayer class. (I can provide my Adapter and other classes too, if you want. But I doubt they are the issue.) My list view shows the title, artist and the album of musics and also stores each music's path.
I could play musics easily with MediaPlayer, but I'm having problem using the AudioTrack. The code makes device play static sounds, like the old TVs when they have no signal! :)
As you can see in adapter's on click listener, I'm
1. getting the music that is selected.
2. reading the music file into an InputStream
3. and finally writing the bytes to the audio track instance. (I've also put the line: mAudioTrack.play() after the try, catch statement, no luck) What am I doing wrong here?

Playing the binary contents of a compressed audio file to the AudioTrack, perhaps? This won't work unless your music files are stored in raw, uncompressed format. AudioTracks use PCM format. Even the header on a .wav file would sound like static, until you reached the raw samples.

Thanks to #greeble31 answer, I understand my issue now, I searched about how to decode .mp3 and .wav files to PCM format and I found some useful answers here and here , in case anyone needs it.

Related

How to play InputStream of an audio file that's not within a url or storage?

Background
I've succeeded uploading an audio file (3gp) into Google-Drive.
Now I want to be able to play the file within the app.
The Google Drive API only allows to get the input stream of the file that's stored there.
The problem
All MediaPlayer capabilities of inputs aren't available in my case, which is only InputSteam:
http://developer.android.com/reference/android/media/MediaPlayer.html#setDataSource(java.io.FileDescriptor)
I know I can save the file from the Google-Drive to the cache and play it, but I want to avoid the storage handling, and play the file on the fly.
What I've tried
I tried to search for this issue, and only found that it might be possible using AudioTrack (here). It might also be possible using new Jelly-Bean features (shown here, found from here), but I'm not sure as it's quite low level.
Sadly, using AudioTrack I got wrong sounds being played (noise).
I've also noticed that MediaPlayer has the option to set the dataSource to be MediaDataSource (here) , but not only I'm not sure how to use it, it also requires API 23 and above.
Of course, I tried using a url that is given in Google-Drive, but this is only used for other purposes and isn't being directed to the audio file, so it can't be used using MediaPlayer.
The question
Given an InputStream, is it possible to use AudioTrack or something else, to play an audio 3gp file ?
Is there maybe a support library solution for this?
If your minSdkVersion is 23 or higher, you can use setDataSource(MediaDataSource) and supply your own subclass of the abstract MediaDataSource class.
For older devices, you should be able to use a pipe created from ParcelFileDescriptor. You would have a thread that writes data to your end of the pipe, and pass the FileDescriptor (from getFileDescriptor()) for the player's end to setDataSource(FileDescriptor).
The simplest MediaDataSource implementation example:
import android.media.MediaDataSource;
import android.os.Build;
import android.support.annotation.RequiresApi;
import java.io.IOException;
import java.io.InputStream;
#RequiresApi(api = Build.VERSION_CODES.M)
public class InputStreamMediaDataSource extends MediaDataSource {
private InputStream is;
private long streamLength = -1, lastReadEndPosition;
public InputStreamMediaDataSource(InputStream is, long streamLength) {
this.is = is;
this.streamLength = streamLength;
if (streamLength <= 0){
try {
this.streamLength = is.available(); //Correct value of InputStream#available() method not always supported by InputStream implementation!
} catch (IOException e) {
e.printStackTrace();
}
}
}
#Override
public synchronized void close() throws IOException {
is.close();
}
#Override
public synchronized int readAt(long position, byte[] buffer, int offset, int size) throws IOException {
if (position >= streamLength)
return -1;
if (position + size > streamLength)
size -= (position + size) - streamLength;
if (position < lastReadEndPosition) {
is.close();
lastReadEndPosition = 0;
is = getNewCopyOfInputStreamSomeHow();//new FileInputStream(mediaFile) for example.
}
long skipped = is.skip(position - lastReadEndPosition);
if (skipped == position - lastReadEndPosition) {
int bytesRead = is.read(buffer, offset, size);
lastReadEndPosition = position + bytesRead;
return bytesRead;
} else {
return -1;
}
}
#Override
public synchronized long getSize() throws IOException {
return streamLength;
}
}
To use it with API >= 23 you have to provide streamLength value and if (when) MediaPlayer goes back - i.e. position < lastReadEndPosition, you have to know how to create a new copy of InputStream.
Usage example:
You have to create Activity, initialize MediaPlayer class (there are a lot of examples of a file playback) and place following code istead of old player.setDataSource("/path/to/media/file.3gp")
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.M) {
File file = new File("/path/to/media/file.3gp");//It is just an example! If you have real file on the phone memory, you don't need to wrap it to the InputStream to play it in MediaPlayer!
player.setDataSource(new InputStreamMediaDataSource(new FileInputStream(file), file.length()));
} else
player.setDataSource(this, mediaUri);
If your file is an object com.google.api.services.drive.model.File on Google Drive and com.google.api.services.drive.Drive drive you can get
InputStream is = drive.getRequestFactory().buildGetRequest(new GenericUrl(file.getDownloadUrl())).execute().getContent();
In the case of Build.VERSION.SDK_INT < Build.VERSION_CODES.MI had to setup HTTP sever on the local host of an Android device (by means of NanoHTTPd) and transfer a byte stream thru this server to MediaPlayer by uri - player.setDataSource(this, mediaUri)
For anyone interested in using a MediaDataSource implementation I've created one that reads ahead and caches buffers of data. It works from any InputStream, I've mainly created it for reading from networked files using JCIFS SmbFile's.
You can find it at https://github.com/SteveGreatApe/BufferedMediaDataSource
If you asking about media player which set our audio path maybe this code can help.
File directory = Environment.getExternalStorageDirectory();
File file = new File( directory + "/AudioRecorder" );
String AudioSavePathInDevice = file.getAbsolutePath() + "/" + "sample.wav" ;
mediaPlayer = new MediaPlayer();
try {
mediaPlayer.setDataSource(AudioSavePathInDevice);
mediaPlayer.prepare();
} catch (IOException e) {
e.printStackTrace();
}
mediaPlayer.start();

Record sound using AudioRecord()

I wand to record some audio using AudioRecord. In order to initialize the AudioRecord object you must provide several arguments e.g( rate, channel, encoding) and since different combinations of arguments are supported from hardware devices, I went on and check functional apps like
Ringdroid:
Audio recording done in Ringdroid
and Rehersal Assistant:
Audio recording in Rehersal Assistant
As mentioned in the documentation of the AudioRecord class, the configuration that will always work is rate = 44100 and channel = CHANNEL_IN_MONO.
I am using the same arguments when initializing my AudioRecord object but I still get a runtime error saying that my object is uninitialized. Since RingDroind is working fine on my device (Nexus 5) I have used the same configuration when creating my AudioObject.
package com.example.android.visualizeaudio;
import android.app.Activity;
import android.media.AudioFormat;
import android.media.AudioRecord;
import android.media.MediaRecorder;
import android.os.Bundle;
import android.view.Menu;
import android.view.View;
import android.widget.Button;
import android.widget.Toast;
public class MainActivity extends Activity {
int mSampleRate = 44100;
Button startButton;
boolean started = false;
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
startButton = (Button) this.findViewById(R.id.start_button);
}
private void RecordAudio() {
int minBufferSize = AudioRecord.getMinBufferSize(
mSampleRate, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);
// make sure minBufferSize can contain at least 1 second of audio (16 bits sample).
if (minBufferSize < mSampleRate * 2) {
minBufferSize = mSampleRate * 2;
}
AudioRecord audioRecord = new AudioRecord(
MediaRecorder.AudioSource.MIC,
mSampleRate,
AudioFormat.CHANNEL_IN_MONO,
AudioFormat.ENCODING_PCM_16BIT,
minBufferSize
);
audioRecord.startRecording();
//Do some stuff here with the recorded data
audioRecord.stop();
audioRecord.release();
}
#Override
public boolean onCreateOptionsMenu(Menu menu) {
getMenuInflater().inflate(R.menu.menu_main, menu);
return true;
}
public void startRec(View view) {
if (started) {
started = false;
startButton.setText("Start");
} else {
started = true;
startButton.setText("Stop");
Toast.makeText(this, "Recording started", Toast.LENGTH_LONG);
RecordAudio();
}
}
}
I am attaching the object inspection during debugging in case it provides more insight
Thank you
Switching to SDK target version 22 made the trick. With target SDK 23 I had these errors. I don't know why but it seems that the resource I am trying to access is used by the OS.

Android app to listen for sound and record volume, using Phonegap/Cordova

I'm trying to write an app that will listen for sound over a phone/tablet's microphone. I think capturing sound is not too hard, I've found some code here.
I was wondering how I would go about associating a volume level? Ideally I'd like to convert the sound level into decibels, but any arbitrary scale would do just fine.
In the end I just used the Java SDK to do this, and then when I converted the app to iOS I rewrote it in objective-c.
On Android you can import the following libraries:
import android.media.AudioFormat;
import android.media.AudioRecord;
import android.media.MediaRecorder;
And then make some calls like this to see the amplitude of the audio recording:
int minSize = AudioRecord.getMinBufferSize(8000,AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT);
AudioRecord ar = new AudioRecord(MediaRecorder.AudioSource.MIC, 8000,AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT,minSize);
short[] buffer = new short[minSize];
ar.startRecording();
while(1){
ar.read(buffer, 0, val*minSize);
average_buffer = 0;
for (int i = 0; i < minSize; i++){
average_buffer += (int) Math.sqrt((float)buffer[i]*buffer[i]);
}
Log.i("NOISE LEVEL", Integer.toString(average_buffer);
}

android sdk realtime audio record

I use this code to record and play back recorded audio in real time using the AudioTrack and AudioRecord
package com.example.audiotrack;
import android.app.Activity;
import android.media.AudioFormat;
import android.media.AudioManager;
import android.media.AudioRecord;
import android.media.AudioTrack;
import android.media.MediaRecorder;
import android.os.Bundle;
import android.util.Log;
public class MainActivity extends Activity {
private int freq = 8000;
private AudioRecord audioRecord = null;
private Thread Rthread = null;
private AudioManager audioManager = null;
private AudioTrack audioTrack = null;
byte[] buffer = new byte[freq];
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);
final int bufferSize = AudioRecord.getMinBufferSize(freq,
AudioFormat.CHANNEL_CONFIGURATION_MONO,
AudioFormat.ENCODING_PCM_16BIT);
audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC, freq,
AudioFormat.CHANNEL_CONFIGURATION_MONO,
MediaRecorder.AudioEncoder.AMR_NB, bufferSize);
audioTrack = new AudioTrack(AudioManager.ROUTE_HEADSET, freq,
AudioFormat.CHANNEL_CONFIGURATION_MONO,
MediaRecorder.AudioEncoder.AMR_NB, bufferSize,
AudioTrack.MODE_STREAM);
audioTrack.setPlaybackRate(freq);
final byte[] buffer = new byte[bufferSize];
audioRecord.startRecording();
Log.i("info", "Audio Recording started");
audioTrack.play();
Log.i("info", "Audio Playing started");
Rthread = new Thread(new Runnable() {
public void run() {
while (true) {
try {
audioRecord.read(buffer, 0, bufferSize);
audioTrack.write(buffer, 0, buffer.length);
} catch (Throwable t) {
Log.e("Error", "Read write failed");
t.printStackTrace();
}
}
}
});
Rthread.start();
}
}
My problem :
1.the quality of audio is bad
2.when I try different frequencies the app crashes
Audio quality can be bad because you are using AMR codec to compress audio data. AMR uses compression based on acoustic model so any other sounds than human speech will be in poor quality
Instead of
MediaRecorder.AudioEncoder.AMR_NB
try
AudioFormat.ENCODING_PCM_16BIT
AudioRecord is low level tool, so you must take care of, parameters compatibility on your own. As said in documentation many frequencies are not guranteed to work.
So it is good idea to go through all combinations and check wich of them are accesible before trying to record or play.
Nice solution was mentioned few times on stackOverflow, e.g here
Frequency detection on Android - AudioRecord
check public AudioRecord findAudioRecord() method

Calculating the DFT with Jtransform's DoubleFFT_1D() for android

I have been searching everywhere to find a reliable method to calculate the FFT of an audio byte stream received by a native function in android SDK (through eclipse IDE). I have come across the libgdx fft and Jtransform. Jtransform Found here
JTransform
.
I have downloaded them all and added the .jar files to a created libs folder in the root directory for the project. I have then linked the project to the new .jar files through project properties > java Build Path > Libraries.
My src java file looks like this attempting to use Jtransform.
package com.spectrum;
import android.app.Activity;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.media.AudioFormat;
import android.media.AudioManager;
import android.media.AudioTrack;
import android.media.MediaPlayer;
import android.os.Bundle;
import android.widget.LinearLayout;
import android.widget.TextView;
import android.view.View;
import com.badlogic.gdx.audio.analysis.*;
import edu.emory.mathcs.jtransforms.fft.*;
import edu.emory.mathcs.utils.*;
public class spectrum extends Activity {
static byte [] sin = null;
static int f = 2000;
static int fs = 44100;
double buf;
/** Called when the activity is first created. */
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
//initialize(new SpectrumDesktop(), false);
sin = playsin(f,fs);
buf = bytearray2double(sin);
public DoubleFFT_1D(512); //<Undefined>
public void complexForward(sin) //<Undefined>
playsound(sin);
}
public static double bytearray2double(byte[] b) {
ByteBuffer buf = ByteBuffer.wrap(b);
return buf.getDouble();
}
private void playsound(byte[] sin2){
int intSize = android.media.AudioTrack.getMinBufferSize(44100, AudioFormat.CHANNEL_CONFIGURATION_STEREO,
AudioFormat.ENCODING_PCM_16BIT);
AudioTrack at = new AudioTrack(AudioManager.STREAM_MUSIC, 44100, AudioFormat.CHANNEL_CONFIGURATION_STEREO,
AudioFormat.ENCODING_PCM_16BIT, intSize, AudioTrack.MODE_STREAM);
at.play();
at.write(sin2, 0, sin2.length );
at.stop();
at.release();
}
#Override
protected void onDestroy() {
super.onDestroy();
// TODO Auto-generated method stub
if (mMediaPlayer != null) {
mMediaPlayer.release();
mMediaPlayer = null;
}
}
public native byte[] playsin(int f,int fs);
/** Load jni .so on initialisation */
static {
System.loadLibrary("SPL");
}
}
In this example I am only using the Jtransform packages, however I have been getting the same compile error for the lingdx packages. The compiler says that DoubleFFT_1D and complexForward are undefined.
So there is something I am missing, like not linking my libraries correctly, I am not sure.
Any help would be greatly appreciated. Am I meant to declare an instance of DoubleFFT_1D and complexForward before onCreate or something?
I know this is a noob question, but I am new to object oriented languages and learning java on the go. Thanks :)
You first need to create a Fourier Transform object
DoubleFFT_1D fft = new DoubleFFT_1D(int n);
Where n is the size of the transform you want. n may have to be 2 times bigger than you think since it expects real and imaginary parts side by side in the same input matrix.
Then you can apply your methods to fft, eg
fft.complexForward(double[]);
Strangely the result is saved into the input array.
1.In Project properties -> Java build path -> Order and export, check all your added dependencies to be included with project class files.
2.Select Android Tools > Fix Project Properties
Than run your app

Categories

Resources