Playing audio is too slow in Android - android

I'm having a problem with Android's MediaPlayer in that it is too slow when calling the prepare method. I've tried to simply keep a Vector of the few MediaPlayer objects (with their preloaded data sources) but calling .start() multiple times results in weird issues.
The first issue is it will skip every other play, and sometimes the play will be half (or less) as loud.
The tones played are very very short but need to be played as quickly as possible. My source code is posted below.
Any help is greatly appreciated.
Kevin
package com.atClass.lemon;
import java.util.Vector;
import com.atClass.cardShoe.SettingTools.SETTING_PREF;
import com.atClass.cardShoe.SettingTools.SETTING_STUB;
import com.atClass.cardShoe.SettingTools.SETTING_VALUE;
import android.content.res.AssetFileDescriptor;
import android.media.MediaPlayer;
import android.media.MediaPlayer.OnCompletionListener;
import android.net.Uri;
import android.util.Config;
import android.util.Log;
public class MediaHandler {
public static int cRepeat;
public static float cVolume = Integer.valueOf(Prefs.cPrefsGet.getString(SETTING_PREF.annunciator_volume.name()+SETTING_STUB._int.name(), PrefDefaults.getDefault(SETTING_PREF.annunciator_volume,SETTING_STUB._int)));
public static boolean cVolumeEnabled = !(Prefs.cPrefsGet.getString(SETTING_PREF.annunciator_volume.name()+SETTING_STUB._value.name(),PrefDefaults.getDefault(SETTING_PREF.annunciator_volume)).equals(SETTING_VALUE.disabled.name()));
static Vector <MediaPlayer> cQuickMediaPlayerList = new Vector<MediaPlayer>();
public static enum AUDIO_CLIP {
app_boot_sound(R.raw.windows_hardware_insert),
app_results_sound(R.raw.windows_exclamation),
app_warning_sound(R.raw.windows_hardware_fail),
app_card_draw_sound(R.raw.fs_beep5),
app_lid_open_sound(R.raw.windows_hardware_fail),
app_top_tigger_overdraw_sound(R.raw.fs_beep6),
test(R.raw.fs_beep4);
private int enumResourceId;
AUDIO_CLIP(int input){ enumResourceId = input;}
int getItem(){return enumResourceId;}
}
public static int getAudioClipIndex(AUDIO_CLIP iAudioClip){
for (int i=0; i<AUDIO_CLIP.values().length; i++){
if (AUDIO_CLIP.values()[i] == iAudioClip){
return i;
}
}
return 0;
}
public static void setupQuickMediaPlayer(){
cQuickMediaPlayerList.clear();
for (int i=0; i<AUDIO_CLIP.values().length; i++){
MediaPlayer lMediaPlayer = new MediaPlayer();
final AssetFileDescriptor afd = Global.gContext.getResources().openRawResourceFd(AUDIO_CLIP.values()[i].getItem());
try{
lMediaPlayer.setDataSource(afd.getFileDescriptor(), afd.getStartOffset(), afd.getLength());
afd.close();
lMediaPlayer.prepare();
}catch(Exception e){}
lMediaPlayer.setVolume(cVolume,cVolume);
lMediaPlayer.setLooping(false);
lMediaPlayer.setOnCompletionListener(new OnCompletionListener(){
#Override
public void onCompletion(MediaPlayer lMediaPlayer) {
lMediaPlayer.release();
try{lMediaPlayer.prepare();}catch(Exception e){e.printStackTrace();}
}});
cQuickMediaPlayerList.add(lMediaPlayer);
}
}
public static void playAudio(AUDIO_CLIP iAudioClip){
float volume = cVolume;
volume++;
volume /= 10;
playAudio(iAudioClip,volume);
}
public static void playAudio(final AUDIO_CLIP iAudioClip, final float iVolume){
Thread lThread = new Thread(new Runnable(){
public void run() {
//int resourceId = iAudioClip.getItem();
Log.d(Global.TAG,"--> Playing audio clip: " + iAudioClip.name() + "," + iAudioClip.getItem() + "," + getAudioClipIndex(iAudioClip));
if (cVolumeEnabled == true){
//Log.d(Global.TAG,"--> Supplying volume: " + iVolume);
//Works but is too slow
// try {
// final MediaPlayer lMediaPlayer = new MediaPlayer();
// AssetFileDescriptor afd = Global.gContext.getResources().openRawResourceFd(iAudioClip.getItem());
// lMediaPlayer.setDataSource(afd.getFileDescriptor(), afd.getStartOffset(), afd.getLength());
// afd.close();
// lMediaPlayer.prepare();
// lMediaPlayer.setVolume(iVolume,iVolume);
// lMediaPlayer.setLooping(false);
// lMediaPlayer.setOnCompletionListener(new OnCompletionListener(){
// #Override
// public void onCompletion(MediaPlayer arg0) {
// lMediaPlayer.release();
// }});
// lMediaPlayer.start();
// }catch(Exception e){}
try{
//Works half the time
cQuickMediaPlayerList.get(getAudioClipIndex(iAudioClip)).start();
}catch(Exception e){}
}
}
});
lThread.setPriority(Thread.MAX_PRIORITY);
lThread.start();
}
}

You should use SoundPool instead: http://developer.android.com/reference/android/media/SoundPool.html

In your onCompletionListener, you call release(), followed by prepare(). This is illegal, and is probably why you're having problems starting them multiple times. If you want to call it again, don't use release(), because that frees all the resources for the MP, and should only be called when you are done with it. Use stop() instead.
However, this still won't speed up the prepare(). You might want to try seekTo(0) instead, but even then it might not be as fast as you want. It really depends on how fast you're talking about.

Related

Why does this metronome app crash? (Android)

I'm working on a very small Android Project that uses this exact code from github.
However, when I (or you) intermittently button mash the start/stop button... the app eventually crashes. Unfortunately this can take a little while to reproduce... but it will happen!
Oh, I forgot the desired result!!
The desired result is that this crash does not occur. :)
Does anyone know why this crash occurs? The author of this code has had an open bug/issue for this on Github since March of 2013... so I'm pretty sure it's not a particularly stupid question... and if you do know the answer to this, you would no doubt be a hailed as a bowss.
I have been dissecting the code, print debugging, and researching ASyncTask, Handlers, and AudioTrack for a couple of days now but I can't figure it out... I will though if nobody else beats me to it.
This is the stack trace:
E/AndroidRuntime: FATAL EXCEPTION: AsyncTask #4
Process: com.example.boober.beatkeeper, PID: 15664
java.lang.RuntimeException: An error occurred while executing doInBackground()
at android.os.AsyncTask$3.done(AsyncTask.java:309)
at java.util.concurrent.FutureTask.finishCompletion(FutureTask.java:354)
at java.util.concurrent.FutureTask.setException(FutureTask.java:223)
at java.util.concurrent.FutureTask.run(FutureTask.java:242)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1113)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:588)
at java.lang.Thread.run(Thread.java:818)
Caused by: java.lang.IllegalStateException: Unable to retrieve AudioTrack pointer for write()
at android.media.AudioTrack.native_write_byte(Native Method)
at android.media.AudioTrack.write(AudioTrack.java:1761)
at android.media.AudioTrack.write(AudioTrack.java:1704)
at com.example.boober.beatkeeper.AudioGenerator.writeSound(AudioGenerator.java:55)
at com.example.boober.beatkeeper.Metronome.play(Metronome.java:60)
at com.example.boober.beatkeeper.MainActivity$MetronomeAsyncTask.doInBackground(MainActivity.java:298)
at com.example.boober.beatkeeper.MainActivity$MetronomeAsyncTask.doInBackground(MainActivity.java:283)
at android.os.AsyncTask$2.call(AsyncTask.java:295)
at java.util.concurrent.FutureTask.run(FutureTask.java:237)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1113)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:588)
at java.lang.Thread.run(Thread.java:818)
You could just go to github and download the original code, but in order to satisfy stackoverflow requirements, I have also provided the even-more-concise "minimal working example" which you can individually cut and paste into your Android Studio if you prefer.
MainActivity:
import android.graphics.Color;
import android.os.AsyncTask;
import android.os.Handler;
import android.os.Message;
import android.support.v7.app.AppCompatActivity;
import android.os.Bundle;
import android.util.Log;
import android.view.View;
import android.widget.Button;
import android.widget.TextView;
public class MainActivity extends AppCompatActivity {
String TAG = "AAA";
Button playStopButton;
TextView currentBeat;
// important objects
MetronomeAsyncTask aSync;
Handler mHandler;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
currentBeat = findViewById(R.id.currentBeatTextView);
playStopButton = findViewById(R.id.playStopButton);
// important objcts
aSync = new MetronomeAsyncTask();
}
// only called from within playStopPressed()
private void stopPressed() {
aSync.stop();
aSync = new MetronomeAsyncTask();
}
// only called from within playStopPressed()
private void playPressed() {
//aSync.execute();
aSync.executeOnExecutor(AsyncTask.THREAD_POOL_EXECUTOR, (Void[])null);
}
public synchronized void playStopButtonPressed(View v) {
boolean wasPlayingWhenPressed = playStopButton.isSelected();
playStopButton.setSelected(!playStopButton.isSelected());
if (wasPlayingWhenPressed) {
stopPressed();
} else {
playPressed();
}
}
// METRONOME BRAIN STUFF ------------------------------------------
private Handler getHandler() {
return new Handler() {
#Override
public void handleMessage(Message msg) {
String message = (String) msg.obj;
if (message.equals("1")) {
currentBeat.setTextColor(Color.GREEN);
}
else {
currentBeat.setTextColor(Color.BLUE);
}
currentBeat.setText(message);
}
};
}
private class MetronomeAsyncTask extends AsyncTask<Void, Void, String> {
MetronomeBrain metronome;
MetronomeAsyncTask() {
mHandler = getHandler();
metronome = new MetronomeBrain(mHandler);
Runtime.getRuntime().gc(); // <---- don't know if this line is necessary or not.
}
protected String doInBackground(Void... params) {
metronome.setBeat(4);
metronome.setNoteValue(4);
metronome.setBpm(100);
metronome.setBeatSound(2440);
metronome.setSound(6440);
metronome.play();
return null;
}
public void stop() {
metronome.stop();
metronome = null;
}
public void setBpm(short bpm) {
metronome.setBpm(bpm);
metronome.calcSilence();
}
public void setBeat(short beat) {
if (metronome != null)
metronome.setBeat(beat);
}
}
}
MetronomeBrain:
import android.os.Handler;
import android.os.Message;
public class MetronomeBrain {
private double bpm;
private int beat;
private int noteValue;
private int silence;
private double beatSound;
private double sound;
private final int tick = 1000; // samples of tick
private boolean play = true;
private AudioGenerator audioGenerator = new AudioGenerator(8000);
private Handler mHandler;
private double[] soundTickArray;
private double[] soundTockArray;
private double[] silenceSoundArray;
private Message msg;
private int currentBeat = 1;
public MetronomeBrain(Handler handler) {
audioGenerator.createPlayer();
this.mHandler = handler;
}
public void calcSilence() {
silence = (int) (((60 / bpm) * 8000) - tick);
soundTickArray = new double[this.tick];
soundTockArray = new double[this.tick];
silenceSoundArray = new double[this.silence];
msg = new Message();
msg.obj = "" + currentBeat;
double[] tick = audioGenerator.getSineWave(this.tick, 8000, beatSound);
double[] tock = audioGenerator.getSineWave(this.tick, 8000, sound);
for (int i = 0; i < this.tick; i++) {
soundTickArray[i] = tick[i];
soundTockArray[i] = tock[i];
}
for (int i = 0; i < silence; i++)
silenceSoundArray[i] = 0;
}
public void play() {
calcSilence();
do {
msg = new Message();
msg.obj = "" + currentBeat;
if (currentBeat == 1)
audioGenerator.writeSound(soundTockArray);
else
audioGenerator.writeSound(soundTickArray);
if (bpm <= 120)
mHandler.sendMessage(msg);
audioGenerator.writeSound(silenceSoundArray);
if (bpm > 120)
mHandler.sendMessage(msg);
currentBeat++;
if (currentBeat > beat)
currentBeat = 1;
} while (play);
}
public void stop() {
play = false;
audioGenerator.destroyAudioTrack();
}
public double getBpm() {
return bpm;
}
public void setBpm(int bpm) {
this.bpm = bpm;
}
public int getNoteValue() {
return noteValue;
}
public void setNoteValue(int bpmetre) {
this.noteValue = bpmetre;
}
public int getBeat() {
return beat;
}
public void setBeat(int beat) {
this.beat = beat;
}
public double getBeatSound() {
return beatSound;
}
public void setBeatSound(double sound1) {
this.beatSound = sound1;
}
public double getSound() {
return sound;
}
public void setSound(double sound2) {
this.sound = sound2;
}
}
AudioGenerator:
import android.media.AudioFormat;
import android.media.AudioManager;
import android.media.AudioTrack;
public class AudioGenerator {
private int sampleRate;
private AudioTrack audioTrack;
public AudioGenerator(int sampleRate) {
this.sampleRate = sampleRate;
}
public double[] getSineWave(int samples,int sampleRate,double frequencyOfTone){
double[] sample = new double[samples];
for (int i = 0; i < samples; i++) {
sample[i] = Math.sin(2 * Math.PI * i / (sampleRate/frequencyOfTone));
}
return sample;
}
public byte[] get16BitPcm(double[] samples) {
byte[] generatedSound = new byte[2 * samples.length];
int index = 0;
for (double sample : samples) {
// scale to maximum amplitude
short maxSample = (short) ((sample * Short.MAX_VALUE));
// in 16 bit wav PCM, first byte is the low order byte
generatedSound[index++] = (byte) (maxSample & 0x00ff);
generatedSound[index++] = (byte) ((maxSample & 0xff00) >>> 8);
}
return generatedSound;
}
public void createPlayer(){
audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC,
sampleRate, AudioFormat.CHANNEL_CONFIGURATION_MONO,
AudioFormat.ENCODING_PCM_16BIT, sampleRate,
AudioTrack.MODE_STREAM);
audioTrack.play();
}
public void writeSound(double[] samples) {
byte[] generatedSnd = get16BitPcm(samples);
audioTrack.write(generatedSnd, 0, generatedSnd.length);
}
public void destroyAudioTrack() {
audioTrack.stop();
// This line seems to be a most likely culprit of the start/stop crash.
// Is this line even necessary?
audioTrack.release();
}
}
Layout:
<?xml version="1.0" encoding="utf-8"?>
<android.support.constraint.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context="com.example.boober.android_metronome.MainActivity">
<Button
android:id="#+id/playStopButton"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_marginBottom="8dp"
android:layout_marginEnd="8dp"
android:layout_marginStart="8dp"
android:layout_marginTop="8dp"
android:onClick="playStopButtonPressed"
android:text="Play"
app:layout_constraintBottom_toBottomOf="parent"
app:layout_constraintEnd_toEndOf="parent"
app:layout_constraintStart_toStartOf="parent"
app:layout_constraintTop_toTopOf="parent" />
<TextView
android:id="#+id/currentBeatTextView"
android:layout_width="100dp"
android:layout_height="50dp"
android:layout_marginEnd="8dp"
android:layout_marginStart="8dp"
android:layout_marginTop="32dp"
android:text="TextView"
android:gravity="center_vertical"
android:textAlignment="center"
android:textSize="30sp"
app:layout_constraintEnd_toEndOf="parent"
app:layout_constraintStart_toStartOf="parent"
app:layout_constraintTop_toBottomOf="#+id/playStopButton" />
</android.support.constraint.ConstraintLayout>
After thinking about dmarin's comment and reading the code, I arrive to the conclusion that dmarin answered your question. It's a race condition, and it's also an access of an object which is not initialized. So the short solution is: The code needs to check, if the accessed data is initialized. The AudioTrack objects can be checked, if it is null or if the getState() equals "initialized". Unfortunately, the problem does not disappear with my setup (Android Studio 3.1.2, Android SDK Build-Tools 28-rc2).
private boolean isInitialized() {
return audioTrack.getState() == AudioTrack.STATE_INITIALIZED;
}
After a code analysis one could notice the creation of AsyncTasks and AudioTracks. So, to minimize those, create the AsyncTask only once in the onCreate - function and set the AudioTrack object to static.
MainActivity
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
currentBeat = findViewById(R.id.currentBeatTextView);
playStopButton = findViewById(R.id.playStopButton);
// important objcts
aSync = new MetronomeAsyncTask();
aSync.executeOnExecutor(AsyncTask.THREAD_POOL_EXECUTOR, (Void[])null);
}
AudioGenerator
public class AudioGenerator {
/*changed to static*/
private static AudioTrack audioTrack;
...
}
I admit just changing it to static is not a beautiful solution. But since I only need one pipe to the AudioService, this will do.
Creating the audio-pipe, stopping the playing of the audio and freeing the resource will look like this:
public void createPlayer(){
if (audioTrack == null || ! isInitialized())
audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC,
sampleRate, AudioFormat.CHANNEL_CONFIGURATION_MONO,
AudioFormat.ENCODING_PCM_16BIT, sampleRate,
AudioTrack.MODE_STREAM);
if (isInitialized()){
audioTrack.play();
}
}
public void destroyAudioTrack() {
if (isInitialized()) {
audioTrack.stop();
}
}
public void stopRelease() {
if (isInitialized()) {
audioTrack.stop();
audioTrack.release();
}
}
The boolean play is repurposed by me. Also, the beat counter called currentBeat is reset, when the play button is pressed. For accessing from the MainActivity: The change from private to public of those variables is not the best solution.
// only called from within playStopPressed()
private void stopPressed() {
aSync.metronome.play = false;
}
// only called from within playStopPressed()
private void playPressed() {
aSync.metronome.play = true;
aSync.metronome.currentBeat = 1;
}
In play() of MetronomeBrain the loop becomes an endless loop. This problem will be fixed, soon. That is why the play boolean may be repurposed. The playing of the tones needs to be set to a different condition, which depends on play.
public void play() {
calcSilence();
/*a change for the do-while loop: It runs forever and needs
to be killed externally of the loop.
Also the play decides, if audio is being played.*/
do {
msg = new Message();
msg.obj = "" + currentBeat;
if (currentBeat == 1 && play)
audioGenerator.writeSound(soundTockArray);
else if (play)
audioGenerator.writeSound(soundTickArray);
if (bpm <= 120)
mHandler.sendMessage(msg);
audioGenerator.writeSound(silenceSoundArray);
if (bpm > 120)
mHandler.sendMessage(msg);
currentBeat++;
if (currentBeat > beat)
currentBeat = 1;
} while (true);
}
Now the loop runs forever, but it may only play, if play is set to true. If the clean-up is necessary, it may be done at the end of the Activity lifecycle like this in the MainActivity:
#Override
protected void onDestroy() {
aSync.metronome.stopReleaseAudio(); //calls the stopRelease()
aSync.cancel(true);
super.onDestroy();
}
As I stated, the code could be further improved, but it gives a fair hint and enough material to think/learn about AsyncTasks, Services like Audio Service and Activity - Lifecycles.
References
- https://developer.android.com/reference/android/os/AsyncTask
- https://developer.android.com/reference/android/media/AudioManager
- https://developer.android.com/reference/android/media/AudioTrack
- https://developer.android.com/reference/android/app/Activity#activity-lifecycle
TL;DR: Make sure that the objects are initialized before accessing them, just create everything once, and destroy them, when you do not need them e.g. at the end of the activity.

Send Android TextToSpeech to just one stereo channel

On Android, I want to play TextToSpeech output through only one sound channel (think Shoulder Angel). To do this, I am currently using tts.synthesizeToFile(), and then playing back the dynamically-created file using the MediaPlayer. I use mediaPlayer.setVolume(0.0f, 1.0f) to play the audio through only one channel.
My working code is below.
My question is: is there a more direct way of playing TTS output through a single channel?
Using TextToSpeech to synthesize the file is time-consuming, and using MediaPlayer to play it back uses more resources than strictly necessary. I want this to be responsive and to work on low-end devices, so being kind to the CPU is important.
MainActivity.java
package com.example.pantts;
import android.app.Activity;
import android.media.AudioManager;
import android.media.MediaPlayer;
import android.speech.tts.TextToSpeech;
import android.os.Bundle;
import android.speech.tts.UtteranceProgressListener;
import android.util.Log;
import java.io.File;
import java.io.FileDescriptor;
import java.io.FileInputStream;
import java.util.HashMap;
import java.util.Locale;
public class MainActivity extends Activity implements TextToSpeech.OnInitListener {
private TextToSpeech tts;
private String toSpeak = "Hello, right ear!";
private static final String FILE_ID = "file";
private HashMap<String, String> hashMap = new HashMap<String, String>();
private String filename;
private TextToSpeech tts;
private MediaPlayer mediaPlayer;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
filename = getFilesDir() + "/" + "tts.wav";
Log.d("LOG", "file: " + filename);
// /data/data/com.example.pantts/files/tts.wav
mediaPlayer = new MediaPlayer();
mediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
tts = new TextToSpeech(this, this);
tts.setOnUtteranceProgressListener(mProgressListener);
}
public void onInit(int status) {
if (status == TextToSpeech.SUCCESS) {
tts.setLanguage(Locale.UK);
hashMap.put(TextToSpeech.Engine.KEY_PARAM_UTTERANCE_ID, FILE_ID);
// Using deprecated call for API 20 and earlier
tts.synthesizeToFile(toSpeak, hashMap, filename);
Log.d("LOG", "synthesizeToFile queued");
}
}
private UtteranceProgressListener mProgressListener =
new UtteranceProgressListener() {
#Override
public void onStart(String utteranceId) {
Log.d("LOG", "synthesizeToFile onStart " + utteranceId);
}
#Override
public void onError(String utteranceId) {
Log.d("LOG", "synthesizeToFile onError " + utteranceId);
}
#Override
public void onDone(String utteranceId) {
if (utteranceId.equals(FILE_ID)) { // Thanks to Hoan Nguyen for correcting this
Log.d("LOG", "synthesizeToFile onDone " + utteranceId);
try {
File ttsFile = new File(filename);
FileInputStream inputStream = new FileInputStream(ttsFile);
FileDescriptor fileDescriptor = inputStream.getFD();
mediaPlayer.reset();
mediaPlayer.setDataSource(fileDescriptor);
inputStream.close();
mediaPlayer.prepare();
mediaPlayer.setVolume(0.0f, 1.0f); // right channel only
mediaPlayer.start();
} catch (Exception e) {
e.printStackTrace();
}
}
}
};
}
There is nothing wrong with the synthesize, it is the comparison that is wrong. It should be
if (utteranceId.equals(FILE_ID))

android audio - soundpool alternatives

I like the Android Soundpool class for its simplicity and it works well with the standard audio files I am using in my app. Now I want to make it possible for the user to specify certains sounds by specifying audio files on the sd card. Unfortunately I run into limitations of Soundpool, when the sound file is too big i get a
AudioFlinger could not create track. status: -12
response. It seems I have to switch to MediaPlayer yet before getting into the complexity of MediaPlayer again I wanted to ask if there is an audio library available for android which
has the simplicity of Soundpool for playing various sounds
doesnt have the limitations of Soundpool regarding the size of the files.
Thank you very much.
martin
For now I came up with a very simple AudioPool class which plays audio added to it subsequently with the MediaPlayer class. This implementation is for sure not mature yet I just thought to share it as it at least gives some idea how this can be approached easily. If you see any problems with this class please let us know.
Usage:
AudioPool ap = new AudioPool();
File root = Environment.getExternalStorageDirectory() ;
int id1 = ap.addAudio(root + "/gong1.mp3");
int id2 = ap.addAudio(root + "/gong2.mp3");
int id3 = ap.addAudio(root + "/gong3.mp3");
ap.playAudio(id1);
ap.playAudio(id3);
ap.playAudio(id3);
ap.playAudio(id2);
which will play gong1 -> gong3 -> gong3 -> gong1 subsequently. As this is basically what I need I leave it here ...
import java.util.HashMap;
import java.util.LinkedList;
import java.util.Map;
import android.media.MediaPlayer;
import android.media.MediaPlayer.OnCompletionListener;
import android.util.Log;
public class AudioPool {
static String TAG = "AudioPool";
MediaPlayer mPlayer;
int mAudioCounter;
int mCurrentId;
HashMap<Integer, String> mAudioMap;
LinkedList<Integer> mAudioQueue;
public AudioPool() {
mAudioMap = new HashMap<Integer, String>();
mAudioQueue = new LinkedList<Integer>();
mAudioCounter = 0;
}
public int addAudio(String path) {
Log.d(TAG, "adding audio " + path + " to the pool");
if (mAudioMap.containsValue(path)) {
return getAudioKey(path);
}
mAudioCounter++;
mAudioMap.put(mAudioCounter, path);
return mAudioCounter;
}
public boolean playAudio(int id) {
if (mAudioMap.containsKey(id) == false) {
return false;
}
if (mPlayer == null) {
setupPlayer();
}
if (mPlayer.isPlaying() == false) {
return prepareAndPlayAudioNow(id);
} else {
Log.d(TAG, "adding audio " + id + " to the audio queue");
mAudioQueue.add(id);
}
return true;
}
public Integer[] getAudioIds() {
return (Integer[]) mAudioMap.keySet().toArray(
new Integer[mAudioMap.keySet().size()]);
}
public void releaseAudioPlayer() {
if (mPlayer != null) {
mPlayer.release();
mPlayer = null;
}
}
private boolean prepareAndPlayAudioNow(int id) {
mCurrentId = id;
try {
Log.d(TAG, "playing audio " + id + " now");
mPlayer.reset();
mPlayer.setDataSource(mAudioMap.get(id));
mPlayer.prepare();
mPlayer.start();
return true;
} catch (Exception e) {
Log.d(TAG, "problems playing audio " + e.getMessage());
return false;
}
}
private boolean playAudioAgainNow() {
try {
mPlayer.seekTo(0);
mPlayer.start();
return true;
} catch (Exception e) {
Log.d(TAG, "problems playing audio");
return false;
}
}
private void setupPlayer() {
mPlayer = new MediaPlayer();
mPlayer.setOnCompletionListener(new OnCompletionListener() {
#Override
public void onCompletion(MediaPlayer mp) {
audioDone();
}
});
}
private void audioDone() {
if (mAudioQueue.size() > 0) {
Log.d(TAG, mAudioQueue.size() + " audios in queue");
int nextId = mAudioQueue.removeFirst();
if (mCurrentId == nextId) {
playAudioAgainNow();
} else {
prepareAndPlayAudioNow(nextId);
}
} else {
releaseAudioPlayer();
}
}
private int getAudioKey(String path) {
for (Map.Entry<Integer, String> map : mAudioMap.entrySet()) {
if (map.getValue().compareTo(path) == 0) {
return map.getKey();
}
}
return -1;
}
}
Thanks to dorjeduck for the solution, but his class based on MediaPlayer, which has huge latency.
What does it mean? It means that when you call these:
mPlayer.prepare();
mPlayer.start();
and actually hear the sound the delay is very noticable. For example when you need to play one track and immediately play another, you will hear delay even on high-end hardware.
The solution to load all bytes into memory before playing, and use AudioTrack to play that sound bytes.
I have written SoundPoolCompat which uses AudioTrack under the hood. You could pass custom bufferSize, and all data within that buffer will be loaded into memory and played with small latency like SoundPool does. All data that exceed that bufferSize will be loaded on demand (which adds latency, similar to MediaPlayer). Api is very similart to SoundPool, also it is added a feature to load sounds from Uri (for example gdrive). And there is playOnce method, all resources will be unloaded after file is played.
implementation 'com.olekdia:sound-pool:3.0.2'
https://gitlab.com/olekdia/common/libraries/sound-pool

Android MediaPlayer works fine in Custom audio Streaming application up to Android 2.1 but not in higher versions

EDIT:
Android 2.2 MediaPlayer is working fine with one SHOUTcast URL but not with the other one
I need to play audio files from external URLs(shoutcast stream). Currently the audio files are downloaded incrementally & are played as soon as we get enough audio in phone local temporary storage. i am using the StreamingMediaPlayer class.
Check this piece of code:
private MediaPlayer createMediaPlayer(File mediaFile)
throws IOException {
MediaPlayer mPlayer = new MediaPlayer();
//example of mediaFile =/data/data/package/cache/playingMedia0.dat
FileInputStream fis = new FileInputStream(mediaFile);
mPlayer.setDataSource(fis.getFD());
mPlayer.prepare();
return mPlayer;
}
Current status:
1- It works fine from Android 1.6 to 2.1 but not in the higher versions like Android 2.2.
2- The "mPlayer.setDataSource(fis.getFD())" is the line which throws the error.
3- The error is "Unable to to create media player"
Other Solution tried:
I tried below alternate solution but nothing worked so far.
Android 2.2 MediaPlayer is working fine with one SHOUTcast URL but not with the other one
What i am looking for?
My goal is to have a peace of code which can work on Android 2.1 & higher.
This issue is also discussed here:
1- Inconsistent 2.2 Media Player Behavior
2- android code for streaming shoutcast stream breaks in 2.2
3- This issue is also discussed in a lot of questions on this site, but i found the answer no where.
4- markmail.org
LogCat trace:
Unable to to create media player
Error copying buffered conent.
java.lang.NullPointerException
com.ms.iradio.StreamingMediaPlayer.startMediaPlayer(StreamingMediaPlayer.java:251)
com.ms.iradio.StreamingMediaPlayer.access$2(StreamingMediaPlayer.java:221)
com.ms.iradio.StreamingMediaPlayer$2.run(StreamingMediaPlayer.java:204)
android.os.Handler.handleCallback(Handler.java:587)
android.os.Handler.dispatchMessage(Handler.java:92)
android.os.Looper.loop(Looper.java:123)
android.app.ActivityThread.main(ActivityThread.java:3683)
java.lang.reflect.Method.invokeNative(Native Method)
java.lang.reflect.Method.invoke(Method.java:507)
com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:839)
com.android.internal.os.ZygoteInit.main(ZygoteInit.java:597)
dalvik.system.NativeStart.main(Native Method)
The problem is that content type "audio/aacp" streaming is not supported directly. Some decoding libraries can be used to play "aacp", please see the solution below:
Freeware Advanced Audio (AAC) Decoder for Android
How to use this library?
Consider legal issues while using it.
[T]he project http://code.google.com/p/aacplayer-android/ is licensed
under GPL, so you can create commercial apps on top of it, but you
need to fullfill the GPL - mainly it means to publish your code as
well. If you use the second project
http://code.google.com/p/aacdecoder-android/ , then you do not need to
publish your code (the library is licensed under LGPL).
The StreamingMediaPlayer class is using a double-buffering technique to get around limitations in pre-1.2 releases of Android. All production releases of Android OS have included a MediaPlayer that supports streaming media(1). I would recommend doing that rather than using this double-buffering technique to get around the problem.
Android OS 2.2 replaced the old media player code with the FrightCast player which probably is acting differently in this case.
The line numbers in your stack trace don't map to the file you link to, so I assume there's a different version that you're actually using. I'm going to guess that that NullPointerException is being reported by MediaPlayer but neither the FileInputStream nor the returned FileDescriptor can be null.
(1) Prior to version 2.2 the media player wouldn't recognize ShoutCast streams with an "ICY/1.1" version header in the response. By creating a proxy that replaces this with "HTTP/1.1" you can resolve that. See the StreamProxy class here for an example.
i am using this code and run 2.2 to upper version for streaming downloaded.
import java.io.BufferedInputStream;
import java.io.BufferedOutputStream;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStream;
import java.net.URL;
import java.net.URLConnection;
import android.content.Context;
import android.media.MediaPlayer;
import android.os.Environment;
import android.os.Handler;
import android.util.Log;
import android.view.View;
import android.widget.Button;
import android.widget.ImageButton;
import android.widget.ProgressBar;
import android.widget.TextView;
public class StreamingMediaPlayer {
private static final int INTIAL_KB_BUFFER = 96*10;//assume 96kbps*10secs/8bits per byte
private TextView textStreamed;
private ImageButton playButton;
private ProgressBar progressBar;
ProgressBar pb;
int audiofiletime=0;
private long mediaLengthInSeconds;
private int totalKbRead = 0;
int totalsize=0;
int numread;
int totalBytesRead = 0;
private final Handler handler = new Handler();
private MediaPlayer mediaPlayer;
private File downloadingMediaFile;
private boolean isInterrupted;
private Context context;
private int counter = 0;
public StreamingMediaPlayer(Context context,TextView textStreamed, ImageButton playButton, Button streamButton,ProgressBar progressBar,ProgressBar pb)
{
this.context = context;
this.textStreamed = textStreamed;
this.playButton = playButton;
this.progressBar = progressBar;
this.pb=pb;
}
/**
* Progressivly download the media to a temporary location and update the MediaPlayer as new content becomes available.
*/
public void startStreaming(final String mediaUrl) throws IOException {
//this.mediaLengthInSeconds = 100;
Runnable r = new Runnable() {
public void run() {
try {
downloadAudioIncrement(mediaUrl);
} catch (IOException e) {
Log.e(getClass().getName(), "Unable to initialize the MediaPlayer for fileUrl=" + mediaUrl, e);
return;
}
}
};
new Thread(r).start();
}
/**
* Download the url stream to a temporary location and then call the setDataSource
* for that local file
*/
#SuppressWarnings({ "resource", "unused" })
public void downloadAudioIncrement(String mediaUrl) throws IOException {
URLConnection cn = new URL(mediaUrl).openConnection();
cn.connect();
InputStream stream = cn.getInputStream();
if (stream == null) {
Log.e(getClass().getName(), "Unable to create InputStream for mediaUrl:" + mediaUrl);
}
///////////////////save sdcard///////////////
File direct = new File(Environment.getExternalStorageDirectory()+"/punya");
if(!direct.exists()) {
if(direct.mkdir()); //directory is created;
}
String[] files=mediaUrl.split("/");
String fileName=files[files.length-1];
fileName = fileName.replace(".m4a", ".rdo");
//create a new file, to save the downloaded file
File file = new File(direct,fileName);
#SuppressWarnings("resource")
FileOutputStream fileOutput = new FileOutputStream(file);
///////////////////end/////////////////
totalsize=cn.getContentLength();
//mediaLengthInKb = 10000;
downloadingMediaFile = new File(context.getCacheDir(),fileName);
if (downloadingMediaFile.exists()) {
downloadingMediaFile.delete();
}
FileOutputStream out = new FileOutputStream(downloadingMediaFile);
byte buf[] = new byte[16384];
int incrementalBytesRead = 0;
do {
numread = stream.read(buf);
if (numread <= 0)
break;
out.write(buf, 0, numread);
fileOutput.write(buf, 0, numread);
totalBytesRead += numread;
incrementalBytesRead += numread;
totalKbRead = totalBytesRead/1000;
// pb.setMax(100);
// pb.setProgress(totalKbRead);
testMediaBuffer();
fireDataLoadUpdate();
} while (validateNotInterrupted());
stream.close();
if (validateNotInterrupted()) {
fireDataFullyLoaded();
}
}
private boolean validateNotInterrupted() {
if (isInterrupted) {
if (mediaPlayer != null) {
mediaPlayer.pause();
//mediaPlayer.release();
}
return false;
} else {
return true;
}
}
/**
* Test whether we need to transfer buffered data to the MediaPlayer.
* Interacting with MediaPlayer on non-main UI thread can causes crashes to so perform this using a Handler.
*/
private void testMediaBuffer() {
Runnable updater = new Runnable() {
public void run() {
if (mediaPlayer == null) {
// Only create the MediaPlayer once we have the minimum buffered data
if ( totalKbRead >= INTIAL_KB_BUFFER) {
try {
startMediaPlayer();
} catch (Exception e) {
Log.e(getClass().getName(), "Error copying buffered conent.", e);
}
}
} else if ( mediaPlayer.getDuration() - mediaPlayer.getCurrentPosition() <= 1000 ){
// NOTE: The media player has stopped at the end so transfer any existing buffered data
// We test for < 1second of data because the media player can stop when there is still
// a few milliseconds of data left to play
transferBufferToMediaPlayer();
}
}
};
handler.post(updater);
}
private void startMediaPlayer() {
try {
//File bufferedFile = new File(context.getCacheDir(),"playingMedia" + (counter++) + ".m4a");
//moveFile(downloadingMediaFile,bufferedFile);
// Log.e(getClass().getName(),"Buffered File path: " + bufferedFile.getAbsolutePath());
// Log.e(getClass().getName(),"Buffered File length: " + bufferedFile.length()+"");
mediaPlayer = createMediaPlayer(downloadingMediaFile);
//mediaPlayer.start();
startPlayProgressUpdater();
//playButton.setEnabled(true);
playButton.setVisibility(View.VISIBLE);
} catch (IOException e) {
Log.e(getClass().getName(), "Error initializing the MediaPlayer.", e);
return;
}
}
private MediaPlayer createMediaPlayer(File mediaFile)
throws IOException {
MediaPlayer mPlayer = new MediaPlayer();
mPlayer.setOnErrorListener(
new MediaPlayer.OnErrorListener() {
public boolean onError(MediaPlayer mp, int what, int extra) {
Log.e(getClass().getName(), "Error in MediaPlayer: (" + what +") with extra (" +extra +")" );
return false;
}
});
FileInputStream fis = new FileInputStream(mediaFile);
mPlayer.setDataSource(fis.getFD());
mPlayer.prepare();
return mPlayer;
}
/**
* Transfer buffered data to the MediaPlayer.
* NOTE: Interacting with a MediaPlayer on a non-main UI thread can cause thread-lock and crashes so
* this method should always be called using a Handler.
*/
private void transferBufferToMediaPlayer() {
try {
boolean wasPlaying = mediaPlayer.isPlaying();
int curPosition = mediaPlayer.getCurrentPosition();
File oldBufferedFile = new File(context.getCacheDir(),"playingMedia" + counter + ".m4a");
File bufferedFile = new File(context.getCacheDir(),"playingMedia" + (counter++) + ".m4a");
bufferedFile.deleteOnExit();
moveFile(downloadingMediaFile,bufferedFile);
//mediaPlayer.pause();
mediaPlayer.release();
mediaPlayer = createMediaPlayer(bufferedFile);
mediaPlayer.seekTo(curPosition);
boolean atEndOfFile = mediaPlayer.getDuration() - mediaPlayer.getCurrentPosition() <= 1000;
if (wasPlaying || atEndOfFile){
mediaPlayer.start();
}
oldBufferedFile.delete();
}catch (Exception e) {
Log.e(getClass().getName(), "Error updating to newly loaded content.", e);
}
}
private void fireDataLoadUpdate() {
Runnable updater = new Runnable() {
public void run() {
//float loadProgress = ((float)totalBytesRead/(float)mediaLengthInKb);
//float per = ((float)numread/mediaLengthInKb) * 100;
float per = ((float)totalBytesRead/totalsize) * 100;
textStreamed.setText((totalKbRead + " Kb (" + (int)per + "%)"));
progressBar.setSecondaryProgress((int)(per));
pb.setSecondaryProgress((int)(per));
}
};
handler.post(updater);
}
private void fireDataFullyLoaded() {
Runnable updater = new Runnable() {
public void run() {
transferBufferToMediaPlayer();
downloadingMediaFile.delete();
textStreamed.setText(("Download completed" ));
}
};
handler.post(updater);
}
public MediaPlayer getMediaPlayer() {
return mediaPlayer;
}
public void startPlayProgressUpdater() {
audiofiletime =mediaPlayer.getDuration();
float progress = (((float)mediaPlayer.getCurrentPosition()/ audiofiletime) * 100);
progressBar.setProgress((int)(progress));
//pb.setProgress((int)(progress*100));
if (mediaPlayer.isPlaying()) {
Runnable notification = new Runnable() {
public void run() {
startPlayProgressUpdater();
}
};
handler.postDelayed(notification,1000);
}
}
public void interrupt() {
playButton.setEnabled(false);
isInterrupted = true;
validateNotInterrupted();
}
/**
* Move the file in oldLocation to newLocation.
*/
public void moveFile(File oldLocation, File newLocation)
throws IOException {
if ( oldLocation.exists( )) {
BufferedInputStream reader = new BufferedInputStream( new FileInputStream(oldLocation) );
BufferedOutputStream writer = new BufferedOutputStream( new FileOutputStream(newLocation, false));
try {
byte[] buff = new byte[5461];
int numChars;
while ( (numChars = reader.read( buff, 0, buff.length ) ) != -1) {
writer.write( buff, 0, numChars );
}
} catch( IOException ex ) {
throw new IOException("IOException when transferring " + oldLocation.getPath() + " to " + newLocation.getPath());
} finally {
try {
if ( reader != null ){
writer.close();
reader.close();
}
} catch( IOException ex ){
Log.e(getClass().getName(),"Error closing files when transferring " + oldLocation.getPath() + " to " + newLocation.getPath() );
}
}
} else {
throw new IOException("Old location does not exist when transferring " + oldLocation.getPath() + " to " + newLocation.getPath() );
}
}
}

Android AudioRecord - Won't Initialize 2nd time

Hej, im currently trying to get AudioRecord to work. Because I need it in a bigger project. But it seems to mess up a lot.
I have been trying alot of things, so I went back to basic when I traced this bug.
I am using my Samsung Galaxy S as my debugdevice.
My problem is, first time after a reboot of my device I can initialize the AudioRecord object I create without problems.
But the second time I run it, it won't initialize the AudioRecord object.
I have tried several frequencies, fyi.
Here is my code:
package android.audiorecordtest;
import android.app.Activity;
import android.media.AudioFormat;
import android.media.AudioRecord;
import android.media.MediaRecorder;
import android.os.Bundle;
import android.view.View;
import android.view.View.OnClickListener;
import android.widget.Button;
import android.widget.TextView;
public class AudioRecordTest extends Activity {
int frequency;
AudioRecord audRec;
TextView txtVw;
/** Called when the activity is first created. */
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
txtVw = (TextView) findViewById(R.id.txtVw);
frequency=8000;
int bufferSize=(AudioRecord.getMinBufferSize(frequency, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT))*2;
if (bufferSize>0) {
audRec = new AudioRecord(MediaRecorder.AudioSource.MIC, frequency, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferSize);
int status = audRec.getState();
if (status == AudioRecord.STATE_INITIALIZED) {
txtVw.setText("Initialized" + frequency);
} else {
txtVw.setText("Not Initialized i=" + frequency);
}
}
After a few hours of looking through logcat information i found this event
02-28 10:46:37.048: DEBUG/dalvikvm(4477): GC_EXPLICIT freed 1801 objects / 98944 bytes in 97ms
02-28 10:46:37.048: VERBOSE/AudioRecord(4477): stop
Which seems to "release the native hold on the AudioRecord.
So i tried doing an override of finalize with my Audiorecord object.release(). This didnt work though.. Anyone have any idea?
I was able to reproduce your problem (on a Samsung phone). I added an onDestroy() method releasing the record:
#Override
public void onDestroy() {
super.onDestroy();
System.out.println("OnDestroy");
audRec.release();
}
After adding this, the audioRecord seems to initialize correctly every time the activity is started.
I had the same problem, usually the audRec.release() helps indeed, but if you need to stop and start several times the following code is more robust. Plus, I had an issue that the recording took place in a separate thread and Android sometimes kills threads when running for a long time. So take a look at this code, it makes sure the recording is held even when the other thread is dead and upon the following audRec.start() it stops and releases:
import android.media.AudioFormat;
import android.media.AudioRecord;
import android.media.MediaRecorder;
public class RecorderSingleton {
private static final int FREQUENCY = 16000;
public static RecorderSingleton instance = new RecorderSingleton();
private AudioRecord recordInstance = null;
private int bufferSize;
private RecorderSingleton() {
bufferSize = AudioRecord.getMinBufferSize(FREQUENCY, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);
}
public boolean init() {
recordInstance = new AudioRecord(MediaRecorder.AudioSource.MIC, FREQUENCY, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferSize);
if (recordInstance.getState() == AudioRecord.STATE_UNINITIALIZED) {
return false;
}
return true;
}
public int getBufferSize() {
return bufferSize;
}
public boolean start() {
if (recordInstance != null && recordInstance.getState() != AudioRecord.STATE_UNINITIALIZED) {
if (recordInstance.getRecordingState() != AudioRecord.RECORDSTATE_STOPPED) {
recordInstance.stop();
}
recordInstance.release();
}
if (!init()) {
return false;
}
recordInstance.startRecording();
return true;
}
public int read(short[] tempBuffer) {
if (recordInstance == null) {
return AudioRecord.ERROR_INVALID_OPERATION;
}
int ret = recordInstance.read(tempBuffer, 0, bufferSize);
return ret;
}
public void stop() {
if (recordInstance == null) {
return;
}
recordInstance.stop();
recordInstance.release();
}
}
Then if you have a recorder thread you can use it as follows:
import android.media.AudioRecord;
public class Recorder implements Runnable {
private int requiredSamples;
private int takenSamples = 0;
private boolean cancelled = false;
public void run() {
// We're important...
android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);
int bufferRead = 0;
int bufferSize = RecorderSingleton.instance.getBufferSize();
short[] tempBuffer = new short[bufferSize];
if (!RecorderSingleton.instance.start()) {
return;
}
try {
Log.d(RoomieConstants.LOG_TAG, "Recorder Started");
while (takenSamples < requiredSamples && !cancelled) {
bufferRead = RecorderSingleton.instance.read(tempBuffer);
if (bufferRead == AudioRecord.ERROR_INVALID_OPERATION) {
throw new IllegalStateException("read() returned AudioRecord.ERROR_INVALID_OPERATION");
} else if (bufferRead == AudioRecord.ERROR_BAD_VALUE) {
throw new IllegalStateException("read() returned AudioRecord.ERROR_BAD_VALUE");
}
takenSamples += bufferRead;
// do something with the samples ...
// ...
// ...
}
} finally {
// Close resources...
stop();
}
}
public void stop() {
RecorderSingleton.instance.stop();
}
public void cancel() {
cancelled = true;
}
}
To Answer my own question, the only way i found it doable to use AudioRecord, is to never have it as an global variable, dont know why, but it seems it won't let you release the resources of the instance correctly if you do so.
You should try to call audRec.stop() to release the resource.
My AudioRecord didn't initialize because it was static

Categories

Resources