I want to play a horn sound in the app without any lag. I am using Media player class but its giving lag while playing file again.
code to run:
thread to improve the lag:(mp_horn is the media player instance we made from sound file) Below thread gives us much better result then making mediaplayer.setloop(true)
#Override
public void run() {
try {
if (mp_horn != null && mp_horn.isPlaying()) {
final long durationTotal_horn = mp_horn
.getDuration();
long durationCurrent_horn = mp_horn
.getCurrentPosition();
if (durationCurrent_horn >= (.90) * durationTotal_horn) {
// mp_engineContiue.seekTo((int)
// durationCurrent_back);
Log.v("arrrrrr", durationCurrent_horn
+ "......."
+ durationTotal_horn);
// mp_engineContiue.pause();
mp_horn.seekTo((int) (durationTotal_horn * .0000001));
}
}
} catch (IllegalStateException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
Probably
if (mp_horn != null && mp_horn.isPlaying())
should be :
if (mp_horn != null && !mp_horn.isPlaying())
otherwise while the Horn is playing you are doing all these calculations all over again, which I am guessing is causing the lag
Related
I am working on streaming radio application. everything is working fine except the changing the equalizer effect does not affect sound.
Changing the equalizer effect by calling usePreset(preset) does not make any changes in the sound effects.
Even though there is no error, why usePreset does not change the sound effects.
I have tested in samsung galaxy sII with 4.0.3.
public void startPlayer() {
//
// Check whether we can acquire the audio focus
// to start the player
//
if (!requestAudioFocus()) {
return;
}
if (null != mAudioPlayer) {
if (mAudioPlayer.isPlaying()) {
mAudioPlayer.stop();
}
mAudioPlayer.reset();
} else {
mAudioPlayer = new MediaPlayer();
mAudioPlayer.reset();
}
try {
notifyProgressUpdate(PLAYER_INITIALIZING);
try {
mEqualizer = new Equalizer(0, mAudioPlayer.getAudioSessionId());
mEqualizer.setEnabled(true);
Log.d(TAG,
"Audio Session ID " + mAudioPlayer.getAudioSessionId()
+ "Equalizer " + mEqualizer + " Preset "
+ mEqualizer.getCurrentPreset());
} catch (Exception ex) {
mEqualizer = null;
}
mAudioPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
mAudioPlayer.setDataSource(mCurrentTrack.getStreamURL());
//
// Add the Listener to track the player status
//
mAudioPlayer.setOnCompletionListener(this);
mAudioPlayer.setOnBufferingUpdateListener(this);
mAudioPlayer.setOnPreparedListener(this);
mAudioPlayer.setOnInfoListener(this);
mAudioPlayer.setOnErrorListener(this);
notifyProgressUpdate(PLAYER_BUFFERING);
mAudioPlayer.prepareAsync();
} catch (IllegalArgumentException e) {
e.printStackTrace();
} catch (SecurityException e) {
e.printStackTrace();
} catch (IllegalStateException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
//Get the available presets from the equalizer
public String[] getEqualizerPresets() {
String[] presets = null;
short noOfPresets = -1;
if (null != mEqualizer) {
noOfPresets = mEqualizer.getNumberOfPresets();
presets = new String[noOfPresets];
for (short index = 0; index < noOfPresets; index++) {
presets[index] = mEqualizer.getPresetName(index);
}
}
return presets;
}
//Set the user preferred presets
public void setEqualizerPreset(int position) {
if (null != mEqualizer) {
Log.d(TAG, "setting equlizer effects " + position);
Log.d(TAG, "Equalizer " + mEqualizer + " set Preset " + position);
mEqualizer.usePreset((short)position);
Log.d(TAG, "Equalizer " + mEqualizer + " current Preset "
+ mEqualizer.getCurrentPreset());
}
}
Appreciate your help to identify the issue.
EDIT
This issue is not resolved yet. i did not find any sample code which explain Equalizer Preset usage.
Any reference to code sample which uses Preset welcome.
this is a fully source code for equalizer, hope this will help you
I have the same problem. When I load it on emulator it produce an error that I don't really know why, it always says ...audiofx.Equalizer. and audiofx.AudioEffect. or something similar. But I have discovered that if you have other media player like n7player in my case, try to close it and try again your media player. In my case it works, but I think that it has to be one method to get some equalizer that is active.
I am trying to run the audio recording http://developer.android.com/guide/topics/media/index.html, Its working fine, what I need is to show max amplitude while recording voice continuously. What is the best approach for that.
Max amplitude gives max amplitude of the given sample, so I taken sample for every 250 milli seconds and calculated max amplitude
public void run() {
int i = 0;
while(i == 0) {
Message msg = mHandler.obtainMessage();
Bundle b = new Bundle();
try {
sleep(250);
} catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
if (mRecorder != null) {
amplitude = mRecorder.getMaxAmplitude();
b.putLong("currentTime", amplitude);
Log.i("AMPLITUDE", new Integer(amplitude).toString());
} else {
b.putLong("currentTime", 0);
}
msg.setData(b);
mHandler.sendMessage(msg);
}
}
I used message handlers to modify front end using background process thread
Create a thread which runs all the time.
In the thread do this:
int amp = mrec.getMaxAmplitude();
if (amp > 0)
yourcode;
Do you need more information on the thread?
I've been working with Eclipse ADT for about 2 months. In that time, I have a small utility that allows me to select an IP Address and Port, and then send a file to that combo. The utility works as intended, but when I type in the wrong file name, the application hangs.
#Override
public void run() {
if (data != null) {
this.send(data);
} else if (this.file != null) {
if (file.exists()) {
this.send(file);
} else {
transferError = new FileNotFoundException("The specified file could not be found");
}
}
}
I've even tried to do the following in hopes that one or the other would throw, but I am unsuccessful in both.
public void run() {
if (data != null) {
this.send(data);
} else if (this.file != null) {
if (file.exists()) {
this.send(file);
} else {
transferError = new FileNotFoundException("The specified file could not be found");
}
}try {
throw new Exception("blah blah blah");
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
I've jockeyed around the exception, I've added the one above, I've tried placing it in different places, and all unsuccessful. Again, I'm exceptionally new to this, and got here from basically mincing various tcp client codes. Aside of creating a way to throw the exception correctly, please help me understand why the first one isn't working and why the one you suggest is.
in your else block you aren't throwin the transferError you create.
throw transferError;
However you probably won't be able to do that because FileNotFoundException is a checked exception and the run() method doesn't declare any thrown exceptions. You probably need to find a different way to present the error to the user, like with a Toast or something.
Your second block doesn't work because you are catching the exception you throw.
Situation: I would like to pull a network resource (a file), display it for 30 seconds, then load the next file. While displaying the initial file for 30 seconds, I'd like to preload the next file.
Question: Should I be using concurrent threads with a lock (like a linked blocking queue?) or a handler? Or something I'm missing?
Currently I have the onCreate call an asynctask that has a for loop which iterates through an array of paths for the files one at a time, which then calls the next asynctask (inside the for loop) which downloads the file from that path and then in the onPostExecute assigns the file to the UI view.
This works but I am unable to setup timing where the first file loads in the UI view then 30 seconds later the next file loads in the UI view and so on... Right now it loads the files fine but the 1st may take 40 seconds to display and the 2nd 25 seconds and the 3rd 60.
(FYI, the files are uniform size and only take 5~15 sec to load)
Here's a general example (non compile-able):
onCreate(){
new firstAsynchTask().execute(filePaths[]);
}
private class firstAsynchTask extends AsyncTask<String[], Void, Void> {
protected Void doInBackground(String[]... x) {
try {
for(int i = 0; x[0] != null && i < x.length; i++)
{
long startT = System.currentTimeMillis();
if(x[0][i].isFile())
{
SmbFile g = new SmbFile(x[0][i].getPath());
new secondAsynchTask().execute(g);
long TimeNow = (System.currentTimeMillis() - 30000);
if(startT > TimeNow)
{
try {
Thread.sleep(startT - TimeNow);
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();}
}
}
if(i == x.length - 1)
{
i = 0;
}
}
return null;
} catch (IOException e) {
// TODO Auto-generated catch block
return null;
}
}
private class SecondAsyncTask extends AsyncTask<File, Integer, Drawable> {
protected File doInBackground(File... x) {
SomeType FileTemp = new File();
try {
long startT = System.currentTimeMillis();
FileTemp = (get streamed file);
long TimeNow = (System.currentTimeMillis() - 30000);
if(startT > TimeNow)
{
Thread.sleep(startT - TimeNow);
return FileTemp;
}
else
{
return FileTemp;
}
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
return null;
} catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
return null;
}
}
protected void onPostExecute(File result) {
if(result != null)
{
FileView centerFile = (FileView) findViewById(R.id.FileView1);
centerFile.setFile(result);
}
}
}
It looks like a simple producer-consumer approach would be sufficient for what you need. I would have an array of URLs that point to your network resource with a synchronised access. A single AsyncTask would be enough to pop a single URL off the queue, download and display it. How you will re-invoke the task depends on you: you can display downloaded resource and at the same time run task, putting downloaded resource to some temporary placeholder (a Drawable member field for example). Have a timer that every 30 seconds just puts temporary resource to the display and runs download/cache task again.
Of course this is a very simple solutions without handling cases when your task haven't finished in 30 seconds time frame, etc.
Another approach would be to have a boolean object that you will set to true every 30 seconds. Your AsyncTask runs, downloads image and on onPostExecute checks if previously mentioned boolean is set to true, if yes - displays image, sets flag to false and finishes. If flag is false then it enters a loop that sleeps for, say, 200ms and checks condition again, etc.
I created a small TTS app implementing OnUtteranceCompleteListener and, while things seem to be working exactly as expected, I noticed the following on LogCat (one for each completed utterance):
03-01 20:47:06.436:
VERBOSE/TtsService(381): TTS callback:
dispatch completed to 1
Again, this seems to be benign but I don't understand what '1' means. All such lines for all utterances say "completed to 1", even for utterance IDs that are greater than 1.
What does '1' mean in this log?
BTW, this message is not generated by my code but rather by the TTS engine (Pico) itself.
Looking at the TTSService.java source code available at http://eyes-free.googlecode.com you can find the function dispatchUtteranceCompletedCallback():
private void dispatchUtteranceCompletedCallback(String utteranceId, String packageName) {
/* Legacy support for TTS */
final int oldN = mCallbacksOld.beginBroadcast();
for (int i = 0; i < oldN; i++) {
try {
mCallbacksOld.getBroadcastItem(i).markReached("");
} catch (RemoteException e) {
// The RemoteCallbackList will take care of removing
// the dead object for us.
}
}
try {
mCallbacksOld.finishBroadcast();
} catch (IllegalStateException e) {
// May get an illegal state exception here if there is only
// one app running and it is trying to quit on completion.
// This is the exact scenario triggered by MakeBagel
return;
}
/* End of legacy support for TTS */
ITtsCallbackBeta cb = mCallbacksMap.get(packageName);
if (cb == null) {
return;
}
Log.v(SERVICE_TAG, "TTS callback: dispatch started");
// Broadcast to all clients the new value.
final int N = mCallbacks.beginBroadcast();
try {
cb.utteranceCompleted(utteranceId);
} catch (RemoteException e) {
// The RemoteCallbackList will take care of removing
// the dead object for us.
}
mCallbacks.finishBroadcast();
Log.v(SERVICE_TAG, "TTS callback: dispatch completed to " + N);
}
1 is the current value of N, which is initialized by the return value from mCallbacks.beginBroadcast().
beginBroadcast() is a method of the class RemoteCallbackList and its documentation states that it:
Returns the number of callbacks in the
broadcast, to be used with
getBroadcastItem(int) to determine the
range of indices you can supply
Does this help?