I am new to Android development, so I am reaching out to see if there is a more efficient, or faster way to switch a MediaPlayer datasource with an onTouch method. I'm trying to create a instrument that will play like a flute, but the audio source wont switch fast enough when I press (touch) the buttons.
I am using the playNote() method to switch between the audio files. Any advice is appreciated.
public class PlayAggeion extends Activity {
ImageButton patC1;
int soundIsOn = 1;
MediaPlayer mp;
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_play_aggeion);
onConfigurationChanged(null);
addListenerPatima();
mp = new MediaPlayer();
playNote(R.raw.aa);
}
public void addListenerPatima() {
patC1 = (ImageButton) findViewById(R.id.patC1);
patC1.setOnTouchListener(new View.OnTouchListener() {
#Override
public boolean onTouch(View v, MotionEvent event) {
switch(event.getAction())
{
case MotionEvent.ACTION_DOWN:
playNote(R.raw.bb);
return true;
case MotionEvent.ACTION_UP:
playNote(R.raw.aa);
return true;
}
return false;
};
});
}
public void playNote(int note){
// Play note
try {
mp.reset();
mp.setDataSource(getApplicationContext(), Uri.parse("android.resource://" + getPackageName() + "/" + note));
mp.prepare();
mp.setLooping(true);
mp.start();
} catch (IllegalArgumentException e) {
e.printStackTrace();
} catch (IllegalStateException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
I think you should use SoundPool instead of MediaPlayer. SoundPool lets you preload a number of sound files and lets you play them one after another without any additional delay. It is often used in games and sound board apps, so it seems to perfectly match your needs.
More info:
http://developer.android.com/reference/android/media/SoundPool.html
Nice tutorial:
http://www.vogella.com/articles/AndroidMedia/article.html#tutorial_soundpool
Related
I have searched and researched stackoverflow and google but can't find any answer to MY question. I've found other question and answers but that were related to sounds saved in the app but I'm creating an app which gets data from Parse server, so it gets mp3 files and display these in listview and than when an item is clicked it plays that track. But here comes the problem: When you play a sound and click on another one, the first just doesn't stop and the second starts to play.
I have tried with the following code but it's just not working.
Here's my code:
play.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
final MediaPlayer mediaPlayer = new MediaPlayer();
final MediaPlayer scndmediaPlayer = new MediaPlayer();
mediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
scndmediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
if (mediaPlayer.isPlaying()) {
Toast.makeText(getContext(), "First is playing", Toast.LENGTH_SHORT).show();
try {
mediaPlayer.stop();
scndmediaPlayer.setDataSource(audioFileURL);
scndmediaPlayer.prepare();
scndmediaPlayer.start();
//soundtoolbar.setTitle(name);
} catch (IllegalArgumentException e1) {
e1.printStackTrace();
} catch (SecurityException e1) {
e1.printStackTrace();
} catch (IllegalStateException e1) {
e1.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
} else {
try {
if (scndmediaPlayer.isPlaying()){
scndmediaPlayer.stop();
}
Toast.makeText(getContext(), "First is starting", Toast.LENGTH_SHORT).show();
mediaPlayer.setDataSource(audioFileURL);
mediaPlayer.prepare();
mediaPlayer.start();
soundtoolbar.setTitle(name);
} catch (IllegalArgumentException e1) {
e1.printStackTrace();
} catch (SecurityException e1) {
e1.printStackTrace();
} catch (IllegalStateException e1) {
e1.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
playPause.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
if (mediaPlayer.isPlaying() || scndmediaPlayer.isPlaying()) {
mediaPlayer.pause();
scndmediaPlayer.pause();
playPause.setBackground(getContext().getDrawable(R.drawable.ic_play_arrow_white_24dp));
} else {
mediaPlayer.start();
scndmediaPlayer.start();
playPause.setBackground(getContext().getDrawable(R.drawable.ic_pause_white_24dp));
}
}
});
}
});
I've created 2 mediaplayers with the code above and when user clicks the play button it first checks if any of the player is running.
I'm trying to achieve the following: When user clicks the play button it checks if the (1st) mediaPlayer is running or not. If it's running, it has just to stop it and launch (2nd) scndmediaPlayer or viceversa... if second is playing it stops that and launch first one. so it will create a loop: 1st is playing? User clicks another button stop first. Launch second. User clicks another button. First is playing? No. Second is playing? Yes. Stop the second and launch the first.
But can't find where is the problem in my code.
Please help me with this. I'm trying to resolve it from 2 days but I'm unable...
Thanks :)
EDIT: I tried using one MediaPlayer and do the following: Check if mediaplayer is playing! No it isn't playing. Start it. User clicks the button again and it stops the mediaplayer and start it with new audioFileUrl. BUT. MediaPlayer is forgetting that it's playing. Seems like it just starts the track and than forget and to check if it's true i set a Toast: when mediaplayer isn't playing the toast shows and it's showing every time I click a track in the list which means it forget that it has a track which is playing...
EDIT 2: I managed to do the following: It plays the track. User clicks another track. It stops the mediaplayer but doesn't play the new track. User click once again. It plays the new track. User clicks the new track and the app crashes...
EDIT 3: Posting my entire class:
public class MyAdapter extends ParseQueryAdapter<ParseObject> {
public Button playPause, next, previous;
public Toolbar soundtoolbar;
boolean isPlaying = false;
public MyAdapter(Context context) {
super(context, new ParseQueryAdapter.QueryFactory<ParseObject>() {
public ParseQuery create() {
ParseQuery query = new ParseQuery("MyClass");
query.orderByDescending("createdAt");
return query;
}
});
}
#Override
public View getItemView(final ParseObject object, View v, final ViewGroup parent) {
if (v == null) {
v = View.inflate(getContext(), R.layout.activity_audio_files_item, null);
}
super.getItemView(object, v, parent);
final Button play = (Button) v.findViewById(R.id.play);
playPause = TabFragment1.playPause;
next = TabFragment1.next;
previous = TabFragment1.previous;
soundtoolbar = TabFragment1.soundtoolbar;
final ParseFile descr = object.getParseFile("audiofile");
final String name = object.getString("name");
final String audioFileURL = descr.getUrl();
final SlidingUpPanelLayout slidingUpPanelLayout = TabFragment1.spanel;
play.setText(name);
final MediaPlayer mediaPlayer = new MediaPlayer();
play.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
if (isPlaying != true) {
Toast.makeText(getContext(), name+" is playing", Toast.LENGTH_SHORT).show();
try {
mediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
mediaPlayer.setDataSource(audioFileURL);
mediaPlayer.prepareAsync();
mediaPlayer.setOnPreparedListener(new MediaPlayer.OnPreparedListener() {
#Override
public void onPrepared(MediaPlayer mp) {
soundtoolbar.setTitle(name);
slidingUpPanelLayout.setPanelState(SlidingUpPanelLayout.PanelState.EXPANDED);
mediaPlayer.start();
isPlaying = true;
}
});
} catch (IllegalArgumentException e1) {
e1.printStackTrace();
} catch (SecurityException e2) {
e2.printStackTrace();
} catch (IllegalStateException e3) {
e3.printStackTrace();
} catch (IOException e4) {
e4.printStackTrace();
} catch (NullPointerException e5) {
e5.printStackTrace();
}
} else {
mediaPlayer.stop();
Toast.makeText(getContext(), "Starting "+name, Toast.LENGTH_SHORT).show();
try {
mediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
mediaPlayer.prepareAsync();
mediaPlayer.setDataSource(audioFileURL);
mediaPlayer.setOnPreparedListener(new MediaPlayer.OnPreparedListener() {
#Override
public void onPrepared(MediaPlayer mp) {
soundtoolbar.setTitle(name);
slidingUpPanelLayout.setPanelState(SlidingUpPanelLayout.PanelState.EXPANDED);
mediaPlayer.start();
}
});
} catch (IllegalArgumentException e1) {
e1.printStackTrace();
} catch (SecurityException e2) {
e2.printStackTrace();
} catch (IllegalStateException e3) {
e3.printStackTrace();
} catch (IOException e4) {
e4.printStackTrace();
} catch (NullPointerException e5){
e5.printStackTrace();
}
}
playPause.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
if (mediaPlayer.isPlaying()) {
mediaPlayer.pause();
playPause.setBackground(getContext().getDrawable(R.drawable.ic_play_arrow_white_24dp));
} else {
mediaPlayer.start();
playPause.setBackground(getContext().getDrawable(R.drawable.ic_pause_white_24dp));
}
}
});
}
});
return v;
}
}
Somebody please help...
i suggest you to checkout SoundPool . İt ll helps you. And one more , may be you ll put media urls to array or something like this.And use one mediaPlayer By the way, you ll avoid from two mediaPlayer and avoid from memory leak.
http://developer.android.com/reference/android/media/SoundPool.html
My problem is really simple: I'm using a MediaRecorder to record voice while the user is pressing on a FAB, and playing it afterwards (when he/she releases). The issue is that I lose a few seconds near the end of the recording, and I can't figure out why (they never get played back). Code (only relevant parts are shown):
Variables
double record_length = 0;
boolean recording = false;
String outputFile;
Handler myHandler = new Handler();
MediaRecorder recorder = new MediaRecorder();
OnTouchListener
findViewById(R.id.record_record).setOnTouchListener(new View.OnTouchListener() {
#Override
public boolean onTouch(View v, MotionEvent event) {
if (event.getAction() == MotionEvent.ACTION_DOWN) {
findViewById(R.id.delete_swipe).setVisibility(View.VISIBLE);
StartRecord();
} else if (event.getAction() == MotionEvent.ACTION_UP) {
if(recording){
EndRecord();
}
findViewById(R.id.delete_swipe).setVisibility(View.INVISIBLE);
}
return true;
}
});
.
public void StartRecord() {
recording = true;
record_length = 0;
SharedPreferences saved_login = getSharedPreferences("FalloundLogin", 0);
recorder.setAudioSource(MediaRecorder.AudioSource.MIC);
recorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
recorder.setAudioEncoder(MediaRecorder.OutputFormat.AMR_NB);
//removed construction of outputFile, but it is generated correctly - I checked
recorder.setOutputFile(outputFile);
try {
recorder.prepare();
recorder.start();
} catch (IOException e) {
e.printStackTrace();
}
myHandler.postDelayed(UpdateUploadLength, 200);
}
.
public void EndRecord() {
recording = false;
try {
recorder.stop();
recorder.reset();
recorder = null;
} catch (IllegalStateException e) {
e.printStackTrace();
}
MediaPlayer m = new MediaPlayer();
try {
m.setDataSource(outputFile);
} catch (IOException e) {
e.printStackTrace();
}
try {
m.prepare();
} catch (IOException e) {
e.printStackTrace();
}
m.start();
}
I need the recording to be a maximum of 27 seconds. To avoid complications, I tested without this extra termination condition and am including the Runnable just for completeness.
private Runnable UpdateUploadLength = new Runnable(){
#Override
public void run() {
if(recording == true) {
record_length += 0.2;
if (record_length < 27) {
myHandler.postDelayed(UpdateUploadLength, 200);
} else {
//TODO: stop recording
myHandler.removeCallbacks(UpdateUploadLength);
}
}
};
I've been trying for a few hours with no luck, so any help is appreciated (also - and I dunno if it's bad to ask multiple questions in the same post - but is there any way to get better audio quality from MediaRecorder?)
Thanks in advance.
Its answear for your second question. Yes you can have much better quality. There is more encoding types, file formats and parameters in library. Example:
mediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
mediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AAC);
mediaRecorder.setAudioSamplingRate(44100);
mediaRecorder.setAudioEncodingBitRate(256000);
this code will set your recorder to m4a files with AAC, 44,1kHz sampling rate and around 256kbps
In my application I need to play a sound in different moments and wait for it to end before doing anything else. Right now I have this code for playing sound:
private MediaPlayer mPlayer = null;
private boolean playSound(){
mPlayer = new MediaPlayer();
try{
mPlayer.setDataSource(openFileInput(fileName).getFD());
mPlayer.prepare();
mPlayer.start();
} catch (Exception e){
e.printStackTrace();
Log.e("player", "error playing sound: "+fileName);
return false;
}
while(mPlayer.isPlaying());
mPlayer.release();
mPlayer = null;
return true;
}
It's not working as I expected: when changing an image before playing the sound the change happens after the sound is played (I suppose it's for the thread queue or something like this)
I don't know if this is a bad solution or if there is any better solution for my case:
the sound playing must start after all previous work is finished.
the app must not do any other work while playing the sound.
the code to execute after the sound playing is not always the same.
You should implement a completion listener:
public class myclass extends .... implements OnCompletionListener, ... {
private int nextAction; // not the smatest solution, but good enough :)
...
private boolean playSound(next){
nextAction = next;
mPlayer = new MediaPlayer();
try{
mPlayer.setDataSource(openFileInput(fileName).getFD());
mPlayer.prepare();
mPlayer.setOnCompletionListener(this);
mPlayer.start();
} catch (Exception e){
e.printStackTrace();
Log.e("player", "error playing sound: "+fileName);
return false;
}
#Override
public void onCompletion(MediaPlayer mp) {
// do your stuff, destroy the mplayer, if needed
switch(nextAction){
case ACTION1: ...; break;
case ACTION2: ...; break; //..and so on
}
}
Making an app and streaming audio from site. I've got a menu and when I click the button to open the radio activity it can take from 8-20 seconds to load and sometimes force closes. Any help would be awesome thanks.
Code:
public class Radio extends Activity {
private MediaPlayer mp;
private ImageButton pauseicon;
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.player_1);
pauseicon = (ImageButton) findViewById(R.id.pauseicon);
getActionBar().setDisplayHomeAsUpEnabled(true);
/**
* Play button click event plays a song and changes button to pause
* image pauses a song and changes button to play image
* */
String res = "http://216.235.91.36/play?s=magic24point7&d=LIVE365&r=0&membername=&session=magic24point7:0&AuthType=NORMAL&app_id=live365%3ABROWSER&SaneID=24.79.96.172-13316781890137014897763&tag=live365";
mp = new MediaPlayer();
try {
mp.setAudioStreamType(AudioManager.STREAM_MUSIC);
mp.setDataSource(res);
mp.prepare();
mp.start();
} catch (IllegalArgumentException e) {
e.printStackTrace();
} catch (IllegalStateException e) {
e.printStackTrace();
} catch (IOException e) {
}
pauseicon.setOnClickListener(new View.OnClickListener() {
public void onClick(View v) {
// TODO Auto-generated method stub
// No need to check if it is pauseicon
if (mp.isPlaying()) {
mp.pause();
((ImageButton) v).setImageResource(R.drawable.playicon);
} else {
mp.start();
((ImageButton) v).setImageResource(R.drawable.pauseicon);
}
}
});
}
#Override
public boolean onOptionsItemSelected(MenuItem item) {
switch (item.getItemId()) {
case android.R.id.home:
NavUtils.navigateUpFromSameTask(this);
if (mp != null)
if (mp.isPlaying())
mp.stop();
mp.release();
return true;
default:
return super.onOptionsItemSelected(item);
}
}
#Override
public void onBackPressed() {
if (mp != null) {
if (mp.isPlaying())
mp.stop();
mp.release();
}
// there is no reason to call super.finish(); here
// call super.onBackPressed(); and it will finish that activity for you
super.onBackPressed();
}
}
Use prepareAsync() and setOnPreparedListener() instead of prepare(). prepare() blocks the UI thread until it returns and is not recommended for a stream. This may be the cause your crash.
mp = new MediaPlayer();
try {
mp.setAudioStreamType(AudioManager.STREAM_MUSIC);
mp.setDataSource(res);
mp.setOnPreparedListener(new OnPreparedListener() {
#Override
public void onPrepared(MediaPlayer player) {
mp.start();
}
});
mp.prepareAsync();
} catch (IllegalArgumentException e) {
e.printStackTrace();
} catch (IllegalStateException e) {
e.printStackTrace();
} catch (IOException e) {
}
http://developer.android.com/reference/android/media/MediaPlayer.html#prepare()
Prepares the player for playback, synchronously. After setting the datasource and the display surface, you need to either call prepare() or prepareAsync(). For files, it is OK to call prepare(), which blocks until MediaPlayer is ready for playback.
Otherwise I think the network is your bottleneck. The fastest way to speed things up is to ensure your server/client communication is quick. There doesn't seem to be anything inherently slow about your code.
I've been working on an app where the user touches the screen to start a movie. The screen image is the first frame of the movie. Once the touch happens, the movie plays. I do this by putting a jpg of the first frame in front of the movie, and then removing the jpg once I think the movie is playing. (Figuring out when that happens is impossible, but that's another issue. And on older devices if you remove the image too soon, you get black.)
Tested this on probably six different devices. Today the seventh: Kindle Fire HD. On this device, the movies are all brighter than the corresponding jpgs. On all other devices, they match perfectly. Any ideas what could cause this or how to fix?
(Another issue with the HD is that movies take a REALLY long time to start playing. But that's another issue.)
EDIT: here is my main.xml:
<?xml version="1.0" encoding="utf-8"?>
<FrameLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="fill_parent"
android:layout_height="fill_parent"
android:orientation="vertical" >
<ImageView
android:id="#+id/iv"
android:layout_width="fill_parent"
android:layout_height="fill_parent"/>
<VideoView
android:id="#+id/vv"
android:layout_width="fill_parent"
android:layout_height="fill_parent"/>
</FrameLayout>
and here is code:
public class VideoTestActivity extends Activity implements SurfaceHolder.Callback, OnPreparedListener, OnCompletionListener {
private VideoView vv;
private ImageView iv;
private Bitmap b;
private MediaPlayer mp = new MediaPlayer();
private static final String TAG = VideoTestActivity.class.getSimpleName();
private volatile boolean prepared = false;
private volatile boolean readytoplay = false;
private volatile boolean playing = false;
/** Called when the activity is first created. */
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN, WindowManager.LayoutParams.FLAG_FULLSCREEN);
setContentView(R.layout.main);
iv = (ImageView)findViewById(R.id.iv);
iv.bringToFront();
vv = (VideoView)findViewById(R.id.vv);
b = BitmapFactory.decodeResource(getResources(), R.drawable.babyblack);
iv.setBackgroundColor( 0xFFDFA679 );
vv.getHolder().addCallback(this);
mp.setOnPreparedListener( this );
mp.setOnCompletionListener( this );
try {
mp.setDataSource( this, Uri.parse("android.resource://" + getPackageName() + "/" + R.raw.ape) );
} catch (IllegalArgumentException e) {
Log.d(TAG,"illegal argument exception on set data source");
e.printStackTrace();
} catch (SecurityException e) {
Log.d(TAG,"security exception on set data source");
e.printStackTrace();
} catch (IllegalStateException e) {
Log.d(TAG,"illegal state exception on set data source");
e.printStackTrace();
} catch (IOException e) {
Log.d(TAG,"IO exception on set data source");
e.printStackTrace();
}
}
#Override
public boolean onTouchEvent(MotionEvent event) {
float dx, dy;
Log.d(TAG,"touch event");
if ( !playing && event.getAction() == MotionEvent.ACTION_UP ) {
Log.d(TAG,"action up");
if ( prepared ) {
playing = true;
Log.d(TAG,"hardware accelerated: iv="+iv.isHardwareAccelerated()+", vv="+vv.isHardwareAccelerated());
mp.start();
Log.d(TAG, "playing video in onTouch callback");
Log.d(TAG,"hardware accelerated: iv="+iv.isHardwareAccelerated()+", vv="+vv.isHardwareAccelerated());
} else
readytoplay = true;
}
return true;
}
#Override
public void surfaceChanged(SurfaceHolder arg0, int arg1, int arg2, int arg3) {
// TODO Auto-generated method stub
}
#Override
public void surfaceCreated(SurfaceHolder holder) {
// TODO Auto-generated method stub
Log.d(TAG,"surface is created");
mp.setDisplay( vv.getHolder() );
try {
mp.prepareAsync();
} catch (IllegalArgumentException e) {
Log.d(TAG,"illegal argument exception on prepare");
e.printStackTrace();
} catch (SecurityException e) {
Log.d(TAG,"security exception on prepare");
e.printStackTrace();
} catch (IllegalStateException e) {
Log.d(TAG,"illegal state exception on prepare");
e.printStackTrace();
}
}
#Override
public void surfaceDestroyed(SurfaceHolder holder) {
// TODO Auto-generated method stub
}
#Override
public void onPrepared(MediaPlayer mp) {
Log.d(TAG,"video is prepared");
prepared = true;
if ( readytoplay ) {
playing = true;
mp.start();
iv.setVisibility( View.GONE );
Log.d(TAG,"playing video from prepared callback");
}
}
#Override
public void onCompletion(MediaPlayer arg0) {
Log.d(TAG,"video is done");
playing = false;
iv.setVisibility( View.VISIBLE );
}}
I changed the ImageView to have no image, but just a solid-colored background. The only data file you need is an mp4 movie. When you touch the screen, the movie plays, hidden behind the ImageView. The screen immediately brightens when I touch it (mp.start() happens), then the movie starts playing, and it gradually dims a bit, then brightens again, and finally stabilizes when the movie is done.
I tried hardware acceleration, and no hardware acceleration; no difference. I tried plugging the Kindle Fire HD in, and not plugging it in; no difference.
I would post the 2-second mp4 file that I am using but don't know how.
look like it's by design, per this forum post - https://forums.developer.amazon.com/forums/thread.jspa?threadID=450#1780