Kindle Fire HD: movies are brighter than jpgs - android

I've been working on an app where the user touches the screen to start a movie. The screen image is the first frame of the movie. Once the touch happens, the movie plays. I do this by putting a jpg of the first frame in front of the movie, and then removing the jpg once I think the movie is playing. (Figuring out when that happens is impossible, but that's another issue. And on older devices if you remove the image too soon, you get black.)
Tested this on probably six different devices. Today the seventh: Kindle Fire HD. On this device, the movies are all brighter than the corresponding jpgs. On all other devices, they match perfectly. Any ideas what could cause this or how to fix?
(Another issue with the HD is that movies take a REALLY long time to start playing. But that's another issue.)
EDIT: here is my main.xml:
<?xml version="1.0" encoding="utf-8"?>
<FrameLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="fill_parent"
android:layout_height="fill_parent"
android:orientation="vertical" >
<ImageView
android:id="#+id/iv"
android:layout_width="fill_parent"
android:layout_height="fill_parent"/>
<VideoView
android:id="#+id/vv"
android:layout_width="fill_parent"
android:layout_height="fill_parent"/>
</FrameLayout>
and here is code:
public class VideoTestActivity extends Activity implements SurfaceHolder.Callback, OnPreparedListener, OnCompletionListener {
private VideoView vv;
private ImageView iv;
private Bitmap b;
private MediaPlayer mp = new MediaPlayer();
private static final String TAG = VideoTestActivity.class.getSimpleName();
private volatile boolean prepared = false;
private volatile boolean readytoplay = false;
private volatile boolean playing = false;
/** Called when the activity is first created. */
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN, WindowManager.LayoutParams.FLAG_FULLSCREEN);
setContentView(R.layout.main);
iv = (ImageView)findViewById(R.id.iv);
iv.bringToFront();
vv = (VideoView)findViewById(R.id.vv);
b = BitmapFactory.decodeResource(getResources(), R.drawable.babyblack);
iv.setBackgroundColor( 0xFFDFA679 );
vv.getHolder().addCallback(this);
mp.setOnPreparedListener( this );
mp.setOnCompletionListener( this );
try {
mp.setDataSource( this, Uri.parse("android.resource://" + getPackageName() + "/" + R.raw.ape) );
} catch (IllegalArgumentException e) {
Log.d(TAG,"illegal argument exception on set data source");
e.printStackTrace();
} catch (SecurityException e) {
Log.d(TAG,"security exception on set data source");
e.printStackTrace();
} catch (IllegalStateException e) {
Log.d(TAG,"illegal state exception on set data source");
e.printStackTrace();
} catch (IOException e) {
Log.d(TAG,"IO exception on set data source");
e.printStackTrace();
}
}
#Override
public boolean onTouchEvent(MotionEvent event) {
float dx, dy;
Log.d(TAG,"touch event");
if ( !playing && event.getAction() == MotionEvent.ACTION_UP ) {
Log.d(TAG,"action up");
if ( prepared ) {
playing = true;
Log.d(TAG,"hardware accelerated: iv="+iv.isHardwareAccelerated()+", vv="+vv.isHardwareAccelerated());
mp.start();
Log.d(TAG, "playing video in onTouch callback");
Log.d(TAG,"hardware accelerated: iv="+iv.isHardwareAccelerated()+", vv="+vv.isHardwareAccelerated());
} else
readytoplay = true;
}
return true;
}
#Override
public void surfaceChanged(SurfaceHolder arg0, int arg1, int arg2, int arg3) {
// TODO Auto-generated method stub
}
#Override
public void surfaceCreated(SurfaceHolder holder) {
// TODO Auto-generated method stub
Log.d(TAG,"surface is created");
mp.setDisplay( vv.getHolder() );
try {
mp.prepareAsync();
} catch (IllegalArgumentException e) {
Log.d(TAG,"illegal argument exception on prepare");
e.printStackTrace();
} catch (SecurityException e) {
Log.d(TAG,"security exception on prepare");
e.printStackTrace();
} catch (IllegalStateException e) {
Log.d(TAG,"illegal state exception on prepare");
e.printStackTrace();
}
}
#Override
public void surfaceDestroyed(SurfaceHolder holder) {
// TODO Auto-generated method stub
}
#Override
public void onPrepared(MediaPlayer mp) {
Log.d(TAG,"video is prepared");
prepared = true;
if ( readytoplay ) {
playing = true;
mp.start();
iv.setVisibility( View.GONE );
Log.d(TAG,"playing video from prepared callback");
}
}
#Override
public void onCompletion(MediaPlayer arg0) {
Log.d(TAG,"video is done");
playing = false;
iv.setVisibility( View.VISIBLE );
}}
I changed the ImageView to have no image, but just a solid-colored background. The only data file you need is an mp4 movie. When you touch the screen, the movie plays, hidden behind the ImageView. The screen immediately brightens when I touch it (mp.start() happens), then the movie starts playing, and it gradually dims a bit, then brightens again, and finally stabilizes when the movie is done.
I tried hardware acceleration, and no hardware acceleration; no difference. I tried plugging the Kindle Fire HD in, and not plugging it in; no difference.
I would post the 2-second mp4 file that I am using but don't know how.

look like it's by design, per this forum post - https://forums.developer.amazon.com/forums/thread.jspa?threadID=450#1780

Related

Android MediaPlayer: Multiple sounds playing at the same time. My code is

I have searched and researched stackoverflow and google but can't find any answer to MY question. I've found other question and answers but that were related to sounds saved in the app but I'm creating an app which gets data from Parse server, so it gets mp3 files and display these in listview and than when an item is clicked it plays that track. But here comes the problem: When you play a sound and click on another one, the first just doesn't stop and the second starts to play.
I have tried with the following code but it's just not working.
Here's my code:
play.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
final MediaPlayer mediaPlayer = new MediaPlayer();
final MediaPlayer scndmediaPlayer = new MediaPlayer();
mediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
scndmediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
if (mediaPlayer.isPlaying()) {
Toast.makeText(getContext(), "First is playing", Toast.LENGTH_SHORT).show();
try {
mediaPlayer.stop();
scndmediaPlayer.setDataSource(audioFileURL);
scndmediaPlayer.prepare();
scndmediaPlayer.start();
//soundtoolbar.setTitle(name);
} catch (IllegalArgumentException e1) {
e1.printStackTrace();
} catch (SecurityException e1) {
e1.printStackTrace();
} catch (IllegalStateException e1) {
e1.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
} else {
try {
if (scndmediaPlayer.isPlaying()){
scndmediaPlayer.stop();
}
Toast.makeText(getContext(), "First is starting", Toast.LENGTH_SHORT).show();
mediaPlayer.setDataSource(audioFileURL);
mediaPlayer.prepare();
mediaPlayer.start();
soundtoolbar.setTitle(name);
} catch (IllegalArgumentException e1) {
e1.printStackTrace();
} catch (SecurityException e1) {
e1.printStackTrace();
} catch (IllegalStateException e1) {
e1.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
playPause.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
if (mediaPlayer.isPlaying() || scndmediaPlayer.isPlaying()) {
mediaPlayer.pause();
scndmediaPlayer.pause();
playPause.setBackground(getContext().getDrawable(R.drawable.ic_play_arrow_white_24dp));
} else {
mediaPlayer.start();
scndmediaPlayer.start();
playPause.setBackground(getContext().getDrawable(R.drawable.ic_pause_white_24dp));
}
}
});
}
});
I've created 2 mediaplayers with the code above and when user clicks the play button it first checks if any of the player is running.
I'm trying to achieve the following: When user clicks the play button it checks if the (1st) mediaPlayer is running or not. If it's running, it has just to stop it and launch (2nd) scndmediaPlayer or viceversa... if second is playing it stops that and launch first one. so it will create a loop: 1st is playing? User clicks another button stop first. Launch second. User clicks another button. First is playing? No. Second is playing? Yes. Stop the second and launch the first.
But can't find where is the problem in my code.
Please help me with this. I'm trying to resolve it from 2 days but I'm unable...
Thanks :)
EDIT: I tried using one MediaPlayer and do the following: Check if mediaplayer is playing! No it isn't playing. Start it. User clicks the button again and it stops the mediaplayer and start it with new audioFileUrl. BUT. MediaPlayer is forgetting that it's playing. Seems like it just starts the track and than forget and to check if it's true i set a Toast: when mediaplayer isn't playing the toast shows and it's showing every time I click a track in the list which means it forget that it has a track which is playing...
EDIT 2: I managed to do the following: It plays the track. User clicks another track. It stops the mediaplayer but doesn't play the new track. User click once again. It plays the new track. User clicks the new track and the app crashes...
EDIT 3: Posting my entire class:
public class MyAdapter extends ParseQueryAdapter<ParseObject> {
public Button playPause, next, previous;
public Toolbar soundtoolbar;
boolean isPlaying = false;
public MyAdapter(Context context) {
super(context, new ParseQueryAdapter.QueryFactory<ParseObject>() {
public ParseQuery create() {
ParseQuery query = new ParseQuery("MyClass");
query.orderByDescending("createdAt");
return query;
}
});
}
#Override
public View getItemView(final ParseObject object, View v, final ViewGroup parent) {
if (v == null) {
v = View.inflate(getContext(), R.layout.activity_audio_files_item, null);
}
super.getItemView(object, v, parent);
final Button play = (Button) v.findViewById(R.id.play);
playPause = TabFragment1.playPause;
next = TabFragment1.next;
previous = TabFragment1.previous;
soundtoolbar = TabFragment1.soundtoolbar;
final ParseFile descr = object.getParseFile("audiofile");
final String name = object.getString("name");
final String audioFileURL = descr.getUrl();
final SlidingUpPanelLayout slidingUpPanelLayout = TabFragment1.spanel;
play.setText(name);
final MediaPlayer mediaPlayer = new MediaPlayer();
play.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
if (isPlaying != true) {
Toast.makeText(getContext(), name+" is playing", Toast.LENGTH_SHORT).show();
try {
mediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
mediaPlayer.setDataSource(audioFileURL);
mediaPlayer.prepareAsync();
mediaPlayer.setOnPreparedListener(new MediaPlayer.OnPreparedListener() {
#Override
public void onPrepared(MediaPlayer mp) {
soundtoolbar.setTitle(name);
slidingUpPanelLayout.setPanelState(SlidingUpPanelLayout.PanelState.EXPANDED);
mediaPlayer.start();
isPlaying = true;
}
});
} catch (IllegalArgumentException e1) {
e1.printStackTrace();
} catch (SecurityException e2) {
e2.printStackTrace();
} catch (IllegalStateException e3) {
e3.printStackTrace();
} catch (IOException e4) {
e4.printStackTrace();
} catch (NullPointerException e5) {
e5.printStackTrace();
}
} else {
mediaPlayer.stop();
Toast.makeText(getContext(), "Starting "+name, Toast.LENGTH_SHORT).show();
try {
mediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
mediaPlayer.prepareAsync();
mediaPlayer.setDataSource(audioFileURL);
mediaPlayer.setOnPreparedListener(new MediaPlayer.OnPreparedListener() {
#Override
public void onPrepared(MediaPlayer mp) {
soundtoolbar.setTitle(name);
slidingUpPanelLayout.setPanelState(SlidingUpPanelLayout.PanelState.EXPANDED);
mediaPlayer.start();
}
});
} catch (IllegalArgumentException e1) {
e1.printStackTrace();
} catch (SecurityException e2) {
e2.printStackTrace();
} catch (IllegalStateException e3) {
e3.printStackTrace();
} catch (IOException e4) {
e4.printStackTrace();
} catch (NullPointerException e5){
e5.printStackTrace();
}
}
playPause.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
if (mediaPlayer.isPlaying()) {
mediaPlayer.pause();
playPause.setBackground(getContext().getDrawable(R.drawable.ic_play_arrow_white_24dp));
} else {
mediaPlayer.start();
playPause.setBackground(getContext().getDrawable(R.drawable.ic_pause_white_24dp));
}
}
});
}
});
return v;
}
}
Somebody please help...
i suggest you to checkout SoundPool . İt ll helps you. And one more , may be you ll put media urls to array or something like this.And use one mediaPlayer By the way, you ll avoid from two mediaPlayer and avoid from memory leak.
http://developer.android.com/reference/android/media/SoundPool.html

How to capture screenshot or video frame of VideoView in Android

I've tried the following tutorial from this blog ( http://android-er.blogspot.kr/2013/05/get-current-frame-in-videoview-using.html ), which shows how to capture a video frame using MediaMetadataRetriever from a video source. However, it only works if the video is located locally on the phone.
Is there a way to capture a video frame while the VideoView is streaming the video over IP?
Video View is a Subclass of SurfaceView so Pixel Copy can take a screenshot of video View as well
Put this method in some Util class
/**
* Pixel copy to copy SurfaceView/VideoView into BitMap
*/
fun usePixelCopy(videoView: SurfaceView, callback: (Bitmap?) -> Unit) {
val bitmap: Bitmap = Bitmap.createBitmap(
videoView.width,
videoView.height,
Bitmap.Config.ARGB_8888
);
try {
// Create a handler thread to offload the processing of the image.
val handlerThread = HandlerThread("PixelCopier");
handlerThread.start();
PixelCopy.request(
videoView, bitmap,
PixelCopy.OnPixelCopyFinishedListener { copyResult ->
if (copyResult == PixelCopy.SUCCESS) {
callback(bitmap)
}
handlerThread.quitSafely();
},
Handler(handlerThread.looper)
)
} catch (e: IllegalArgumentException) {
callback(null)
// PixelCopy may throw IllegalArgumentException, make sure to handle it
e.printStackTrace()
}
}
Usage:
usePixelCopy(videoView) { bitmap: Bitmap? ->
processBitMp(bitmap)
}
I have found a solution to this problem. It appears that the VideoView does not allow this because of low-level hardware GPU reasons while using a SurfaceView.
The solution is to use a TextureView and use a MediaPlayer to play a video inside of it. The Activity will need to implement TextureView.SurfaceTextureListener. When taking a screenshot with this solution, the video freezes for a short while. Also, the TextureView does not display a default UI for the playback progress bar (play, pause, FF/RW, play time, etc). That is one drawback. If you have another solution please let me know :)
Here is the solution:
public class TextureViewActivity extends Activity
implements TextureView.SurfaceTextureListener,
OnBufferingUpdateListener,
OnCompletionListener,
OnPreparedListener,
OnVideoSizeChangedListener
{
private MediaPlayer mp;
private TextureView tv;
public static String MY_VIDEO = "https://www.blahblahblah.com/myVideo.mp4";
public static String TAG = "TextureViewActivity";
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_texture_view);
tv = (TextureView) findViewById(R.id.textureView1);
tv.setSurfaceTextureListener(this);
}
public void getBitmap(TextureView vv)
{
String mPath = Environment.getExternalStorageDirectory().toString()
+ "/Pictures/" + Utilities.getDayTimeString() + ".png";
Toast.makeText(getApplicationContext(), "Capturing Screenshot: " + mPath, Toast.LENGTH_SHORT).show();
Bitmap bm = vv.getBitmap();
if(bm == null)
Log.e(TAG,"bitmap is null");
OutputStream fout = null;
File imageFile = new File(mPath);
try {
fout = new FileOutputStream(imageFile);
bm.compress(Bitmap.CompressFormat.PNG, 90, fout);
fout.flush();
fout.close();
} catch (FileNotFoundException e) {
Log.e(TAG, "FileNotFoundException");
e.printStackTrace();
} catch (IOException e) {
Log.e(TAG, "IOException");
e.printStackTrace();
}
}
#Override
public boolean onCreateOptionsMenu(Menu menu) {
// Inflate the menu; this adds items to the action bar if it is present.
getMenuInflater().inflate(R.menu.media_player_video, menu);
return true;
}
#Override
public boolean onOptionsItemSelected(MenuItem item) {
// Handle action bar item clicks here. The action bar will
// automatically handle clicks on the Home/Up button, so long
// as you specify a parent activity in AndroidManifest.xml.
int id = item.getItemId();
if (id == R.id.action_settings) {
return true;
}
return super.onOptionsItemSelected(item);
}
#Override
public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int height)
{
Surface s = new Surface(surface);
try
{
mp = new MediaPlayer();
mp.setDataSource(MY_VIDEO);
mp.setSurface(s);
mp.prepare();
mp.setOnBufferingUpdateListener(this);
mp.setOnCompletionListener(this);
mp.setOnPreparedListener(this);
mp.setOnVideoSizeChangedListener(this);
mp.setAudioStreamType(AudioManager.STREAM_MUSIC);
mp.start();
Button b = (Button) findViewById(R.id.textureViewButton);
b.setOnClickListener(new OnClickListener(){
#Override
public void onClick(View v)
{
TextureViewActivity.this.getBitmap(tv);
}
});
}
catch (IllegalArgumentException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (SecurityException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IllegalStateException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}

Skipped Frames. Application may be doing too much work on it's main thread” error on android 4.3 nexus 7 [duplicate]

This question already has answers here:
The application may be doing too much work on its main thread
(21 answers)
Closed 1 year ago.
import java.io.BufferedReader;
public class Main extends Activity implements SurfaceHolder.Callback,
MediaPlayer.OnCompletionListener, View.OnClickListener, OnInitListener {
String SrcPath = "";
MediaPlayer mp;
SurfaceView mSurfaceView;
private SurfaceHolder holderrrr;
Boolean play = false;
String t_alarm1 = "alarm.xml", t_alarm2 = "alarm2.xml", text;
// TextToSpeach
private TextToSpeech mText2Speech;
#Override
protected void onCreate(Bundle savedInstanceState) {
StrictMode.ThreadPolicy policy = new StrictMode.ThreadPolicy.Builder()
.permitAll().build();
StrictMode.setThreadPolicy(policy);
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
addListenerOnButton();
mText2Speech = new TextToSpeech(Main.this, Main.this);
}
// menü
#Override
public boolean onCreateOptionsMenu(Menu menu) {
// Inflate the menu; this adds items to the action bar if it is present.
getMenuInflater().inflate(R.menu.main, menu);
return true;
}
#Override
public boolean onOptionsItemSelected(MenuItem item) {
switch (item.getItemId()) {
case R.id.action_settings: {
Intent myIntent = new Intent(this, menu.class);
Main.this.startActivity(myIntent);
return true;
}
}
return true;
}
// kilépésfigyelő
private static final long DOUBLE_PRESS_INTERVAL = 2000000000;// 2 másodperc
private long lastPressTime;
#Override
public void onBackPressed() {
Toast.makeText(Main.this, getString(R.string.kilepes_dupla),
Toast.LENGTH_SHORT).show();
long pressTime = System.nanoTime();
if (pressTime - lastPressTime <= DOUBLE_PRESS_INTERVAL) {
// this is a double click event
System.exit(0);
}
lastPressTime = pressTime;
}
public void onInit(int status) {
if (status == TextToSpeech.SUCCESS) {
mText2Speech.setLanguage(Locale.getDefault()); // alaértelmezett
// nyelv a TtS-hez
}
}
private void addListenerOnButton() {
final Button button5 = (Button) findViewById(R.id.button5);
button5.setOnClickListener(new View.OnClickListener() {
public void onClick(View v) {
// Perform action on click
if (play) {
try{
mp.stop();
mp.release();
mp = null;
}catch(Exception e){
}
button5.setText("Start");
play = false;
} else {
try {
mp = new MediaPlayer();
mSurfaceView = (SurfaceView) findViewById(R.id.surface);
holderrrr = mSurfaceView.getHolder();
play = true;
button5.setText("Stop");
surfaceCreated(holderrrr);
} catch (IllegalArgumentException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (SecurityException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IllegalStateException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
});
}// addlistener vége
public void surfaceCreated(SurfaceHolder holder) {
mp.setDisplay(holder);
holder = mSurfaceView.getHolder();
holder.addCallback(this);
holder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
try {
mp.setOnCompletionListener(new MediaPlayer.OnCompletionListener() {
#Override
public void onCompletion(MediaPlayer mp) {
mp.stop();
mp.release();
Toast.makeText(Main.this,
"A videó lejátszás befejeződött!",
Toast.LENGTH_SHORT).show();
// button5.setText("Start");
//play = false;
}
});
mp.setDataSource(SrcPath);
mp.prepare();
} catch (IllegalArgumentException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (SecurityException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IllegalStateException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
// Get the dimensions of the video
// int videoWidth = mp.getVideoWidth();
// int videoHeight = mp.getVideoHeight();
// Get the width of the screen
// int screenWidth = getWindowManager().getDefaultDisplay().getWidth();
// Get the SurfaceView layout parameters
android.view.ViewGroup.LayoutParams lp = mSurfaceView.getLayoutParams();
// Set the width of the SurfaceView to the width of the screen
// lp.width = screenWidth;
lp.width = 420;
// Set the height of the SurfaceView to match the aspect ratio of the
// video
// be sure to cast these as floats otherwise the calculation will likely
// be 0
// lp.height = (int) (((float) videoHeight / (float) videoWidth) *
// (float) screenWidth);
lp.height = 390;
// Commit the layout parameters
mSurfaceView.setLayoutParams(lp);
// Start video
mp.start();
}
}
}
it works fine on the 4.1.2 android on galaxy s3 but it gives me the error message which is in the title. and it doesn't show the first 3 sec video...
please give me some advice or som solution with this because i have no idea how to get rid of this kind of error
In simple terms this error means you are asking your Android system to do too much work on the main thread of this particular application. There are some good general answers on this error, one good example being:
https://stackoverflow.com/a/21126690/334402
For your specific example, you may be simply asking it to do too much video related work, which is very processor hungry, on the main thread. It would be worth looking at your 'prepare' method for example and if you are using a streamed source, consider using prepareAsynch - see below from the Android documentation:
public void prepareAsync ()
Added in API level 1 Prepares the player for playback, asynchronously.
After setting the datasource and the display surface, you need to
either call prepare() or prepareAsync(). For streams, you should call
prepareAsync(), which returns immediately, rather than blocking until
enough data has been buffered.
One reason why you may be seeing problems on the Nexus 7 and not on the Galaxy 3 is that the Nexus has a significantly bigger screen and your video source may offer different video encodings for different sized devices - the larger ones quite likely requiring more processing to decode and render.

Is there a faster way to switch a MediaPlayer's datasource?

I am new to Android development, so I am reaching out to see if there is a more efficient, or faster way to switch a MediaPlayer datasource with an onTouch method. I'm trying to create a instrument that will play like a flute, but the audio source wont switch fast enough when I press (touch) the buttons.
I am using the playNote() method to switch between the audio files. Any advice is appreciated.
public class PlayAggeion extends Activity {
ImageButton patC1;
int soundIsOn = 1;
MediaPlayer mp;
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_play_aggeion);
onConfigurationChanged(null);
addListenerPatima();
mp = new MediaPlayer();
playNote(R.raw.aa);
}
public void addListenerPatima() {
patC1 = (ImageButton) findViewById(R.id.patC1);
patC1.setOnTouchListener(new View.OnTouchListener() {
#Override
public boolean onTouch(View v, MotionEvent event) {
switch(event.getAction())
{
case MotionEvent.ACTION_DOWN:
playNote(R.raw.bb);
return true;
case MotionEvent.ACTION_UP:
playNote(R.raw.aa);
return true;
}
return false;
};
});
}
public void playNote(int note){
// Play note
try {
mp.reset();
mp.setDataSource(getApplicationContext(), Uri.parse("android.resource://" + getPackageName() + "/" + note));
mp.prepare();
mp.setLooping(true);
mp.start();
} catch (IllegalArgumentException e) {
e.printStackTrace();
} catch (IllegalStateException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
I think you should use SoundPool instead of MediaPlayer. SoundPool lets you preload a number of sound files and lets you play them one after another without any additional delay. It is often used in games and sound board apps, so it seems to perfectly match your needs.
More info:
http://developer.android.com/reference/android/media/SoundPool.html
Nice tutorial:
http://www.vogella.com/articles/AndroidMedia/article.html#tutorial_soundpool

is it possible to display video information from an rtsp stream in an android app UI

I have managed to get a working video player that can stream rtsp links, however im not sure how to display the videos current time position in the UI, i have used the getDuration and getCurrentPosition calls, stored this information in a string and tried to display it in the UI but it doesnt seem to work
**in main.xml:**
TextView android:id="#+id/player"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_margin="1px"
android:text="#string/cpos"
/>
**in strings.xml:**
string name="cpos">"" /string>
**in Player.java**
private void playVideo(String url) {
try {
media.setEnabled(false);
if (player == null) {
player = new MediaPlayer();
player.setScreenOnWhilePlaying(true);
} else {
player.stop();
player.reset();
}
player.setDataSource(url);
player.getCurrentPosition();
player.setDisplay(holder);
player.setAudioStreamType(AudioManager.STREAM_MUSIC);
player.setOnPreparedListener(this);
player.prepareAsync();
player.setOnBufferingUpdateListener(this);
player.setOnCompletionListener(this);
} catch (Throwable t) {
Log.e(TAG, "Exception in media prep", t);
goBlooey(t);
try {
try {
player.prepare();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
Log.v(TAG, "Duration: ===> " + player.getDuration());
} catch (IllegalStateException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
private Runnable onEverySecond = new Runnable() {
public void run() {
if (lastActionTime > 0
&& SystemClock.elapsedRealtime() - lastActionTime > 3000) {
clearPanels(false);
}
if (player != null) {
timeline.setProgress(player.getCurrentPosition());
//stores getCurrentPosition as a string
cpos = String.valueOf(player.getCurrentPosition());
System.out.print(cpos);
}
if (player != null) {
timeline.setProgress(player.getDuration());
//stores getDuration as a string
cdur = String.valueOf(player.getDuration());
System.out.print(cdur);
}
if (!isPaused) {
surface.postDelayed(onEverySecond, 1000);
}
}
};
Your code snippet looks significantly like my vidtry sample. getCurrentPosition() and getDuration() works for HTTP streaming, such as for use in updating the progress bar.
I have not tried vidtry with an RTSP video stream, mostly because I don't know of any.
Check the SDP response from the server to ensure that it is sending the duration in the response (live streams don't have a recognizable time and that may cause the client to not provide this information.)
E.g. a live feed will look like:
a=range:npt=0-
Whereas a VoD clip should look like:
a=range:npt=0-399.1680
If getCurrentPosition() doesn't work, but you know the Duration (either getDuration() works or you have an alternate way of getting this information; you could calculate it by watching the buffering events and tracking this your self. Your approach is the more desirable approach than this one.
If I got you right, you want to show in a TextView elapsed time e.g. hh:mm:ss?
If so, I'll give you a little walkthrough on how to do that.
private TextView mElapsedTimeText;
private VideoView mVideoView;
private Thread mThread;
#Override
public void onCreate(Bundle savedInstanceState) {
/* here goes your code */
// let's assume that your IDs are elapsedId and videoId
mElapsedTimeText = (TextView) findViewById(R.id.elapsedId);
mVideoView = (VideoView) findViewById(R.id.videoId);
mThread = new Thread() {
#Override
public void run() {
mElapsedTime.setText(getNiceString());
mVideoView.postDelayed(mThread, 1000);
}
}
/* here goes your code */
}
public String getNiceString() {
String result = "";
int position = mVideoView.getCurrentPosition();
/* here goes your code */
//result is hh:mm:ss formatted string
return result;
}
#Override
public void onPrepared(MediaPlayer mp) {
/* here goes your code */
// you have to trigger the process somewhere
mVideoView.postDelayed(mThread, 1000);
/* here goes your code */
}
And one more thing I forgot to mention. In order to make this work your activity class has to implement the OnPreparedListener interface.
I hope you or someone else will find this post useful.
Best regards,
Igor

Categories

Resources