How to add pause and resume feature while recoding video in android - android

Requirement :
I want to develop an app that has Video Recoding with pause and resume feature.
I Have Tried :
I have developed the app upto recoding the video by using surface view.
Already Researched :
I have already searched all the site and also like but till now i can't get the solution and
that i know there is no default option in android for pause and resume video and also know by merging the video we can achieve it.
What i need:
Please share me if there is any external plugin available for it, guide me how to achieve this if you already achieved, and also share me any resouce that related to how to merge video ..i have searched but no proper resource i had seen please share any thing if you find..

Finally i find the answer :)
i research about ffmpeg it seems more deeply and some more days digging around it but can't get proper resource for ffmepg and i try to use mp4parser lib and successfully completed my requirement.
Code For Merging Multiple Video
public class MergeVide extends AsyncTask<String, Integer, String> {
#Override
protected void onPreExecute() {
progressDialog = ProgressDialog.show(Video.this,
"Preparing for upload", "Please wait...", true);
// do initialization of required objects objects here
};
#Override
protected String doInBackground(String... params) {
try {
String paths[] = new String[count];
Movie[] inMovies = new Movie[count];
for (int i = 0; i < count; i++) {
paths[i] = path + filename + String.valueOf(i + 1) + ".mp4";
inMovies[i] = MovieCreator.build(new FileInputStream(
paths[i]).getChannel());
}
List<Track> videoTracks = new LinkedList<Track>();
List<Track> audioTracks = new LinkedList<Track>();
for (Movie m : inMovies) {
for (Track t : m.getTracks()) {
if (t.getHandler().equals("soun")) {
audioTracks.add(t);
}
if (t.getHandler().equals("vide")) {
videoTracks.add(t);
}
}
}
Movie result = new Movie();
if (audioTracks.size() > 0) {
result.addTrack(new AppendTrack(audioTracks
.toArray(new Track[audioTracks.size()])));
}
if (videoTracks.size() > 0) {
result.addTrack(new AppendTrack(videoTracks
.toArray(new Track[videoTracks.size()])));
}
BasicContainer out = (BasicContainer) new DefaultMp4Builder()
.build(result);
#SuppressWarnings("resource")
FileChannel fc = new RandomAccessFile(String.format(Environment
.getExternalStorageDirectory() + "/wishbyvideo.mp4"),
"rw").getChannel();
out.writeContainer(fc);
fc.close();
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
String mFileName = Environment.getExternalStorageDirectory()
.getAbsolutePath();
mFileName += "/wishbyvideo.mp4";
filename = mFileName;
return mFileName;
}
#Override
protected void onPostExecute(String value) {
super.onPostExecute(value);
progressDialog.dismiss();
Intent i = new Intent(Video.this, VideoUpload.class);
i.putExtra("videopath", value);
i.putExtra("id", id);
i.putExtra("name", name);
i.putExtra("photo", photo);
startActivity(i);
finish();
}
}
the count is nothing but video file count.
the above code for merge more video and send the final code to another activity in that i have decided to preview the video.
before using above code make sure use mp4parser lib.

Related

Cast videos in queue by Chromecast stop playing video automatically after some time

Cast video using Chromecast in a queue is working fine. As per my requirement, it's need to play video constantly for hours on the screen. For that i get bunch of video urls from server for 5 to 10 videos. When 2 video are remain i get new bunch and i append in a queue. Videos are with the length around 40 to 50 seconds.
It continues play for about 45 to 60 min not more than that. It stops than.
I want it to play for hours...
Can any one Help me to come out from this issue. Any help will be useful for me.
Here is my code to play queue.
public void queuePlay(ArrayList<CastModel> data) {
ArrayList<MediaQueueItem> queueList = new ArrayList<>();
for (int i = 0; i < data.size(); i++) {
MediaMetadata mediaMetadata = new MediaMetadata(MediaMetadata.MEDIA_TYPE_MOVIE);
mediaMetadata.putString(MediaMetadata.KEY_TITLE, data.get(i).vTitle);
mediaMetadata.putString(MediaMetadata.KEY_SUBTITLE, data.get(i).vName);
mediaMetadata.addImage(new WebImage(Uri.parse(data.get(i).vImage)));
JSONObject extraData = null;
try {
extraData = getJsonOfObject(data.get(i));
if (extraData == null)
extraData = new JSONObject();
} catch (Exception e) {
Log.i(TAG, "queuePlay: exception " + e.toString());
}
MediaInfo mediaInfo = new MediaInfo.Builder(data.get(i).vVideo)
.setStreamType(MediaInfo.STREAM_TYPE_BUFFERED)
.setContentType("videos/mp4")
.setMetadata(mediaMetadata)
.setCustomData(extraData)
.setStreamDuration(30 * 1000)
.build();
MediaQueueItem item = new MediaQueueItem.Builder(mediaInfo).build();
queueList.add(item);
}
MediaQueueItem[] queueArray = new MediaQueueItem[queueList.size()];
queueArray = queueList.toArray(queueArray);
remoteMediaClient = sessionManager.getCurrentCastSession().getRemoteMediaClient();
remoteMediaClient.queueLoad(queueArray, 0, REPEAT_MODE_REPEAT_OFF, null);
remoteMediaClient.addListener(new RemoteMediaClient.Listener() {
#Override
public void onStatusUpdated() {
try {
Thread.sleep(1000); // Hold for a while
} catch (InterruptedException e) {
e.printStackTrace();
}
MediaStatus mMediaStatus = remoteMediaClient.getMediaStatus();
if (mMediaStatus != null && mMediaStatus.getQueueItems() != null) {
if (queueItemPlayedPosition < mMediaStatus.getCurrentItemId()) {
Log.w(TAG, "onStatusUpdated: Delete video " + queueItemPlayedPosition);
updateCastList(false);
queueItemPlayedPosition++;
}
Log.e(TAG, "onStatusUpdated getCurrentItemId " + remoteMediaClient.getMediaStatus().getCurrentItemId() + " *** onStatusUpdated: getQueueItemCount *** " + mMediaStatus.getQueueItemCount());
}
}
#Override
public void onMetadataUpdated() {
}
#Override
public void onQueueStatusUpdated() {
}
#Override
public void onPreloadStatusUpdated() {
}
#Override
public void onSendingRemoteMediaRequest() {
}
});
}
Haven't played with Cast SDK a lot but found this Autoplay & Queueing APIs which might provide what you're looking for as it mentions ways to play videos continuously using autoplay.

Pause & Resume with Android MediaRecorder (API level < 24)

While using MediaRecorder, we don't have pause/resume for API level below 24.
So there can be a way to do this is:
On pause event stop the recorder and create the recorded file.
And on resume start recording again and create another file and keep doing so until user presses stop.
And at last merge all files.
Many people asked this question on SO, but couldn't find anyway to solve this. People talk about creating multiple media files by stopping recording on pause action and restarting on resume. So my question is How can we merge/join all media file programmatically?
Note: in my case MPEG4 container - m4a for audio and mp4 for video.
I tried using SequenceInputStream to merge multiple InputStream of respective generated recorded files. But it always results the first file only.
Code Snippet:
Enumeration<InputStream> enu = Collections.enumeration(inputStreams);
SequenceInputStream sqStream = new SequenceInputStream(enu);
while ((oneByte = sqStream.read(buffer)) != -1) {
fileOutputStream.write(buffer, 0, oneByte);
}
sqStream.close();
while (enu.hasMoreElements()) {
InputStream element = enu.nextElement();
element.close();
}
fileOutputStream.flush();
fileOutputStream.close();
I could solve this problem using mp4parser library. Thanks much to author of this library :)
Add below dependency in your gradle file:
compile 'com.googlecode.mp4parser:isoparser:1.0.2'
The solution is to stop recorder when user pause and start again on resume as already mentioned in many other answers in stackoverflow. Store all the audio/video files generated in an array and use below method to merge all media files. The example is also taken from mp4parser library and modified little bit as per my need.
public static boolean mergeMediaFiles(boolean isAudio, String sourceFiles[], String targetFile) {
try {
String mediaKey = isAudio ? "soun" : "vide";
List<Movie> listMovies = new ArrayList<>();
for (String filename : sourceFiles) {
listMovies.add(MovieCreator.build(filename));
}
List<Track> listTracks = new LinkedList<>();
for (Movie movie : listMovies) {
for (Track track : movie.getTracks()) {
if (track.getHandler().equals(mediaKey)) {
listTracks.add(track);
}
}
}
Movie outputMovie = new Movie();
if (!listTracks.isEmpty()) {
outputMovie.addTrack(new AppendTrack(listTracks.toArray(new Track[listTracks.size()])));
}
Container container = new DefaultMp4Builder().build(outputMovie);
FileChannel fileChannel = new RandomAccessFile(String.format(targetFile), "rw").getChannel();
container.writeContainer(fileChannel);
fileChannel.close();
return true;
}
catch (IOException e) {
Log.e(LOG_TAG, "Error merging media files. exception: "+e.getMessage());
return false;
}
}
Use flag isAudio as true for Audio files and false for Video files.
Another solution is merging with FFmpeg
Add this line to your app build.gradle
implementation 'com.writingminds:FFmpegAndroid:0.3.2'
And use below code to merge videos.
String textFile = "";
try {
textFile = getTextFile().getAbsolutePath();
} catch (IOException e) {
e.printStackTrace();
}
String[] cmd = new String[]{
"-y",
"-f",
"concat",
"-safe",
"0",
"-i",
textFile,
"-c",
"copy",
"-preset",
"ultrafast",
getVideoFilePath()};
mergeVideos(cmd);
getTextFile()
private File getTextFile() throws IOException {
videoFiles = new String[]{firstPath, secondPath, thirdPatch};
File file = new File(getActivity().getExternalFilesDir(null), System.currentTimeMillis() + "inputFiles.txt");
FileOutputStream out = new FileOutputStream(file, false);
PrintWriter writer = new PrintWriter(out);
StringBuilder builder = new StringBuilder();
for (String path : videoFiles) {
if (path != null) {
builder.append("file ");
builder.append("\'");
builder.append(path);
builder.append("\'\n");
}
}
builder.deleteCharAt(builder.length() - 1);
String text = builder.toString();
writer.print(text);
writer.close();
out.close();
return file;
}
getVideoFilePath()
private String getVideoFilePath() {
final File dir = getActivity().getExternalFilesDir(null);
return (dir == null ? "" : (dir.getAbsolutePath() + "/"))
+ System.currentTimeMillis() + ".mp4";
}
mergeVideos()
private void mergeVideos(String[] cmd) {
FFmpeg ffmpeg = FFmpeg.getInstance(getActivity());
try {
ffmpeg.execute(cmd, new ExecuteBinaryResponseHandler() {
#Override
public void onStart() {
startTime = System.currentTimeMillis();
}
#Override
public void onProgress(String message) {
}
#Override
public void onFailure(String message) {
Toast.makeText(getActivity(), "Failed " + message, Toast.LENGTH_SHORT).show();
}
#Override
public void onSuccess(String message) {
}
#Override
public void onFinish() {
Toast.makeText(getActivity(), "Videos are merged", Toast.LENGTH_SHORT).show();
}
});
} catch (FFmpegCommandAlreadyRunningException e) {
// Handle if FFmpeg is already running
}
}
Run this code before merging
private void checkFfmpegSupport() {
FFmpeg ffmpeg = FFmpeg.getInstance(this);
try {
ffmpeg.loadBinary(new LoadBinaryResponseHandler() {
#Override
public void onStart() {
}
#Override
public void onFailure() {
Toast.makeText(VouchActivity.this, "FFmpeg not supported on this device :(", Toast.LENGTH_SHORT).show();
}
#Override
public void onSuccess() {
}
#Override
public void onFinish() {
}
});
} catch (FFmpegNotSupportedException e) {
// Handle if FFmpeg is not supported by device
}
}

Audio tracks while merging multiple videos using FFMPEG in Android?

I have problem in video processing in Android.
To merging multiple videos, I'm now using ffmpeg c++ library and JavaCV.
Here is my code:
protected Void doInBackground(String... params) {
String firstVideo = params[0];
String secondVideo = params[1];
String outPutVideo = params[2];
try {
FrameGrabber grabber1 = new FFmpegFrameGrabber(firstVideo);
grabber1.start();
FrameGrabber grabber2 = new FFmpegFrameGrabber(secondVideo);
grabber2.start();
FrameRecorder recorder2 = new FFmpegFrameRecorder(outPutVideo, grabber2.getImageWidth(),
grabber2.getImageHeight(), grabber1.getAudioChannels());
recorder2.setVideoCodec(grabber2.getVideoCodec());
recorder2.setFrameRate(grabber2.getFrameRate());
recorder2.setSampleFormat(grabber2.getSampleFormat());
recorder2.setSampleRate(grabber2.getSampleRate());
recorder2.setAudioChannels(2);
recorder2.start();
Frame frame;
int j = 0;
while ((frame = grabber1.grabFrame()) != null) {
j++;
recorder2.record(frame);
}
while ((frame = grabber2.grabFrame()) != null) {
recorder2.record(frame);
}
recorder2.stop();
grabber2.stop();
grabber1.stop();
} catch (Exception e) {
e.printStackTrace();
success = false;
}
return null;
}
First video has no sound, and second video has audio tack.
Audio of second video starts from start of result video.
I tried and had search many hours, but cannot find solution. Please give me advise if you have experience!!!

AudioRecord writing/reading raw PCM data to file

File pcmFile = new File(mediaPath, TEMP_PCM_FILE_NAME);
if (pcmFile.exists())
pcmFile.delete();
int total = 0;
mAudioRecordInstance.startRecording();
try {
DataOutputStream pcmDataOutputStream = new DataOutputStream(
new BufferedOutputStream(new FileOutputStream(pcmFile)));
while (isRecording) {
mAudioRecordInstance.read(mBuffer, 0, mBufferSize);
for (int i = 0; i < mBuffer.length; i++) {
Log.d("Capture", "PCM Write:["+i+"]:" + mBuffer[i]);
pcmDataOutputStream.writeShort(mBuffer[i]);
total++;
}
}
pcmDataOutputStream.close();
} catch (IOException e) {
runOnUiThread(new Runnable() {
#Override
public void run() {
DialogCodes e = DialogCodes.ERROR_CREATING_FILE;
showDialog(e.getValue());
actionButton.performClick();
}
});
return;
} catch (OutOfMemoryError om) {
runOnUiThread(new Runnable() {
#Override
public void run() {
DialogCodes e = DialogCodes.OUT_OF_MEMORY;
showDialog(e.getValue());
System.gc();
actionButton.performClick();
}
});
}
Log.d("Capture", "Stopping recording!!!");
mAudioRecordInstance.stop();
Log.d("Capture", "Processing starts");
short[] shortBuffer = new short[total];
try {
DataInputStream pcmDataInputStream = new DataInputStream(
new BufferedInputStream(new FileInputStream(pcmFile)));
for (int j = 0; pcmDataInputStream.available() > 0; j++) {
shortBuffer[j] = pcmDataInputStream.readShort();
Log.d("Capture", "PCM Read:[" + j + "]:" + shortBuffer[j] );
}
outStream.write(Utilities.shortToBytes(shortBuffer));
pcmDataInputStream.close();
} catch (IOException e) {
runOnUiThread(new Runnable() {
#Override
public void run() {
DialogCodes e = DialogCodes.ERROR_CREATING_FILE;
showDialog(e.getValue());
outFile = null;
actionButton.performClick();
}
});
return;
} catch (OutOfMemoryError om) {
runOnUiThread(new Runnable() {
#Override
public void run() {
DialogCodes e = DialogCodes.OUT_OF_MEMORY;
showDialog(e.getValue());
System.gc();
actionButton.performClick();
}
});
}
I am trying to write PCM data to temp file, so that I can later process it without loosing anything recordable. Initially I tried processing in the same recording loop but the recorded duration didn't matched with the actual duration. Now what I want is to read short from PCM file and write it to WAV file (Want to process short data later if this issue is fixed) with header. If I open the file in Audacity it is coming out to be empty. If I write directly to WAV file instead of temp PCM file it works fine.
Other issue is I am using the handler to run a thread in which I update the duration of recording and update VU meter view. I use mBuffer data to display in VU Meter view and is invalidated every second. No synchronization is used on the data but still it effects the recorded duration. Sometimes it comes out to be thrice the original duration.
Questions are (1) Why reading and writing PCM data to temp file is causing WAV file to be empty? Why reading from the unsynchronized short buffer (member variable) in a thread managed by handler is adding duration to WAV data, this happens when I write recorded buffer to WAV file directly?
It was all in the header
http://gitorious.org/android-eeepc/base/blobs/48276ab989a4d775961ce30a43635a317052672a/core/java/android/speech/srec/WaveHeader.java
Once I fixed that everything was fine.

MediaPlayer stutters at start of mp3 playback

I've been having a problem playing an mp3 file stored in a raw resource: when the file first starts playing, it generates perhaps a quarter of a second of sound and then restarts. (I know that this is basically a duplicate of the problem described here, but the solution offered there hasn't worked for me.) I have tried several things and have made some progress on the problem, but it isn't totally fixed.
Here's how I'm setting up to play a file:
mPlayer.reset();
try {
AssetFileDescriptor afd = getResources().openRawResourceFd(mAudioId);
if (afd == null) {
Toast.makeText(mOwner, "Could not load sound.",
Toast.LENGTH_LONG).show();
return;
}
mPlayer.setDataSource(afd.getFileDescriptor(),
afd.getStartOffset(), afd.getLength());
afd.close();
mPlayer.prepare();
} catch (Exception e) {
Log.d(LOG_TAG, "Could not load sound.", e);
Toast.makeText(mOwner, "Could not load sound.", Toast.LENGTH_LONG)
.show();
}
If I exit the activity (which calls mPlayer.release()) and come back to it (creating a new MediaPlayer), the stutter is usually (but not always) gone—provided I load the same sound file. I tried a couple of things that made no difference:
Load the sound file as an asset instead of as a resource.
Create the MediaPlayer using MediaPlayer.create(getContext(), mAudioId) and skip the calls to setDataSource(...) and prepare().
Then I noticed that LogCat always shows this line at about the time that playback starts:
DEBUG/AudioSink(37): bufferCount (4) is too small and increased to 12
It got me wondering if the stuttering is due to the apparent rebuffering. This led me to try something else:
After calling prepare(), call mPlayer.start() and immediately call mPlayer.pause().
To my pleasant surprise, this had a big effect. A great deal of the stutter is gone, plus no sound (that I can hear) is actually played at that point in the process.
However, it still stutters from time to time when I call mPlayer.start() for real. Plus, this seems like a huge kludge. Is there any way to kill this problem completely and cleanly?
EDIT More info; not sure if related. If I call pause() during playback, seek to an earlier position, and call start() again, I hear a short bit (~1/4 sec) of additional sound from where it was paused before it starts playing at the new position. This seems to point to more buffering problems.
Also, the stuttering (and paused buffer) problems show up on emulators from 1.6 through 3.0.
AFAIK the buffers that MediaPlayer creates internally are for storing decompressed samples, not for storing prefetched compressed data. I suspect your stuttering comes from I/O slowness as it loads more MP3 data for decompression.
I recently had to solve a similar problem with video playback. Thanks to MediaPlayer being unable to play an arbitrary InputStream (the API is strangely lame) the solution I came up with was to write a small in-process webserver for serving up local files (on the SD card) over HTTP. MediaPlayer then loads it via a URI of the form http://127.0.0.1:8888/videofilename.
EDIT:
Below is the StreamProxy class I use to feed content into a MediaPlayer instance. The basic use is that you instantiate it, start() it, and set your media player going with something like MediaPlayer.setDataSource("http://127.0.0.1:8888/localfilepath");
I should note that it is rather experimental and probably not entirely bug-free. It was written to solve a similar problem to yours, namely that MediaPlayer cannot play a file that is also being downloaded. Streaming a file locally in this way works around that restriction (i.e. I have a thread downloading the file while the StreamProxy feeds it into mediaplayer).
import java.io.BufferedOutputStream;
import java.io.File;
import java.io.FileInputStream;
import java.io.IOException;
import java.io.OutputStream;
import java.net.InetAddress;
import java.net.ServerSocket;
import java.net.Socket;
import java.net.SocketException;
import java.net.SocketTimeoutException;
import java.net.UnknownHostException;
import android.os.AsyncTask;
import android.os.Looper;
import android.util.Log;
public class StreamProxy implements Runnable {
private static final int SERVER_PORT=8888;
private Thread thread;
private boolean isRunning;
private ServerSocket socket;
private int port;
public StreamProxy() {
// Create listening socket
try {
socket = new ServerSocket(SERVER_PORT, 0, InetAddress.getByAddress(new byte[] {127,0,0,1}));
socket.setSoTimeout(5000);
port = socket.getLocalPort();
} catch (UnknownHostException e) { // impossible
} catch (IOException e) {
Log.e(TAG, "IOException initializing server", e);
}
}
public void start() {
thread = new Thread(this);
thread.start();
}
public void stop() {
isRunning = false;
thread.interrupt();
try {
thread.join(5000);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
#Override
public void run() {
Looper.prepare();
isRunning = true;
while (isRunning) {
try {
Socket client = socket.accept();
if (client == null) {
continue;
}
Log.d(TAG, "client connected");
StreamToMediaPlayerTask task = new StreamToMediaPlayerTask(client);
if (task.processRequest()) {
task.execute();
}
} catch (SocketTimeoutException e) {
// Do nothing
} catch (IOException e) {
Log.e(TAG, "Error connecting to client", e);
}
}
Log.d(TAG, "Proxy interrupted. Shutting down.");
}
private class StreamToMediaPlayerTask extends AsyncTask<String, Void, Integer> {
String localPath;
Socket client;
int cbSkip;
public StreamToMediaPlayerTask(Socket client) {
this.client = client;
}
public boolean processRequest() {
// Read HTTP headers
String headers = "";
try {
headers = Utils.readTextStreamAvailable(client.getInputStream());
} catch (IOException e) {
Log.e(TAG, "Error reading HTTP request header from stream:", e);
return false;
}
// Get the important bits from the headers
String[] headerLines = headers.split("\n");
String urlLine = headerLines[0];
if (!urlLine.startsWith("GET ")) {
Log.e(TAG, "Only GET is supported");
return false;
}
urlLine = urlLine.substring(4);
int charPos = urlLine.indexOf(' ');
if (charPos != -1) {
urlLine = urlLine.substring(1, charPos);
}
localPath = urlLine;
// See if there's a "Range:" header
for (int i=0 ; i<headerLines.length ; i++) {
String headerLine = headerLines[i];
if (headerLine.startsWith("Range: bytes=")) {
headerLine = headerLine.substring(13);
charPos = headerLine.indexOf('-');
if (charPos>0) {
headerLine = headerLine.substring(0,charPos);
}
cbSkip = Integer.parseInt(headerLine);
}
}
return true;
}
#Override
protected Integer doInBackground(String... params) {
long fileSize = GET CONTENT LENGTH HERE;
// Create HTTP header
String headers = "HTTP/1.0 200 OK\r\n";
headers += "Content-Type: " + MIME TYPE HERE + "\r\n";
headers += "Content-Length: " + fileSize + "\r\n";
headers += "Connection: close\r\n";
headers += "\r\n";
// Begin with HTTP header
int fc = 0;
long cbToSend = fileSize - cbSkip;
OutputStream output = null;
byte[] buff = new byte[64 * 1024];
try {
output = new BufferedOutputStream(client.getOutputStream(), 32*1024);
output.write(headers.getBytes());
// Loop as long as there's stuff to send
while (isRunning && cbToSend>0 && !client.isClosed()) {
// See if there's more to send
File file = new File(localPath);
fc++;
int cbSentThisBatch = 0;
if (file.exists()) {
FileInputStream input = new FileInputStream(file);
input.skip(cbSkip);
int cbToSendThisBatch = input.available();
while (cbToSendThisBatch > 0) {
int cbToRead = Math.min(cbToSendThisBatch, buff.length);
int cbRead = input.read(buff, 0, cbToRead);
if (cbRead == -1) {
break;
}
cbToSendThisBatch -= cbRead;
cbToSend -= cbRead;
output.write(buff, 0, cbRead);
output.flush();
cbSkip += cbRead;
cbSentThisBatch += cbRead;
}
input.close();
}
// If we did nothing this batch, block for a second
if (cbSentThisBatch == 0) {
Log.d(TAG, "Blocking until more data appears");
Thread.sleep(1000);
}
}
}
catch (SocketException socketException) {
Log.e(TAG, "SocketException() thrown, proxy client has probably closed. This can exit harmlessly");
}
catch (Exception e) {
Log.e(TAG, "Exception thrown from streaming task:");
Log.e(TAG, e.getClass().getName() + " : " + e.getLocalizedMessage());
e.printStackTrace();
}
// Cleanup
try {
if (output != null) {
output.close();
}
client.close();
}
catch (IOException e) {
Log.e(TAG, "IOException while cleaning up streaming task:");
Log.e(TAG, e.getClass().getName() + " : " + e.getLocalizedMessage());
e.printStackTrace();
}
return 1;
}
}
}
Would using prepareAsync and responding to setOnPreparedListener suit you better? Depending on your activity workflow, when the MediaPlayer is first initialized you could set the preparation listener and then call mPlayer.prepareAsync() later once you're actually loading the resource, then start playback there. I use something similar, albeit for a network-based streaming resource:
MediaPlayer m_player;
private ProgressDialog m_progressDialog = null;
...
try {
if (m_player != null) {
m_player.reset();
} else {
m_player = new MediaPlayer();
}
m_progressDialog = ProgressDialog
.show(this,
getString(R.string.progress_dialog_please_wait),
getString(R.string.progress_dialog_buffering),
true);
m_player.setOnPreparedListener(this);
m_player.setAudioStreamType(AudioManager.STREAM_MUSIC);
m_player.setDataSource(someSource);
m_player.prepareAsync();
} catch (Exception ex) {
}
...
public void onPrepared(MediaPlayer mp) {
if (m_progressDialog != null && m_progressDialog.isShowing()) {
m_progressDialog.dismiss();
}
m_player.start();
}
There's obviously more to a complete solution (error-handling, etc.) but I think this should work as a good example to start from that you can pull the streaming out of.

Categories

Resources