UPDATE:
I found this post, which details exactly the same problem I am seeing. It turns out that the fact I am using a Pipe approach in my DocumentsProvider to stream content from DropBox means that ExoPlayer doesn't know the size of the file ahead of time, and so by default was not saving it to the cache.
So I ended up doing what I presume the author did - I created a custom CacheDataSource for these situations that alters the DataSpec.flags variable in the open() method of that class:
public long open(DataSpec dataSpec) throws IOException {
try {
key = cacheKeyFactory.buildCacheKey(dataSpec);
uri = dataSpec.uri;
actualUri = getRedirectedUriOrDefault(cache, key, /* defaultUri= */ uri);
httpMethod = dataSpec.httpMethod;
if ( !dataSpec.isFlagSet(DataSpec.FLAG_ALLOW_CACHING_UNKNOWN_LENGTH) ) { // <-- update here
flags = (dataSpec.flags | DataSpec.FLAG_ALLOW_CACHING_UNKNOWN_LENGTH);
} else {
flags = dataSpec.flags;
}
readPosition = dataSpec.position;
Not the optimum solution, and I also chimed in on the other post with a request for a more supported way to indicate this flag should be set.
But at least now my streamed files are being saved in the cache.
I am implementing a customer CacheDataSourceFactory for ExoPlayer2, in order to implement a cache to store videos streamed to ExoPlayer.
I have reviewed several posts here, this one was helpful in getting the general approach right to have a video cached into the directory of my choice.
I noticed that when handling a URI that resolves to my custom DocumentsProvider, the Cache defined by the CacheDataSourceFactory is only used to store what looks like a "pointer" or "index" file ("cached_content_index.exi"). Looking in that file I see the URI of the video streamed by my custom DocumentsProvider. However the actual video is not in the cache.
Here is the relevant portion of my Provider, it's quite straight forward:
// Return a descriptor that will stream the file
Timber.d("In openDocument of DropboxProvider for Id: %s, streaming from source", documentId);
ParcelFileDescriptor[] pipe;
try {
pipe = ParcelFileDescriptor.createPipe();
// Get input stream for the pipe
DbxDownloader downloader = mDbxClient.files().download(fileMetadata.getPathLower(), fileMetadata.getRev());
new TransferThread(downloader.getInputStream(), new ParcelFileDescriptor.AutoCloseOutputStream(pipe[1]), signal, fileMetadata.getSize()).start();
return pipe[0];
} catch (DbxException dbe) {
Timber.d("Got IDbxException when streaming content: %s", dbe.getMessage());
} catch (IOException ioe) {
Timber.d("Got IOException when streaming content: %s", ioe.getMessage());
} catch (Exception e) {
Timber.d("Got Exception when streaming content: %s", e.getMessage());
}
return null;
And the TransferThread:
private static class TransferThread extends Thread {
final InputStream in;
final OutputStream out;
final CancellationSignal signal;
final long size;
TransferThread(InputStream in, OutputStream out, CancellationSignal signal, long size) {
this.in = in;
this.out = out;
this.signal = signal;
this.size = size;
}
#Override
public void run() {
int biteSize = (8*1024);
if ( size <= (biteSize * 8) ) {
biteSize = Math.max( ((int)(size / (biteSize*2))) * (biteSize * 2), biteSize);
}
Timber.d("TransferThread: File size is: %s, buffer biteSize set to: %d", InTouchUtils.getFormattedFileSize(size), biteSize);
byte[] buf = new byte[biteSize];
int len;
try {
while ( ((len=in.read(buf)) >= 0) && (signal == null || !signal.isCanceled()) ) {
out.write(buf, 0, len);
}
} catch (IOException e) {
// When Glide is used to request a URI where this provider resolves the query,
// it will close the stream out from under us once it has fetched enough bytes
// to render a single frame as an image if the if it is to a video, so
// we swallow that exception here, only logging the error if it isn't that EPIPE
// (broken pipe due to one end being closed) exception.
if ( !(e.getMessage().contains("EPIPE"))) {
Timber.d("TransferThread: Got IOException transferring file: %s", e.getMessage());
}
} finally {
try {
if (in != null) {
in.close();
}
if ( out != null ) {
out.flush();
out.close();
}
Timber.d("TransferThread: Finished streaming file.");
} catch (IOException ioe) {
Timber.d("TransferThread: Got IOException closing file: %s", ioe.getMessage());
}
}
}
}
Again - ExoPlayer seems quite happy with the ParcelFileDescriptor it receives from the DocumentsProvider in this case - it takes the bytes streamed to it and plays the video. I am just not seeing the video file end up in the cache.
I also tried an example streaming a video from my Google Drive (which uses the out-of-the-box documents provider from the SAF), and this time the video did wind up in the cache.
Since they both use the same MediaSource instance - there must be an approach that the Google Docs provider takes so that ExoPlayer knows to place the resulting streamed video in the cache that my custom Dropbox DocumentsProvider is not doing.
Does anyone know how to get to the source code of the DocumentsProvider that ships with the SAF that manages access to Google Docs? I'd like to see what it is doing in its openDocument() method.
Is the fact that the Dropbox provider is utilizing a Pipe in its ParcelFileDescriptor something that ExoPlayer doesn't handle?
Other Ideas?
Related
I have using spydroid from https://github.com/fyhertz/spydroid-ipcamera.
Based on the requirement streaming should be send and receive in the device. from local network we should able to shown the rtsp stream. Ex. VLC Media Player.
The issue I am facing is, When I am change the resolution Ex. 640*480. It should give black screen with streaming live. In Default demo, It should support 320*240, which is working fine. I have also change the bitrate and framerate according to 640*480 resolution. But couldn't get the result.
Any help would be appreciate.
You might be using old library which the demo SpyDroid is using.
I had same issue with the code i tried following way :-
Steps:-
1.)include library LibStreaming .
As it is the latest library and supporting above Lollipop version.
2.)Find H263Stream class Change following method :-
From
#SuppressLint("NewApi")
private MP4Config testMediaCodecAPI() throws RuntimeException, IOException {
createCamera();
updateCamera();
try {
if (mQuality.resX>=640) {
// Using the MediaCodec API with the buffer method for high resolutions is too slow
mMode = MODE_MEDIARECORDER_API;
}
EncoderDebugger debugger = EncoderDebugger.debug(mSettings, mQuality.resX, mQuality.resY);
return new MP4Config(debugger.getB64SPS(), debugger.getB64PPS());
} catch (Exception e) {
// Fallback on the old streaming method using the MediaRecorder API
Log.e(TAG,"Resolution not supported with the MediaCodec API, we fallback on the old streamign method.");
mMode = MODE_MEDIARECORDER_API;
return testH264();
}
}
To
#SuppressLint("NewApi")
private MP4Config testMediaCodecAPI() throws RuntimeException, IOException {
createCamera();
updateCamera();
try {
if (mQuality.resX>=1080) {
// Using the MediaCodec API with the buffer method for high resolutions is too slow
mMode = MODE_MEDIARECORDER_API;
}
EncoderDebugger debugger = EncoderDebugger.debug(mSettings, mQuality.resX, mQuality.resY);
return new MP4Config(debugger.getB64SPS(), debugger.getB64PPS());
} catch (Exception e) {
// Fallback on the old streaming method using the MediaRecorder API
Log.e(TAG,"Resolution not supported with the MediaCodec API, we fallback on the old streamign method.");
mMode = MODE_MEDIARECORDER_API;
return testH264();
}
}
-You can find the difference change resolution from "640" into "1080"
-Don't know what exact reason is but above solution worked for me.
-Revert me back if there is any loop fall.
In my case the problem was in MediaRecorder:
file: VideoStream.java
method: encodeWithMediaRecorder
MediaRecorder doesn't support ParcelFileDescriptor correctly. I created local file to save stream form MediaRecorder:
mMediaRecorder.setOutputFile(this.tmpFileToStream);
//mMediaRecorder.setOutputFile(fd); //disable..
and then I run new thread to copy bytes from tmpFileToStream to mParcelWrite
public void run()
{
FileDescriptor fd = mParcelWrite.getFileDescriptor();
try {
InputStream isS = new FileInputStream(tmpFileToStream); //read from local file..
FileOutputStream outputStream = new FileOutputStream(fd);
while (!Thread.interrupted()) {
int content;
while ((content = isS.read()) != -1) {
outputStream.write(content);
}
Thread.sleep(10);
}
} catch (Exception e)
{
Log.e(TAG, "E.. " + e.getMessage());
}
}
/**
* Video encoding is done by a MediaRecorder.
*/
protected void encodeWithMediaRecorder() throws IOException, ConfNotSupportedException {
Log.d(TAG,"Video encoded using the MediaRecorder API");
Log.d(TAG,"Roz" + mRequestedQuality.resX + " x " + mRequestedQuality.resY + " frame " + mRequestedQuality.framerate );
// We need a local socket to forward data output by the camera to the packetizer
createSockets();
// Reopens the camera if needed
destroyCamera();
createCamera();
// The camera must be unlocked before the MediaRecorder can use it
unlockCamera();
this.tmpFileToStream = this.getOutputMediaFile(MEDIA_TYPE_VIDEO);
Log.d(TAG,"Video record to " + this.tmpFileToStream.getAbsolutePath() );
try {
mMediaRecorder = new MediaRecorder();
mMediaRecorder.setCamera(mCamera);
mMediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
// mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
mMediaRecorder.setVideoEncoder(mVideoEncoder);
mMediaRecorder.setPreviewDisplay(mSurfaceView.getHolder().getSurface());
mMediaRecorder.setVideoSize(mRequestedQuality.resX,mRequestedQuality.resY);
mMediaRecorder.setVideoFrameRate(mRequestedQuality.framerate);
// The bandwidth actually consumed is often above what was requested
mMediaRecorder.setVideoEncodingBitRate((int)(mRequestedQuality.bitrate*0.8));
// We write the output of the camera in a local socket instead of a file !
// This one little trick makes streaming feasible quiet simply: data from the camera
// can then be manipulated at the other end of the socket
FileDescriptor fd = null;
if (sPipeApi == PIPE_API_PFD) {
fd = mParcelWrite.getFileDescriptor();
} else {
fd = mSender.getFileDescriptor();
}
mMediaRecorder.setOutputFile(this.tmpFileToStream); //save to local.. afeter read and stream that..
// mMediaRecorder.setOutputFile(fd); //disable..
// copy bytes from local file to mParcelWrite.. :)
if(tInput == null) {
this.tInput = new Thread(this);
this.tInput.start();
}
mMediaRecorder.prepare();
mMediaRecorder.start();
} catch (Exception e) {
StringWriter sw = new StringWriter();
PrintWriter pw = new PrintWriter(sw);
e.printStackTrace(pw);
Log.d(TAG, "Error stack " + sw.toString());
throw new ConfNotSupportedException(e.getMessage());
}
InputStream is = null;
if (sPipeApi == PIPE_API_PFD) {
is = new ParcelFileDescriptor.AutoCloseInputStream(mParcelRead);
} else {
is = mReceiver.getInputStream();
}
// This will skip the MPEG4 header if this step fails we can't stream anything :(
try {
byte buffer[] = new byte[4];
// Skip all atoms preceding mdat atom
while (!Thread.interrupted()) {
while (is.read() != 'm');
is.read(buffer,0,3);
if (buffer[0] == 'd' && buffer[1] == 'a' && buffer[2] == 't') break;
}
} catch (IOException e) {
Log.e(TAG,"Couldn't skip mp4 header :/");
stop();
throw e;
}
// The packetizer encapsulates the bit stream in an RTP stream and send it over the network
mPacketizer.setInputStream(is);
mPacketizer.start();
mStreaming = true;
}
That's unusual solution but it works for another resolution
I have an app for Android which downloads hundreds of files from the Internet. Some files turn out to be 0-byte after download. The app attempts to detect such cases and delete such files after download but sometimes it fails. The problem is more frequent on Android 4.x devices.
Here is the method which does the downloading. I gets the number of actually read bytes from inputStream.read(buffer).
public class Utils
{
public static class DownloadFileData
{
int nTotalSize;
int nDownloadedSize;
}
public interface ProgressCallback
{
void onProgress(long nCurrent, long nMax);
}
public static boolean downloadFile(String sFileURL, File whereToSave, DownloadFileData fileData, ProgressCallback progressCallback)
{
InputStream inputStream = null;
FileOutputStream fileOutput = null;
try
{
URL url = new URL(sFileURL);
URLConnection connection = url.openConnection();
//set up some things on the connection
connection.setDoOutput(true);
connection.connect();
fileOutput = new FileOutputStream(whereToSave);
inputStream = connection.getInputStream();
fileData.nTotalSize = connection.getContentLength();
fileData.nDownloadedSize = 0;
byte[] buffer = new byte[1024];
int bufferLength = 0; //used to store a temporary size of the buffer
// now, read through the input buffer and write the contents to the file
while ((bufferLength = inputStream.read(buffer)) > 0)
{
// if interrupted, don't download the file further and return
// also restore the interrupted flag so that the caller stopped also
if (Thread.interrupted())
{
Thread.currentThread().interrupt();
return false;
}
// add the data in the buffer to the file in the file output stream
fileOutput.write(buffer, 0, bufferLength);
// add up the size so we know how much is downloaded
fileData.nDownloadedSize += bufferLength;
if (null != progressCallback && fileData.nTotalSize > 0)
{
progressCallback.onProgress(fileData.nDownloadedSize, fileData.nTotalSize);
}
}
return true;
}
catch (FileNotFoundException e)
{
return false; // swallow a 404
}
catch (IOException e)
{
return false; // swallow a 404
}
catch (Throwable e)
{
return false;
}
finally
{
// in any case close input and output streams
if (null != inputStream)
{
try
{
inputStream.close();
inputStream = null;
}
catch (Exception e)
{
}
}
if (null != fileOutput)
{
try
{
fileOutput.close();
fileOutput = null;
}
catch (Exception e)
{
}
}
}
}
Here is the piece of code which processes the downloads. Since sometimes the number of read bytes is incorrect (it is > 0 and the real file has the size 0 bytes) I check the size of the downloaded file with outputFile.length(). But this again gives a value > 0 even though the file is really 0 byte. I tried to also just create a new file and read its size with recheckSizeFile.length(). Still the size is determined as > 0 while it's really 0 byte.
Utils.DownloadFileData fileData = new Utils.DownloadFileData();
boolean bDownloadedSuccessully = Utils.downloadFile(app.sCurrenltyDownloadedFile, outputFile, fileData, new Utils.ProgressCallback()
{
... // progress bar is updated here
});
if (bDownloadedSuccessully)
{
boolean bIsGarbage = false;
File recheckSizeFile = new File(sFullPath);
long nDownloadedFileSize = Math.min(recheckSizeFile.length(), Math.min(outputFile.length(), fileData.nDownloadedSize));
// if the file is 0bytes, it's garbage
if (0 == nDownloadedFileSize)
{
bIsGarbage = true;
}
// if this is a video and if of suspiciously small size, it's
// garbage, too
else if (Utils.isStringEndingWith(app.sCurrenltyDownloadedFile, App.VIDEO_FILE_EXTENSIONS) && nDownloadedFileSize < Constants.MIN_NON_GARBAGE_VIDEO_FILE_SIZE)
{
bIsGarbage = true;
}
if (bIsGarbage)
{
++app.nFilesGarbage;
app.updateLastMessageInDownloadLog("File is fake, deleting: " + app.sCurrenltyDownloadedFile);
// delete the garbage file
if (null != outputFile)
{
if (!outputFile.delete())
{
Log.e("MyService", "Failed to delete garbage file " + app.sCurrenltyDownloadedFile);
}
}
}
else
{
... // process the normally downloaded file
}
I am not sure but I think there is a bug in Android with reading file size. Has anyone seen a similar problem? Or am I maybe doing something wrong here?
Thanks!
EDIT: how i determine that the files are 0-byte:
all the files which get downloaded go thru the described routines. When I then later view the download folder with a file browser (Ghost Commander), some of the files (like maybe 10%) are 0-byte. They can't be played by a video player (shown as "broken file" icon).
It looks to me like your problem is that you only check for "garbage" files if the Utils.downloadFile call returns true. If the download fails in the getInputStream call or the first read, you will have created a file with zero length which will never be deleted.
You should call flush() on your FileOutputStream to ensure that all data is written to the file. This should make your issue with 0-byte files occur less often.
To check for 0 byte files using File.length() should work properly. Can you open a shell (adb shell) on the device and run ls -l to see the byte count displayed by it is 0 (maybe your file manager has some weird issues). Also please debug (or put some log statements) that sFullPath contains the correct file paths. I can't see where sFullPath gets set in your code above and why you don't just use outputFile but recreate another File object.
I started developing an android app that have to interact with MMS attachements, in particular, get attachements such as text, bitmaps, audio, video etc. and store them on the phone in a specific folder.
So i started reading some books and some post on the web but it isn't a very common argument, and i didn't find an official way to do what i want to do.
I found a fairly good article here on stack-overflow here: How to Read MMS Data in Android?... it works very well for me, but there are 2 problems:
The article shows you how to get MMS data by querying over the "hidden" SMS-MMS content provider, and as far as i know, Google doesn't guarantee that they'll keep the current structure in every android's future relase.
The article only explains how to get Text data and Bitmap data from MMS...what about video/audio? I tried to get a video/audio stream from an InputStream such as the example did with Bitmaps, unfortunately with no luck...
I'm very disappointed about the absence of official tutorial or "How-To" over this argument because SMS and MMS management is a very common need in mobile developement.
I hope someone can help me....
Thanks in advance!!
I found a fairly simple way to read Video/Audio data from MMS, so i decided to publish this part of my class that provides MMS attachements, for all users that need this.
private static final int RAW_DATA_BLOCK_SIZE = 16384; //Set the block size used to write a ByteArrayOutputStream to byte[]
public static final int ERROR_IO_EXCEPTION = 1;
public static final int ERROR_FILE_NOT_FOUND = 2;
public static byte[] LoadRaw(Context context, Uri uri, int Error){
InputStream inputStream = null;
byte[] ret = new byte[0];
//Open inputStream from the specified URI
try {
inputStream = context.getContentResolver().openInputStream(uri);
//Try read from the InputStream
if(inputStream!=null)
ret = InputStreamToByteArray(inputStream);
}
catch (FileNotFoundException e1) {
Error = ERROR_FILE_NOT_FOUND;
}
catch (IOException e) {
Error = ERROR_IO_EXCEPTION;
}
finally{
if (inputStream != null) {
try {
inputStream.close();
}
catch (IOException e) {
//Problem on closing stream.
//The return state does not change.
Error = ERROR_IO_EXCEPTION;
}
}
}
//Return
return ret;
}
//Create a byte array from an open inputStream. Read blocks of RAW_DATA_BLOCK_SIZE byte
private static byte[] InputStreamToByteArray(InputStream inputStream) throws IOException{
ByteArrayOutputStream buffer = new ByteArrayOutputStream();
int nRead;
byte[] data = new byte[RAW_DATA_BLOCK_SIZE];
while ((nRead = inputStream.read(data, 0, data.length)) != -1) {
buffer.write(data, 0, nRead);
}
buffer.flush();
return buffer.toByteArray();
}
In this way you can extract "Raw" data such as Audio/Video/Images from MMS by passing:
the context where you need to use this function
the URI of the MMS part that contains data you want to extract (for ex. "content://mms/part/2")
the byref param that returns an eventual error code thrown by the procedure.
Once you have your byte[], you can create an empty File and then use a FileOutputStream to write the byte[] into it. If the file path\extension is correct and if your app has all the right
permissions, you'll be able to store your data.
PS. This procedure has been tested a few times and it worked, but i don't exclude can be some unmanaged exception cases that may produce error states. IMHO it can be improoved too...
I would like to use an arbitrary InputStream as a data source for a MediaPlayer object.
The reason for this is that the InputStream I am using is in fact an authorized HTTPS connection to a media resource on a remote server. Passing the URL in that case will obviously not work as an authentication is required. I can however do the authentication separately and get an InputStream to the resource - problem is what do I do once I have it?
I thought about the option of using a named pipe and passing its FileDescriptor to the setDataResource method of MediaPlayer. Is there a way to create named pipes in Android (without using NDK)?
Any other suggestion is most welcome.
I think I have found a solution. I would appreciate it if others who are interested would try this on their own and report the results with their device models and SDK version.
I have seen similar posts which direct to this but I thought I would post it anyway since it is newer and seems to work on newer versions of the SDK - so far it works on my Nexus One running Android 2.3.6.
The solution relies on bufferring the input stream to a local file (I have this file on the external storage but it will probably be possible to place it on the intenal storage as well) and providing that file's descriptor to the MediaPlayer instance.
The following runs in a doInBackground method of some AsyncTask that does AudioPlayback:
#Override
protected
Void doInBackground(LibraryItem... params)
{
...
MediaPlayer player = new MediaPlayer();
setListeners(player);
try {
_remoteStream = getMyInputStreamSomehow();
File tempFile = File.createTempFile(...);
tempFile.deleteOnExit();
_localInStream = new FileInputStream(tempFile);
_localOutStream = new FileOutputStream(tempFile);
int buffered = bufferMedia(
_remoteStream, _localOutStream, BUFFER_TARGET_SIZE // = 128KB for instance
);
player.setAudioStreamType(AudioManager.STREAM_MUSIC);
player.setDataSource(_localInStream.getFD());
player.prepareAsync();
int streamed = 0;
while (buffered >= 0) {
buffered = bufferMedia(
_remoteStream, _localOutStream, BUFFER_TARGET_SIZE
);
}
}
catch (Exception exception) {
// Handle errors as you see fit
}
return null;
}
The bufferMedia method buffers nBytes bytes or until the end of input is reached:
private
int bufferMedia(InputStream inStream, OutputStream outStream, int nBytes)
throws IOException
{
final int BUFFER_SIZE = 8 * (1 << 10);
byte[] buffer = new byte[BUFFER_SIZE]; // TODO: Do static allocation instead
int buffered = 0, read = -1;
while (buffered < nBytes) {
read = inStream.read(buffer);
if (read == -1) {
break;
}
outStream.write(buffer, 0, read);
outStream.flush();
buffered += read;
}
if (read == -1 && buffered == 0) {
return -1;
}
return buffered;
}
The setListeners method sets handlers for various MediaPlayer events. The most important one is the OnCompletionListener which
is invoked when playback is complete. In cases of buffer underrun (due to, say, temporary slow network connection) the player
will reach the end of the local file and transit to the PlaybackCompleted state. I identify those situations by comparing the
position of _localInStream against the size of the input stream. If the position is smaller, then playback is now really completed
and I reset the MediaPlayer:
private
void setListeners(MediaPlayer player)
{
// Set some other listeners as well
player.setOnSeekCompleteListener(
new MediaPlayer.OnSeekCompleteListener()
{
#Override
public
void onSeekComplete(MediaPlayer mp)
{
mp.start();
}
}
);
player.setOnCompletionListener(
new MediaPlayer.OnCompletionListener()
{
#Override
public
void onCompletion(MediaPlayer mp)
{
try {
long bytePosition = _localInStream.getChannel().position();
int timePosition = mp.getCurrentPosition();
int duration = mp.getDuration();
if (bytePosition < _track.size) {
mp.reset();
mp.setDataSource(_localInStream.getFD());
mp.prepare();
mp.seekTo(timePosition);
} else {
mp.release();
}
} catch (IOException exception) {
// Handle errors as you see fit
}
}
}
);
}
Another solution would be to start a proxy HTTP server on localhost. The media player will connect to this server with setDataSource(Context context, Uri uri). This solution works better than the previous and does not cause playback to glitch.
I am developing app in which I have to implement live TV streaming. My Google search has lead me to believe that live streaming is not possible till 2.1 Android.
Is it right?
As I get code of streaming music of mediaplayer and I can use type of it by setting below method:
mp.setAudioStreamType(2);
But i want to know is it sufficient for streaming just code like that and save file like below method:
private void setDataSource(String path) throws IOException {
if (!URLUtil.isNetworkUrl(path)) {
mp.setDataSource(path);
} else {
Log.i("enter the setdata","enter the setdata");
URL url = new URL(path);
URLConnection cn = url.openConnection();
cn.connect();
InputStream stream = cn.getInputStream();
if (stream == null)
throw new RuntimeException("stream is null");
File temp = File.createTempFile("mediaplayertmp", "dat");
String tempPath = temp.getAbsolutePath();
FileOutputStream out = new FileOutputStream(temp);
byte buf[] = new byte[128];
do {
int numread = stream.read(buf);
if (numread <= 0)
break;
out.write(buf, 0, numread);
} while (true);
mp.setDataSource(tempPath);
try {
stream.close();
Log.i("exit the setdata","exit the setdata");
}
catch (IOException ex) {
Log.e(TAG, "error: " + ex.getMessage(), ex);
}
}
}
Is there any extra stuff needed for live TV streaming?
Adressing "Is it sufficient" : absolutely not.
You're saving all the data from the URL to the device, then playing it back. This works if you can guarantee it's a small clip, but 'live tv streaming' implies we're talking about a stream of unknown length sent at a real-time rate.
The impact of this is :
A N-minute long program will take N-minutes to stream to the device before playback starts.
A long broadcast has the potential to fill up all available storage.
The MediaPlayer.setDataSource(FileDescriptor fd) method should read data from any source you can get a FileDescriptor for, including sockets.
The exact details of how to use this will vary based on the protocol you're using, but essentially you need to read data from the broadcast source, transcode it to a suitable form, and pipe it to the fd.