I am developing app in which I have to implement live TV streaming. My Google search has lead me to believe that live streaming is not possible till 2.1 Android.
Is it right?
As I get code of streaming music of mediaplayer and I can use type of it by setting below method:
mp.setAudioStreamType(2);
But i want to know is it sufficient for streaming just code like that and save file like below method:
private void setDataSource(String path) throws IOException {
if (!URLUtil.isNetworkUrl(path)) {
mp.setDataSource(path);
} else {
Log.i("enter the setdata","enter the setdata");
URL url = new URL(path);
URLConnection cn = url.openConnection();
cn.connect();
InputStream stream = cn.getInputStream();
if (stream == null)
throw new RuntimeException("stream is null");
File temp = File.createTempFile("mediaplayertmp", "dat");
String tempPath = temp.getAbsolutePath();
FileOutputStream out = new FileOutputStream(temp);
byte buf[] = new byte[128];
do {
int numread = stream.read(buf);
if (numread <= 0)
break;
out.write(buf, 0, numread);
} while (true);
mp.setDataSource(tempPath);
try {
stream.close();
Log.i("exit the setdata","exit the setdata");
}
catch (IOException ex) {
Log.e(TAG, "error: " + ex.getMessage(), ex);
}
}
}
Is there any extra stuff needed for live TV streaming?
Adressing "Is it sufficient" : absolutely not.
You're saving all the data from the URL to the device, then playing it back. This works if you can guarantee it's a small clip, but 'live tv streaming' implies we're talking about a stream of unknown length sent at a real-time rate.
The impact of this is :
A N-minute long program will take N-minutes to stream to the device before playback starts.
A long broadcast has the potential to fill up all available storage.
The MediaPlayer.setDataSource(FileDescriptor fd) method should read data from any source you can get a FileDescriptor for, including sockets.
The exact details of how to use this will vary based on the protocol you're using, but essentially you need to read data from the broadcast source, transcode it to a suitable form, and pipe it to the fd.
Related
UPDATE:
I found this post, which details exactly the same problem I am seeing. It turns out that the fact I am using a Pipe approach in my DocumentsProvider to stream content from DropBox means that ExoPlayer doesn't know the size of the file ahead of time, and so by default was not saving it to the cache.
So I ended up doing what I presume the author did - I created a custom CacheDataSource for these situations that alters the DataSpec.flags variable in the open() method of that class:
public long open(DataSpec dataSpec) throws IOException {
try {
key = cacheKeyFactory.buildCacheKey(dataSpec);
uri = dataSpec.uri;
actualUri = getRedirectedUriOrDefault(cache, key, /* defaultUri= */ uri);
httpMethod = dataSpec.httpMethod;
if ( !dataSpec.isFlagSet(DataSpec.FLAG_ALLOW_CACHING_UNKNOWN_LENGTH) ) { // <-- update here
flags = (dataSpec.flags | DataSpec.FLAG_ALLOW_CACHING_UNKNOWN_LENGTH);
} else {
flags = dataSpec.flags;
}
readPosition = dataSpec.position;
Not the optimum solution, and I also chimed in on the other post with a request for a more supported way to indicate this flag should be set.
But at least now my streamed files are being saved in the cache.
I am implementing a customer CacheDataSourceFactory for ExoPlayer2, in order to implement a cache to store videos streamed to ExoPlayer.
I have reviewed several posts here, this one was helpful in getting the general approach right to have a video cached into the directory of my choice.
I noticed that when handling a URI that resolves to my custom DocumentsProvider, the Cache defined by the CacheDataSourceFactory is only used to store what looks like a "pointer" or "index" file ("cached_content_index.exi"). Looking in that file I see the URI of the video streamed by my custom DocumentsProvider. However the actual video is not in the cache.
Here is the relevant portion of my Provider, it's quite straight forward:
// Return a descriptor that will stream the file
Timber.d("In openDocument of DropboxProvider for Id: %s, streaming from source", documentId);
ParcelFileDescriptor[] pipe;
try {
pipe = ParcelFileDescriptor.createPipe();
// Get input stream for the pipe
DbxDownloader downloader = mDbxClient.files().download(fileMetadata.getPathLower(), fileMetadata.getRev());
new TransferThread(downloader.getInputStream(), new ParcelFileDescriptor.AutoCloseOutputStream(pipe[1]), signal, fileMetadata.getSize()).start();
return pipe[0];
} catch (DbxException dbe) {
Timber.d("Got IDbxException when streaming content: %s", dbe.getMessage());
} catch (IOException ioe) {
Timber.d("Got IOException when streaming content: %s", ioe.getMessage());
} catch (Exception e) {
Timber.d("Got Exception when streaming content: %s", e.getMessage());
}
return null;
And the TransferThread:
private static class TransferThread extends Thread {
final InputStream in;
final OutputStream out;
final CancellationSignal signal;
final long size;
TransferThread(InputStream in, OutputStream out, CancellationSignal signal, long size) {
this.in = in;
this.out = out;
this.signal = signal;
this.size = size;
}
#Override
public void run() {
int biteSize = (8*1024);
if ( size <= (biteSize * 8) ) {
biteSize = Math.max( ((int)(size / (biteSize*2))) * (biteSize * 2), biteSize);
}
Timber.d("TransferThread: File size is: %s, buffer biteSize set to: %d", InTouchUtils.getFormattedFileSize(size), biteSize);
byte[] buf = new byte[biteSize];
int len;
try {
while ( ((len=in.read(buf)) >= 0) && (signal == null || !signal.isCanceled()) ) {
out.write(buf, 0, len);
}
} catch (IOException e) {
// When Glide is used to request a URI where this provider resolves the query,
// it will close the stream out from under us once it has fetched enough bytes
// to render a single frame as an image if the if it is to a video, so
// we swallow that exception here, only logging the error if it isn't that EPIPE
// (broken pipe due to one end being closed) exception.
if ( !(e.getMessage().contains("EPIPE"))) {
Timber.d("TransferThread: Got IOException transferring file: %s", e.getMessage());
}
} finally {
try {
if (in != null) {
in.close();
}
if ( out != null ) {
out.flush();
out.close();
}
Timber.d("TransferThread: Finished streaming file.");
} catch (IOException ioe) {
Timber.d("TransferThread: Got IOException closing file: %s", ioe.getMessage());
}
}
}
}
Again - ExoPlayer seems quite happy with the ParcelFileDescriptor it receives from the DocumentsProvider in this case - it takes the bytes streamed to it and plays the video. I am just not seeing the video file end up in the cache.
I also tried an example streaming a video from my Google Drive (which uses the out-of-the-box documents provider from the SAF), and this time the video did wind up in the cache.
Since they both use the same MediaSource instance - there must be an approach that the Google Docs provider takes so that ExoPlayer knows to place the resulting streamed video in the cache that my custom Dropbox DocumentsProvider is not doing.
Does anyone know how to get to the source code of the DocumentsProvider that ships with the SAF that manages access to Google Docs? I'd like to see what it is doing in its openDocument() method.
Is the fact that the Dropbox provider is utilizing a Pipe in its ParcelFileDescriptor something that ExoPlayer doesn't handle?
Other Ideas?
I have using spydroid from https://github.com/fyhertz/spydroid-ipcamera.
Based on the requirement streaming should be send and receive in the device. from local network we should able to shown the rtsp stream. Ex. VLC Media Player.
The issue I am facing is, When I am change the resolution Ex. 640*480. It should give black screen with streaming live. In Default demo, It should support 320*240, which is working fine. I have also change the bitrate and framerate according to 640*480 resolution. But couldn't get the result.
Any help would be appreciate.
You might be using old library which the demo SpyDroid is using.
I had same issue with the code i tried following way :-
Steps:-
1.)include library LibStreaming .
As it is the latest library and supporting above Lollipop version.
2.)Find H263Stream class Change following method :-
From
#SuppressLint("NewApi")
private MP4Config testMediaCodecAPI() throws RuntimeException, IOException {
createCamera();
updateCamera();
try {
if (mQuality.resX>=640) {
// Using the MediaCodec API with the buffer method for high resolutions is too slow
mMode = MODE_MEDIARECORDER_API;
}
EncoderDebugger debugger = EncoderDebugger.debug(mSettings, mQuality.resX, mQuality.resY);
return new MP4Config(debugger.getB64SPS(), debugger.getB64PPS());
} catch (Exception e) {
// Fallback on the old streaming method using the MediaRecorder API
Log.e(TAG,"Resolution not supported with the MediaCodec API, we fallback on the old streamign method.");
mMode = MODE_MEDIARECORDER_API;
return testH264();
}
}
To
#SuppressLint("NewApi")
private MP4Config testMediaCodecAPI() throws RuntimeException, IOException {
createCamera();
updateCamera();
try {
if (mQuality.resX>=1080) {
// Using the MediaCodec API with the buffer method for high resolutions is too slow
mMode = MODE_MEDIARECORDER_API;
}
EncoderDebugger debugger = EncoderDebugger.debug(mSettings, mQuality.resX, mQuality.resY);
return new MP4Config(debugger.getB64SPS(), debugger.getB64PPS());
} catch (Exception e) {
// Fallback on the old streaming method using the MediaRecorder API
Log.e(TAG,"Resolution not supported with the MediaCodec API, we fallback on the old streamign method.");
mMode = MODE_MEDIARECORDER_API;
return testH264();
}
}
-You can find the difference change resolution from "640" into "1080"
-Don't know what exact reason is but above solution worked for me.
-Revert me back if there is any loop fall.
In my case the problem was in MediaRecorder:
file: VideoStream.java
method: encodeWithMediaRecorder
MediaRecorder doesn't support ParcelFileDescriptor correctly. I created local file to save stream form MediaRecorder:
mMediaRecorder.setOutputFile(this.tmpFileToStream);
//mMediaRecorder.setOutputFile(fd); //disable..
and then I run new thread to copy bytes from tmpFileToStream to mParcelWrite
public void run()
{
FileDescriptor fd = mParcelWrite.getFileDescriptor();
try {
InputStream isS = new FileInputStream(tmpFileToStream); //read from local file..
FileOutputStream outputStream = new FileOutputStream(fd);
while (!Thread.interrupted()) {
int content;
while ((content = isS.read()) != -1) {
outputStream.write(content);
}
Thread.sleep(10);
}
} catch (Exception e)
{
Log.e(TAG, "E.. " + e.getMessage());
}
}
/**
* Video encoding is done by a MediaRecorder.
*/
protected void encodeWithMediaRecorder() throws IOException, ConfNotSupportedException {
Log.d(TAG,"Video encoded using the MediaRecorder API");
Log.d(TAG,"Roz" + mRequestedQuality.resX + " x " + mRequestedQuality.resY + " frame " + mRequestedQuality.framerate );
// We need a local socket to forward data output by the camera to the packetizer
createSockets();
// Reopens the camera if needed
destroyCamera();
createCamera();
// The camera must be unlocked before the MediaRecorder can use it
unlockCamera();
this.tmpFileToStream = this.getOutputMediaFile(MEDIA_TYPE_VIDEO);
Log.d(TAG,"Video record to " + this.tmpFileToStream.getAbsolutePath() );
try {
mMediaRecorder = new MediaRecorder();
mMediaRecorder.setCamera(mCamera);
mMediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
// mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
mMediaRecorder.setVideoEncoder(mVideoEncoder);
mMediaRecorder.setPreviewDisplay(mSurfaceView.getHolder().getSurface());
mMediaRecorder.setVideoSize(mRequestedQuality.resX,mRequestedQuality.resY);
mMediaRecorder.setVideoFrameRate(mRequestedQuality.framerate);
// The bandwidth actually consumed is often above what was requested
mMediaRecorder.setVideoEncodingBitRate((int)(mRequestedQuality.bitrate*0.8));
// We write the output of the camera in a local socket instead of a file !
// This one little trick makes streaming feasible quiet simply: data from the camera
// can then be manipulated at the other end of the socket
FileDescriptor fd = null;
if (sPipeApi == PIPE_API_PFD) {
fd = mParcelWrite.getFileDescriptor();
} else {
fd = mSender.getFileDescriptor();
}
mMediaRecorder.setOutputFile(this.tmpFileToStream); //save to local.. afeter read and stream that..
// mMediaRecorder.setOutputFile(fd); //disable..
// copy bytes from local file to mParcelWrite.. :)
if(tInput == null) {
this.tInput = new Thread(this);
this.tInput.start();
}
mMediaRecorder.prepare();
mMediaRecorder.start();
} catch (Exception e) {
StringWriter sw = new StringWriter();
PrintWriter pw = new PrintWriter(sw);
e.printStackTrace(pw);
Log.d(TAG, "Error stack " + sw.toString());
throw new ConfNotSupportedException(e.getMessage());
}
InputStream is = null;
if (sPipeApi == PIPE_API_PFD) {
is = new ParcelFileDescriptor.AutoCloseInputStream(mParcelRead);
} else {
is = mReceiver.getInputStream();
}
// This will skip the MPEG4 header if this step fails we can't stream anything :(
try {
byte buffer[] = new byte[4];
// Skip all atoms preceding mdat atom
while (!Thread.interrupted()) {
while (is.read() != 'm');
is.read(buffer,0,3);
if (buffer[0] == 'd' && buffer[1] == 'a' && buffer[2] == 't') break;
}
} catch (IOException e) {
Log.e(TAG,"Couldn't skip mp4 header :/");
stop();
throw e;
}
// The packetizer encapsulates the bit stream in an RTP stream and send it over the network
mPacketizer.setInputStream(is);
mPacketizer.start();
mStreaming = true;
}
That's unusual solution but it works for another resolution
I'm working on an application which supposed to run on devices from API 8 to latest.
Actually I'm dealing with Mediaplayer. the code is in a fragment and is simply:
MediaPlayer mediaPlayer = null;
if (mediaPlayer = MediaPlayer.create(getActivity(), myAudioFileUri) != null) {
. . .
}
This code perfectly works on Android 4.4.2, MediaPlayer.create() returns a valid value and I can use Mediaplayer without problem.
Unfortunately, MediaPlayer.create() returns null on Android 2.3.7.
this is my problem and I didn't find on Internet a reason why it could cause problem this Android version neither a difference in the way to use it.
Both tests have benn done on GenyMotion emulator as I don't have such an old Android device.
Edit:
So I verified using the shell adb that the problem really comes from mp3 file permissions if I "chmod 777 myfile.mp3", I can succesfully read it.
My problem now is to know how to change permissions on Android 2.3
The code used to download the file from my remote server to copy it locally is the next one:
private Uri downloadFileFromURL(URL url, String fileName) {
try {
URLConnection conn = url.openConnection();
HttpURLConnection httpConnection = conn instanceof HttpURLConnection ? (HttpURLConnection ) conn : null;
int responseCode = httpConnection.getResponseCode();
if (responseCode == HttpURLConnection.HTTP_OK){
int len, length = 0;
byte[] buf = new byte[8192];
InputStream is = httpConnection.getInputStream();
File file = new File(getActivity().getApplicationContext().getFilesDir().getParentFile().getPath(), fileName);
OutputStream os = new FileOutputStream(file);
try {
while((len = is.read(buf, 0, buf.length)) > 0) {
os.write(buf, 0, len);
length += len;
}
os.flush();
}
finally {
is.close();
os.close();
}
String chmodString = "chmod 777 " + getActivity().getApplicationContext().getFilesDir().getParentFile().getPath() +"/" + fileName;
Process sh = Runtime.getRuntime().exec("su", null, new File("/system/bin/"));
OutputStream osChgPerms = sh.getOutputStream();
osChgPerms.write((chmodString).getBytes("ASCII"));
osChgPerms.flush();
osChgPerms.close();
try {
sh.waitFor();
} catch (InterruptedException e) {
Log.d("2ndGuide", "InterruptedException." + e);
}
return Uri.fromFile(file);
}
}
catch(IOException e)
{
Log.d("2ndGuide", "IO Exception." + e);
}
return null;
}
But osChgPerms.write((chmodString).getBytes("ASCII")); generates an IOException: broken pipe.
I suppose I didn't understand how to execute the command.
What's wrong?
Regards,
I can point you 2 possible reasons behind that, not sure whether they can solve your issue.
Android can only allocate a certain amount of MediaPlayer objects, you need to release any MediaPlayer object by using mediaPlayer.release().
Android supports only 8- and 16-bit linear PCM, so check you audio
file. More: Supported Media Formats
So in fact the problem clearly comes from the fact that the media files must be readable for everybody to be readable by the media player.
This behaviour only occurs on pre HONEYCOMB devices.
I made a simple test to make MediaPlayer play some live streaming data via localSocket.
class IOLoop extends Thread
{
#Override
public void run()
{
try
{
MediaPlayer mPlayer = new MediaPlayer();
mPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
System.out.println("----======MediaPlayer()============-- ");
LocalSocket receiver = new LocalSocket();
System.out.println("----======new LocalSocket()============-- ");
FileDescriptor fd = receiver.getFileDescriptor();
System.out.println("----fd============-- ");
mPlayer.setDataSource(fd); //<-- error
mPlayer.prepare();
System.out.println("----=========mPlayer set===============-- ");
}
catch (IOException e)
{//
}
}
}
IOLoop io00 = new IOLoop();
io00.start();
This code fails, with IllegalArgumentException
02-14 05:16:46.418 20424-20436/com.example.app I/System.out﹕ ----fd============--
02-14 05:16:46.426 20424-20436/com.example.app W/dalvikvm﹕ threadid=10: thread exiting with uncaught exception (group=0xa61ea908)
02-14 05:16:46.426 20424-20436/com.example.app E/AndroidRuntime﹕ FATAL EXCEPTION: Thread-197
java.lang.IllegalArgumentException
at android.media.MediaPlayer.setDataSource(Native Method)
at android.media.MediaPlayer.setDataSource(MediaPlayer.java:976)
at com.example.app.MainActivity$1IOLoop.run(MainActivity.java:51)
so googled.
Basically, they say LocalSocket FileDescriptor is not seekable so not adequate for the data source.
However, according to AndroidDeveloper-Media Playback
http://developer.android.com/guide/topics/media/mediaplayer.html
It clearly stated:
The Android multimedia framework includes support for playing variety
of common media types, so that you can easily integrate audio, video
and images into your applications. You can play audio or video from
media files stored in your application's resources (raw resources),
from standalone files in the filesystem, or from a data stream
arriving over a network connection, all using MediaPlayer APIs.
So, it's a strange situation.
Also, there is voice chat apps like LINE etc. What is the workaround?
Any thought? Thank you.
EDIT:
found a similar topic:
Can I use MediaPlayer play video from stream line
How do you play Android InputStream on MediaPlayer?
Play AAC buffer on Android
https://code.google.com/p/aacdecoder-android/
What a mess..
EDIT2
This is a real good project to illustrate these area.
https://github.com/fyhertz/libstreaming
I think MediaCodec is the way to go instead of MediaRecorder etc.
Don't use media player. Just take in the microphone input and send it to the client who is at the receiver end of the voice chat. For example to send the voice:
protected String doInBackground() throws Exception {
AudioFormat af = new AudioFormat(8000.0f,8,1,true,false);
DataLine.Info info = new DataLine.Info(TargetDataLine.class, af);
TargetDataLine microphone = (TargetDataLine)AudioSystem.getLine(info);
microphone.open(af);
Socket conn = new Socket(SERVER,3000);
microphone.start();
DataOutputStream dos = new DataOutputStream(conn.getOutputStream());
int bytesRead = 0;
byte[] soundData = new byte[1];
Thread inThread = new Thread(new SoundReceiver(conn));
inThread.start();
while(bytesRead != -1)
{
bytesRead = microphone.read(soundData, 0, soundData.length);
if(bytesRead >= 0)
{
dos.write(soundData, 0, bytesRead);
}
}
// TODO Auto-generated method stub
return null;
}
}
To receive the voice:
public SoundReceiver(Socket conn) throws Exception
{
connection = conn;
soundIn = new DataInputStream(connection.getInputStream());
AudioFormat af = new AudioFormat(8000.0f,8,1,true,false);
DataLine.Info info = new DataLine.Info(SourceDataLine.class, af);
inSpeaker = (SourceDataLine)AudioSystem.getLine(info);
inSpeaker.open(af);
}
public void run()
{
int bytesRead = 0;
byte[] inSound = new byte[1];
inSpeaker.start();
while(bytesRead != -1)
{
try{bytesRead = soundIn.read(inSound, 0, inSound.length);} catch (Exception e){}
if(bytesRead >= 0)
{
inSpeaker.write(inSound, 0, bytesRead);
}
}
}
When using the version of MediaPlayer.setDataSource() that takes a FileDescriptor, MediaPlayer expects that file descriptor to be a regular seekable file. A socket is not seekable.
The "data stream arriving over a network connection" bit you quoted from the documentation refers to http/rtsp streaming, so you can either point it at a remote server URL, or implement a local server and then do something like mPlayer.setDataSource("http://localhost:12345")
From my Android app I try to download from the windows Azure blob storage using the following URL: http://iclyps.blob.core.windows.net/broadcasts/23_6.mp4
The resulting file is corrupt when I download it from within my app. Same error occurs when I download it using the default Browser or Chrome. Also from the Easy Downloader app, the same error occurs. Only a download from my PC or using Firefox Beta from the Android device (or emulator), the file is retrieved correctly.
I use the following code (snippet):
try {
HttpURLConnection urlConnection = (HttpURLConnection) url.openConnection();
//set up some things on the connection
urlConnection.setRequestMethod("GET");
urlConnection.setDoOutput(true);
//and connect!
urlConnection.connect();
bis = new BufferedInputStream(urlConnection.getInputStream(), BUFSIZE);
bos = new BufferedOutputStream(
context.openFileOutput(TMPFILE, Context.MODE_PRIVATE), BUFSIZE);
/*
* Read bytes to the buffer in chunks of BUFSIZE bytes until there is nothing more to read.
* Each chunk is written to the output file.
*/
byte[] buf = new byte[BUFSIZE];
int nBytes = 0;
int tBytes = 0;
while ((nBytes = bis.read(buf, 0, BUFSIZE)) > 0) {
bos.write(buf, 0, nBytes);
tBytes += nBytes;
}
if (tBytes == 0) throw new Exception("no bytes received");
bos.flush();
MobyLog.d(TAG, "download succeeded: #bytes = " + Integer.toString(tBytes));
return true;
} catch (Exception e) {
MobyLog.e(TAG, "download failed: " + e);
context.deleteFile(TMPFILE); // remove possibly present partial file.
return false;
} finally {
if (bis != null) try { bis.close(); } catch (IOException e) {MobyLog.e(TAG, "bis close exception: " + e); };
if (bos != null) try { bos.close(); } catch (IOException e) {MobyLog.e(TAG, "bos close exception: " + e); };
}
Analyzing the files shows that the first part (about 700K) of the original file is repeated a number of times in the corrupted files, resulting in an invalid mp4 file.
Putting the file on another webserver (Apache/IIS), and downloading the file from that location does result in a correct download.
Has anyone experienced a similar problem performing a download from Azure? Can someone provide a solution?
Cheers,
Harald...
Have you tried using the azure-sdk-for-java in your android app?
Our scenario is slightly different in that we using the sdk to pull and push images from blob storage to a custom android app. But the fundamentals should be the same.