I have an android app which plays audio from server by http streaming. Server supports range headers and provides response based on range requests.
Can android media player request with range header , like chrome player does?
We can use SetDataSource for adding header parameters - but is there any way to get current byte position , which mediaplayer is playing or how can we do current time to byte calculation
Thanks
I use ExoPlayer library for my Hls live broadcast.
I have started a live stream with wirecast. Broadcast is in the second hour. But when test the stream with my android application, it always starts at 0. How to I get real duration on server with Exoplayer?
We have a Android radio app which plays various live radio streams. It works pretty well with Google Chromecast as log the stream is a "normal" mp3 stream. The playback on Chromecast Styled Media Receiver is NOT working with SHOUTcast streams i.e http://46.105.118.14:13500.
After mRemoteMediaPlayer.load(...) I see in my Android LogCat a result statuscode = 1.
In Chrome debug console I see the following Load metadata error:
[673.080s] [cast.receiver.MediaManager] Load metadata error
cast_receiver.js:18
ib cast_receiver.js:18
gb.Gb cast_receiver.js:18
B.log cast_receiver.js:13
E cast_receiver.js:15
Z.pa cast_receiver.js:71
Eb cast_receiver.js:23
Cb cast_receiver.js:24
(anonymous function) cast_receiver.js:21
Is it possible to play SHOUTcast live streams with Google Chromecast (Styled Media Receiver or Custom Receiver)? If yes, can you give me some hints or point me to an example?
Append a /; after the port of the stream url, so you get to the stream data i.e http://46.105.118.14:13500/;
I use the localcast android app with this trick to make my chromecast play SHOUTcast Radio.
I am working on an application which plays hls streams from a server. I have used Video view for playing the stream. But now I need the stream data for some background processing. Is it possible to get the stream data which is currently playing in the video view as an inputstream??
I'm creating an Android application of live video streaming between two android phone. I've already established a socket connection between these devices. I'm capturing video in one device and send the stream to other device but currently I just want to save in the receiver side mobile device and save it. I'm recording using MediaRecorder in one device , so to stream to the receiver I,m using parcelfiledescriptor object by setting the data
Client side code
mediaRecorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER);
mediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
mediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
mediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
mediaRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.H263);
mediaRecorder.setOutputFile(pfd.getFileDescriptor());
Receiver side code
pfd= ParcelFileDescriptor.fromSocket(s);
InputStream in = new FileInputStream(pfd.getFileDescriptor());
DataInputStream clientData = new DataInputStream(in);
OutputStream newDatabase = new FileOutputStream(file);
int available=in.available();
byte[] buffer = new byte[available];
int length;
while((length = in.read(buffer)) > 0)
{
newDatabase.write(buffer, 0, length);
}
newDatabase.close();
The video file is being created on the receiver side mobile, but it's not able to receive any bytes. So Do I've to decode the coming stream on the receiver side since the video stream sent is encoded while recording. So how can I decode the stream that is received ? I found some solution like MediaExtractor and MediaCodec...but will this work with live video capturing and moreover I'm testing on android version 2.3.6 GingerBread
Is it possible to decode the video stream from MediaCodec for version 2.3.6 or some other method is available ?
The video file is being created on the receiver side mobile, but it's not able to receive any bytes.
If I understand you right, you are getting no data from the socket. That is a separate problem, which has nothing to do with the video format, decoding or encoding.
To debug your sockets, it may be helpful to use a separate application which just dumps the recieved data. Once the data looks fine, you can go to the next step - decoding the video.
Second part of the problem is the video format. You are using mp4, which is not usable for streaming. Here is more info about the format structure. You can use mp4 to record a video into a local file and then transfer the whole file over socket somewhere, but true realtime streaming cannot be done because of the non-seekable nature of the socket (as described in the linked article). There is a block of metadata at the beginning of the file, which acts as a "table of contents" and without it, the previous data are just junk. The problem is, you can assemble a "table of contents" only after you got all the contents. But at that moment, the data was already sent through the socket and you cannot insert anything at its beginning.
There are few walkarounds, but that's just for your future research and I haven't used them yet.
The most intuitive way would be to switch from mp4 to mpeg-ts, a container designed for streaming. Take a look at a hidden constant in MediaRecorder.OutputFormat with value 8.
Another option is to pack the raw H.264 data into RTP/RTCP packets, which is again a protocol designed for streaming. Also your application would be able to stream to any device that support this protocol (for example a PC running VLC). To further reasearch, take a look at Spydroid IP camera, which does exactly the thing.