I use ExoPlayer library for my Hls live broadcast.
I have started a live stream with wirecast. Broadcast is in the second hour. But when test the stream with my android application, it always starts at 0. How to I get real duration on server with Exoplayer?
Related
I have an android app which plays audio from server by http streaming. Server supports range headers and provides response based on range requests.
Can android media player request with range header , like chrome player does?
We can use SetDataSource for adding header parameters - but is there any way to get current byte position , which mediaplayer is playing or how can we do current time to byte calculation
Thanks
I need to stream audio from external bluetooth device and video from camera to wowza server so that I can then access the live stream through a web app.
I've been able to successfully send other streams to Wowza using the GOCOder library, but as far as I can tell, this library only sends streams that come from the device's camera and mic.
Does anyone have a good suggesting for implementing this?
In the GoCoder Android SDK, the setAudioSource method of WZAudioSource allows you to specify an audio input source other than the default. Here's the relevant API doc for this method:
public void setAudioSource(int audioSource)
Sets the actively configured input device for capturing audio.
Parameters:
audioSource - An identifier for the active audio source. Possible values are those listed at MediaRecorder.AudioSource. The default value is MediaRecorder.AudioSource.CAMCORDER. Note that setting this while audio is actively being captured will have no effect until a new capture session is started. Setting this to an invalid value will cause an error to occur at session begin.
We have a Android radio app which plays various live radio streams. It works pretty well with Google Chromecast as log the stream is a "normal" mp3 stream. The playback on Chromecast Styled Media Receiver is NOT working with SHOUTcast streams i.e http://46.105.118.14:13500.
After mRemoteMediaPlayer.load(...) I see in my Android LogCat a result statuscode = 1.
In Chrome debug console I see the following Load metadata error:
[673.080s] [cast.receiver.MediaManager] Load metadata error
cast_receiver.js:18
ib cast_receiver.js:18
gb.Gb cast_receiver.js:18
B.log cast_receiver.js:13
E cast_receiver.js:15
Z.pa cast_receiver.js:71
Eb cast_receiver.js:23
Cb cast_receiver.js:24
(anonymous function) cast_receiver.js:21
Is it possible to play SHOUTcast live streams with Google Chromecast (Styled Media Receiver or Custom Receiver)? If yes, can you give me some hints or point me to an example?
Append a /; after the port of the stream url, so you get to the stream data i.e http://46.105.118.14:13500/;
I use the localcast android app with this trick to make my chromecast play SHOUTcast Radio.
I have a android application that plays HLS (Http Live Streaming) videos using VideoView.
I am using Local http proxy for forwarding http requests from VideoView to main HLS server as my stream (transport segment) is encrypted.
Flow of my application:
0. Preparing VideoView with local proxy url. ex. "http:// localhost :9878/index.m3u8"
1. VideoView sends requests to my proxy for M3U8 and ts segments.
2. Proxy forwards requests for M3u8 and ts from VideoView to HLS server.
3. Proxy checks for transport stream requests and before sending response to VideoView decrypts transport stream and sends it to VideoView.
4. VideoView plays the video stream.
This is working properly but some times i get following error:
output buffer is smaller than decoded data size Out Length
When i get this error in logcat my video becomes garbage (green video)
I see this issue usually when video stream bitrate size increases, is there any workaround for this issue?
I am doing an Android IP phone application with android 2.1 version.
My application is to provide a simple ip phone function.My program consist of a listener thread which receive command and poll user to start the call.
The audiotrack class will receive audio data and playback
while it will record audio data with audiorecord and stream it out to the other side.
When there is only one user streaming to another, the sound quality is good.However, when the receiver side also start record and stream, both side ear weird sound and loud noise.But still both sides can hear what the others said.
Is it not suitable for using audiotrack and audiorecord class on the same side?I cannot figure out the problem. Can anyone suggest any solution?