Does anyone host IceCast 2.3.2+ server with chunk encoding MP3 stream, that I can test?
I'd like to test it whether would be any stream stops caused of small chunk length in Android Mediaplayer.
Icecast up to and including version 2.4.1 does not support HTTP chunked encoding at all. Version 2.5.0 will support chunked encoding for HTTP PUT requests (source client side), but there is currently no point in supporting it for GET requests.
You might have confused this with the metadata hack introduced by Shoutcast for MP3 streams. There the actual encoded audio data stream is interrupted at a fixed, so called, metadata-interval and metadata is injected.
A player that is capable of handling such a stream has to signal this to the server through a HTTP header and if supported the streaming server will inform the client in the response HTTP headers about the metadata-interval and other parameters. The player then must remove this injected data from the stream upon reception and before handing it to a decoder for playback.
Please note that this hack is only necessary for streams that don't have a container with inherent metadata handling. Opus and Ogg/Vorbis streams will send the metadata natively inside the stream without need for such hacks.
Related
I need an android app to live stream rtsp url using videoview or media player with digest authentication. I don't know how to use digest authentication along with rtsp url. Is it possible using digest authentication in mediaplayer. Any help would be appreciated.
It's broken at present which is a major problem from my point of view since there appears to be no way to connect a stream directly to MediaPlayer either (that is, have some "shim" do that work for you, which might otherwise work -- and which would also make tunneling RTSP over an SSL connection at some remote site possible, resolving the "wide-open video" issue at the same time.)
The issue is that while the MediaPlayer will issue a digest for authentication on RTSP requests what it sends is broken -- specifically, there's an EXTRA SPACE at the end of the URI it passes back:
D13.Denninger.Net.51291 > 192.168.4.211.rtsp: Flags [P.], cksum 0x069b (correct), seq 166:526, ack 143, win 256, length 360: RTSP, length: 360
DESCRIBE rtsp://192.168.4.211:554/cam/realmonitor?channel=1&subtype=0 RTSP/1.0
Accept: application/sdp
Authorization: Digest nonce="a2732278fba530ed26e2a278a866fa13", username="karl", uri="rtsp://192.168.4.211:554/cam/realmonitor?channel=1&subtype=0 ", response="311b3d4ea28e643ed0d7e61820d43588"
User-Agent: stagefright/1.2 (Linux;Android 6.0)
CSeq: 2
That space is NOT THERE in the actual URI passed to the Android Mediaplayer code and causes the authentication to fail.
There's an AOSP bug filed against this (which I contributed the above trace to with more details) but until and unless Google fixes it the only alternative is to find some other player that (1) works with RTSP and either (2) correctly handles digest authentication or (2a) can be attached to a socket allowing you to write a shim, either remotely or locally, to handle the digest work yourself.
I've yet to find one but would love a pointer if someone has resolved this.
I do not know exactly how to resolve the problem with RSTP mixed with disgest authentication.
But, Digest Authentication with HTTP using URLConnection.
There is this implementation that might be helpful for you.
https://gist.github.com/slightfoot/5624590
Theoretically, after you have passed through the authentication process, you could get the data from the stream server and render it.
I've just started to build an iternet radio receiver application for android which works fine for .mp3 and .pls extension URLs.
I find that there are many streaming stations just given by IP address and port number (//101.102.103.104:8080 for example).
My question is,if I receive a large number of bytes from such URL, how can I determine the type of audio stream (is it wav or mp3 or pls or something else) , in order to apply a decoder?
Thanks
File-name extension is meaningless in HTTP. Sniffing content type from raw data is difficult, not always reliable, and inefficient. The correct way to do this is to look at the Content-Type response header. Here are the typical headers for a stream:
Cache-Control:no-cache
Content-Type:audio/mpeg
Expires:Mon, 26 Jul 1997 05:00:00 GMT
icy-br:256
icy-genre:Drum and Bass Jungle
icy-name:Drum and Bass - Digitally Imported Premium
icy-notice1:<BR>This stream requires Winamp<BR>
icy-notice2:SHOUTcast Distributed Network Audio Server/Linux v1.9.8<BR>
icy-pub:0
icy-url:http://www.di.fm
Pragma:no-cache
Server:Icecast 2.3.3-kh7
The content type audio/mpeg indicates MP3 (or some other MPEG audio stream, but I only ever see MP3). There are many others, such as audio/aacp or audio/ogg.
I have a rtsp stream on my rtsp server in localhost.
I would to play this file with the android class MediaPlayer.
If I do setDataSource(rtsp://localhost/file.sdp) it works!!
My problem is... if I copy the file on my http server and I do
setDataSource(http://localhost/file.sdp)
it does not work!! I receive an I/O exception.
filePath = "http://localhost/file.sdp";
mediaPlayer.setDataSource(filePath);
mediaPlayer.prepare();
mediaPlayer.start();
If I play this file with vlc application it works.
RTSP and HTTP are different protocols. An HTTP server is not going to serve the data in the same way. It's going to send HTTP headers, etc. VLC may be somehow smart enough to infer the protocol based on the data it receives, but Android's NuPlayer is probably not so sophisticated.
RTSP "Real Time Streaming Protocol" is for streaming media server. You can watch live video broadcasting from remote server on your computer/mobile device through RTSP protocol. This protocol only handle playback of media files. Below are some features of RTSP:
This works on TCP connection
RTSP requests are all sent on the same TCP connection
This protocol has very less end-to-end delay
This is also called "true streaming"
No file is downloaded to user's system
Play movie in real time
Can do live broadcast
Some firewall blocks this protocol on user's machine
HTTP "Hypertext Transfer Protocol" is for transferring files (text, graphic images, sound, video, and other multimedia files) on the World Wide Web. HTTP protocol communicate between Web pages (Contain text, graphic images, sound, video and other multimedia) hosted on remote server and the user's browsers on their system. We can watch streaming video through HTTP protocol. Below are some features of HTTP:
This works on TCP connection
HTTP will typically send each request on a separate TCP
This protocol has high end-to-end delay as compared to RTSP
Serve content from standard web server
This support progressive download from a web server
File is downloaded to user's system but can start playing before completely downloaded
This works on all firewall because it uses standard HTTP protocol
SDP "Session Description Protocol" consists of a set of communications end points along with a series of interactions among them. SDP contains information about streaming media. It contains mainly three parts about media - Session description, Time description and Media description. SDP not itself is a file type but it is a protocol and responsible for steaming media. HTTP and RTSP both support SDP.
I am trying to play a raw liveTV mpeg2_ts stream via google tv Media player;
The stream is unbounded (live tv) so there is no content-length.
The stream is accessed via a url that looks like this http:///livetv?channum=X
This was tested with VLC has a client and worked great. However using GTV is another story
The stream response header contains the header Tranfer-Encoding : chunked.
Attempting to play that stream in GTV media player causes the following error :
I/AVAPIMediaPlayer(142): Found HTTP success. Connection is HTTP/1.1, code was 206
I/AVAPIMediaPlayer(142): Found content type video/mpeg
W/AVAPIMediaPlayer(142): Error, reached end of headers before finding required fields.
Looking at this file: gtv_curl_transfer_engine.cpp it seems that v3 has removed the support for Transfer-Encoding and only supports / requires a Content-Length.
the previous version of the same file (GTV v2 gtv_curl_transfer_engine.cpp) supported it but it was removed in the current version.
what was the rationale to remove the support ? and how would one work around it ?
I was thinking about a set of temp files and chaining mediaplayer instances for playback but I would rather limit file system interactions given the nature of the stream...
From my interactions with google, there is no plan to change this behavior.
The course of action is to provide the videos in a HTTP Live stream format (m3u8)
I am trying to build a client on android that will receive RTP streams and play it.
I have searched on stackoverflow and Google, and found that MediaPlayer class can be used for this. But the MediaPlayer is used when a URL or a file is used as data source.
In my scenario, my streaming server send RTP streams on a particular port of my client.
So, is there any way to play MediaPlayer to play this stream without writing it into a file.
Have you tried it out?
Your 'data source' for MediaPLayer is your rtsp link - rtsp://127.0.0.0:550/mystream.3gp
I have done this with a VideoView, MediaPlayer is jsut an abstraction of this, so it shouldn't be too different.
The MedaiPlayer will parse the rtsp url and start the Android RTSPEngine which will communicate with the RTP Server an establish the relevants ports to transfer the data.
The method described above works - you need to supply the rtsp link (you likely won't need the port since RTSP defaults to port 554, not 550).
rtsp://se.rv.er.ip/mystream
Essentially an RTSP conversation occurs between the client and server (DESCRIBE, SETUP, PLAY) which will contain the information a .sdp file for an RTP broadcast would have.
You cannot receive an RTP stream unless the server sending it supports RTSP.
If you need more details, I'd recommend just running wireshark on a streaming client or server. I found that helped supplement reading the RFCs.
Good luck!