I need an android app to live stream rtsp url using videoview or media player with digest authentication. I don't know how to use digest authentication along with rtsp url. Is it possible using digest authentication in mediaplayer. Any help would be appreciated.
It's broken at present which is a major problem from my point of view since there appears to be no way to connect a stream directly to MediaPlayer either (that is, have some "shim" do that work for you, which might otherwise work -- and which would also make tunneling RTSP over an SSL connection at some remote site possible, resolving the "wide-open video" issue at the same time.)
The issue is that while the MediaPlayer will issue a digest for authentication on RTSP requests what it sends is broken -- specifically, there's an EXTRA SPACE at the end of the URI it passes back:
D13.Denninger.Net.51291 > 192.168.4.211.rtsp: Flags [P.], cksum 0x069b (correct), seq 166:526, ack 143, win 256, length 360: RTSP, length: 360
DESCRIBE rtsp://192.168.4.211:554/cam/realmonitor?channel=1&subtype=0 RTSP/1.0
Accept: application/sdp
Authorization: Digest nonce="a2732278fba530ed26e2a278a866fa13", username="karl", uri="rtsp://192.168.4.211:554/cam/realmonitor?channel=1&subtype=0 ", response="311b3d4ea28e643ed0d7e61820d43588"
User-Agent: stagefright/1.2 (Linux;Android 6.0)
CSeq: 2
That space is NOT THERE in the actual URI passed to the Android Mediaplayer code and causes the authentication to fail.
There's an AOSP bug filed against this (which I contributed the above trace to with more details) but until and unless Google fixes it the only alternative is to find some other player that (1) works with RTSP and either (2) correctly handles digest authentication or (2a) can be attached to a socket allowing you to write a shim, either remotely or locally, to handle the digest work yourself.
I've yet to find one but would love a pointer if someone has resolved this.
I do not know exactly how to resolve the problem with RSTP mixed with disgest authentication.
But, Digest Authentication with HTTP using URLConnection.
There is this implementation that might be helpful for you.
https://gist.github.com/slightfoot/5624590
Theoretically, after you have passed through the authentication process, you could get the data from the stream server and render it.
Related
We are struggling to get our server set up to stream videos with react-native-video on Android. All the files to be streamed are in .MP4 format and vary in size from 50-100 MB. I am not able to provide the URL to reproduce or share any of the MP4 files. I have tried Android player and Exoplayer and I am not seeking the video as we speak.
For small videos there doesn't seem to be problem, as the server receives HTTP request with the following headers from the player:
{
"name":"test.mp4",
"User-Agent":"stagefright/1.2 (Linux;Android 10)",
"Connection":"close",
"X-Forwarded-Proto":"https",
"X-Forwarded-For":**********,
"Host":**********,
"Accept-Encoding":"gzip",
"ssl_session_id":**********,
}
And we serve the test.mp4 file. In this case everything works on the client side (video plays fine), even though we are getting the following error on server side as react-native-video is sending the same identical HTTP request again. Why is it sending multiple requests?
Failed to handle request.: java.io.IOException: Broken pipe
....
Failed to send error response: java.lang.IllegalStateException: UT010019: Response already commited
For both identical requests we return a response with the file content as body, 200 STATUS OK and the following headers:
{
"Server": "nginx",
"Date": <date>,
"Content-Length": <length in bytes>,
"Connection": "keep-alive",
"Expires": 0,
"Cache-Control": "no-cache, no-store, max-age=0, must-revalidate",
"Pragma": "no-cache",
"Content-Disposition": "attachment; filename="test.mp4",
}
What type of headers is react-native-video expecting?
The situation changes when files get larger (~100 MB). At some point react-native-video starts to send range requests. When? However the first request is without Range header parameter and identical to the one before. As usual we send the same response and produce same errors. However from there on the player starts to send repetitive range requests similar to the one described above, but with the Range header parameter like this:
{
...,
"Range":"bytes=1752325-"
}
All the following requests follow the same pattern: bytes=<start-bytes>-, which means requesting bytes from some value to end. The value <start-bytes> increases with every new request (sometimes there are requests with the same value multiple times). Why is it requesting the same bytes over and over again and not requesting for specific ranges?
We have range requests supported on our server and thus responding with valid file data,
206 STATUS PARTIAL CONTENT and the same headers with the following addition (values are for example):
{
....,
"Content-Range": "bytes 0-11546/115461173"
}
What type of response is react-native-video expecting?
With those range requests the server is also producing the errors described above and the video is getting freezed very often on the client side for minutes. After for some time the video starts to play fine and the errors are not being produced on the server side.
I have several questions marked with bold in the text that I am confused of. Also I could not find any documentation describing how the communication with the player works and what kind of requests is it sending and responses is it expecting.
Is there any documentation describing the protocol or could you hint anything I might be doing wrong here?
"mp4" does not tell the player very much about the file. The player needs information about the codecs, codec features, file layout, and a dozen or so other things. With mp4 this metadata may be at the beginning or end of the file. Its also possible the file you point to is not actually an mp4 so the player needs to check for that as well.
So how does the player get this information? Its starts to download the file. Once is has a few bytes the player may need to cancel the request because the data it needs was not at that location in the file. But there is no cancel mechanism in HTTP, so it just terminates the TCP session; hence java.io.IOException: Broken pipe. It may appear the player is downloading the same data multiple times, But its not because the request was canceled before the request was finished. As the player is scanning the file it can leap-frog over large blocks using the data it has learned so far seeking for the metadata needed.
To give the player the best chance of finding the metadata quickly, make sure it is art the start of the file. This is called "fast start". Searching google for "fast start mp4" will tell you how to do that.
I am streaming from Wowza to Mobile (Android). I am using Vitamio streaming library. The actual flow is like, Wowza don't have live stream up, when my app hits the Wowza, but after a while Wowza gets live stream up. Now I again want to hit the Wowza to check whether live stream up or not.
All I want is, my app should hit the Wowza in every 5 sec to check the stream up or not.
It sounds like you just need to query your Wowza server to see if your stream is active or not. If so, then you can use a Wowza REST API command to query Incoming Streams (you need at least version 4.2+).
For example, to query application "live" for all Incoming Streams, you can send the following command via HTTP:
curl -X GET --header 'Accept:application/json; charset=utf-8' http://localhost:8087/v2/servers/_defaultServer_/vhosts/_defaultVHost_/applications/live/instances/_definst_
You would get a response similar to:
{"serverName":"_defaultServer_","incomingStreams":[{"sourceIp":"<hidden>","isPTZEnabled":false,"applicationInstance":"_definst_","name":"wowzademo.stream","isRecordingSet":false,"isStreamManagerStream":true,"isPublishedToVOD":false,"isConnected":true,"ptzPollingInterval":2000}],"outgoingStreams":[],"recorders":[],"streamGroups":[],"name":"_definst_"}
Which shows that I have an Incoming Stream called "wowzademo.stream" (among other things). You can also change your response to be xml instead of json, if you prefer.
Update
In response to your comment, I would add:
make sure that you are on at least version 4.2+; I would actually recommend using version 4.3 since there were some REST API fixes on the latest version;
If you are on version 4.3, then test with disabling authentication by setting <AuthenticationMethod> and <DocumentationServerAuthenticationMethod> to none under the Root/Server/RESTInterface container of the conf/Server.xml file;
Make sure that you add the restUserHTTPHeaders property to have the value "Access-Control-Allow-Origin:*|Access-Control-Allow-Methods:OPTIONS,GET,PUT,DELETE,POST|Access-Control-Allow-Headers:Content-Type".
If the above still does not work for you, try enabling the debug properties by setting <DiagnosticURLEnable> to true, and adding <debugEnable> (set to true/Boolean type) property in the RESTInterface container. You can view the logs generated by checking the access logs in the logs/ directory.
I have not been able to get the chromecast to connect and play a shoutcast stream. It just returns a SERVICE_MISSING error. I have seen a post about adding a / to the end of the url but this makes no difference for me. I posted in the Winamp forums, they think it is related to the headers at the beginning of the stream. I am currently using the Default Media Receiver.
Any tricks to get this working?
SHOUTcast servers use a non-standard status line in their response. A normal status line:
HTTP/1.1 200 OK
SHOUTcast status line:
ICY 200 OK
Because of this once difference, many clients fail to be able to handle the response from the server. The solution is to simply not use SHOUTcast. Use Icecast or another server that returns proper HTTP responses.
Once you do get the server compatibility issue solved, you may have another problem of codec compatibility. I am not sure of the Chromecast codecs... you may have to experiment in this area to see what is available to you. Even if you do get the right codec, you have to have a compatible container. For example, a lot of browsers support HE-AAC, but not when wrapped in the ADTS stream wrapper which is often used with SHOUTcast and Icecast servers.
Append the two characters /; after the port of the stream url, so you get to the stream data i.e http://46.105.118.14:13500/;
This worked for me with the Default Media Receiver. See answer here.
If you append /;stream/1 it downloads for me
I am trying to play a raw liveTV mpeg2_ts stream via google tv Media player;
The stream is unbounded (live tv) so there is no content-length.
The stream is accessed via a url that looks like this http:///livetv?channum=X
This was tested with VLC has a client and worked great. However using GTV is another story
The stream response header contains the header Tranfer-Encoding : chunked.
Attempting to play that stream in GTV media player causes the following error :
I/AVAPIMediaPlayer(142): Found HTTP success. Connection is HTTP/1.1, code was 206
I/AVAPIMediaPlayer(142): Found content type video/mpeg
W/AVAPIMediaPlayer(142): Error, reached end of headers before finding required fields.
Looking at this file: gtv_curl_transfer_engine.cpp it seems that v3 has removed the support for Transfer-Encoding and only supports / requires a Content-Length.
the previous version of the same file (GTV v2 gtv_curl_transfer_engine.cpp) supported it but it was removed in the current version.
what was the rationale to remove the support ? and how would one work around it ?
I was thinking about a set of temp files and chaining mediaplayer instances for playback but I would rather limit file system interactions given the nature of the stream...
From my interactions with google, there is no plan to change this behavior.
The course of action is to provide the videos in a HTTP Live stream format (m3u8)
I am trying to build a client on android that will receive RTP streams and play it.
I have searched on stackoverflow and Google, and found that MediaPlayer class can be used for this. But the MediaPlayer is used when a URL or a file is used as data source.
In my scenario, my streaming server send RTP streams on a particular port of my client.
So, is there any way to play MediaPlayer to play this stream without writing it into a file.
Have you tried it out?
Your 'data source' for MediaPLayer is your rtsp link - rtsp://127.0.0.0:550/mystream.3gp
I have done this with a VideoView, MediaPlayer is jsut an abstraction of this, so it shouldn't be too different.
The MedaiPlayer will parse the rtsp url and start the Android RTSPEngine which will communicate with the RTP Server an establish the relevants ports to transfer the data.
The method described above works - you need to supply the rtsp link (you likely won't need the port since RTSP defaults to port 554, not 550).
rtsp://se.rv.er.ip/mystream
Essentially an RTSP conversation occurs between the client and server (DESCRIBE, SETUP, PLAY) which will contain the information a .sdp file for an RTP broadcast would have.
You cannot receive an RTP stream unless the server sending it supports RTSP.
If you need more details, I'd recommend just running wireshark on a streaming client or server. I found that helped supplement reading the RFCs.
Good luck!