I have a rtsp stream on my rtsp server in localhost.
I would to play this file with the android class MediaPlayer.
If I do setDataSource(rtsp://localhost/file.sdp) it works!!
My problem is... if I copy the file on my http server and I do
setDataSource(http://localhost/file.sdp)
it does not work!! I receive an I/O exception.
filePath = "http://localhost/file.sdp";
mediaPlayer.setDataSource(filePath);
mediaPlayer.prepare();
mediaPlayer.start();
If I play this file with vlc application it works.
RTSP and HTTP are different protocols. An HTTP server is not going to serve the data in the same way. It's going to send HTTP headers, etc. VLC may be somehow smart enough to infer the protocol based on the data it receives, but Android's NuPlayer is probably not so sophisticated.
RTSP "Real Time Streaming Protocol" is for streaming media server. You can watch live video broadcasting from remote server on your computer/mobile device through RTSP protocol. This protocol only handle playback of media files. Below are some features of RTSP:
This works on TCP connection
RTSP requests are all sent on the same TCP connection
This protocol has very less end-to-end delay
This is also called "true streaming"
No file is downloaded to user's system
Play movie in real time
Can do live broadcast
Some firewall blocks this protocol on user's machine
HTTP "Hypertext Transfer Protocol" is for transferring files (text, graphic images, sound, video, and other multimedia files) on the World Wide Web. HTTP protocol communicate between Web pages (Contain text, graphic images, sound, video and other multimedia) hosted on remote server and the user's browsers on their system. We can watch streaming video through HTTP protocol. Below are some features of HTTP:
This works on TCP connection
HTTP will typically send each request on a separate TCP
This protocol has high end-to-end delay as compared to RTSP
Serve content from standard web server
This support progressive download from a web server
File is downloaded to user's system but can start playing before completely downloaded
This works on all firewall because it uses standard HTTP protocol
SDP "Session Description Protocol" consists of a set of communications end points along with a series of interactions among them. SDP contains information about streaming media. It contains mainly three parts about media - Session description, Time description and Media description. SDP not itself is a file type but it is a protocol and responsible for steaming media. HTTP and RTSP both support SDP.
Related
I'm trying to make a live stream from raspberry to Android through the internet.
I searched through the web and I'm actually able to stream from the raspberry, and read the stream from mobile when the mobile is directly connected to the raspberry.
But if I want to make it online, there is something I'm missing on how to "pipe" this stream through another server.
So mainly I want to check how to post the stream to a server, and how to retrieve it from a mobile in realtime.
I already checked the following :
Http Live Streaming with the Apache web server
https://raspberrypi.stackexchange.com/questions/7446/how-can-i-stream-h-264-video-from-the-raspberry-pi-camera-module-via-a-web-serve
https://docs.peer5.com/guides/setting-up-hls-live-streaming-server-using-nginx/
You have to forward your port to an external port on a web server.
There are some tutorials which you able to find them by this keywords:
raspberry pi streaming port forwarding tutorial
Some of the useful links are:
https://www.raspberrypi.org/forums/viewtopic.php?t=56149
https://iot.stackexchange.com/a/1562
https://raspberrypi.stackexchange.com/questions/53954/how-to-connect-to-raspberry-pi-outside-of-local-network-without-port-forwarding
https://raspberrypi.stackexchange.com/a/71493
Especially these:
https://jacobsalmela.com/2014/05/31/raspberry-pi-webcam-using-mjpg-streamer-over-internet/
https://videos.cctvcamerapros.com/raspberry-pi/ip-camera-raspberry-pi-youtube-live-video-streaming-server.html
I'm an android developer but a noob in nodejs.I want to create a live radio appliction so I use android to create the client app and nodejs to create the server.
Step:
Android : MediaRecord to record video and transfer to nodejs server.
Nodejs server : Receive the stream and transfer it to another clients or save it in the database.
I read the MediaRecord apis and found the native way to send stream videos to server.
Socket socket = new Socket(“xxx.xxx.x.xxx”, 8890);
ParcelFileDescriptor pfd = ParcelFileDescriptor.fromSocket(socket);
MediaRecorder.setOutputFile(pfd.getFileDescriptor());
But it the NATIVE socket. NOT suit for nodejs server.
I use the Socket.io to establish the socket server and the socket.io-client in android client to connect to the socket server. But it doesn't support to transfer the stream video info from android to server. Then I found socket.io-stream. It supports to transfer the stream info but no any android api. So I want to know what should I do or what should I use to transfer stream videos from Android to a nodejs server? Do I need to estabilsh my own nodejs socket libs to finish this work?
Any help will be appreciated. Thanks anyway.
I am trying to play a raw liveTV mpeg2_ts stream via google tv Media player;
The stream is unbounded (live tv) so there is no content-length.
The stream is accessed via a url that looks like this http:///livetv?channum=X
This was tested with VLC has a client and worked great. However using GTV is another story
The stream response header contains the header Tranfer-Encoding : chunked.
Attempting to play that stream in GTV media player causes the following error :
I/AVAPIMediaPlayer(142): Found HTTP success. Connection is HTTP/1.1, code was 206
I/AVAPIMediaPlayer(142): Found content type video/mpeg
W/AVAPIMediaPlayer(142): Error, reached end of headers before finding required fields.
Looking at this file: gtv_curl_transfer_engine.cpp it seems that v3 has removed the support for Transfer-Encoding and only supports / requires a Content-Length.
the previous version of the same file (GTV v2 gtv_curl_transfer_engine.cpp) supported it but it was removed in the current version.
what was the rationale to remove the support ? and how would one work around it ?
I was thinking about a set of temp files and chaining mediaplayer instances for playback but I would rather limit file system interactions given the nature of the stream...
From my interactions with google, there is no plan to change this behavior.
The course of action is to provide the videos in a HTTP Live stream format (m3u8)
Does anyone host IceCast 2.3.2+ server with chunk encoding MP3 stream, that I can test?
I'd like to test it whether would be any stream stops caused of small chunk length in Android Mediaplayer.
Icecast up to and including version 2.4.1 does not support HTTP chunked encoding at all. Version 2.5.0 will support chunked encoding for HTTP PUT requests (source client side), but there is currently no point in supporting it for GET requests.
You might have confused this with the metadata hack introduced by Shoutcast for MP3 streams. There the actual encoded audio data stream is interrupted at a fixed, so called, metadata-interval and metadata is injected.
A player that is capable of handling such a stream has to signal this to the server through a HTTP header and if supported the streaming server will inform the client in the response HTTP headers about the metadata-interval and other parameters. The player then must remove this injected data from the stream upon reception and before handing it to a decoder for playback.
Please note that this hack is only necessary for streams that don't have a container with inherent metadata handling. Opus and Ogg/Vorbis streams will send the metadata natively inside the stream without need for such hacks.
I am trying to build a client on android that will receive RTP streams and play it.
I have searched on stackoverflow and Google, and found that MediaPlayer class can be used for this. But the MediaPlayer is used when a URL or a file is used as data source.
In my scenario, my streaming server send RTP streams on a particular port of my client.
So, is there any way to play MediaPlayer to play this stream without writing it into a file.
Have you tried it out?
Your 'data source' for MediaPLayer is your rtsp link - rtsp://127.0.0.0:550/mystream.3gp
I have done this with a VideoView, MediaPlayer is jsut an abstraction of this, so it shouldn't be too different.
The MedaiPlayer will parse the rtsp url and start the Android RTSPEngine which will communicate with the RTP Server an establish the relevants ports to transfer the data.
The method described above works - you need to supply the rtsp link (you likely won't need the port since RTSP defaults to port 554, not 550).
rtsp://se.rv.er.ip/mystream
Essentially an RTSP conversation occurs between the client and server (DESCRIBE, SETUP, PLAY) which will contain the information a .sdp file for an RTP broadcast would have.
You cannot receive an RTP stream unless the server sending it supports RTSP.
If you need more details, I'd recommend just running wireshark on a streaming client or server. I found that helped supplement reading the RFCs.
Good luck!