I'd like to write an application for Android for camera streaming to PC (H.263, MPEG_4). I found some libraries: sipandroid, jlibrtp.
SIPandroid:
RTP packets are streamed (wireshark catches it on PC well), but VLC can't play it.
Jlibrtp:
API is shady, stream is not played correctly using VLC.
May be there are some adaptations to these libraries (to make it working for camera streaming), or there are some other libraries with clean API and samples?
Thanks for your answer.
VLC has built-in support for RTP, and as #Lukas said, the network interfaces are likely the problem on VLC. If you stream everything to one port, and listen on that port, you will at least get something. You can also look into the RTP packets to see if they are well formed.
VLC itself uses the LiveMedia library, so you may be able to use that.
Related
i would like some advice about the best way to stream a only-video live stream from a server to:
Android (>4.0 is ok)
PC with web-browser
iOS
I would like to keep latency as low as 1/2 second.
I can use:
flash: works on PC but no iOS and no Android(works only on some tablets)
HLS: not good because of latency
proprietary library: it should work but i have to implement it everywhere
RTSP: works only on Android
Any other way? Is a proprietary library the way to go?
I'm working on Linux but i'm mainly interested in "use this technology" and not "use this code".
Not sure, but you can try HTTP streaming of MP4/3gp formats using a web server. Both Android and iOS supports HTTP streaming. But you need to implement Progressive Download.
Please specify on which OS you want to implement your server.
For Windows - you can use following binary to relocate your moov atoms to the beginning of media file to enable them for progressive download
http://notboring.org/devblog/2009/07/qt-faststartexe-binary-for-windows/
Let us know your progress.
You can implement FFmpeg Server for Live broadcast. It gives you various options. Enable/Disable options from its configuration file located at /etc/ffserver.conf
You can get detail documentation at
http://ffmpeg.org/ffserver.html
Rtsp might be the way to go , but that 1/2 second latency might be hard to get.
I guess for video only and if you don't buffer at all , this may work for ios anyway
https://github.com/mooncatventures-group/FFPlayer-tests
Android supports rtsp , but its not very good.
You can compile ffmpeg for android and write a simple player using OpenGL. I can't share the code because we did it for a client but its not to difficult.
I want to stream video recording from my android phone to network media server.
The first problem is that when setting MediaRecorder output to socket, the stream is missing some mdat size headers. This can be fixed by preprocessing that stream locally and adding missing data to stream in order to produce valid output stream.
The question is how to proceed from there.
How can I go about output that stream as an RTMP stream?
First, let's unwind your question. As you've surmised, RTMP isn't currently supported by Android. You can use a few side libraries to add support, but these may not be full implementations or have other undesirable side effects and bugs that cause them to fail to meet your needs.
The common alternative in this case is to use RTSP. It provides a comparable session format that has its own RFC, and its packet structure when combined with RTP is very similar (sans some details) to your desired protocol. You could perform the necessary fixups here to transmute RTP/RTSP into RTMP, but as mentioned, such effort is currently outside the development scope of your application.
So, let's assume you would like to use RTMP (invalidating this thread) and that the above-linked library does not meet your needs.
You could, for example, follow this tutorial for recording and playback using Livu, Wowza, and Adobe Flash Player, talking with the Livu developer(s) about licensing their client. Or, you could use this client library and its full Android recorder example to build your client.
To summarize:
RTSP
This thread, using Darwin Media Server, Windows Media Services, or VLC
RTMP
This library,
This thread and this tutorial, using Livu, Wowza, and Adobe Flash Player
This client library and this example recorder
Best of luck with your application. I admit that I have a less than comprehensive understanding of all of these libraries, but these appear to be the standard solutions in this space at the time of this writing.
Edit:
According to the OP, walking the RTMP library set:
This library: He couldn't make the library demos work. More importantly, RTMP functionality is incomplete.
This thread and this tutorial, using Livu, Wowza, and Adobe Flash Player: This has a long tutorial on how to consume video, but its tutorial on publication is potentially terse and insufficient.
This client library and this example recorder: The given example only covers audio publication. More work is needed to make this complete.
In short: more work is needed. Other answers, and improvements upon these examples, are what's needed here.
If you are using a web-browser on Android device, you can use WebRTC for video capturing and server-side recording, i.e with Web Call Server 4
Thus the full path would be:
Android Chrome [WebRTC] > WCS4 > recording
So you don't need RTMP protocol here.
If you are using a standalone RTMP app, you can use any RTMP server for video recording. As i know Wowza supports H.264+Speex recording.
I want to know is it mandatory to use any of the streaming servers like Darwin,Wowza or VLC to stream an RTSP live stream video? I am receiving an RTSP link from my client and it tends to change everytime. I can successfully play it in the VLC player but on phone I cant see anything. I tried playing a sample link having .3gp extension and it worked fine. But my links dont have an extension. They look like this rtsp://122.166.229.151:1950/1346a0cf0ef7c2. Please help me.If its compulsory to use an extension or a server, I will continue working in that direction.
A streaming server (as you describe) isn't strictly necessary - as long as you can pull RTSP from whatever your source is, you should be able to see it. Most IP cameras have onboard RTSP servers (although I wouldn't put too many connections on it). If you can see it in VLC, the phone should be able to consume it as well, given that the codec used to encode is one supported by the android device (in most cases, if you're doing H.264 Baseline 3.0 with AAC, you should be good to go).
A streaming server like Wowza can make that stream available to a wider audience than pulling directly from the source device, but if you're not intending to broadcast to a wide audience, it's not required for streaming to Android devices.
Newer versions of Android (Gingerbread and later) are also able to consume Apple HTTP Live Streaming.
Now I'm working on a voice-call project between Android and PC. I use JMF library for PC client, and normal API Android to create a voice-call between them. I use JMF because it supports RTP protocol. My problem is that the PC client can understand the packets sent from Android one, but not vice versa.
I customized code from SipDroid application and see that only two codecs are used - PCMA and PCMU. I'm not good at audio/video codec. So my question is if JMF library supports those codecs (PCMA and PCMU). I searched in Internet, and some guys say that PCMA or PCMU is same with ULAW/ALAW, but I'm not sure that's right.
Does anyone have experience on this?
JMF supports ulaw which is called PCMU.
See here.
And yes PCMU/PCMA is same as ulaw/alaw.
See here.
I am working on an Android app to play video stream in RTSP protocol, which provided by the Darwin streaming server.
My problem is that the RTSP stream cannot be played using Android's VideoView/MediaPlayer via some specific WiFi hotspots, e.g. at my workplace. I searched around and found that Darwin streaming server use UDP Ports 6970 - 6999 for media data streaming, and the firewall may be the problem. but the same stream can be played using VLC on PC via the same WiFi hotspot.
What's the difference between the mechanism that VLC and the Android's build-in media framework OpenCore use? Is that possible for me to write my own rtsp client with live555's openRTSP source on Android? Any help will be very appreciated.
Bolton
I've used wireshark to scan my network and I think I now know the difference:
When I use android emulator, I can see the client keeps sending UDP requests through ports 6970, 6971 but get no response. And when using VLC, the RTP data is transfered in TCP via port 554.
And the problem is caused by the firewall I think.
As you stated in your answer VLC Switches to Interleaved Rtp over Rtsp when Udp fails.
This is why VLC continues to work.
You can use my library # https://net7mma.codeplex.com/ if you can use .Net or you can use it as a reference for your own development.