Audio/Video Conferencing Application in Android - android

I have to develop an application in android for audio/video conferencing. Which is the most efficient way of implementing this? While my research, I came across Android's SIP API. Can it be used for implementing the audio as well as video conferencing ? And If yes, what shall I use to stream the videos in real time? Shall I use any RTSP library for this?
Please Guide me.
Thanks,
Rupesh

Okz for my practical project i used Spydroid which use rtsp protocol sdp less. you can customize it for audio use only. i will prefer spydroid because it uses pure java in which it reads camera packets and right them to linux socket and read them from there by rtsp server.
on other hand if i am not wrong sip uses c/c++ codes too

Related

Android and Nginx Rtmp Module solution

folks
I have an Android app that streams videos to Wowza Server. Right now, I am using libstreaming (https://github.com/fyhertz/libstreaming) in the Android app to livestream audio and video to Wowza.
It works fine, but I am building an open source solution and I would like to stop using Wowza (since it is a payed product) and start using nginx-rtmp-module (https://github.com/arut/nginx-rtmp-module). The problem is that libstreaming does not work with rtmp protocol, and, as much as I researched, I still couldn't find a good solution on the Android side to livestream to nginx.
Does anybody know any solution to do that? Did anybody already implemented it? Thanks in advance!
You can probably use ffmpeg to convert RTP into RTMP on server side.
e.g. Pipe UDP input to FFMPEG

Can Android natively play udp / RTP /RTSP Streams?

I am a bit confused, is it possible to receive UDP / RTSP Streams with the Android SDK? I've searched for some solution, but obviously they are just forwarding the request to the android native player or vlc. I would like to playback a video-feed in a surface-view for example.
Is it possible to receive streams without using third-party api's like ffmpeg?
Yes, you can use e.g. the android.media.MediaPlayer class to do this. See http://developer.android.com/guide/topics/media/mediaplayer.html for more information about how to do it.

Advice about streaming live video to android/ios/pc

i would like some advice about the best way to stream a only-video live stream from a server to:
Android (>4.0 is ok)
PC with web-browser
iOS
I would like to keep latency as low as 1/2 second.
I can use:
flash: works on PC but no iOS and no Android(works only on some tablets)
HLS: not good because of latency
proprietary library: it should work but i have to implement it everywhere
RTSP: works only on Android
Any other way? Is a proprietary library the way to go?
I'm working on Linux but i'm mainly interested in "use this technology" and not "use this code".
Not sure, but you can try HTTP streaming of MP4/3gp formats using a web server. Both Android and iOS supports HTTP streaming. But you need to implement Progressive Download.
Please specify on which OS you want to implement your server.
For Windows - you can use following binary to relocate your moov atoms to the beginning of media file to enable them for progressive download
http://notboring.org/devblog/2009/07/qt-faststartexe-binary-for-windows/
Let us know your progress.
You can implement FFmpeg Server for Live broadcast. It gives you various options. Enable/Disable options from its configuration file located at /etc/ffserver.conf
You can get detail documentation at
http://ffmpeg.org/ffserver.html
Rtsp might be the way to go , but that 1/2 second latency might be hard to get.
I guess for video only and if you don't buffer at all , this may work for ios anyway
https://github.com/mooncatventures-group/FFPlayer-tests
Android supports rtsp , but its not very good.
You can compile ffmpeg for android and write a simple player using OpenGL. I can't share the code because we did it for a client but its not to difficult.

Android: how to stream video

I am programming android client that would show video from web camera in real time. The issue is that I get this stream over RTMP protocol. It seems that nobody knows an easy way to do that.
On stackoverflow I have found just unsolved question about that How to stream over RTMP on Android?, but now may be someone knows the answer. All help would be appreciated.
P.S.
I want to support Android 2.3 >
You may want to use VLC for android sources or use some intermediary server (ffmpeg, avconv) to convert RTMP to RTSP

Convert video Input Stream to RTMP

I want to stream video recording from my android phone to network media server.
The first problem is that when setting MediaRecorder output to socket, the stream is missing some mdat size headers. This can be fixed by preprocessing that stream locally and adding missing data to stream in order to produce valid output stream.
The question is how to proceed from there.
How can I go about output that stream as an RTMP stream?
First, let's unwind your question. As you've surmised, RTMP isn't currently supported by Android. You can use a few side libraries to add support, but these may not be full implementations or have other undesirable side effects and bugs that cause them to fail to meet your needs.
The common alternative in this case is to use RTSP. It provides a comparable session format that has its own RFC, and its packet structure when combined with RTP is very similar (sans some details) to your desired protocol. You could perform the necessary fixups here to transmute RTP/RTSP into RTMP, but as mentioned, such effort is currently outside the development scope of your application.
So, let's assume you would like to use RTMP (invalidating this thread) and that the above-linked library does not meet your needs.
You could, for example, follow this tutorial for recording and playback using Livu, Wowza, and Adobe Flash Player, talking with the Livu developer(s) about licensing their client. Or, you could use this client library and its full Android recorder example to build your client.
To summarize:
RTSP
This thread, using Darwin Media Server, Windows Media Services, or VLC
RTMP
This library,
This thread and this tutorial, using Livu, Wowza, and Adobe Flash Player
This client library and this example recorder
Best of luck with your application. I admit that I have a less than comprehensive understanding of all of these libraries, but these appear to be the standard solutions in this space at the time of this writing.
Edit:
According to the OP, walking the RTMP library set:
This library: He couldn't make the library demos work. More importantly, RTMP functionality is incomplete.
This thread and this tutorial, using Livu, Wowza, and Adobe Flash Player: This has a long tutorial on how to consume video, but its tutorial on publication is potentially terse and insufficient.
This client library and this example recorder: The given example only covers audio publication. More work is needed to make this complete.
In short: more work is needed. Other answers, and improvements upon these examples, are what's needed here.
If you are using a web-browser on Android device, you can use WebRTC for video capturing and server-side recording, i.e with Web Call Server 4
Thus the full path would be:
Android Chrome [WebRTC] > WCS4 > recording
So you don't need RTMP protocol here.
If you are using a standalone RTMP app, you can use any RTMP server for video recording. As i know Wowza supports H.264+Speex recording.

Categories

Resources