I am looking to transmit audio in realtime from an Android application I am working on to a server in a way similar to how a baby monitor functions (one way listening).
I created a test app that uses SIP to initiate a VOIP call between our client and server applications. The problem is that now I need a way to do this on non-SIP enabled devices. I have tried recording the audio from the device microphone into a buffer, then sending the buffer in chunks to the server through HTTP objects and re-assembling the audio for playback with poor results.
Does anyone have any suggestions for streaming realtime audio from an Android device to a server application for processing? SIP works so well, but I don't have time to implement a SIP stack on all of our non supported devices.
XMPP/jingle (aka gtalk) is the usual alternative. There are C libraries as well as some support in java using the smack libraries. (The smack jingle support is old and doesn't work well, but IIRC someone is working on a new version)
Related
There are plenty of good sources on how to receive audio data on mobile phones. I find (almost) none of streaming audio data TO a server in a standardized (i.e., using a standard protocol, such as RTP) way.
What is the standard way to stream audio data from Android or iOS to a server?
Some additional info:
The closest solutions I found are:
Objective c: Send audio data in rtp packet via socket, where the accepted answer does send audio data, but not inside RTP
Using ffserver, which can listen to a port and receive streamed audio for further processing. But it has been discontinued.
I can write all of that functionality myself. I.e., wrap the audio data in RTP, RTSP, RTMP, whatever, write a server that receives the stream, decodes it and does the processing, but that's days of work for something that seems to be a standard task and thus should already exist.
Furthermore, I firmly believe that you shouldn't reinvent the wheel. If there's a good, established, API for something, Apple, Google or a third party, writing the functionality myself is a bad idea, in particular if networking is involved and thus security concerns.
I think I figured it out, at least one standard way. One likely answer to my question is simply: SIP
There are (even native) interfaces for SIP on both iOS and Android, there are (many) ready-made servers for this, and very likely there exist libraries or command-line clients to run on server side and use for the further processing.
I haven't tried it, but this looks very standardised, is likely widely used for such problems.
I would like to stream a video between two android devices (android-android). There wouldn't be any server, so the streaming has to be direct between devices. Devices would be in the same network so they could communicate via WiFi.
I've tried using MediaRecorder - MediaPlayer via sockets, but I've received many exceptions.
I also looked for library, but I just want to stream a video between two devices directly.
Any solutions?
If your video if for real time communication, e.g. a web chat or sharing some CCTV in real time with minimal delay then a real time video communication approach like WebRTC would be one additional possibility - this type of approach prioritises low latency over quality to ensure minimum delay. See here for Android WebRTC documentation:
https://webrtc.org/native-code/android/
If the requirement is just to allow one device act as a server for non-real time videos then the easiest approach may be to use one of the available HTTP server libraries or apps to allow one device act as a server that the other one can simply connect to via a browser or player. An example Android HTTP server that seems to get good reviews is:
https://play.google.com/store/apps/details?id=jp.ubi.common.http.server&hl=en
I'm trying develop VoIP app which will connect SIP and RTP protocol and will have a function which in case (for example too low capacity) hang a connection, change RTP codec and continue connection. For SIP I used android sip demo example and this work pretty well. For RTP I thought about android.net.rtp but I didn’t find a method which can measure parameters of connections. Can you suggest a RTP library which is easy to use, it can be integrated with android.net.sip and allow to measure parameters of RTP transmissions?
I would recommend to use another SIP client library and not the built-in one.
The built-in SIP clients has a lot of issues and it is a very basic/simple solution with lot's of limitation (The most obvious one, as you mentioned, it doesn't include RTP by default)
You can find more in this thread.
I have a question whether I can remotely control VLC video player program (play, pause, sound, maybe some video streaming, cam streaming) between my computer/mobile phone.
Here is my plan:
1. VLC player on Mac OS
2. writing some TCP Server (C++)
3. writing client on side of android mobile phone
here i consider writing in C++ in order to use it in android/ios ?
4. writing application on Android with simple buttons that can control remotely this player...
Can this solution work properly?
Some additional questions:
1. Can such solution work over WAN (Internet) not only LAN (TCP socket communication)
2. VLC player has in preferences Interface > Main Interfaces > RC
and Lua HTTP, Lua Telnet, etc. (whats the aim of this? )
3. I saw some application on Google Store that comunicates via Lua HTTP?
I would prefer writing my own server/client + protocol for communication this is for university lower degree project.
So my question is whether if I will write such a server, Will there be possibility to integrate it with VLC somehow like adding to preferences > Interfeaces, or it should be separate program or it can be written as plugin or some add-on ?
In summary I need some help to direct me which solution will be the best to provide the most seamless interaction with VLC and have own server, client, protocol in order it hasn't been to easy (I saw in documentacion that possibly there are simple commands in VLC over HTTP protocol which I assume could allow for easy interacting with VLC).
I think also about extending this project by enabling mouse move control on Mac OS / Windows. What should I need for it?
The last part is to enable streaming video to phone and maybe in opposite direction from phone to VLC player. Also web cam capture streaming from phone to VLC and oposite Mac book to phone will be interesting solution.?
thanks for any help
PLEASE If it is too long question please concentrate on answering whether it is possible to do, and whether it can be seamlessly integrated in such way that end user shouldn't have to make many hours of configuration...
Best solution form my point of view:
- preference screen of my plugin embedded in VLC player settings
- writing TCP port/ host (maybe using current host IP in local network)
- on mobile side detecting and connecting via this host:port using client and it just works...
1. VLC player on Mac OS 2. writing some TCP Server (C++) 3. writing client on side of android mobile phone here i consider writing in C++ in order to use it in android/ios ? 4. writing application on Android with simple buttons that can control remotely this player... Can this solution work properly?
Yes it is possible and it works perfectly. VLC has in built server, so you do not need another server app to control it. you just write a client side app for android or Windows/iOS. However if you still want to write server app, you can do so (I don't recommend it), but obviously communication delay between client app and VLC is higher than usual.
1. Can such solution work over WAN (Internet) not only LAN (TCP socket communication) 2. VLC player has in preferences Interface > Main Interfaces > RC
and Lua HTTP, Lua Telnet, etc. (whats the aim of this? ) 3. I saw some application on Google Store that comunicates via Lua HTTP?
Yes it should be possible, but I haven't tried it though.
I would prefer writing my own server/client + protocol for communication this is for university lower degree project. So my question is whether if I will write such a server, Will there be possibility to integrate it with VLC somehow like adding to preferences > Interfeaces, or it should be separate program or it can be written as plugin or some add-on ?
As I said, you can write your own server app, you can integrate that server with VLC server (web Interface). Again this method is not recommended.
If you still want to write server app, instead of integrating with VLC's web interface, map key board short cuts (for example, receive stop request from your client app, and rise keyboard 'S' key event on your server app. 'S' key is short cut for stop command for VLC. For more VLC short cut keys refer here)
VLC supports both transcoding and streaming, I suggest you to write only client app, integrate it with the VLC Web interface. That is the best method. (For more info, there are many apps on play store, try any one of them or refer VLC forum)
I am trying to develop a Asterisk Android Client. My preferred codec is GSM. I have downloaded SIPDroid source code and some other helping projects. But since I am totally new on this area, I am not sure where to start from.
Here is what I am trying to do at starting.
Record sound
Convert that sound to GSM RTP packet
Play that codec sound
Stream that GSM RTP Packet
Integrate SIP Session with the App
I have one Android device (HTC Wildfire). Is it possible to test these steps from simulator to my set using Wi-Fi network?
Please give me an appropriate steps/algorithm on which I can develop the client App.
It'll be great if someone give me some tips to use the existing projects.Thanks
I asked a friend with an android phone to install SIPDroid, and it does support the GSM codec.