I'm trying develop VoIP app which will connect SIP and RTP protocol and will have a function which in case (for example too low capacity) hang a connection, change RTP codec and continue connection. For SIP I used android sip demo example and this work pretty well. For RTP I thought about android.net.rtp but I didn’t find a method which can measure parameters of connections. Can you suggest a RTP library which is easy to use, it can be integrated with android.net.sip and allow to measure parameters of RTP transmissions?
I would recommend to use another SIP client library and not the built-in one.
The built-in SIP clients has a lot of issues and it is a very basic/simple solution with lot's of limitation (The most obvious one, as you mentioned, it doesn't include RTP by default)
You can find more in this thread.
Related
There are plenty of good sources on how to receive audio data on mobile phones. I find (almost) none of streaming audio data TO a server in a standardized (i.e., using a standard protocol, such as RTP) way.
What is the standard way to stream audio data from Android or iOS to a server?
Some additional info:
The closest solutions I found are:
Objective c: Send audio data in rtp packet via socket, where the accepted answer does send audio data, but not inside RTP
Using ffserver, which can listen to a port and receive streamed audio for further processing. But it has been discontinued.
I can write all of that functionality myself. I.e., wrap the audio data in RTP, RTSP, RTMP, whatever, write a server that receives the stream, decodes it and does the processing, but that's days of work for something that seems to be a standard task and thus should already exist.
Furthermore, I firmly believe that you shouldn't reinvent the wheel. If there's a good, established, API for something, Apple, Google or a third party, writing the functionality myself is a bad idea, in particular if networking is involved and thus security concerns.
I think I figured it out, at least one standard way. One likely answer to my question is simply: SIP
There are (even native) interfaces for SIP on both iOS and Android, there are (many) ready-made servers for this, and very likely there exist libraries or command-line clients to run on server side and use for the further processing.
I haven't tried it, but this looks very standardised, is likely widely used for such problems.
I am looking to transmit audio in realtime from an Android application I am working on to a server in a way similar to how a baby monitor functions (one way listening).
I created a test app that uses SIP to initiate a VOIP call between our client and server applications. The problem is that now I need a way to do this on non-SIP enabled devices. I have tried recording the audio from the device microphone into a buffer, then sending the buffer in chunks to the server through HTTP objects and re-assembling the audio for playback with poor results.
Does anyone have any suggestions for streaming realtime audio from an Android device to a server application for processing? SIP works so well, but I don't have time to implement a SIP stack on all of our non supported devices.
XMPP/jingle (aka gtalk) is the usual alternative. There are C libraries as well as some support in java using the smack libraries. (The smack jingle support is old and doesn't work well, but IIRC someone is working on a new version)
I am trying to create an OPUS based multicast server for an audio project that I am working on and it will be running on the O-Droid X (http://www.hardkernel.com/renewal_2011/products/prdt_info.php?g_code=g133999328931) for this project. At the moment I am unsure on where to start for creating and going about making a multicast server in linux or android using the OPUS codec. This is my first multicast server for audio support that I have done from scratch. If there are any pointers they would greatly be appreciated.
Also making it accessible through a web page and playable through that webpage would be an ideal situation so that a specific app on the client side would not be needed.
Apparently Icecast does a lot of what you're looking for. It's open source (GPL) and supports Opus streams using the Ogg container format, you could have a peek for some general software architecture ideas. My SoundWire Android app (with Win/Linux server) does Opus streaming with low latency but the network protocols are custom... I don't know any established open protocols that can do low latency (by my definition 1 second delay is not low latency).
My approach was to build a conventional network server that sets up a normal unicast UDP socket for each client. Avoid TCP if you want low latency, then you'll have to deal with the datagram nature of UDP in some way. With Opus the amount of data streamed per client is not excessive. I use multicast only for discovery (auto locate a server).
I suggest you start with some open source server code and adapt it for your needs, bring in Opus which is very easy to integrate, choose a container format such as Ogg if it's suitable (search for Ogg Opus). If you want browser compatibility then you'll more or less be implementing part of a web server (HTTP etc.) and will have to give up on your low latency goals.
As a general note, pending a reply to my comment: You will be disappointed to learn that multicast is pretty much useless. Outside of some unusual configurations which you will probably not encounter in the real world, multicast doesn't work over the Internet, as most routers are not configured to pass it. It's really only usable over local networks.
As far as making it accessible through a web page, you're pretty much out of luck. There is no native browser support for multicast, nor is support widespread for OPUS, and most of the standard methods of extending the browser's capabilities (e.g, Javascript and Flash) can't really help you much either. You might be able to implement it in a Java applet, but the number of user agents with working Java installations is rapidly shrinking (particularly with the recent Java exploit), and the resulting applet may end up requiring elevated privileges to use multicast anyway.
I have to develop an application in android for audio/video conferencing. Which is the most efficient way of implementing this? While my research, I came across Android's SIP API. Can it be used for implementing the audio as well as video conferencing ? And If yes, what shall I use to stream the videos in real time? Shall I use any RTSP library for this?
Please Guide me.
Thanks,
Rupesh
Okz for my practical project i used Spydroid which use rtsp protocol sdp less. you can customize it for audio use only. i will prefer spydroid because it uses pure java in which it reads camera packets and right them to linux socket and read them from there by rtsp server.
on other hand if i am not wrong sip uses c/c++ codes too
I've been contemplating (re)building an app on iPad for some time, where I would use objective-C and DSMI to send MIDI signals to a host computer. This is not bad (I mean, except for actually writing the app).
Now I'm contemplating perhaps developing the app for Android tablets (TBA).
In Java, what options are available for MIDI message communication? I'm quite familiar with javax.sound.midi, but then I would need a virtual MIDI port to send messages to the host.
On the other hand, if the app were done in Adobe AIR, what options would I have available for communicating with MIDI?
Obviously another option is to send/receive messages over a TCP/IP socket to a Java host, and talk that way, but it sounds a tad cumbersome... or perhaps not? DSMI does use a host program, after all.
javax.sound.midi is not available in android.
The only access to midi functionality in android is through the JetPlayer class and its very much designed for use in games. Android will play a midi file but its unclear what code path it uses, theres no way to write to the midi hardware from memory.
In one app I've made i needed to play notes dynamically based on the GUI/user interaction and ended up having to use samples and pitch filters to create the notes.
Sounds to me like what you need to do is port DSMI to android, its open source the iPhone library looks pretty simple, shouldn't be difficult to port over.
EDIT:
After thinking about this for second you wouldn't gain anything by using javax.sound.midi or whatever midi functionality exists in AIR anyway. All you need to do is pass MIDI messages through a network link to another device who is responsible for communication with the actual MIDI synth device. This is exactly what DSMI does. Porting the iPhone libdsmi to Android is what you need to do, its only 2 source files and 2 headers. 1 handles the MIDI message format which is very simple and can pretty much just be converted line by line to java and the other handles the network connection to the DSMI server which will need to be rewritten to use Android semantics for creating network connections.