Creating an OPUS codec based multicast server (android/linux) - android

I am trying to create an OPUS based multicast server for an audio project that I am working on and it will be running on the O-Droid X (http://www.hardkernel.com/renewal_2011/products/prdt_info.php?g_code=g133999328931) for this project. At the moment I am unsure on where to start for creating and going about making a multicast server in linux or android using the OPUS codec. This is my first multicast server for audio support that I have done from scratch. If there are any pointers they would greatly be appreciated.
Also making it accessible through a web page and playable through that webpage would be an ideal situation so that a specific app on the client side would not be needed.

Apparently Icecast does a lot of what you're looking for. It's open source (GPL) and supports Opus streams using the Ogg container format, you could have a peek for some general software architecture ideas. My SoundWire Android app (with Win/Linux server) does Opus streaming with low latency but the network protocols are custom... I don't know any established open protocols that can do low latency (by my definition 1 second delay is not low latency).
My approach was to build a conventional network server that sets up a normal unicast UDP socket for each client. Avoid TCP if you want low latency, then you'll have to deal with the datagram nature of UDP in some way. With Opus the amount of data streamed per client is not excessive. I use multicast only for discovery (auto locate a server).
I suggest you start with some open source server code and adapt it for your needs, bring in Opus which is very easy to integrate, choose a container format such as Ogg if it's suitable (search for Ogg Opus). If you want browser compatibility then you'll more or less be implementing part of a web server (HTTP etc.) and will have to give up on your low latency goals.

As a general note, pending a reply to my comment: You will be disappointed to learn that multicast is pretty much useless. Outside of some unusual configurations which you will probably not encounter in the real world, multicast doesn't work over the Internet, as most routers are not configured to pass it. It's really only usable over local networks.
As far as making it accessible through a web page, you're pretty much out of luck. There is no native browser support for multicast, nor is support widespread for OPUS, and most of the standard methods of extending the browser's capabilities (e.g, Javascript and Flash) can't really help you much either. You might be able to implement it in a Java applet, but the number of user agents with working Java installations is rapidly shrinking (particularly with the recent Java exploit), and the resulting applet may end up requiring elevated privileges to use multicast anyway.

Related

Standard way of streaming audio data from a mobile device to a server

There are plenty of good sources on how to receive audio data on mobile phones. I find (almost) none of streaming audio data TO a server in a standardized (i.e., using a standard protocol, such as RTP) way.
What is the standard way to stream audio data from Android or iOS to a server?
Some additional info:
The closest solutions I found are:
Objective c: Send audio data in rtp packet via socket, where the accepted answer does send audio data, but not inside RTP
Using ffserver, which can listen to a port and receive streamed audio for further processing. But it has been discontinued.
I can write all of that functionality myself. I.e., wrap the audio data in RTP, RTSP, RTMP, whatever, write a server that receives the stream, decodes it and does the processing, but that's days of work for something that seems to be a standard task and thus should already exist.
Furthermore, I firmly believe that you shouldn't reinvent the wheel. If there's a good, established, API for something, Apple, Google or a third party, writing the functionality myself is a bad idea, in particular if networking is involved and thus security concerns.
I think I figured it out, at least one standard way. One likely answer to my question is simply: SIP
There are (even native) interfaces for SIP on both iOS and Android, there are (many) ready-made servers for this, and very likely there exist libraries or command-line clients to run on server side and use for the further processing.
I haven't tried it, but this looks very standardised, is likely widely used for such problems.

Stream video android-android

I would like to stream a video between two android devices (android-android). There wouldn't be any server, so the streaming has to be direct between devices. Devices would be in the same network so they could communicate via WiFi.
I've tried using MediaRecorder - MediaPlayer via sockets, but I've received many exceptions.
I also looked for library, but I just want to stream a video between two devices directly.
Any solutions?
If your video if for real time communication, e.g. a web chat or sharing some CCTV in real time with minimal delay then a real time video communication approach like WebRTC would be one additional possibility - this type of approach prioritises low latency over quality to ensure minimum delay. See here for Android WebRTC documentation:
https://webrtc.org/native-code/android/
If the requirement is just to allow one device act as a server for non-real time videos then the easiest approach may be to use one of the available HTTP server libraries or apps to allow one device act as a server that the other one can simply connect to via a browser or player. An example Android HTTP server that seems to get good reviews is:
https://play.google.com/store/apps/details?id=jp.ubi.common.http.server&hl=en

What is the most efficient way to implement HTTP Live Video Streaming in Android?

For the past month I have been searching over the Internet for ways to implement recording live video from an application on Android and sending it over to a server, but the more I research the more confused I get.
First of all, I am looking for a streaming protocol that can be used for iOS also in the future, so I came to a conclusion that DASH(Dynamic Adaptive Streaming over HTTP) is the ideal solution.
In addition, the recent Android framework, ExoPlayer, support this feature.
Furthermore, I do not wish to use a Live Streaming engine such as WOWZA.
Secondly, based on my research I also concluded that any HTTP server can be used to receive the "chuncks" of data, but I must have a streaming server to be able to stream the video back to the users.
I believe this process is quite complex but I will not give up until I successfully make it work.
Lastly, my question is, what Server, Protocol should I use to be able to achieve this ? And how to convert video directly and send to server ?
Looking at your questions re protocol and server:
A 'streaming protocol that can be used for iOS also in the future'
It probably depends what you mean by 'future. At the moment apple require you to use HLS on iOS for any video on a Mobile Network (cellular) which is over 10 mins long. DASH is establishing itself as the industry standard so this may change and apple may accept it also, but if you need something in the near future you may want to plan to support DASH and HLS.
What server should you use for streaming
Streaming video is complex and the domain is fast changing so it really is good to use or build on a dedicated streaming server, if you can. These will generally have mechanisms and/or well documented procedures for converting input videos to the different formats and bit rates you need, depending on the reach and user experience goals you have. Reach will determine the different encodings you need, different browsers and devices supporting different encodings, and if you want your user to have good experience avoiding buffering you will want multiple bit rate versions of each format also - this allows DASH and HLS provide Adaptive Bit rate Streaming (ABR) which means the clients can select the best bit rate at any given time depending on network conditions. Video manipulation, especially transcoding, is a CPU intensive task so another advantage of dedicated streaming server software is that it should be optimised as much as possible to reduce your server loads.
If you do decide to go the streaming server route, then there are open source alternatives, as well as Wowza which you mention above, such as:
https://gstreamer.freedesktop.org
These have plugins that support ABR etc - if you search for 'GStreamer streaming server ABR' you will find some good blogs about setting this up.

How to integrate SIP with RTP into android?

I'm trying develop VoIP app which will connect SIP and RTP protocol and will have a function which in case (for example too low capacity) hang a connection, change RTP codec and continue connection. For SIP I used android sip demo example and this work pretty well. For RTP I thought about android.net.rtp but I didn’t find a method which can measure parameters of connections. Can you suggest a RTP library which is easy to use, it can be integrated with android.net.sip and allow to measure parameters of RTP transmissions?
I would recommend to use another SIP client library and not the built-in one.
The built-in SIP clients has a lot of issues and it is a very basic/simple solution with lot's of limitation (The most obvious one, as you mentioned, it doesn't include RTP by default)
You can find more in this thread.

MIDI on Android: Java and/or AIR libraries

I've been contemplating (re)building an app on iPad for some time, where I would use objective-C and DSMI to send MIDI signals to a host computer. This is not bad (I mean, except for actually writing the app).
Now I'm contemplating perhaps developing the app for Android tablets (TBA).
In Java, what options are available for MIDI message communication? I'm quite familiar with javax.sound.midi, but then I would need a virtual MIDI port to send messages to the host.
On the other hand, if the app were done in Adobe AIR, what options would I have available for communicating with MIDI?
Obviously another option is to send/receive messages over a TCP/IP socket to a Java host, and talk that way, but it sounds a tad cumbersome... or perhaps not? DSMI does use a host program, after all.
javax.sound.midi is not available in android.
The only access to midi functionality in android is through the JetPlayer class and its very much designed for use in games. Android will play a midi file but its unclear what code path it uses, theres no way to write to the midi hardware from memory.
In one app I've made i needed to play notes dynamically based on the GUI/user interaction and ended up having to use samples and pitch filters to create the notes.
Sounds to me like what you need to do is port DSMI to android, its open source the iPhone library looks pretty simple, shouldn't be difficult to port over.
EDIT:
After thinking about this for second you wouldn't gain anything by using javax.sound.midi or whatever midi functionality exists in AIR anyway. All you need to do is pass MIDI messages through a network link to another device who is responsible for communication with the actual MIDI synth device. This is exactly what DSMI does. Porting the iPhone libdsmi to Android is what you need to do, its only 2 source files and 2 headers. 1 handles the MIDI message format which is very simple and can pretty much just be converted line by line to java and the other handles the network connection to the DSMI server which will need to be rewritten to use Android semantics for creating network connections.

Categories

Resources