I have to build an application for android to stream video and audio to a desktop application through a server. Latency is important. I also have to make sure that android streaming can be controlled from pc (user should be able to switch the camera or turn off the microphone).
I thought to use the WebRTC protocol for communication but it seems I'm gonna have to write signalling server myself to support that requirement mentioned above.
Is there a better way to implement this whole thing? Also, I can't find any good docs or libraries for android streaming (no retrofit analogies obviously).
P.S. I'm thinking about using Javafx via Tornadofx for a desktop application.
You certainly don't need to create your own signaling server. I would suggest using something like Kurento Streaming Server or a derivation of Kurento like OpenVidu. It's open source and free and has lot's of great and active support via google groups. Depending on how much specific customization you may need one or the other might be better for you. OpenVidu allows for less customization since most of the stuff under the hood is already done for you, whereas Kurento allows you to modify and customize almost everything under the hood and on the front end using examples that can be changed at the code level. I have used it extensive on projects on the past and would think it meets most, if not all of your requirements. Scaling can be a bit challenging, but is still mush easier than just P2P webRTC since everything is relayed through a central server and most certainly doable depending on your requirements and implementation. Additionally you can record, process and transcode video server side.
I've been Searching various iOS and Android SDKs all week that have the ability to live stream video in real time (with low latency) and have run into many problems. Usually the example projects they provide don't work.. Then the ones that do are quit expensive... Then the ones that I do find that work don't offer one-to-many option. The support from the companies I've talked to is awful and SLOW
All that I'm needing for my project is to integrate one-to-many live video streaming (similar to Periscope).
Does anyone know of a quicker and easier way to accomplish this? Has anyone else run into these problems? The functionality I need would be similar to periscope. I appreciate any help or guidance on the topic as I'm tearing my hair out reviewing all of these supposed 'solutions'
Thanks
I'm just learingn mobile web development and thinking about task:
Is there a way to make a videostream betwen iOS, Android and Browser. What architecture and technology it should use. I already read this quetion on SO Peer-to-Peer video from iOS to Android? but there is nothing about browsers.
If it can't be p2p and crossplatfom at the same time. I thought i shoud use Red5 server or etc. or Xmpp
So I'm asking your advice and opinion here. Any information would be valuable
Yes, You can !!!
There is new technology enforced by google is WEBRTC
It is stands for "web real time communication" and is an opensource project funded by google.
It is also support Android/iPhone native application.
I am working on it and got success say 60%.
Video clarity is good but audio is choppy.
You can find source code from Here
Discussion with community Here
You can see live demo Here
NOTE:
It is ongoing project and has not been stable yet. Google team is working on.Currently it is working on latest Chrome,FF and opera. IE has not given support yet.
Yes,the open source solution should be WEBRTC technology,please check it on official website: webrtc.org
I am developing a mobile app targeting the iOS and Android platforms. The app will consist of:
A relatively simple 'user login/signup and listing of database items' type of interface, powered by an already built webservice from and existing web application.
A video capture and upload feature using native plugins.
I have done extensive research on PhoneGap for the last week, and have determined that even considering the well documented issues and limitations of PhoneGap, it is well suited for the 1st part of my app.
However, given the limitations of the PhoneGap Capture API ( org.apache.cordova.media-capture ), it is not appropriate for capturing video for upload, mostly due to the lack of control over video specs. (On most devices, video captures will be enormous HD files that are not suitable for upload, even on Wifi, and certainly not over 3g/4g.)
Given my resources and timeline, I've determined that building native apps in both Java and ObjC are impractical, at least for now. I have very little ObjC and Java experience, but I am fairly confident and eager to learn these languages if need be.
That said, I am considering 2 options:
The first, and probably most rational, is that I pick the platform with the greater market share of my existing user base (iOS), suck it up, and go native.
The second, and perhaps lofty option, is that I develop a hybrid app in PhoneGap, targeting both platforms, and circumvent the limitations of my video capture ability by building native plugins for Java and ObjC using PhoneGap's plugin API, thus reducing development time on the rest of the app, and using native code only where it's needed.
The requirements of my video capture plugin would be as follows:
Have complete control over the specs of the recorded video, most importantly resolution and bitrate. (Presumably with AVFoundation, and the like in Adroid SDK).
Control the user interface of the video capture functionality.
Obviously, I am aware that these tasks are very possible on both platforms when developing in the native api's as is evident by existing apps in the market. (Vine, etc..).
My real question, is what are the limitations, and issues with extending native functionality via PhoneGap's Plugin API? There are almost no examples of work done in PhoneGap with this level of native implementation of video capture. The one example I've found is this plugin, VideoCapturePlus, which although I haven't been able to get to work, seems promising.
I am especially interested to find out if anyone out there has successfully implemented native plugins in PhoneGap with this level of complexity, or if it is a rabbit hole I will wish I hadn't gone down.
I have essentially gotten to the bottom of this question, that I am sure others are and will face. I will address the topic in 3 parts:
Can I do [something] in PhoneGap (as opposed to in native iOS or Android SDK)?
This is a question I'd imagine many developers considering the PhoneGap framework find themselves asking, as did I. The short answer is YES, YOU PROBABLY CAN.
How do I do [something] in PhoneGap?
Plugins! Here's the rub: Cordova (PhoneGap) in all it's brilliance is extremely limited in the way it accesses native hardware features, especially when you get into video/photo/audio capture.
This is where plugins come in. There are thousands of PhoneGap plugins at this point. Many are as simple as 4 lines of Java and Objective C to get over some simple thing that PhoneGap just won't do. Others are large projects with lifecycles of their own.
In my case, it became very clear very early on that I was not going to be able to build my app with the video capture functions that existed in PhoneGap. That said, I went shopping for plugins. By the end of my project it had taken 2 plugins that extended video functionality and four more to do other small various things I needed. In a few cases, the plugins I found did not have everything I needed, and I wound up contributing bits of code. If the projects are active on GitHub, I highly recommend this.
In the end, there were things I wanted to do that I could not find plugins for. I still have plans to build some of my own, but am not there yet. Bottom line: Outside of gaming and other 3d rendering applications, you'll be hard pressed to find something that can't be achieved with a PhoneGap plugin.
For those interested specifically in video capture, these are the 2 plugins that got me over the hump in my project:
VideoCapturePlus
Video Screenshot
Should I build my app that does [something] in PhoneGap?
Of course, this question is up to each developer, and what the goals of the app are. In my case, a simple app that among other things captures and uploads photos and short videos was quite possible. There are certainly cases in which the parameters and goals of the app make it such that native development is the best option.
That said, for most solo devs or small teams with limited resources and little or no Java or ObjC experience, the answer to the question, "Should I consider PhoneGap?", would be [in my best Jim Halpert voice], "Absolutely you should". As an added bonus, I will say this: In my case, the HTML5 layouts, and much of the front end JS that were used for both the Android and iOS versions of my app are largely reusable for the mobile web version of my app. Being able to maintain a single codebase for those three things offers efficiency that even larger resourced organizations shouldn't overlook.
I am a MIDI based musical application author. In my application I am generating a .midi file with a small lib that I wrote and play it on MediaPlayer and that's enough for that app. However in the future app I plan to have more interactivity and that's where I would probably need a streaming API.
As far as I know Android leaks APIs for realtime midi synth (at least official). But still I can see some apps that do use midi in quite advanced way. Question is how? Do they use NDK to access Sonivox directly or are there an unofficial apis for that after all? Any ideas?
Also I'm very interested if Google is planning to improve MIDI support in future versions of Android (in case anybody of Google sees this :))
Thanks.
You should check out libpd, which is a native port of PureData for both Android and iOS. It will provide you with access to the MIDI drivers of the system while still being able to prototype your software with very high-level tools.
Java has a very important latency, so i think this should be done with the NDK. Check this question, it has a couple of hints. This was reported as an Android issue (NDK support for low-latency audio), there might be some tips or info there too.
This is a simple but great sample application that successfully streams MIDI on Android https://github.com/billthefarmer/mididriver
You will have to put your MIDI messages together manually though ( the example creates two MIDI messages for play note and stop note). One can refer to the MIDI specification to further control the MIDI channels. The problem is that the default sound fonts on Android sound so bad.