I'm developing a webRTC project, the goal is to have 8 random people on the same channel, sharing video and audio, while having the possibility of being on different platforms (iOS, Android, PC, etc).
So far, so good, I finished developing the browser client and the server (using Socket.io and Node.js), and it is working fine.
The problem is I used a webRTC abstraction to code the browser instead of the apprtc libraries, and the library I used doesn't support native apps.
My question is, should I try to code a new library for mobile based on the abstraction library I used (around 6k lines of code), try to find a way to connect peers running on the browser abstraction library and the peers using an android/iOS abstraction library, or should I re-write all client side code with apprtc samples?
The goal here is to have everything working as fast as possible and have the possibility to optimize later on.
A few notes on my project:
->The interface will be really simple, all the user has to do is click a button, I will then check for video and send him to a queue in the server (through socket.io).
->The server will then find 7 other people to connect the peer to.
->All peers receive information from the server (either a channel or other peer's client information) and set up a video and audio conference.
Related
I see a lot of tutorials in the Internet teaching about android to browser or browser to browser webrtc application. Is it possible to build a native android-to-android video chat app using webrtc?
Well, for establishing a connection between the devices before the call via peer-to-peer WebRTC solution you need STUN/TURN/ICE servers.
They establish the route for communication between the devices.
Once the route is established the devices communicate directly without participation of a server for passing the media streams.
To make it easier for you, you can look at or try some existing solutions, like ConnectyCube.
They have peer-to-peer WebRTC solution for Android already implemented.
So, maybe there is not need to reinvent the wheel.
There is an official Android sample project AppRTCMobile provided here - https://webrtc.org/native-code/android/. However, the build process is tedious and the total download size exceeds 20 GB. The recommended way is to use the following dependency in your project.
implementation 'org.webrtc:google-webrtc:1.0.+'
However, for video chat functionality you will need to refer AppRTCMobile source code. There is a clone of this project on GitHub updated for Oreo and ready to import in Android Studio. Check out this link.
WebRTC uses ICE protocol for creating connection between two peers. It uses DTLS-SRTP for creating secure data exchange between peers.
Now both ICE protocol and DTLS-SRTP are protocols that can be implemented on any devices no matter what platform. You implement or use existing implementation of
ICE and DTLS-SRTP protocol on your android apps and communicate with each other.
When you read tutorials about implementation of WebRTC for communication between android app and browser, there the android app has the implementation of both ICE and DTLS-SRTP. So this android app can communicate with other android app having similar implementation.
in addition to #tahlil great answer, you can also use a number of open source SDKs out there that already took the burden on bundling the WebRTC libraries and offering simple APIs for you to integrate Real Time Communications in your native app. One example of such SDK is the RestComm Android SDK
See https://github.com/Mobicents/restcomm-android-sdk and http://www.telestax.com/restcomm-client-android-sdk-beta-2-is-out/
Do you need one? Or can you use it in a mobile app? And if you can, and you are working on android, how do you put the html code inside, without using webview? (since it doesnt support webrtc)
You do not need a browser to use WebRTC. Google has sample applications for Android and iOS. These are built using native code, which means there is no HTML; you use Java or Objective-C to handle the same API.
WebRTC is not a browser technology (though it is well designed for browsers), but a complex of technologies for video/audio/files/messages P2P delivery: codecs, APIs, routers etc.
You can use WebRTC to even transmitting video/audio/messages/files between two servers (still peer to peer, isn't it). Practically, you can use any device with access to local network or Internet and write any program to make it working with WebRTC.
WebRTC defines protocols for sending video and audio and data, and protocols for two endpoints to connect to one another (ICE with STUN and TURN). While the Javascript bindings are part of the WebRTC standard, some mobile native SDKs implement many of the protocols of WebRTC, but do not strictly present the Javascript bindings.
i am working on one android application with the functionality of p2p video chat just like Skype. while researching on google, i got some libraries but not getting anything for android native.
i decided to go with WebRTC with the use of PubNub api. how can i create a video chat native android client with the use of there libraries?
i found one code for native video chat client,
https://github.com/pchab/AndroidRTC
this demo application require url with IP:PORT so i have one confusion about that how the server will be?
can anyone help me?
as I understand you need some signaling server which allows to detect peers, exchange session descriptions to setup media ports; and helps share everything used for initial handshake. You can find more information here: https://www.webrtc-experiment.com/docs/WebRTC-Signaling-Concepts.html. There a lot of open source implementations e.g. https://janus.conf.meetecho.com/.
Hope this helps.
#Alexey Osminin and #Pubnub are right: you need a signal protocol service (PubNub) and you need a hosted WebRTC solution for the audio/video streams.
Your best bet is to start with this awesome blog, BUILDING AN ANDROID WEBRTC VIDEO CHAT APP, by Kevin Gleason who is the one that did the AndroidRTC and WebRTC research for PubNub as an intern.
PubNub & WebRTC
There is a lot of confusion around what PubNub offers in the WebRTC arena and we have compiled everything you need to know into a single knowledge base article.
We're trying to build an internal system which will provide us with simple chat/video features using WebRTC. We have successfully deployed Peer.js client and server which works great over the browser.
However, we can't seem to find a fairly simple Android/iOS client/SDK to make it compatible with our PeerJS server.
We've tried looking into AppRTC and got the Android client running with their server, however we can't understand how to connect it to our PeerJS server since that Android client (and server implementation) seems really complicated (we're not JAVA experts).
I've also looked into EasyRTC, however it seems that they've pulled back their native SDKs, but the technology stack looks really close to ours.
My question is, has anyone got and Android/iOS Client working with WebRTC running on a NodeJS server? What are the possible workarounds to get this up and running natively on Android?
We're looking for a fairly simple Android SDK (links to libraries/sample projects) which could work with a PeerJS server.
Edit: We could build a signaling server (on NodeJS) ourselves, but how can we build the Android/iOS clients from then on?
I'm offering a bounty of 200 rep to whoever can answer our questions.
I don't know PeerJS but it seems like using websockets. If that's the case you have to implement WebSocket client functionality in you native clients (and various PeerJS internal connection/signaling protocol).
For native to native signaling, it's really simple because we only have to exchange SDP and ICE candidate messages between clients (via WebSocket or any other messaging mechanisms).
To connect to PeerJS server, obviously, we need PeerJS client implemented in Java or C(ObjC).
I'll be surprised if such implementation or SDK exists.
I don't know current state of AppRTC source, but in its old version, it hosted a WebView to run a kind of HTML+JavaScript signaling client.
That is, you can host a WebView and reuse your PeerJS client in Android/iOS apps.
You can have look at Crosswalk project. By follow Tutorial: porting Android app from Web App for WebRTC using PeerJS library. I have done it and it worked perfectly.
There are another option which is using http://phonegap.com/ to port from web app to Android/iOS but I could not make it work event just with "getUserMedia" API.
Hope this help
I found a example in here https://github.com/pchab/AndroidRTC1
On the server they used nodejs + socket.io + AngurlarJs.
On the client they used libjingle_peerconnection + socket.io Client.
Not sure if this is the correct Stack Exchange website but here goes..
A client has asked me to look into the possibility of having a iOS or Android App for typing in information storing that in a SQLLite database and then syncing up with the main desktop application when plugged in by cable or something other syncing technology.
The desktop application is a Windows one written currently in Delphi 7.
Are there any API's to sync data from a SQLLite database on iOS/Android that Delphi can use?
If not, then would it be better if the desktop application was written in C# as its a newer language that can consume the API's easier?
For unidirectional sync (device to desktop), I would start with a simple web service (HTTP based). The new Web Sockets standard, also based on HTTP, is a little more complicated, but would allow for bidirectional communication.
The devices can HTTP POST database changes to the server, and the desktop client can pull new data (using HTTP GET) or receive push notifications, for example using a Web Sockets client.
For desktop to device you could also check out Apple Push Notifications and the corresponding Android technology.
For high availability I recommend a cloud-based solution like Amazon Elastic Compute Cloud (EC2), Google App Engine (GAE) or Azure.
This question is rather broad, there are many things you could do here. There aare so many technologies to use it's blinding.
The prevailing technique for transferring information from mobile devices is REST (over http).
You could also whip something up rather swiftly in node.js or WFC to create a service to collect information from the mobile devices.
I'm not sure about Delphi libraries available but perhaps by including REST in your search term you may have more luck. You may want to take a look at https://stackoverflow.com/questions/3959851/using-rest-with-delphi for a start.
The mobile side may be more tricky. If you are developing for multiple devices you may wish to explore the cross platform developer framework by PhoneGap as there seems to be a few projects that aim to sync local databases to the cloud. How you tie these together will be an interesting task.
Good luck!!!