Build WebRTC for Android application - Video and Audio streaming - android

I am going through webRTC for Android. I need to develop video and audio chat application. I read many things in webRTC. I am confused where to start with. I did not get proper link.Many said that refer below site.
https://webrtc.org/reference/getting-started
But I could not find this page itself. Please help me to build webRTC for Android development.
Note: I want opensource code. I don't want any licenced libraries.
Thank you for the help.

Check this, and you can start by studying this sample android app.

Related

How to implement WebRTC on android for app to app calling?

I am really new to WebRTC. What i need is to implement app to app voice calling (not video calling) feature in my android app. I want to call randomly among my app users by webRTC on android. I implemented appRTC sdk in android studio and made an app. By this app i can create or join in a room. And then i can create peer to peer connection (voice call). Its 1 to 1 calling in same room. But how to implement random calling. I just want to know the way i can achieve it. Thank you
I've been playing with webRTC quite a lot recently and created videochat-roullete as a sample for our webRTC wrapper, you might want to take a look into it : https://github.com/netguru/videochatguru-android
WebRTC can be problematic in many cases as it lacks of good documentation, hope it helps.
Use easyRTC which is built on webRTC. I have personally applied it in one of our project for audio /video and chat communication. Use this link https://demo.easyrtc.com/demos/index.html
There is only one challenge I am facing right now, how to make it work on iOS

Unity Speech to text with Azure

I want to implement Speech-to-text functionality using Azure on a Unity project that will be deployed to Android.
I have tried this example:
https://learn.microsoft.com/en-us/azure/cognitive-services/speech/getstarted/getstartedcsharpdesktop
But Im not sure how Im supposed to implement this in my Unity project. Can anyone point me at the right direction?
Based on my understanding, I think you are developing an Android App using Unity. So considering to the compatibility of client library for Android, I suggested that the best way is follow the other tutorial for REST API in C# to implement your needs in your Unity project, and first you need to implement a feature to record a speech audio file, then pass it as the request body of REST API.
For real-time streaming, please refer to Speech Websocket Protocol.
Hope it helps. Any concern, please feel free to let me know.

How to build a simple Native WebRTC Android application that can do basic video call?

Recently I've been struggling a lot with WebRTC, I was able to build a very simple WebRTC web application based on the WebRTC codelab which consists of a simple signaling server (basically step 8 in the codelab tutorial).
My next target is to build a native Android application that does the same thing which is to be able to make video call with the web application using the same simple signaling server. I am very new to WebRTC and I could not find any good tutorial or guide that allows me to build a simple native Android application.
I've searched for similar questions on Stackoverflow but most of them are outdated and do not provide useful answers that I need.
I'm hoping the Stackoverflow community who knows any good source or tutorial on how to build a simple and basic native WebRTC Android application can share with me their knowledge and information. Thank you so much.
I suggest you build the AppRTCMobile target in WebRTC (see https://webrtc.org/native-code/android for details on how to build etc) then deploy your own instance of AppRTC (https://github.com/webrtc/apprtc) if you wan to have full control over the signaling. Otherwise you can just use the one publicly available at https://appr.tc.

Screen sharing in native Android application using webrtc

I searched many documents but didn't find any exact solution for my problem. I want to implement audio call and screen sharing in Android native application using webrtc without using any third party sdk.
I found one demo example i.e apprtc but it supports only audio call. How to implement screen sharing too?
This answer may be irrelevant for the OP, since the question is very old.
Anyway, for anyone in the future searching for something similar, check this commit in webrtc repo. It adds a screen capturer for Android.

Making an app that streams videos to Blackberry Playbook?

I am trying to work on an app that streams videos from a website into an app. So, it's like watching those videos in the app, not the website. How would you do this? Can someone point me to a tutorial or explain it to me?
Well depends on what framework you are developing for. You need to provide more info if you want real answers.
For example, if you are using the native sdk to develop for the playbook (C/C++), info regarding streaming video can be found here:
https://bdsc.webapps.blackberry.com/native/documentation/video_playback_overview_1935223_11.html
May not be for streaming specifically, but its a start.
If you are developing and AIR application (Flex), well then thats library question. Just off the top of my head, you would probably use the NetStream class. Documentation found here:
http://help.adobe.com/en_US/FlashPlatform/reference/actionscript/3/flash/net/NetStream.html
Again, need more info to give you a good answer.

Categories

Resources