How to USe WebRTC in native android 2019 - android

How can I set up WebRTC in Kotlin for Android Studio? I couldn't find a working solution. Please provide detailed info.

Many of the examples online are using the old WebRTC api for android. There have been many changes in the past few years. The following example is in Java but it should be similar to Kotlin.
To start off with, you need to request permissions to camera and audio. Then perhaps set your views using findviewbyid, then add your ice servers to an array:
List<PeerConnection.IceServer> peerIceServers = new ArrayList<>();
peerIceServers.add(PeerConnection.IceServer.builder("stun:stun1.l.google.com:19302").createIceServer());
then initialize your peer connection factory.
DefaultVideoEncoderFactory defaultVideoEncoderFactory = new DefaultVideoEncoderFactory(eglBase.getEglBaseContext(), true, true);
DefaultVideoDecoderFactory defaultVideoDecoderFactory = new DefaultVideoDecoderFactory(eglBase.getEglBaseContext());
PeerConnectionFactory.InitializationOptions initializationOptions =
PeerConnectionFactory.InitializationOptions.builder(this)
.createInitializationOptions();
PeerConnectionFactory.initialize(initializationOptions);
PeerConnectionFactory.Options options = new PeerConnectionFactory.Options();
factory = PeerConnectionFactory.builder()
.setVideoEncoderFactory(defaultVideoEncoderFactory)
.setVideoDecoderFactory(defaultVideoDecoderFactory)
.setOptions(options)
.createPeerConnectionFactory();
Then you can initialize camera and audio and your signalling client.
Looking at this example in Java may help:

It's been too late. Now, we have many tutorials for WebRTC android.
You need to follow below steps
Create and initialize PeerConnectionFactory
Create a VideoCapturer instance which uses the camera of the device
Create a VideoSource from the Capturer Create a VideoTrack from the source
Create a video renderer using a SurfaceViewRenderer view and add it to the
VideoTrack instance
Initialize Peer connections
Start streaming Video
private fun initializePeerConnectionFactory() {
//Initialize PeerConnectionFactory globals.
val initializationOptions = InitializationOptions.builder(this).createInitializationOptions()
PeerConnectionFactory.initialize(initializationOptions)
//Create a new PeerConnectionFactory instance - using Hardware encoder and decoder.
val options = PeerConnectionFactory.Options()
val defaultVideoEncoderFactory = DefaultVideoEncoderFactory(rootEglBase?.eglBaseContext, /* enableIntelVp8Encoder */true, /* enableH264HighProfile */true)
val defaultVideoDecoderFactory = DefaultVideoDecoderFactory(rootEglBase?.eglBaseContext)
factory = PeerConnectionFactory.builder()
.setOptions(options)
.setVideoEncoderFactory(defaultVideoEncoderFactory)
.setVideoDecoderFactory(defaultVideoDecoderFactory)
.createPeerConnectionFactory()
}
Here, is full demo available but in Java -
Example

Related

Initialize PeerConnectionFactory with latest webrtc dependency

I am using the below dependency of webrtc in my Android App
implementation 'org.webrtc:google-webrtc:1.0.+'
How to Initialize PeerConnectionFactory, I am doing it in below manner but it is giving compilation error.
private void initializePeerConnectionFactory() {
PeerConnectionFactory.initializeAndroidGlobals(this, true, true, true);
factory = new PeerConnectionFactory(null);
factory.setVideoHwAccelerationOptions(rootEglBase.getEglBaseContext(), rootEglBase.getEglBaseContext());
}
I tried it with below fashion:
private void initializePeerConnectionFactory() {
PeerConnectionFactory.initializeAndroidGlobals(this, true, true, true);
factory = new PeerConnectionFactory(null);
factory.setVideoHwAccelerationOptions(rootEglBase.getEglBaseContext(), rootEglBase.getEglBaseContext());
}
But it's not working
As of today (25 Jan 2023), use the latest version of the dependency.
As the older ones had security vulnerabilities and google play does not accept them.
You can initialize PeerConnectionFactory like this:
https://webrtc.googlesource.com/src/+/refs/heads/main/examples/androidapp/src/org/appspot/apprtc/PeerConnectionClient.java?autodive=0%2F%2F
Using
org.webrtc:google-webrtc:1.0.32006
which I believe to be the latest version of webrtc available
you initialise the peer connection factory using the following code
PeerConnectionFactory.InitializationOptions initializationOptions = PeerConnectionFactory.InitializationOptions.builder(getApplicationContext()).createInitializationOptions();
PeerConnectionFactory.initialize(initializationOptions);
//Create a new PeerConnectionFactory instance - using Hardware encoder and decoder.
PeerConnectionFactory.Options options = new PeerConnectionFactory.Options();
DefaultVideoEncoderFactory defaultVideoEncoderFactory = new DefaultVideoEncoderFactory(rootEglBase.getEglBaseContext(), /* enableIntelVp8Encoder */true, /* enableH264HighProfile */true);
DefaultVideoDecoderFactory defaultVideoDecoderFactory = new DefaultVideoDecoderFactory(rootEglBase.getEglBaseContext());
factory = PeerConnectionFactory.builder().setOptions(options)
.setVideoEncoderFactory(defaultVideoEncoderFactory)
.setVideoDecoderFactory(defaultVideoDecoderFactory)
.createPeerConnectionFactory();
Hope this helps!

android antmedia webrtc switch camera to Screen Sharing and vice versa

I'm using webrtc-android-framework module provided by Antmedia official website. I was able to make connection and I can see the video published on the other side without any issues. However I'm unable to switch from camera to screen sharing.
I'm using below code to switch from camera capture to screen sharing.
public void MakeScreenCaptureReady() {
final EglBase.Context eglBaseContext = eglBase.getEglBaseContext();
PeerConnectionFactory peerConnectionFactory = peerConnectionClient.factory;
// create AudioSource
AudioSource audioSource = peerConnectionFactory.createAudioSource(new MediaConstraints());
this.audioTrack = peerConnectionFactory.createAudioTrack("101", audioSource);
surfaceTextureHelper = SurfaceTextureHelper.create("CaptureThread", eglBaseContext);
// create VideoCapturer
videoCapturer = createScreenCapturer();
VideoSource videoSource =
peerConnectionFactory.createVideoSource(videoCapturer.isScreencast());
localVideoTrack = peerConnectionFactory.createVideoTrack("118", videoSource);
videoCapturer.initialize(surfaceTextureHelper, context, videoSource.getCapturerObserver());
videoCapturer.startCapture(720, 1280, 30);
peerConnectionClient.setLocalVideoTrack(localVideoTrack);
peerConnectionClient.localVideoSender.setTrack(localVideoTrack, true); //true for taking ownership and replacing the existing track
}
It buffers screen sharing video for 2-3 seconds and stops throwing source error at the subscribers end. Basically no chunks available on server to further buffer.
I have already taken required permission for screen sharing before hitting the above code.
startActivityForResult(
mMediaProjectionManager!!.createScreenCaptureIntent(),
SCREEN_RECORD_REQUEST_CODE
)
This is the below code I'm using to call the above method on onActivityResult :
intent.putExtra(CallActivity.EXTRA_SCREENCAPTURE, true)
webRTCClient.setMediaProjectionParams(resultCode, data)
webRTCClient.MakeScreenCaptureReady()
How do I achieve switching between camera and screen capture? Any help is much appreciated. Thanks!

Android WebRTC Local AudioTrack is not streaming audio

I am developing an android app with Audio call feature by using WebRTC. I have written the code with latest build available for android. When i receive a call from client, i am able to Connect client successfully. Problem is, i can hear client audio but my audio is not streaming to client. Below is the code.
factory = PeerConnectionFactory.builder()
.setAudioDeviceModule(adm)
.setOptions(options)
.createPeerConnectionFactory();
peerConnection = createPeerConnection(factory);
if(peerConnection != null){
remoteAudioSource = factory.createAudioSource(new MediaConstraints());
remoteAudioTrack = factory.createAudioTrack("5555",remoteAudioSource);
//Local audio track
List<String> mediaStreamLabels = Collections.singletonList("ARDAMS");
peerConnection.addTrack(createAudioTrack(),mediaStreamLabels);
peerConnection.setAudioRecording(true);
peerConnection.setAudioPlayout(true);
}
private AudioTrack createAudioTrack() {
localAudioSource = factory.createAudioSource(new MediaConstraints());
localAudioTrack = factory.createAudioTrack(AUDIO_TRACK_ID, localAudioSource);
localAudioTrack.setEnabled(true);//Enable audio
return localAudioTrack;
}

How to detect WebRTC supported video codecs on native Android

We have a native Android app that uses WebRTC, and we need to find out what video codecs are supported by the host device. (VP8 is always supported but H.264 is subject to the device having a compatible chipset.)
The idea is to create an offer and get the supported video codecs from the SDP. We can do this in a web app as follows:
const pc = new RTCPeerConnection();
if (pc.addTransceiver) {
pc.addTransceiver('video');
pc.addTransceiver('audio');
}
pc.createOffer(...);
Is there a way to do something similar on Android? It's important that we don't need to request camera access to create the offer.
Create a VideoEncoderFactory object and call getSupportedCodecs(). This will return a list of codecs that can be used. Be sure to create the PeerConnectionFactory first.
PeerConnectionFactory.InitializationOptions initializationOptions =
PeerConnectionFactory.InitializationOptions.builder(this)
.setEnableVideoHwAcceleration(true)
.createInitializationOptions();
PeerConnectionFactory.initialize(initializationOptions);
VideoEncoderFactory videoEncoderFactory =
new DefaultVideoEncoderFactory(eglBase.getEglBaseContext()
, true, true);
for (int i = 0; i < videoEncoderFactory.getSupportedCodecs().length; i++) {
Log.d("Codecs", "Supported codecs: " + videoEncoderFactory.getSupportedCodecs()[i].name);
}
I think this is what you are looking for:
private static void codecs() {
MediaCodecInfo[] codecInfos = new MediaCodecList(MediaCodecList.ALL_CODECS).getCodecInfos();
for(MediaCodecInfo codecInfo : codecInfos) {
Log.i("Codec", codecInfo.getName());
for(String supportedType : codecInfo.getSupportedTypes()){
Log.i("Codec", supportedType);
}
}
}
You can check example on https://developer.android.com/reference/android/media/MediaCodecInfo.html

Android screen sharing using WebRTC

I have heard about screen sharing on desktop using WebRTC. But for the Android, it seems not to have much information.
My question is:
Is it possible to use WebRTC for screen sharing on android?. I mean I can cast the current screen to the other phone's screen.
If 1 is Yes, How can I achieve this?
Thanks.
It is possible!
It can be done using the directions below.
I've used ScreenShareRTC in conjunction with ProjectRTC to stream the contents of the screen to a browser with decent quality and fairly low latency ~100ms.
I've added an example below that shows how to configure a screen share as a video source and add it as a track on a stream.
Get the VideoCapturer
#TargetApi(21)
private VideoCapturer createScreenCapturer() {
if (mMediaProjectionPermissionResultCode != Activity.RESULT_OK) {
report("User didn't give permission to capture the screen.");
return null;
}
return new ScreenCapturerAndroid(
mMediaProjectionPermissionResultData, new MediaProjection.Callback() {
#Override
public void onStop() {
report("User revoked permission to capture the screen.");
}
});
}
Initialize the capturer and add the tracks to the local media stream
private void initScreenCapturStream() {
mLocalMediaStream = factory.createLocalMediaStream("ARDAMS");
MediaConstraints videoConstraints = new MediaConstraints();
videoConstraints.mandatory.add(new MediaConstraints.KeyValuePair("maxHeight", Integer.toString(mPeerConnParams.videoHeight)));
videoConstraints.mandatory.add(new MediaConstraints.KeyValuePair("maxWidth", Integer.toString(mPeerConnParams.videoWidth)));
videoConstraints.mandatory.add(new MediaConstraints.KeyValuePair("maxFrameRate", Integer.toString(mPeerConnParams.videoFps)));
videoConstraints.mandatory.add(new MediaConstraints.KeyValuePair("minFrameRate", Integer.toString(mPeerConnParams.videoFps)));
mVideoSource = factory.createVideoSource(videoCapturer);
videoCapturer.startCapture(mPeerConnParams.videoWidth, mPeerConnParams.videoHeight, mPeerConnParams.videoFps);
VideoTrack localVideoTrack = factory.createVideoTrack(VIDEO_TRACK_ID, mVideoSource);
localVideoTrack.setEnabled(true);
mLocalMediaStream.addTrack(factory.createVideoTrack("ARDAMSv0", mVideoSource));
AudioSource audioSource = factory.createAudioSource(new MediaConstraints());
mLocalMediaStream.addTrack(factory.createAudioTrack("ARDAMSa0", audioSource));
mListener.onStatusChanged("STREAMING");
}
For more information this might be a good place to start. Its a Android project that connects to a ProjectRTC signalling server and shares the screen as video. I found it very helpful!
Android screen sharing project(Android client - Java)
https://github.com/Jeffiano/ScreenShareRTC
ProjectRTC(Node server)
https://github.com/pchab/ProjectRTC

Categories

Resources