Initialize PeerConnectionFactory with latest webrtc dependency - android

I am using the below dependency of webrtc in my Android App
implementation 'org.webrtc:google-webrtc:1.0.+'
How to Initialize PeerConnectionFactory, I am doing it in below manner but it is giving compilation error.
private void initializePeerConnectionFactory() {
PeerConnectionFactory.initializeAndroidGlobals(this, true, true, true);
factory = new PeerConnectionFactory(null);
factory.setVideoHwAccelerationOptions(rootEglBase.getEglBaseContext(), rootEglBase.getEglBaseContext());
}
I tried it with below fashion:
private void initializePeerConnectionFactory() {
PeerConnectionFactory.initializeAndroidGlobals(this, true, true, true);
factory = new PeerConnectionFactory(null);
factory.setVideoHwAccelerationOptions(rootEglBase.getEglBaseContext(), rootEglBase.getEglBaseContext());
}
But it's not working

As of today (25 Jan 2023), use the latest version of the dependency.
As the older ones had security vulnerabilities and google play does not accept them.
You can initialize PeerConnectionFactory like this:
https://webrtc.googlesource.com/src/+/refs/heads/main/examples/androidapp/src/org/appspot/apprtc/PeerConnectionClient.java?autodive=0%2F%2F

Using
org.webrtc:google-webrtc:1.0.32006
which I believe to be the latest version of webrtc available
you initialise the peer connection factory using the following code
PeerConnectionFactory.InitializationOptions initializationOptions = PeerConnectionFactory.InitializationOptions.builder(getApplicationContext()).createInitializationOptions();
PeerConnectionFactory.initialize(initializationOptions);
//Create a new PeerConnectionFactory instance - using Hardware encoder and decoder.
PeerConnectionFactory.Options options = new PeerConnectionFactory.Options();
DefaultVideoEncoderFactory defaultVideoEncoderFactory = new DefaultVideoEncoderFactory(rootEglBase.getEglBaseContext(), /* enableIntelVp8Encoder */true, /* enableH264HighProfile */true);
DefaultVideoDecoderFactory defaultVideoDecoderFactory = new DefaultVideoDecoderFactory(rootEglBase.getEglBaseContext());
factory = PeerConnectionFactory.builder().setOptions(options)
.setVideoEncoderFactory(defaultVideoEncoderFactory)
.setVideoDecoderFactory(defaultVideoDecoderFactory)
.createPeerConnectionFactory();
Hope this helps!

Related

Xamarin.Android Mqtt Client with TLS Unauthorized result

I'm basically new to mobile development and stuck for a few weeks because,
I am unable to connect XAMARIN Android app with MQTTT over TLS.
Maybe i am using the wrong library or is it an Xamarin error?
My certificate is a .crt file from m21.cloudmqtt.com.
https://crt.sh/?id=5253106089
First was using System.Net MQTT but they are currently unable to work over TLS.
So i currently i am using MQTTNet, with (for the moment) the default certificate from m21.cloud.com which i have stored in Assets folder.I tested this local with and without a certificate and its working fine.
The MQTTNet Client with cert from local folder is like this, and works like it should:
var caCert = new X509Certificate2("C:/pathtocert.crt");
var source = new CancellationTokenSource().Token;
var token = source;
var factory = new MqttFactory();
var mqttClient = factory.CreateMqttClient();
var mqttOptions = new MqttClientOptionsBuilder()
.WithTcpServer(server, port)
.WithClientId(clientId)
.WithCredentials(username, pswd)
.WithTls(new MqttClientOptionsBuilderTlsParameters
{
UseTls = true,
Certificates = new List<X509Certificate> { caCert }
})
.Build();
mqttClient.ConnectAsync(mqttOptions, token).Wait(token);
To get the Certifcate from Android Assets Folder i used the same client code as above and t et the certificate i used:
using (var assetStream = await Xamarin.Essentials.FileSystem.OpenAppPackageFileAsync("filename.crt"))
using (var memStream = new MemoryStream())
{
assetStream.CopyTo(memStream);
caCert = new X509Certificate(memStream.ToArray());
}
I dont understand why its not working, for now its also okay if the certificate isn't used but it needs to use TLS. But tried, and i still get unauthorized error.
var mqttOptions = new MqttClientOptionsBuilder()
.WithTcpServer(server, port)
.WithClientId(clientId)
.WithCredentials(username, pswd)
.WithTls(new MqttClientOptionsBuilderTlsParameters
{
UseTls = true,
})
.Build();
Thanks in adnvance.

How to USe WebRTC in native android 2019

How can I set up WebRTC in Kotlin for Android Studio? I couldn't find a working solution. Please provide detailed info.
Many of the examples online are using the old WebRTC api for android. There have been many changes in the past few years. The following example is in Java but it should be similar to Kotlin.
To start off with, you need to request permissions to camera and audio. Then perhaps set your views using findviewbyid, then add your ice servers to an array:
List<PeerConnection.IceServer> peerIceServers = new ArrayList<>();
peerIceServers.add(PeerConnection.IceServer.builder("stun:stun1.l.google.com:19302").createIceServer());
then initialize your peer connection factory.
DefaultVideoEncoderFactory defaultVideoEncoderFactory = new DefaultVideoEncoderFactory(eglBase.getEglBaseContext(), true, true);
DefaultVideoDecoderFactory defaultVideoDecoderFactory = new DefaultVideoDecoderFactory(eglBase.getEglBaseContext());
PeerConnectionFactory.InitializationOptions initializationOptions =
PeerConnectionFactory.InitializationOptions.builder(this)
.createInitializationOptions();
PeerConnectionFactory.initialize(initializationOptions);
PeerConnectionFactory.Options options = new PeerConnectionFactory.Options();
factory = PeerConnectionFactory.builder()
.setVideoEncoderFactory(defaultVideoEncoderFactory)
.setVideoDecoderFactory(defaultVideoDecoderFactory)
.setOptions(options)
.createPeerConnectionFactory();
Then you can initialize camera and audio and your signalling client.
Looking at this example in Java may help:
It's been too late. Now, we have many tutorials for WebRTC android.
You need to follow below steps
Create and initialize PeerConnectionFactory
Create a VideoCapturer instance which uses the camera of the device
Create a VideoSource from the Capturer Create a VideoTrack from the source
Create a video renderer using a SurfaceViewRenderer view and add it to the
VideoTrack instance
Initialize Peer connections
Start streaming Video
private fun initializePeerConnectionFactory() {
//Initialize PeerConnectionFactory globals.
val initializationOptions = InitializationOptions.builder(this).createInitializationOptions()
PeerConnectionFactory.initialize(initializationOptions)
//Create a new PeerConnectionFactory instance - using Hardware encoder and decoder.
val options = PeerConnectionFactory.Options()
val defaultVideoEncoderFactory = DefaultVideoEncoderFactory(rootEglBase?.eglBaseContext, /* enableIntelVp8Encoder */true, /* enableH264HighProfile */true)
val defaultVideoDecoderFactory = DefaultVideoDecoderFactory(rootEglBase?.eglBaseContext)
factory = PeerConnectionFactory.builder()
.setOptions(options)
.setVideoEncoderFactory(defaultVideoEncoderFactory)
.setVideoDecoderFactory(defaultVideoDecoderFactory)
.createPeerConnectionFactory()
}
Here, is full demo available but in Java -
Example

How to detect WebRTC supported video codecs on native Android

We have a native Android app that uses WebRTC, and we need to find out what video codecs are supported by the host device. (VP8 is always supported but H.264 is subject to the device having a compatible chipset.)
The idea is to create an offer and get the supported video codecs from the SDP. We can do this in a web app as follows:
const pc = new RTCPeerConnection();
if (pc.addTransceiver) {
pc.addTransceiver('video');
pc.addTransceiver('audio');
}
pc.createOffer(...);
Is there a way to do something similar on Android? It's important that we don't need to request camera access to create the offer.
Create a VideoEncoderFactory object and call getSupportedCodecs(). This will return a list of codecs that can be used. Be sure to create the PeerConnectionFactory first.
PeerConnectionFactory.InitializationOptions initializationOptions =
PeerConnectionFactory.InitializationOptions.builder(this)
.setEnableVideoHwAcceleration(true)
.createInitializationOptions();
PeerConnectionFactory.initialize(initializationOptions);
VideoEncoderFactory videoEncoderFactory =
new DefaultVideoEncoderFactory(eglBase.getEglBaseContext()
, true, true);
for (int i = 0; i < videoEncoderFactory.getSupportedCodecs().length; i++) {
Log.d("Codecs", "Supported codecs: " + videoEncoderFactory.getSupportedCodecs()[i].name);
}
I think this is what you are looking for:
private static void codecs() {
MediaCodecInfo[] codecInfos = new MediaCodecList(MediaCodecList.ALL_CODECS).getCodecInfos();
for(MediaCodecInfo codecInfo : codecInfos) {
Log.i("Codec", codecInfo.getName());
for(String supportedType : codecInfo.getSupportedTypes()){
Log.i("Codec", supportedType);
}
}
}
You can check example on https://developer.android.com/reference/android/media/MediaCodecInfo.html

Android Google Exoplayer 2.2 HLS and DASH streaming cache

I'm trying to caching HLS and DASH streaming video,
I have tried many solution but not working with Exoplayer v2.2
many issue redirect to below links but not getting any proper solution.
https://github.com/google/ExoPlayer/issues/420 and Using cache in ExoPlayer.
In the one solution 'ExtractorSampleSource' class is not found in Google Exoplayer 2.2
OkHttpClient okHttpClient = new OkHttpClient.Builder().cache(new okhttp3.Cache(context.getCacheDir(), 1024000)).build();
OkHttpDataSource okHttpDataSource = new OkHttpDataSource(okHttpClient, "android", null);
OkHttpDataSource ok2 = new OkHttpDataSource(okHttpClient, "android", null);
HttpDataSource dataSource = new CacheDataSource(context, okHttpDataSource, ok2);
ExtractorSampleSource sampleSource = new ExtractorSampleSource(
uri,
dataSource,
allocator,
buffer_segment_count * buffer_segment_size,
new Mp4Extractor(), new Mp3Extractor());
In other solution got same error 'DefaultUriDataSource' class not found in v2.2
DataSource dataSource = new DefaultUriDataSource(context, null, new OkHttpDataSource(getClient(context), userAgent, null, null/*, CacheControl.FORCE_CACHE*/));
all the solutions are 1 to 2 year older and it's not supported latest version of Google Exoplayer v2.2.
any one have idea or any sample or any solution to do caching with HLS and DASH stream?
Used below buildDataSourceFactory and its storing the cache
DataSource.Factory buildDataSourceFactory(boolean cache) {
if (!cache) {
return new DefaultDataSourceFactory(context, BANDWIDTH_METER,
buildHttpDataSourceFactory(BANDWIDTH_METER));
}else{
return new DataSource.Factory() {
#Override
public DataSource createDataSource() {
LeastRecentlyUsedCacheEvictor evictor = new LeastRecentlyUsedCacheEvictor(100 * 1024 * 1024);
SimpleCache simpleCache = new SimpleCache(new File(context.getCacheDir(), "media_cache"), evictor);
return new CacheDataSource(simpleCache, buildCachedHttpDataSourceFactory(BANDWIDTH_METER).createDataSource(),
new FileDataSource(), new CacheDataSink(simpleCache, 10 * 1024 * 1024),
CacheDataSource.FLAG_BLOCK_ON_CACHE | CacheDataSource.FLAG_IGNORE_CACHE_ON_ERROR, null);
}
};
}
}
private DefaultDataSource.Factory buildCachedHttpDataSourceFactory(DefaultBandwidthMeter bandwidthMeter) {
return new DefaultDataSourceFactory(context, bandwidthMeter, buildHttpDataSourceFactory(bandwidthMeter));
}

android exoplayer custom datasource

I am developing a custom DataSource object to use in ExoPlayer.
I am having problems understanding how to connect it to Samplesource objects so that data requests from underlying Exoplayer components happen via my DataSource object.
Has anybody got this working?
Appreciate any comments.
Thanks.
SampleSource (or ChunkSource) takes the upstream DataSource object in its constructor - that is how you connect a DataSource to SampleSource (or ChunkSource)
Let me take an example of HLS to illustrate how to inject your custom DataSource to SampleSource.
In https://github.com/google/ExoPlayer/blob/master/demo/src/main/java/com/google/android/exoplayer/demo/player/HlsRendererBuilder.java
Existing Code
DataSource dataSource = new UriDataSource(userAgent, bandwidthMeter);
HlsChunkSource chunkSource = new HlsChunkSource(dataSource, url, manifest, bandwidthMeter, null,
HlsChunkSource.ADAPTIVE_MODE_SPLICE);
HlsSampleSource sampleSource = new HlsSampleSource(chunkSource, true, 3);
Let's assume you implement a CustomDataSource class. New code will look like this
CustomDataSource dataSource = new CustomDataSource(<your arguments here>);
HlsChunkSource chunkSource = new HlsChunkSource(dataSource, url, manifest, bandwidthMeter, null,
HlsChunkSource.ADAPTIVE_MODE_SPLICE);
HlsSampleSource sampleSource = new HlsSampleSource(chunkSource, true, 3);

Categories

Resources