How to Implement Custom ExpandedControllerActivity in Android Using Cast v3 SDK - android

I am using Xamarin.Android to develop my current Android app. And I plan to make the application chromecast compatible (users can stream videos). To get this done I am following the tutorial at: Google Cast Docs. However the Xamarin Cast libraries seem to be missing the ExpandedControllerActivity implementation. They should reside under Android.Gms.Cast.Framework.Media.Widget, but I can only find ControlButtonsContainer and MiniControllerFragment.
The only option I can see is to implement a custom ExpandedControllerActivity myself. However I lack the knowledge to do so (because I couldn't even find the java implementation of the ExpandedControllerActivity). Some guidance to get this done, is very much appreciated (it doesn't have to be in C#, just plain old Java will do).
NOTE :- This is my first question over at StackOverflow, so if you are down-voting the question, please state the exact reason.

The java binding for the component cast-framework at Xamarin Google Play Services Components (v10.0.1.0 or v10.0.2.0) doesn't seem to generate the ExpandedControllerActivity. I have created an issue on Github, with a suggested fix.

Related

How to use simulcast with webrtc android sdk?

(I am newer of stack overflow user and sorry for my poor English.)
I am using WebRtc Android SDK to create a chat app, and our team decides to use
simulcast to deal with various abilities of participants. However, when I come to the SDK APIs, I can not find a way to use simulcast.
I use the SDK with offical-recommended manner:
implementation 'org.webrtc:google-webrtc:1.0.+'
(The concrete version is 1.0.28513.)
I have googled much and found some code fragment like this:
RtpTransceiver.RtpTransceiverInit transceiverInit =
new RtpTransceiver.RtpTransceiverInit(peerConnectionParameters.transDirection, mediaStreamLabels
/*, encodings*/ // cannot create Encoding instances
);
...
peerConnection.addTransceiver(MediaStreamTrack.MediaType.MEDIA_TYPE_VIDEO, transceiverInit);
However, when I try to create Encoding instances,I found the Encoding's constructor cannot be accessed, it is package access limit.
By the way, I have tried using reflection to forcibly create Encoding instances.
But it will cause an error when this code executes as might have been expected:
peerConnection.addTransceiver(MediaStreamTrack.MediaType.MEDIA_TYPE_VIDEO, transceiverInit);
I have also gone through the SDK APIs, and cannot find any other way to set up the simulcast.
So how should I use simulcast with webrtc android SDK?

How to use Google Cloud Text-to-Speech on Android

I want to use Google Cloud Text-to-Speech on my Android app.
I found a sample code but I don't know how to use it in my app.
I run this sample and it worked.
https://github.com/changemyminds/Google-Cloud-TTS-Android
How to use it totally depends on your needs. You can use some patterns from this example. Also, you can read official Google documentation and use Java examples, but most important is TTS android reference with Java and Kotlin.
You can try to find in Google something like "Android TTS tutorials" to get more understanding about the theme.
I know it's a lot late response but still, if you had that issue then I had the solution of it and could say this would definitely help those who are facing the same issues, so if you are not having a heavy app size or it doesn't your Gradle conflicts with maven then you may prefer:- https://github.com/changemyminds/Google-Cloud-TTS-Android but if this link causes you errors as I got do prefer:-
https://github.com/ivso0001/GoogleCloudTextToSpeech this one is the very simpler so you won't be facing big issues. as it does help me too.

Does react-native-touch-id support android?

I cant find anything saying one way or the other. Looking through everything, theres no reason that it shouldnt, however I am new to developing on either android or ios, so Im not entirely sure of the different methods available and how they link with react-native and react-native-touch-id
The only thing I can find online is something from two years ago that states react-native-touch-id does not support android "yet". And I found another node package called react-native-touch-id-android which I am having trouble building
I've been able to use React Native Touch ID for both iOS and Android on the latest version of RN. Someone forked this repo to also allow the device passcode as a fallback.
https://github.com/tradle/react-native-local-auth

how to use webRTC on android studio for video live streaming

I passed three weeks searching how to create a video live call for my android app (using android studio), but can't find exactly what I'm looking for, I don't want to use something like quikbox or snitch because it's my final year project and I have to do it programatically, I find that webRTC for android can be used but unfortunately I didn't understand how to use it.
So please can anyone help me with any things.
This is unfortunately something that's still hard. The team is working on it though: see https://bugs.chromium.org/p/webrtc/issues/detail?id=6328 for progress.
There's also https://bugs.chromium.org/p/webrtc/issues/detail?id=6804 that has resulted in this bot archiving .aar builds: https://build.chromium.org/p/client.webrtc.fyi/builders/Android%20Archive which should make it easier to consume the library.

How to open XBRL document in android

How to open XBRL document in our android app?any third party library available or is it availble already in android?
It seems nothing android specific is available, at least I couldn't find anything. Since Android development is done in Java, you could try integrating an existing java xbrl implementation into your project, usually it should work.
http://www.xbrlapi.org/
looks promising. If you're successful, please post a short review here, as I can imaging, this concerns several other people as well.

Categories

Resources