Managing AndroidPermissionRequest and MediaPermissionRequest using Geckoview Android - android

I am currently having a problem on managing requests using geckoview. Android built in webview is not an option for me because the website I want to open is not compatible with chrome. It can be opened only using mozilla so geckoview is my alternative.
The problem I have is granting permission on using the microphone and recording audio. Because the website I am trying to open in geckoview records audio (voice Collection).
I'am new to android and geckoview thats why the guide I use is this project https://searchfox.org/mozilla-central/source/mobile/android/geckoview_example/src/main/java/org/mozilla/geckoview_example/GeckoViewActivity.java
I was able to show the request permission and accept it but it seems my application doesn't store the permission result. Currently I am trying my program to this website https://www.onlinemictest.com
This is my PermissionDelegate
private class ExamplePermissionDelegate implements GeckoSession.PermissionDelegate {
public int androidPermissionRequestCode = 1;
#Override
public void onAndroidPermissionsRequest(GeckoSession session, String[] permissions, Callback callback)
{
if (ContextCompat.checkSelfPermission(MainActivity.this,
Manifest.permission.RECORD_AUDIO) != PackageManager.PERMISSION_GRANTED){
Log.i(TAG, "Android Permission Needed");
requestPermissions(permissions, androidPermissionRequestCode);
callback = new ExamplePermissionCallback();
callback.grant();
}
else{
Log.i(TAG, "Android Permission Granted");
callback.grant();
}
}
#Override
public void onContentPermissionRequest (GeckoSession session, String uri, int type, String access, Callback callback)
{
Log.i(TAG, "Content Permission Needed");
}
#Override
public void onMediaPermissionRequest (GeckoSession session, String uri, MediaSource[] video, MediaSource[] audio, MediaCallback callback)
{
Log.i(TAG, "Media Permission Needed");
}
}
and this is my PermissionDelegateCallback
public class ExamplePermissionCallback implements GeckoSession.PermissionDelegate.Callback{
#Override
public void grant() {
int permission = ContextCompat.checkSelfPermission(MainActivity.this,
Manifest.permission.RECORD_AUDIO);
if (permission != PackageManager.PERMISSION_GRANTED) {
ActivityCompat.requestPermissions(MainActivity.this,
new String[]{Manifest.permission.RECORD_AUDIO},
RECORD_REQUEST_CODE);}
}
#Override
public void reject() {
}
}
The result says 'Android Permission Granted' and after that, it shows the Log I put which is 'Media Permission Needed' and the website says 'Waiting for microphone'
I also checked the application on my phone and it already has the microphone permission.

GeckoView has two levels of permissions:
The Android-level permission which Android grants to your app and you seem to be requesting correctly
The content-level permission which GeckoView grants to the specific web page and is not granted in your example.
In short, just because your app has permission to listen to the microphone, that doesn't mean that all web pages that you open in GeckoView will have access to the microphone.
When a page requests a media permission you get a onMediaPermission callback which you would need to accept using callback.grant, an example of this is here: https://searchfox.org/mozilla-central/rev/3483fb259b4edbe4594cfcc3911db97d5441b67d/mobile/android/geckoview_example/src/main/java/org/mozilla/geckoview_example/BasicGeckoViewPrompt.java#927
The audio argument of onMediaPermission contains the list of all the audio sources (most likely you'll only have one, the microphone) which you can use to accept the prompt for the right audio source calling
#Override
public void onMediaPermissionRequest(
GeckoSession session,
String uri,
MediaSource[] video,
MediaSource[] audio,
MediaCallback callback)
{
// Find out which audio source is the microphone
final int MICROPHONE_INDEX = ...;
// Grant the request
callback.grant(null, audio[MICROPHONE_INDEX]);
}
Note if you also need video, you can do the same thing with the video argument and the appropriate video source.
To figure out which one is the microphone you can use this snippet as example (look for SOURCE_MICROPHONE) https://searchfox.org/mozilla-central/source/mobile/android/geckoview_example/src/main/java/org/mozilla/geckoview_example/GeckoViewActivity.java#1201-1223

Related

Detect when a screenshot was taken without requiring storage permission on Android

I want to know when a screenshot has been taken of our application, but I do not care about the metadata, nor do I want a reference to the actual image.
The current implementation I'm looking to improve declares a custom ContentObserver, which subscribes to MediaStore.Images.Media.EXTERNAL_CONTENT_URI
mContentResolver.registerContentObserver(MediaStore.Images.Media.EXTERNAL_CONTENT_URI, true, this)
This enables the Observer to be notified when content has changed in this URI.
#Override
synchronized public void onChange(boolean selfChange, Uri uri) {
super.onChange(selfChange, uri);
if (uri.toString().startsWith(MediaStore.Images.Media.EXTERNAL_CONTENT_URI.toString())) {
try {
checkPermissionAndProcess(uri);
} catch (Exception e) {
Timber.w(e, "onChange error : %s", e.toString());
}
} else {
Timber.v("[Finish] not EXTERNAL_CONTENT_URI ");
}
}
At this point, we need to check if we have Manifest.permission.READ_EXTERNAL_STORAGE and if not, request it. If the user grants us this permission, we can verify if the URI change is a screenshot with a series of checks. If we don't have permission, a SecurityException is thrown.
private boolean matchPath(String path) {
return (path.toLowerCase().contains("screenshots/") && !path.contains(FILE_POSTFIX));
}
My understanding of a ContentResolver observing MediaStore.Images.Media.EXTERNAL_CONTENT_URI means that any changes external to the application's storage can invoke the onChange callback.
Is there any way around this without asking the user for storage access which feels icky?
On Android 7.1.1 and Android 13 devices, I have tried using the camera while the app is in the background to trigger the onChange of the ContentResolver, and I only see it invoked when I take a screenshot. Assuming just observing changes in MediaStore.Images.Media.EXTERNAL_CONTENT_URI is naive, as other apps storing images could also result in changes here.

Blazor MAUI - Camera and Microphone Android permissions

I am trying to show live stream from camera & microphone in <video> html element.
In blazor file (sample.razor) I invoke method to request and show video:
protected override async Task OnAfterRenderAsync(bool firstRender)
{
await base.OnAfterRenderAsync(firstRender);
if (firstRender)
{
await JSRuntime.InvokeVoidAsync("requestMediaAndShow");
}
}
In javascript file (sample.js) I request stream and assign to video html element.
// Create request options
let options = {
audio: true,
video: true
};
// Request user media
navigator.mediaDevices
.getUserMedia(options)
.then(gotLocalStream)
.catch(logError);
But when I am requesting i catch an error like "NotAllowedError: Permission denied".
AndroidManifest.xml contains:
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" />
MainActivity.cs contains:
public class MainActivity : MauiAppCompatActivity
{
protected override void OnCreate(Bundle savedInstanceState)
{
base.OnCreate(savedInstanceState);
ActivityCompat.RequestPermissions(this, new[] { Manifest.Permission.Camera, Manifest.Permission.RecordAudio, Manifest.Permission.ModifyAudioSettings }, 0);
}
}
Any ideas how to request audio and video stream by javascript in BlazorWebView natively on android?
PS. On both on website and natively on Windows platform works great and no extra permissions are required.
Although the Android permissions seem to be granted OK, I suspected the website permissions were not. Not sure if the author of this question is the same person, but an issue was opened on the .NET MAUI repo as well about this.
While we are looking into making this work out of the box, another helpful user has posted this workaround for now.
Implement your own handler like so
public class MauiWebChromeClient : WebChromeClient
{
public override void OnPermissionRequest(PermissionRequest request)
{
request.Grant(request.GetResources());
}
}
public class MauiBlazorWebViewHandler : BlazorWebViewHandler
{
protected override WebChromeClient GetWebChromeClient()
{
return new MauiWebChromeClient();
}
}
And register this handler in your MauiProgram.cs:
builder.ConfigureMauiHandlers(handlers =>
{
handlers.AddHandler<IBlazorWebView, MauiBlazorWebViewHandler>();
});
This will automatically grant all the requested permissions from the web side of things.
I think you are missing some parts of the permission checks. You should read this section about runtime permissions that were introduced in Android 6.0. For example, I cannot see any override of OnRequestPermissionsResult in MainActivity.
https://learn.microsoft.com/en-us/xamarin/android/app-fundamentals/permissions?tabs=macos#runtime-permission-checks-in-android-60

How can I get the permissions of my Android / iOS application from my WebView in javascript?

I would like to know how I could obtain or read the permissions that the user granted in my application from a webview in javascript.
For example, in my webview I occupy the camera, but if the user denied access to the camera, as I could know from my webview in javascript that the use of the camera is denied and not show the option to the user (because that would truncate my app
PD: Preferably not involving modifications in IOS / ANDROID.
Thanks..
What you need is a JavaScript Interface:
webView.addJavascriptInterface(new JavaScriptInterface(this), "myApplication");
JavaScriptInterface.java:
public class JavaScriptInterface {
private WeakReference<Activity> mActivityWeakReference;
public JavaScriptInterface(Activity activity) {
this.mActivityWeakReference = new WeakReference<>(activity);
}
#JavascriptInterface
public void requestCameraPermission() {
if (mActivityWeakReference.get() != null) {
ActivityCompat.requestPermission(mActivityWeakReference.get(), ....);
}
}
}
And then in your JavaScript code:
window.myApplication = function AndroidClass(){};
window.myApplication.requestCameraPermission();
More info about implementation here.

Automatically show the currently talking user in OpenTok

I'm currently working on an Android app that enables users to group-chat with each other via OpenTok API. And I want to add a feature to the app that automatically detects which user is talking right now and show his video to the others and minimize the other users' videos until someone else talks.
I cannot find such feature in OpenTok so I was wondering if there's a workaround.
private void joinVideoCall(String sessionId, String sessionToken) {
session = new Session.Builder(activity, OPENTOK_API_KEY, sessionId).build();
session.setSessionListener(this);
session.connect(sessionToken);
}
#Override
public void onConnected(Session session) {
publisher = new Publisher.Builder(activity).build();
publisher.setPublisherListener(this);
publisherView.addView(publisher.getView());
session.publish(publisher);
}
#Override
public void onStreamReceived(Session session, Stream stream) {
subscriber = new Subscriber.Builder(activity, stream).build();
session.subscribe(subscriber);
subscriberView.addView(subscriber.getView());
}
...
In order to do that, you'll need to use a custom audio driver that will detect the audio levels.
Take a look to this sample: https://github.com/opentok/opentok-android-sdk-samples/tree/master/Custom-Audio-Driver
And also, take a look to the API documentation: https://tokbox.com/developer/sdks/android/reference/com/opentok/android/BaseAudioDevice.html

How to cast media to default media receiver of the Chromecast device?

Trying to cast media to default media receiver of the Chromecast from android app, but it doesn't cast at all. Following is the code snippet using to find the routes :
MediaRouteSelector selector = new MediaRouteSelector.Builder()
.addControlCategory(CastMediaControlIntent
.categoryForRemotePlayback(CastMediaControlIntent.DEFAULT_MEDIA_RECEIVER_APPLICATION_ID))
.build();
Then, it will show the Chromecast device within the WiFi and running the following code when the device is selected :
MediaRouter.RouteInfo route = adapter.getItem(position).routeInfo;
// select the route for usage
route.select();
// send the play control request with the video uri
route.sendControlRequest(
new Intent(MediaControlIntent.ACTION_PLAY)
.setDataAndType(videoUri, "video/mp4")
.addCategory(MediaControlIntent.CATEGORY_REMOTE_PLAYBACK),
new MediaRouter.ControlRequestCallback() {
#Override
public void onError(String error, Bundle data) {
super.onError(error, data);
}
#Override
public void onResult(Bundle data) {
super.onResult(data);
}
}
);
It can't cast the media to the device. Any suggestions ?
It seems like you are not using the Cast SDK but using the Media Route Provider. I don't see any session being set up; you might want to look at the democastplayer sample code that is distributed along with the Android SDK (under SDK folder, go to extras/google/google_play_service/samples/cast/democastplayer). In that sample, look at the MrpCastPlayerActivity class.

Categories

Resources