I created a chat application that uses Twilio Sdk. Everything works fine but after a Video Call ends, I get this notification which doesn't go away whatever I try to do. After this notification appears, if I try to initiate Video Call, it doesn't work then. Maybe its using some of my resources like microphone which does not allow the app to start Video Calling because the Android system shows that microphone is still being used. Also this is appearing on my Redmi 9C. I have already searched everywhere on the net but no such solution found. It also states its a bug in some devices but there must be something that can be done to resolve this. Help would be appreciated thank you.
This is the code that is used to disconnect from a room:
private fun killAllVideoProcess() {
if (localVideoTrack != null) {
if (localParticipant != null) {
localParticipant!!.unpublishTrack(localVideoTrack!!)
}
localVideoTrack!!.release()
localVideoTrack = null
}
if (room != null && room!!.state != Room.State.DISCONNECTED) {
room!!.disconnect()
disconnectedFromOnDestroy = true
}
if (localAudioTrack != null) {
localAudioTrack!!.release()
localAudioTrack = null
}
if (localVideoTrack != null) {
localVideoTrack!!.release()
localVideoTrack = null
}
}//killAllVideoProcess ends
Related
I am developing an accessibility service for Android. The service calls an app, and that app has a RecyclerView. Then I want to click on an element of the RecyclerView with performAction(AccessibilityNodeInfo.ACTION_CLICK) but it is not working. I know there are a few similar questions but none of them works for me. Also I checked the official documentation for the class of the performAction method https://developer.android.com/reference/android/view/accessibility/AccessibilityNodeInfo
This is my code:
#Override
public void onAccessibilityEvent(Accessibility event){
AccessibilityNodeInfo source = event.getSource();
if(source != null){
List<AccessibilityNodeInfo> list = source.findAccessibilityNOdeInfosByText("mystring");
list.get(0).performAction(AccessibilityNodeInfo.ACTION_CLICK);
}
}
This is my configuration xml file:
<accessibility-srvice xmlns...
android:accessibilityFeedbackType = "feedbackGeneric"
android:AccessibilityFlags = "flagDefault"
android:canPerformGestures = "true"
android:canRetrieveWIndowCOntent = "true"
I think I misunderstood something, but i don't know what can be. Any help is appreciated.
The simple answer is that while finding the node by text is fine, that particular node was not the node with the desired onClick event. The solution is to call
list.get(0).getParent().performAction(AccessibilityNodeInfo.ACTION_CLICK)
The clarifying discussion is below
I think .performAction(AccessibilityNodeInfo.ACTION_CLICK) is right, but there might be some other concerns. Sorry for posting as an answer but a comment is too small.
Are you sure the onAccessibilityEvent is being called? I don't think that is the right event, but I can't be sure. Maybe put a log in there to ensure it's calling the event when you expect it to be called.
Also, looking at the source might restrict your search, maybe instead of event.getSource() try using rootInActiveWindow (I use Kotlin so it might have a method, see https://developer.android.com/reference/android/accessibilityservice/AccessibilityService#getRootInActiveWindow(int))
EDIT: 28 March 2022
I have run this code on my own accessibility service and it does click the button. But it's very prone to overflow.
var ranOnce = false // prevent overflow
override fun onAccessibilityEvent(event: AccessibilityEvent?) {
if (event == null) return
if (event.eventType == TYPE_WINDOW_STATE_CHANGED) return
if (event.source != null && !ranOnce) {
val nodeList = rootInActiveWindow.findAccessibilityNodeInfosByText("Menu")
//event.source.findAccessibilityNodeInfosByText("Menu") // <-- always nothing in list
Log.d("onAccessibilityEvent", "List of nodes: $nodeList")
if (nodeList.size > 0) {
android.util.Log.d("onAccessibilityEvent", "Node info: ${nodeList[0]}")
ranOnce = true
nodeList[0].performAction(AccessibilityNodeInfo.ACTION_CLICK) //<-- caused an infinite loop!
} else {
Log.d("onAccessibilityEvent", "No nodes found")
}
} else {
Log.d("onAccessibilityEvent", "event.source is null!")
}
}
How can I know that my app was opened by Google Assistant, instead of just normally launched.
I don't need App Actions. I just want to know, that yes, my app was opened with "Ok Google -> Open appname" instead of pressing on the icon, or resuming it from recents.
If there an intent/any data in the bundle that I can check for that?
This is my intent when I do "Open appname"
Intent { act=android.intent.action.MAIN cat=[android.intent.category.LAUNCHER] flg=0x10000000 pkg=com.xelion.android cmp=com.xelion.android/.activity.InitializationActivity (has extras) }
And it has extras, but don't know what:
Bundle[mParcelledData.dataSize=220]
EDIT:
I found out that this will be the flag for opening with google Assistant:
intent.flags == 0x10000000
But my problem is that this also will run when I build the app from machine or update it, Any idea how to avoid that?
EDIT2:
I have also tried:
private fun getReferrerCompatible(activity: Activity): Uri? {
val intent = activity.intent
val referrerUri: Uri? = intent.getParcelableExtra(Intent.EXTRA_REFERRER)
if (referrerUri != null) {
return referrerUri
}
val referrer = intent.getStringExtra(REFERRER_NAME)
if (referrer != null) {
// Try parsing the referrer URL; if it's invalid, return null
try {
return Uri.parse(referrer)
} catch (e: ParseException) {
return null
}
}
return null
}
But I still get NULL as referrer
I am trying the : intent.extras?.get(KEY_REF_NAME) == REG_G_ASSISTANT or getReferrerCompatible() from the onCreate. Should it be later? like onResume?
When opened through Google Assistant, the android.intent.extra.REFERRER_NAME will be android-app://com.google.android.googlequicksearchbox/https/www.google.com
val KEY_REF_NAME = "android.intent.extra.REFERRER_NAME"
val REG_G_ASSISTANT = "android-app://com.google.android.googlequicksearchbox/https/www.google.com"
if (intent.extras?.get(KEY_REF_NAME) == REG_G_ASSISTANT) {
// APP OPENED THROUGH GOOGLE ASSISTANT
} else {
// APP OPENED THROUGH DEFAULT LAUNCHER
}
Based on the response that theapache64 gave and this link:
https://github.com/allegro/slinger/blob/master/slinger/src/main/java/pl/allegro/android/slinger/ReferrerMangler.java
Because the intent was returning null on Android 10, and due to my min SDK being 23 (I do not need to implement logic for under M), I have done the following code:
val REG_G_ASSISTANT = "com.google.android.googlequicksearchbox"
if (referrer != null && referrer.toString().contains(REG_G_ASSISTANT)) {
//code to do
}
This being Kotlin, and being in an activity. The equivalent of referrer in .java would be:
activity.getReferrer();
In case you run an OS under 23, the referrer can be taken like this:
val KEY_REF_NAME = "android.intent.extra.REFERRER_NAME"
intent.extras?.get(KEY_REF_NAME)
Being that theapache64 tried on a OnePlus6, I assume this should work until API level 28 (Pie) on some devices. But to be sure, I would recommend using the activity.getReferrer()
I'm using SignalR. I can receive Messages and show Notification when the app is open or is in the background. but when closed application I can't receive messages. Does anyone have a solution to this?
I have the following code in MainActivity OnCreate:
ActivityChat.connection.received(json -> runOnUiThread(new Runnable() {
#RequiresApi(api = Build.VERSION_CODES.O)
public void run() {
JsonObject jsonObject = json.getAsJsonObject();
if (jsonObject != null && jsonObject.has("A")) {
jsonArray = jsonObject.getAsJsonArray("A");
String method = jsonObject.get("M").getAsString();
//PushNotifications(method,jsonArray);
if (method.equals("addNewMessage")) {
if (jsonArray != null && jsonArray.size() != 0) {
if (jsonArray.get(2).getAsString().equals(driverID)) {
if (TransportClass.showCaseView != null && !TransportClass.showCaseView.isShowing()) {
Notificate();
}
}
}
}
}
}
}));
This is by design, but not by the design of the SignalR client itself; this is a design consideration when building a mobile app on either native iOS or Android. When the app isn't in focus, the app isn't in the background running, either. AFAIK, in most cases the app is only running when you have it open on the phone. This is when the SignalR connection would be active - the moment you close the app, you also close the connection.
You'd want to use something like notifications here to tell the user to open the app, then when they do, the real-time connection would be there again.
I am using TokBox for an android project. I need to add a button which would turn the flash light on torch mode.
Tokbox Publisher object already provides a swapCamera() method which switches between all the available cameras of the device. But I couldn't find any API to change the flash light mode for the currently selected camera.
I tried getting an instance of android.hardware.Camera myself to change it manually, but it didn't work because I got the "java.lang.RuntimeException: Fail to connect to camera service" exception. It is because the Camera object is being used by Tokbox and is not released.
I could find no way to access the Camera instance that Tokbox is using either. It is even deprecated since android API level 21.
Can anyone suggest a way to change the camera parameters? I have access to the View that the video is being previewed on it.
I needed to stop the stream to be able to start the camera app to take a picture. I've found code to release the camera and attach it. Maybe you can use this code to release the camera, turn on the light and then attach the camera again
The following code releases the camera:
public void ReleaseCamera()
{
if (_myPub != null) {
_myPub.PublishVideo = false;
BaseVideoCapturer bvc = _myPub.Capturer;
if (bvc != null) {
bvc.StopCapture ();
bvc.Destroy ();
}
}
}
And this code attaches the camera again:
public void AttachCamera()
{
if (_myPub != null) {
BaseVideoCapturer bvc = _myPub.Capturer;
if (bvc != null) {
if (bvc.IsCaptureStarted == false) {
bvc.Init ();
bvc.StartCapture ();
_myPub.PublishVideo = true;
}
}
}
}
torch light will work only with back camera, so if you are publishing video with front camera then it will freez tokbox video.
if(publisher.cameraPosition == .back){
if let device = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeVideo), device.hasTorch {
do {
try device.lockForConfiguration()
let torchOn = !device.isTorchActive
try device.setTorchModeOnWithLevel(1.0)
device.torchMode = torchOn ? .on : .off
device.unlockForConfiguration()
} catch {
print("error")
}
}
}
I'v made a Ble(Bluetooth 4.0 LE) App.
This App send byte data to BT device.
I had test when I made this function, but as soon as send(write) device was disconnected.
why disconnect?
Especially, LG SMART Phone.
Plz Help me..
//////////////
public static void Send_Data(byte[] data) {
if (mByteCharacteristic != null ) {
mByteCharacteristic.setValue(data);
mByteCharacteristic setWriteType(BluetoothGattCharacteristic.WRITE_TYPE_NO_RESPONSE);
if (bluetoothGatt != null) {
bluetoothGatt.writeCharacteristic(mByteCharacteristic);
}
}
}
///////////////
Make sure that:
mByteCharacteristic.getProperties() & BluetoothGattCharacteristic.PROPERTY_WRITE_NO_RESPONSE) != 0x0
Otherwise try omitting the setWriteType() call. By default the characteristic should be using the correct write type.