I want to make my own accessibility service using the API's, but when
some buttons are clicked, no event of type VIEW_CLICKED is logged.
In accessibility.xml,
I set android:accessibilityFlags="flagDefault|flagIncludeNotImportantViews
I'm not sure Why when I click the Accept and Continue button in Chrome (as part of the first run activity), no click event is logged.
Any help would be greatly appreciated.
You need to specify android:accessibilityEventTypes. To intercept clicks you need to use typeViewClicked option.
Something like that:
android:accessibilityEventTypes="typeViewClicked"
For more information you can check Android documentation.
Related
I have a simple Apache Flex view based application that runs on Android as follow:
<f:MyView xmlns:fx="http://ns.adobe.com/mxml/2009"
xmlns:s="library://ns.adobe.com/flex/sparks"
activate="activateHandler(event)"
deactivate="deactivateHandler(event)"/>
I expect the activateHandler() should execute only once when the view is activated, however after I pop and then push the same view back the number of activateHandler() execution increased by how many times I did the pop and push operation. Why would this happen and how to force it to operate as expected (i.e only once)?
Expanding on #JileniBouguima's answer, changing activate to creationComplete will resolve this because of how those events work. Your expectation that activateHandler() executes only once is a little off; according to the Activate event documentation, activate fires
every time your application gains operating system focus and becomes active.
By contrast, creationComplete fires once per object, once the component is created.
I am not sure what code exactly is written in the handlers activateHandler and deactivateHandler but in Flex this is a standard practice to remove event listener if you do not need it any more. I am assuming that whenever you pop and push the same view it is adding and removing the listener. I can help you more if you share the handlers code.
Change activate to creationComplete.
Is there a way in Calabash Android where I can send my app to the background? In other words, simulate the device/hardware Home button?
Secondly, can the app be brought back into the foreground?
This can be done in the following way:
Then /^I go home$/ do
system "#{default_device.adb_command} shell input keyevent KEYCODE_HOME"
end
P.S. You can also add sleep <some_value_in_seconds> if necessary, after system "..." line.
As far as I know, that action does not currently exist. You can find (mostly) all of the available operations here https://github.com/calabash/calabash-android/tree/master/ruby-gem/lib/calabash-android. Most of the interesting options are in the operations.rb file. The performAction method would be the method most likely to help you as it has a 'go_back' and 'press_menu' feature, but currently no 'go_home' feature. When you're in a calabsh console, you can type performAction 'list_actions' to see all possible actions. I'm not sure if it's a reasonable workaround, but you could try something like this:
until (query "*").empty? do
performAction 'go_back'
end
This will sipmly press the back button until you've arrived on the home screen. If you'd like to get back to your app, however, you'd need to re-run start_test_server_in_background as you will not be able to get any query information from the home screen. Anyways, good luck and I hope I could help out at least a little!
Hie!
I'm trying to develop an app that will do one of these 2 options -
whenever a user marks a text in any app (using the regular copy/paste), there will be another option besides copy/cut that will open my app. (preferred)
My app would listen to a copy text event in another app and will show a notification to the user. when they press it, it will open my app.
I haven't found a way to do either of these options.Some claim that I can listen to the copy activity but I couldn't understand how to do it(Android : How to listen to longclick events in any text area in other applications? click-events-in-any-text-area-in-other-application).
I'm not looking for anyone to write the code for me of course, just a pointer to the right direction if this is possible.
Thanks in advance,
Shahar
After short investigation I don't think it is possible. You may look at this guide, but it says copy/paste is implemented using ClipboardManager in Android, which looks like a part of a system. And it doesn't provide any hooks or interfaces to intercept copy/paste events in other app.
Sure you may hack deeper in this mechanismus on rooted device, but this will not work for all users.
Very late response but you can create a custom text selection action, which is an activity with an intent filter with action android.intent.action.PROCESS_TEXT, and whenever a user press on text, a text selection menu appears, your's will be available too,
when the user clicks on your custom text selection action, the activity configures with the android.intent.action.PROCESS_TEXT will be opened for your users.
check this article for implementation steps Custom text selection action
Starting to develop an app these days, i'm stuck trying to open a "window" when the screen is touched in the widget.
In Code Examples (sdk), we can see this in the Event Widget, when you click in the screen, a "window" is opened, and you can see the events there.
I follow the code to see how can i do this:
in NotificationWidgetExtension : SmartExtensionUtils : onTouch event
Intent intent = new Intent(Widget.Intents.WIDGET_ENTER_NEXT_LEVEL_INTENT);
sendToHostApp(intent);
In WidgetExtension
protected void sendToHostApp(final Intent intent) {
intent.putExtra(Widget.Intents.EXTRA_AEA_PACKAGE_NAME, mContext.getPackageName());
intent.setPackage(mHostAppPackageName);
mContext.sendBroadcast(intent, Registration.HOSTAPP_PERMISSION);
}
Trying to replicate but i'm not being successful. Is there any place i can read about it, or someone can help me on this?
I think you are looking at three different APIs - the Widget API, the Notification API and the Control API.
The onTouch code that you reference is a part of the utility classes of the SDK, and is there to help you if you have made an extension that have both implemented the Notification API and the Widget API. When in the Widget view on the watch, and you have overridden the NotificationWidgetExtension class, the onTouch event will be delivered to the mentioned method. It will basically show the first available (and not read) notification.
You mention "window" etc, and I am guessing that you want to create an application on the SmartWatch. This is done through the Control API. Take a look at the SampleControlExtension contained in the SDK. Check this answer, to get information on how to start a Control extension from your own extension. E.g. if you create a Widget+Control extension, you could start the Control extension in the onTouch method of your Widget.
I'm having some problem here with android + java + cloud. My onCLick event is working only one time
The event only work again when i go back to the last screen and enter again in the screen that i made searchs in the cloud...
I think you are only allowed to call setContentView() once. Since you are doing it in your click listener that might be why it is failing after the first time.
Edit: What does it do when it fails? Force close? nothing? something unexpected?