I'm currently in the process of writing documentation for an app, and was curious of the proper way to reference a user interaction on screen.
i.e.: To advance to the settings screen, tap/touch/click the settings icon.
Since Android is available on so many form-factors, including TV, is it 'tap' or 'touch' or 'click' or something else entirely that maybe encompasses everything? I've checked some other app docs and they all vary.
Thanks in advance.
The documentation of the SDK (agreed, this is for developers, and not end-users) seems to be using the touch word.
See for example the Handling UI Events section, in which you'll find (quoting) :
This is called when the user either
touches the item (when in touch
mode), or focuses upon the item with
the navigation-keys or trackball and
presses the suitable "enter" key or
presses down on the trackball.
Or :
For a touch-capable device, once the
user touches the screen, the device
will enter touch mode.
Related
I am setting up a function that will deliver sounds based on the item currently selected by the Talk Back accessibility service. My issue is that I can not find a way to actually ask Android what has been currently selected by TalkBack.
You can use the android Text-To-Speech Engine(TTS) provided by the system. Using this, you can play the audio of the text provided,on a new message event or whenever user clicks on the text.
TextToSpeech tts=new TextToSpeech(this,this);
String text = "message_text_string";
tts.speak(text, TextToSpeech.QUEUE_FLUSH, null);
I don't know the specific answer for android, perhaps you can, but generally you can't know where a screen reader user is currently looking.
You must make the difference between system focus, and the current reading position. They aren't the same.
When the system focus is moved, for example programatically or when pressing tab on a keyboard, in principle the reading cursor follows.
The converse isn't true when the user reads the page with arrow keys or by sweeping on mobile devices. IN that case, the reading position moves but the system focus stays in place.
You can know where the system focus is at any time (example in JavaScript: document.activeElement), but there is generally no equivalent to know where the reading cursor is.
Philosophically it's very good as it is. If we take a paralel, you have no way to know where exactly the user is looking at on the screen (except with very specific hardware).
For your particular case, the best you can do is probably to base on the system focus to decide whether or not to play sound.
first of all, I would like to tell you, what I want to do: I want to get some behavior of iOS on my Android device. Of course, it's just a little detail.
The behavior I talk about is on iOS like that:
The user doesn't use the device
After a short time, the device dims its screen.
Now the user has to tap somewhere on the display to reactivate the device.
THIS IS THE BEHAVIOR I WANT If the user taps on the screen, the screen
will just become active again. The tap itself will NOT cause any other action.
On Android, it's almost the same behavior. Except for step 4: If the user taps on the screen to prevent standby, the tap will already cause actions in the app or home screen or wherever you are.
I decided to develop a small Accessibility Service. This service will show an overlay when the device is inactive and dims its display. Clicking on the overlay will just close it. The overlay itself is no problem and it's already working.
My problem is: I don't know how to find out when the display is dimmed because of inactivity.
My ideas are:
Listen to the Intent.ACTION_SCREEN_OFF event (https://developer.android.com/reference/android/content/Intent).
--> It's working. But it's too late :-( The screen is already completely off when the event has been fired.
Check, if the device is inactive/idle.
--> Is there any possibility to get the status of the whole device? I haven't found anything about that.
Or maybe somebody has completely other ideas?
Thanks for your help.
Greetings
Patrick
You can keep the window screen on & using a TimerTask, dim the brightness of the window manually by some % every, say 5 seconds...
Then when the taps on overlay, Increase the brightness
Big Picture:
User is on their device home screen with an overlay from my application in an arbitrary corner. Overlay is small icon with no functionality (can register if it has been touched) other than presence.
Is it possible to know when the user has clicked OUTSIDE the overlay. I can tell when the user touches the overlay itself, but would like to know if the user has touched the screen but not the overlay?
Does not matter what is being touched, just if the screen is being touched on their device.
Also, same scenario, is it possible to know that the user has clicked a button? For instance, the user clicks the contacts application (or camera, or any application), is their a way to read that action? Do not care what application/button is clicked, just that one was clicked.
Just trying to learn what is possible, so please no need to write out code. Maybe just some pointers in the right direction. Thanks for input.
For the first scenario you can try chat heads(like Facebook messanger) by using intentservice.
I have been an Android user for more than 3 years already and I am used to enter into a screen (Activity), change some settings and press back. The changes are saved automatically.
Now I am devoping an app and I wanted to use the Discard | Done buttons in the ActionBar. This Activity is a settings Activity so the user changes some stuff and then presses Done. However now I am confused because what should I do if the users presses Back? Should I also save the setting or should I discard them?
To my mind, keep the "discard" button only, and save the setting when the back button is pressed.
For me, the system-back button should offer the same functionality as the DISCARD button. The DONE button shouldn't be disregarded - it's still common (in apps for mobile or desktop-based) to confirm an action, or actively save/send information in a form.
Removing the DONE button as Sporniket suggests means there are two negative interactions (both equating to a cancel) and no confirmation to the user of a save action - for me, I'd be wondering what to do to save the information I'd inputted/changed.
Using system-back as the default save action is counter-intuitive; the system-back button navigates backwards through the activity stack - it's associated with an "exit" by common users, not save.
If you do decide to continue with your implementation (DISCARD only), then ensure you have some visual feedback which allows the user to know that the information has been saved and to help train them (reassure them) that in your app, system-back will save your changes. This could be achieved by using a Crouton when the user presses back, which displays a message telling the user that the data has been saved.
-- Edit:
I should add that my above recommendations are appropriate where the DISCARD/DONE pattern is appropriate. You mentioned in your question that you're used to changing settings, and pressing back, having it save your changes automatically, and I would suppose that these are areas which are predominantly toggles, rather than content that's being edited.
Roman Nurik's post here offers more guidelines, and even mentions a way in which system-back saves information by default. In this instance, he describes having the DONE replace the up affordance, and to hide the DISCARD button in the overflow menu, citing the use-case where the user is unlikely to want to discard information. (IMHO, I disagree with him - I think that if there's a visible DONE or Save action, then system-back should discard, for the reasons stated above. That said, at least it's some guidance for the pattern with usage guidelines from one of the proponents of the pattern.)
Overall, I think it's something that could be better answered if you gave more context about the information the user will be editing in this screen.
Is there a way to register a receiver for a app running in the background for when a user presses a key. Kind of like "ACTION_USER_PRESENT" but if any keys were pressed on the screen.
MORE DETAIL: My app is running as a service in the background. User opens the phone and presses keys, like they searching for something online on their driod. Can I capture those key presses in the background?
To detect whether a user is using the device you could also use the information whether the screen is on or off as an approximation (making the assumption that the screen timeout is set). This blog entry shows how to capture the screen on and off events (I haven't done it myself though).
In the Android HCI Extractor ( http://code.google.com/p/android-hci-extractor/ ) we traverse the GUI and install some event filters (by using listeners) in the app top view.
Maybe if you can reach a top level view from which listen the events you could listen to all the events for this view. Let's try ;-)
This tool is an open-source prototype you can find here: http://code.google.com/p/android-hci-extractor/
It is very easy to integrate and use. In the tutorials you can see that only a few lines of code are needed: http://www.catedrasaes.org/trac/wiki/MIM
I hope it helps you!!