XML attribute usage "android:focusableInTouchMode" - android

Please tell me why is the following XML attribute used ?
I looked up the documentation on developer.android.com but could not understand anything.
android:focusableInTouchMode

This blog post can help you to understand the meaning of touch mode.
The most relevant part :
The touch mode is a state of the view hierarchy that depends solely on the user interaction with the phone. By itself, the touch mode is something very easy to understand as it simply indicates whether the last user interaction was performed with the touch screen. For example, if you are using a G1 phone, selecting a widget with the trackball will take you out of touch mode; however, if you touch a button on the screen with your finger, you will enter touch mode. When the user is not in touch mode, we talk about the trackball mode, navigation mode or keyboard navigation, so do not be surprised if you encounter these terms. Finally, there is only one API directly related to touch mode, View.isInTouchMode().
So android:focusableInTouchMode="true" means that the view can get the focus when the phone is in touch mode.
Typically an EditText is generally focusable in touch mode and on the other hand a Button is generally not focusable in touch mode.

Happened to me when I didn't have the correct number of items in getItemCount().
Double check that you have the right number of items!

Related

Passing clicks events from floating app to covered app

I have a floating app which works perfectly.
I am using OnTouchListener to catch events since I need to use the GestureDetector for swipes etc.
My only problem is that sometimes I wish to ignore certain events on the view.
In this case the view is invisible but not "gone" because I need it to accept certain gestures but not others.
I can't seem to be able to do that.
Returning false from "onTouch" simply doesn't work.
I checked that by experiment by disabling the GestureDetector and simply always returning false just to see what would happen. Result was nothing going through.
Is it even possible to pass a click through to a covered app?
Due to security reasons it's not possible to record and pass a click below (essentially allows building a keylogger).
Best you can do is have your floating window small enough to start the touch but not cover too much of the screen below.

accessibility android Skip nav links are not working

I have added skip to main content links on header in my web app. It works as expected in Windows and MacOS. It even works as expected in IPhone. But the same is not working in Android chrome/talk back.
when I check further, This skip nav links are not working even in webaim.org . The code I refer https://www.bignerdranch.com/blog/web-accessibility-skip-navigation-links/
Could someone please help why android chrome is having this issue? Is it a browser behavior? Please help.
Let's analyze what is happening in this scenario.
1) TalkBack sets ACCESSIBILITY_FOCUS to the element
This is very important, notice that this says ACCESSIBILITY_FOCUSnot FOCUS. For a hidden skipnav link to become visible it needs to get FOCUS as marking the element as visible (or perhaps shifting it on screen) with the :focus pseudo selector is a very common implementation of this. It's very important that such a control receives FOCUS, which it never does with TalkBack.
2) The user double taps to click the element they just heard get focus.
When the user activates the control a physical click event is set to the middle of the onscreen focus rectangle for the control. Similar to actually touching the screen.
3) The browser sees a physical click event occur on the page.
TalkBack essentially has blindly sent a mouse down event to an area of the page that has nothing or perhaps another control overlayed with this invisible element. Either way, the thing that the browser wanted to click was never "visible" because it never properly obtained FOCUS only ACCESSIBILITY_FOCUS and so the control is not there to be clicked.

What is focusable and focusableInTouchMode

I am new to android development. Please help me out and if possible let me know, what is clipChildren, baseAligned, baseAlignedChildrenIndex. These are the things, in which, I have doubts in creating an application.
The touch mode is something very easy to understand as it simply indicates whether the last user interaction was performed with the touch screen. For example, if you are using a G1 phone, selecting a widget with the trackball will take you out of touch mode;
In touch mode, there is no focus and no selection. Any selected item in a list of in a grid becomes unselected as soon as the user enters touch mode. Similarly, any focused widgets become unfocused when the user enters touch mode.
Now that you know focus doesn't exist in touch mode, I must explain that it's not entirely true. Focus can exist in touch mode but in a very special way we call focusable in touch mode. This special mode was created for widgets that receive text input, like EditText or, when filtering is enabled, ListView.
Focusable in touch mode is a property that you can set yourself either from code or XML. However, it should be used sparingly and only in very specific situations as it breaks consistency with Android normal behavior. A game is a good example of an application that can make good use of the focusable in touch mode property. MapView, if used in fullscreen as in Google Maps, is another good example of where you can use focusable in touch mode correctly.
for mor Detail see Developer Blog http://android-developers.blogspot.co.at/2008/12/touch-mode.html

Multi-Touch Soft Keyboard Functionality In Android

I am developing a keyboard in Android (a braille keyboard, specifically). For those of you who do not know, braille is composed of six dots, and combinations of these dots form symbols. I wrote a keyboard using the KeyboardView class with the basic listener, but the problem is this: the default listener only allows for one key to be pressed at a time. To form the letter l requires three dots (dots 1, 2 and 3). I have to press each key individually, followed by the spacebar to write the letter. I would like the keyboard to function in a different way; all dots can be pressed simultaneously, and when the dots are released, the letter is written.
Am I right when thinking this is impossible with the listener bundled with KeyboardView? If I rewrite the onTouch routine in the KeyboardView, would this latter functionality be possible? Does the OnTouch function detect multiple action_down events simultaneously? I was thinking that if this is possible, I can determine the x-y coordinates of the press, loop through the keys on the keyboard, and determine which keys are being pressed. If this is not possible, will I have to go a step below this and write onTouch for a basic view?
Thank you so much.
If you want to make some quick tests to a keyboard, like a new key layout or a new predictive text algorithm, KeyboardView is fine. If you're doing anything more complicated KeyboardView just doesn't have any real flexibility. None of the major keyboards use it (and I know that for sure, I wrote one of them and have talked to engineers at the others).
You could rewrite onTouch or put in a touch listener on the existing keyboard view, but then you're going to have to do a lot of work inside of their framework, such as hit testing to determine what key is touched. At that point its probably easier to just write your own from scratch. I almost guarantee you will at some point anyway if you go all the way to a commercial product.
You can use onTouch to get multiple fingers down, but it won't be on ACTION_DOWN. You get an ACTION_DOWN for the first finger down, then an ACTION_POINTER_DOWN for each additional finger. This is how it works for any view. Then any motion on any finger will generate an ACTION_MOVE. When fingers are released you get an ACTION_POINTER_UP for all fingers until the last, which gets an ACTION_UP.

Like voice over for Android

Good day to all!
I'm trying to implement a voice over for Android (like iPhone), but a specific application (not the entire operating system)
Imagine a screen with six buttons, so they occupy the entire activity, distributed equally in size.
When I "walk" with my finger on the screen, I want to give focus to the button and capture the event when the button has focus and let the focus as well.
Conclusion: As I flick on the screen and if it is over a button, the focus button. if I continue to drag the finger, give the focus to another button without taking your finger off the screen.
Can anyone help me? Sorry for bad English.
I don't think you can use the Android Button class for this, but instead do a custom view, draw six rectangles, and write an onTouchEvent method that determines what sound to play based on where the user's finger is. See the Sudokuv4 example at http://pragprog.com/book/eband3/hello-android for some code you can use.
Well you have to know positions of buttons. You can use basic view functions to get positions (getLeft(), and so on...)
After that you have to implment onTouchListner for Activity. Within you have to check where Event.x and Event.y pointers are and set foucs to specified view. After pointers move from specified view you set focus to false.

Categories

Resources