Appcelerator: Proper Way Automatically Dismiss Keyboard - android

I'm writing an app using Titanium. I want to be able to automatically dismiss the keyboard anytime something outside of the text field is clicked. I have yet to find an elegant solution for this issue.
Couple things that I've thought about, but am still looking for a better solution:
Assign event listeners to basically everything else present in the view, and dismiss the keyboard (using textField.blur()). I want to avoid this since it results in a LOT of code just to dismiss the keyboard. Also, if I end up adding anything else to the view, I'll have to add a click listener to that object as well, so it's not very maintainable.
Create a large transparent view, and have it take up the entire screen. Place it directly beneath the text field and add to it one click listener on that which will dismiss the keyboard. This is a better solution than #1, but still isn't great because I've had a lot of trouble getting zIndexes to work properly. It's also inefficient for my purposes because I've got views with a specific width and height that encapsulate text fields. I've used these for the sake of code simplicity and I re-use them throughout my application.
I've tried adding a listener for the "blur" event for the text field but that doesn't seem to get fired appropriately.
That's about it. I'm sort of at a loss. The zIndexing also behaves strangely on the iPhone, and I haven't tried on Android yet. Also, as I mentioned above, many of the text fields I use are encapsulated within small views with set widths/heights-- so I think that will affect the functionality of Z-indexes.
So the root question is: What's the best way to dismiss a keyboard whenever anything outside the text field that's in focus is clicked?

If I'm correct the click event propagates through all views and windows therefore your #1 option could be modified to check for clicks on the bottom most layer (view or window), check for its source then decide what to do.

Related

Customizing AlertDialog.Builder with multiple Buttons and EditTexts programmatically

The issue I've been running into is creating an AlertDialog popup that includes multiple EditTexts and Buttons, while also including the usual positive and negative buttons. With an additional '+' Button that would create another 'set' of EditTexts and Buttons below the previous one.
A visual example.
I would also like to do this programmatically, if possible. The issue I've been running into is setView(), it seems like that method is meant to be used for only one sort of Button, EditText, or otherwise to show on the screen. But as I need more than one, and calling that method more than once just overrides the previous call, I've become stuck. The only thing I can find that might be it is ViewGroups but I haven't been able to find a way to work with it.
Any information on this would be immensely helpful!

Codename One - AutoCompleteTF bad behavior

Another day, another bug...
I have three AutoCompleteTextFields with the filter overrided to get completion from my REST service, but my big problem is that the suggestions popups of those text are clickthrough... so when I've already completed one of them, any click on the suggestion popup of another will trigger the underlying ACTF, which is already filled and so show it's own popup, making impossible to select the item from the other ACTF suggestion popup.
The two screenshots here show the situation, the ACTF are the textfield hinted "Partenaire", "Contact..." and the already autocompleted one under.
On the second screenshot, I've tried to select the item over the third ACTF, and so the four first results are from the third ACTF, and the four last are from the "Partenaire" ACTF.
Is there a way to override something like onShow() for the popup and it's hiding equivalent, so I could disable the other ACTF when I type in one of them ?
I think it's a good way to solve the problem, but I am open to any other idea :)
I've forget to mention it, but the problem occur on Android and on the simulator, but iOS has not such problem.
Check that your UI has scrolling set correctly, only one container in the hierarchy can be scrollable on the Y axis. By default the Form's content pane should be scrollable on Y (unless it's a border layout).

Showing an interactive floating layout during calls

Background
There are some nice apps out there that show some layout on top , while the user is making a call or answering one (like "current caller id").
I need to create an app with the ability to show something on top , during a call, and allow it to be interactive.
The problem
Using broadcastReceiver ,foreground service and SYSTEM_ALERT permission, I've succeeded showing something on the screen during calls.
As long as the content being shown is static, I have no problems.
However, I've noticed that when I try to make the content being shown to be interactive , I face some problems:
Everything is jumpy and this includes not only animations, but also setting visibility to visible/gone. I hate to think how it would work like when I need to make things draggable.
Not sure if this is the reason, but using the SlidingDrawer make the entire width belong to the SlidingDrawer and you cannot click through it. This means that if its location is at the bottom, you can't touch the "answer" button when someone calls you.
The question
What is the reason for those problems?
How can I fix them and be able to show things right?
How do other apps handle it right ?
EDIT: about the SlidingDrawer , it seems that it has terrible bugs about its location and size, and the content area, even when it's not shown to the user and the user can see through, it cannot be touched through. Still, I don't know why, and how to fix it, and I also don't know why things are so jumpy compared to normal apps (probably because of over-drawing, but it's really really slow).
Maybe this question should be more general: how to make a floating window like on AirCalc, that can be moved easily yet still be quite fast.
For the dragging functionality, I've tried to get the layoutParams of the root view (which is of type WindowManager.LayoutParams ) that I show, update it and set it again, but for some reason it didn't do anything. Wonder what I'm doing wrong.
EDIT: it seems that i should be using windowManager.updateViewLayout for updating the layoutParams. Using this post , I've made it perfectly draggable.
Ok, I've come up to some conclusions about this, first to answer my original questions:
it's probably because of overdraw and the views i've used. I wanted to try out libraries that could replace the slidingDrawer , but each had a different problem. using simple views proved that the idea in general works.
in the case of visibility changes, it was jumpy because the size of the view wasn't able to fit using the current WindowManager.LayoutParams size.
slidingDrawaer does have issues since it uses the whole size it get when closed or opened.
now to the rest of the issues :
unable to drag ? instead of the regular setLayoutParams , use windowManager.updateViewLayout .
unable to touch outside of the view ? set its minimal size according to your needs. you can also set the window flags so that touch event would go through .
want to listen to calls events ? you can use the broadcastReceiver for triggering the showing of the app, but I suspect that hanging the call might cause the intent be received later sometimes. I think you can use telephonyManager and listen to events there, using the service you run in the foreground (that i've created just to make sure the app won't close in the middle).
if anyone else has questions, i can help.

Properly Scrolling an EditText into view when focused

I've got an EditText, which is ultimately inside of a ScrollView. I've implemented a comment feature which takes you to a new activity, and automatically places focus in the edit text so that the user can immediately start writing his comment.
Unfortunately, it doesn't quite scroll the edittext into view, as you can see in the screenshot below:
I would like to see something more like this, where the EditText comes completely into view (see below). I already looked at the android:WindowSoftInputMode, and it seems like the default values should work ... and indeed, it does mostly work because it does scroll, just not enough.
So is there anything I can do to get the desired behavior? Thanks!
is your min SDK 3?
check this
Hope you have tried android:windowSoftInputMode="adjustPan" and check this.
I would give this a go.
You could also try to programatically use this on the onCreate() method of you activity.
getActivity().getWindow().setSoftInputMode(WindowManager.LayoutParams.SOFT_INPUT_ADJUST_PAN);
"SOFT_INPUT_ADJUST_PAN" adjustment option is set to have a window pan when an input method is shown, so it doesn't need to deal with resizing but just panned by the framework to ensure the current input focus is visible.

View-specific IME?

I have a custom view for which I want the user to be able to enter characters from an app-defined set of characters. To do this, as I understand it, I need to write an input method service. The user not only needs to install it, but then needs to enable the IME in the Settings > Language & keyboard, and then select the custom IME for use in the view.
This seems really crazy. I want this IME to be used for just one view in one application. I don't want it to be available system-wide or force the user to make global setting changes.
The only alternative I can see is defining my own in-app custom view and simulate an IME (probably a full-screen one) when the view gains focus. Isn't there anything better?
I do not think the IMEs are conceived for that kind of task. Their concept is to allow user input in a standardized way so it can be used across multiple applications from different vendors.
My strategy would be similar to what you are thinking:
prevent the soft keyboard from appearing,
intercept the menu button key press to show your own instead,
add a custom layout (probably a GridView or a TableView inside a RelativeLayout with bottom gravity)
use an OnItemClickListener
send the required KeyEvents to the root View. If the characters are invented, the KeyCodes do not even need to relate to the ASCII character. You just intercept the code and use at will.
Sorry I can't give you an option as you asked, but this alternative does not seem to be much more work than creating a whole new IME.
Edit: upon reading the related question, it makes sense to use android.inputmethodservice.KeyboardView instead of reinventing the wheel with the GridView.

Categories

Resources