I have a Seekbar and I want to save the state to database when the progress is changed.
I am wondering in which event to put my code onProgressChanged vs. onStopTrackingTouch?
I am going to disagree with both mbaird and jqpubliq, for one simple reason: they assume the user is using a touchscreen.
Most Android devices have touchscreens. Not all will. For example, there are firms developing Android set-top boxes (think Android equivalents of Roku or Boxee Box). Most televisions are not touchscreens.
Now, if you want your application to only be usable with a touchscreen -- and you have set the appropriate <uses-configuration> elements in your manifest -- onStopTrackingTouch() may be reliable for detecting a progress change.
Personally, I would update the database on neither onProgressChanged() nor onStopTrackingTouch(), but at the point when the user does something positive to indicate they want to persist the current screen's contents -- pressing the BACK button, clicking a Save button, etc. But I certainly would not rely on onStopTrackingTouch() unless you are developing a touchscreen-only app.
I would use onProgressChanged if you need to update any elements of the UI as the user is sliding the progress bar.
I would wait for onStopTrackingTouch to actually update the database.
Unless you have reason to believe that the application will often crash in the middle of the gesture and you need to save where the user was at that time, I would recommend onStopTrackingTouch.
Related
Say I have several dialog fragments that are shown in response to messages and events that can arrive in any order. Normally, the last dialog shown will be on top. Is there a way to show a dialog fragment under an existing one, or change their z-order after they are shown?
It should be pretty rare for my app to show more than one dialog at a time, but it could happen. There is one particular dialog that should always be on top whenever it's visible.
A dialog creates an application sub-window. Android's window manager (WindowManagerService) automatically computes window's z-order depending on its type and stores it in WindowState's mLayer field. Internal Android classes have access to this field and change window's z-order sometimes, but this API is not exposed to Android SDK. So it seems that the only way to affect dialog's z-order is to recreate it.
Everything I wrote above is just a result of a brief investigation of Android's source code so I may be wrong. And maybe there's some hacky way to do what you want using reflection and accessing private fields and methods. But I'm not sure it's a good idea to try and do it. In my opinion it would be better to have just a single dialog or even activity, and manage fragments within it.
Our Android app currently has a large number of dialog and alert boxes. I'd like to switch these to toasts, but there's a problem - some of them require the user to choose whether to view more info or just dismiss the popup. It doesn't look like there's a way to do this with toasts.
Is there any existing Android library that supports tappable toasts (i.e., you tap it and it triggers a function call to a listener in the app, sort of like a notification)? If not, is there a recommended alternative for this "tap-here-to-do-something-otherwise-I'll-just-vanish-in-a-few-seconds" UI pattern, or should I just roll my own fragment class for it?
I have to do something similar to my app so I wrote, DropViewNotification, a boiler plate to make it happen by animating the so-called notification into the screen. It doesn't do automatic dismiss as this should only act as a tool.
It accept any kind of view to make it versatile as I need to put at least two or three obvious view into it (TextView, ProgressBar, ImageView). You can switch it's content on the fly if you want to. Animation can also be customized for both showing and dismissing of the notification and the main content.
In real-life you might want to consider adding controller class to handle the display of the content and auto dismissal, etc. Hope it's of some use to you.
I have several widgets in a view, each needing its own ActionMode. I see that the ActionMode does not dismiss automatically when the user taps outside the action bar. Thus, it is easily possible for the user to start an ActionMode for one control, then tap (longclick in my case) another control and stack a second ActionBar on top of the first. This causes programming logic havoc.
I can keep track of the current ActionMode with an activity-level member variable and dismiss the current one if a new one is needed. Howewver, this is making my code messy to read and maintain. And further, I'd prefer to dismiss it immediately when the user taps anything outside the action bar.
Any suggestions on a good way to handle this?
I was looking for a solution of this problem some time ago and as I know you couldn't track it without saving current action-mode state in a global variable. However I don't think that one variable with proper name would make your code messy.
I'm writing an app using Titanium. I want to be able to automatically dismiss the keyboard anytime something outside of the text field is clicked. I have yet to find an elegant solution for this issue.
Couple things that I've thought about, but am still looking for a better solution:
Assign event listeners to basically everything else present in the view, and dismiss the keyboard (using textField.blur()). I want to avoid this since it results in a LOT of code just to dismiss the keyboard. Also, if I end up adding anything else to the view, I'll have to add a click listener to that object as well, so it's not very maintainable.
Create a large transparent view, and have it take up the entire screen. Place it directly beneath the text field and add to it one click listener on that which will dismiss the keyboard. This is a better solution than #1, but still isn't great because I've had a lot of trouble getting zIndexes to work properly. It's also inefficient for my purposes because I've got views with a specific width and height that encapsulate text fields. I've used these for the sake of code simplicity and I re-use them throughout my application.
I've tried adding a listener for the "blur" event for the text field but that doesn't seem to get fired appropriately.
That's about it. I'm sort of at a loss. The zIndexing also behaves strangely on the iPhone, and I haven't tried on Android yet. Also, as I mentioned above, many of the text fields I use are encapsulated within small views with set widths/heights-- so I think that will affect the functionality of Z-indexes.
So the root question is: What's the best way to dismiss a keyboard whenever anything outside the text field that's in focus is clicked?
If I'm correct the click event propagates through all views and windows therefore your #1 option could be modified to check for clicks on the bottom most layer (view or window), check for its source then decide what to do.
When system enters into TouchMode, I'd like to know which widget will lose focus. When system quits TouchMode, I'd also like to know which widget will get focus. Overriding onFocusChange() didn't satisfy me, since it couldn't tell TouchMode change, since it could happen in every mode, touch, trackball, key navigation, etc.
SDK said only one API View.isInTouchMode() there it is. So, is it possible to detect TouchMode change?
Long shot but you probably need to maintain states manually. So you keep a a flag , lets say isTouchMode which you can set every time any of the widgets are touched and unset when something gets focus.
Use ViewTreeObserver.addOnTouchModeChangeListener(). It will tell you when the mode changes.
http://developer.android.com/reference/android/view/ViewTreeObserver.html