I'm looking for an example of how to use MotionEventCompat in Android. I'm using API level 10, which doesn't support if a finger is 'hovering' or 'dragging' onto a view. I need to detect this, preferably from the view itself. Here's some code snippets regarding how I'm trying to use this:
**my class:**
import android.support.v4.view.MotionEventCompat;
public class GridButton extends View
overriding onTouchEvent:
#Override
public boolean onTouchEvent(MotionEvent event) {
super.onTouchEvent(event);
switch (event.getAction() & MotionEventCompat.ACTION_MASK) {
case (MotionEvent.ACTION_DOWN): {
set_active(true);
return true;
}
case (MotionEventCompat.ACTION_HOVER_ENTER): {
set_active(true);
break;
}
}
return false;
}
I based the MotionEventCompat.ACTION_MASK off an example I found somewhere, but it doesn't trigger my code for set_active().
Any help on using this would be appreciated. There's very little about this on the web.
Hover events are sent when the device supports a mouse or touchpad. When the cursor hovers over a view these events are sent to onGenericMotionEvent, not onTouchEvent. They won't help you detect a finger that isn't touching the surface of a capacitive touchscreen or a finger that touched down in a different position and then slid over the view in question. They will never be sent on an API 10 (Android 2.3) device.
Related
I'm working on a project for blind people, there're a lot of troubles I need to fix if the user activates TalkBalk on his phone.
I'm creating a soft keyboard for blind people, the blind will tap with single finger on the circles "Braille Cell Dots" to generate a Braille code, then he types the character/number/symbols he wants as they presented in Braille language.
My problem now is Touch To Explore feature of TalkBack, the user will not be able to make a single tap on the screen because this action now handled by TalkBack, the user must double tap on the dot and this is not good for my app!
How to generate a single tap even if Touch to Explore feature is enabled in TalkBack?
Based on this answer which has solved my problem, as the single touch event converted to hover event if touch to explore is enabled, I've added onHoverEvent to call onTouchEvent and works fine:
#Override
public boolean onHoverEvent(MotionEvent event) {
//Move AccessibilityManager object to the constructor
AccessibilityManager am = (AccessibilityManager) context.getSystemService(Context.ACCESSIBILITY_SERVICE);
if (am.isTouchExplorationEnabled()) {
return onTouchEvent(event);
} else {
return super.onHoverEvent(event);
}
}
And handle hover action to onTouchEvent:
#Override
public boolean onTouchEvent(MotionEvent event) {
switch (event.getAction()) {
case MotionEvent.ACTION_DOWN:
case MotionEvent.ACTION_HOVER_ENTER:
//Your Code
break;
case MotionEvent.ACTION_MOVE:
case MotionEvent.ACTION_HOVER_MOVE:
//Your Code
break;
case MotionEvent.ACTION_UP:
case MotionEvent.ACTION_HOVER_EXIT:
//Your Code
break;
}
return true;
}
I'll edit my question to be more cleaner :)
You don't. It's a terrible idea. Come up with gestures and mechanisms that fit within what TalkBack allows. If you could annotate a specific feature or mechanism of your app that is not allowed to work with talkback, I could recommend an alternative. What gesture is it that's not working?
How do I intercept touch events in Android, ensuring that the existing touch workflow is not impacted. Basically I want to add some touch visualizer so as to know where the user is touching on the screen whereby ensuring that if the user is trying to scroll the tableview, touch visualizer is shown as the user drags his finger but also the tableview scrolls with ease.
In iOS, there is one method sendEvent of class UIWindow does this exactly. Not sure if Android has anything similar.
Thanks
Override Activity.dispatchTouchEvent() and do your touch handling there. Always return super.dispatchTouchEvent() to make sure it gets handled the normal way after your logic executes.
#Override
public boolean dispatchTouchEvent(MotionEvent event) {
/* your code here */
return super.dispatchTouchEvent(event);
}
http://developer.android.com/reference/android/view/View.OnTouchListener.html have alook at this link. It is pretty straight forward. Google it there is lots of example on it.
yourview.setOnTouchListener(new View.OnTouchListener() {
#Override
public boolean onTouch(View v, MotionEvent event) {
return false;
}
});
I'm new to programming for Google Glass. I really like the cards possibilities (https://glass-python-starter-demo.appspot.com/) and it's great that I can submit HTML.
Because the project I'm working on (for which I would like to use Glass) is an online platform, I would like to be able to work from the website. As I found out, I can detect (from the user agent) that Glass is being used (http://www.googleglass.gs/quick-tip-google-glass-web-browser-user-agent/)
So, now my question in two parts.
1) Can I create an Android app that's actually a 'browser without a toolbar' so that I can direct directly to a webpage, but that functions as an app so that I can have it in the menu (after: "ok Glass")?
2) Can I use voice commands in the above app?
PS:
I know this is a beginner question, but -with exception from the Glass Cards option- it's hard to find a starting point for this.
So if I understand you correctly, you want to do this following:
1) Create an app that is basically just a browser.
2) Have the link that opens be to your website where the app is actually contained.
3) Use voice commands to control this app.
All to bypass having to code in java/xml so you can code in the language you know better and then run it online, and just have one line of code creating the browser in the actual app code.
Yes, you can create an app that is just a browser.
Yes, you can have it link to your website and then have it stay there.
I think that it will be much harder for you to interact with the website. The default browser controls are tap to click, hold 2 fingers down and move your head to move "mouse" around the current screen, slide up and down to scroll.
If you want to control the app, you'd have to implement your own gesturedetector, override the default actions taken each time you do something, and then for each action send something to your website to let it know that you just did something.
You could use voice recognition for controls as well.
Either way, you'll need a gesturedetector to override the default controls for the browser if you decide to use the standard one. Here is what you'd need for that:
At the very top, you'll have
private GestureDetector gestureDetector;
Then in your onCreate method, you'll need to create your gestureDetector. I like to have a method to create it farther down. It is just cleaner that way for me.
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
gestureDetector = createGestureDetector(this);
}
And then the method to actually create the GestureDetector:
private GestureDetector createGestureDetector(Context context) {
GestureDetector gestureDetectorTemp = new GestureDetector(context, new GestureDetector.OnGestureListener() {
#Override
public boolean onDown(MotionEvent motionEvent) {
return false;
}
#Override
public void onShowPress(MotionEvent motionEvent) {
return false;
}
#Override
public boolean onSingleTapUp(MotionEvent motionEvent) {
//communicate to the website that you tapped, and have it handle the tap
Log.v("APP_TAG","Someone tapped me!");
return false;
}
#Override
public boolean onScroll(MotionEvent motionEvent, MotionEvent motionEvent2, float distanceX, float distanceY) {
return false;
}
#Override
public void onLongPress(MotionEvent motionEvent) {
return false;
}
#Override
public boolean onFling(MotionEvent motionEvent, MotionEvent motionEvent2, float v, float v2) { //fling = a single slide
return false;
}
});
return gestureDetectorTemp;
}
Notice that each method that was overriden has a "return false" in it. The boolean you are returning represents whether or not the event is to be consumed.
In other words, if you look at onSingleTapUp, we have a Log.v that will print out "Someone tapped me!" when you tap the screen. By returning false, you are letting whatever else was going to occur based on a tao (in the case of a browser, a "click" of the mouse) occur. If you return true, nothing else will occur. The event won't be reported to the other default methods. So to nullify all of the default controls of the browser, you would just change all of the return statements there to "return true", indicating that the event was consumed and that no further action is necessary!
I hope that helped a bit. If you goal is to completely bypass the entire android coding platform and just develop in the browser and then link the app to the browser, I don't think you'll be able to do it completely just because of the nature of Glass. There is no touch screen with lots of different buttons and a keyboard, etc. You'll need to do at least some glass development to get it to work.
I am seeing some unexpected behavior when using a WebView in Android 4.4 (build target 18). One one page in particular we have some edit text fields, and to get the soft keyboard to pop up appropriately, we had to use a code snippet similar to the ones described here:
https://code.google.com/p/android/issues/detail?id=7189
webview.setOnTouchListener(new View.OnTouchListener() {
#Override
public boolean onTouch(View v, MotionEvent event) {
switch (event.getAction()) {
case MotionEvent.ACTION_DOWN:
case MotionEvent.ACTION_UP:
if (!v.hasFocus()) {
v.requestFocus();
}
break;
}
return false;
}
});
This worked as expected up through version 4.3, but beginning with 4.4 the code above caused an undesired effect of having the content of the webview snap/scroll back to the top of the page on the completion of a touch - after scrolling down the page.
Has anyone else experienced this new behavior - or figured out a workaround? The only thing I have come up with so far would be to subclass a webview that allows the edits (so the code above can be removed from read-only pages that now have the scrolling issue). Of course, if a page both scrolls and has edit fields, this would not work.
Thanks in advance!
I'm developing an app which is designed to capture writing on the canvas. The app is designed for use with HTC Flyer (Android 2.3.3).
This device already has Scribbler installed, so I have disabled "Auto launch Scribbler mode" but left "Pen history for each app" checked.
In my tests, I have found the app can detect my fingers on the touchscreen but not the stylus. I pressed a combination of buttons on stylus to no avail.
I have based the code on TouchPaint from Android Developers: http://developer.android.com/resources/samples/ApiDemos/src/com/example/android/apis/graphics/TouchPaint.html
I did not import the package as described in the above code
com.example.android.apis.graphics;
In my Eclipse IDE, it reported the following as a problem suggesting I should remove the Override attribute.
#Override
public boolean onHoverEvent(MotionEvent event) {
return onTouchOrHoverEvent(event, false /*isTouch*/);
}
So I did.
I have added the following to the manifest.
<uses-configuration android:reqTouchScreen="stylus"/>
<uses-configuration android:reqTouchScreen="finger"/>
The app can detect my finger movements on the touchscreen but never my stylus. Why?
I also noted that in Android Developers guide the MotionEvents refers to getToolType but I cannot see it in my "Intellisense" in Eclipse.
http://developer.android.com/reference/android/view/MotionEvent.html#getToolType%28int%29
The method getToolType is not available in my Android code. I thought I could use this method to check the type of the touch input e.g a finger or a stylus.
I also added a onTouchListener for the PaintView (based on TouchPaint code).
this.setOnTouchListener(new View.OnTouchListener() {
#Override
public boolean onTouch(View v, MotionEvent event) {
return touchSurface(v, event);
}
});
touchSurface code
private boolean touchSurface(View v, MotionEvent event) {
boolean complete = true;
int pAction = event.getAction();
int pActionIndex = event.getActionIndex();
Log.i("SignName", "touchSurface event fired.");
Log.i("SignName", "Pointer Action: " + pAction + ", pActionIndex: " + pActionIndex);
return complete;
}
When I use my finger, the above event is fired. When I use a stylus, it's not fired. Why?
I wonder if this problem is isolated to the HTC Flyer, it's because it has a Scribbler app which overrides my app settings or my code is wrong.
Can you please help me?
(Update: 27th April 2012)
I found what the problem was. It was the dedicated stylus HTC Flyer that caused the confusion.
I thought if this stylus didn't work, then any other stylus won't work too. However, I did try a different stylus and it worked.
Thanks for your help, though.
first off, you should update your Flyer to Honeycomb (Android 3.2), also this example is specific to ICS (Android 4), but you can run this example by using a compatibility library and making some minor changes to code, more information will be available at http://htcdev.com