I want to show an menu item only if the device supports the stylus for input.
Unfortunately i've found nothing check if the device or the display supports the stylus/Spen for input.
Edit:
I can distinguish between Stylus and Finger after a MotionEvent is triggered using event.getToolType().
If the tooltype is TOOL_TYPE_STYLUS i can be sure that it supports the stylus.
If not i can guess if there is a pressure > 0 (relating to how to detect if the screen is capacitive or resistive in an Android device?)
But i would like to know it in the onCreate method of my activity.
Following is somehow not supported and does not work for me.
Configuration config = getResources().getConfiguration();
if (config.touchscreen == Configuration.TOUCHSCREEN_STYLUS)
Toast.makeText(this, "TOUCHSCREEN_STYLUS", Toast.LENGTH_SHORT).show();
You can detect S-Pen and other stylus's pretty reliably via InputManager:
boolean sPen = false;
if(Build.VERSION.SDK_INT > 15) {
InputManager inptmgr = (InputManager)getSystemService(INPUT_SERVICE);
int[] inputs = inptmgr.getInputDeviceIds();
for(int i = 0;i<inputs.length;i++) {
if(inptmgr.getInputDevice(inputs[i]).getName().toLowerCase().contains("pen")) sPen = true;
}
}
Usually devices will register with their appropriate names contained in them, for example, "Bluetooth Mouse", "Logitech USB Keyboard" or "E_Pen"
Here you go (from the Android documentation) - seems to only be supported in 4.0 and up though.
http://developer.android.com/about/versions/android-4.0.html
Android now provides APIs for receiving input from a stylus input
device such as a digitizer tablet peripheral or a stylus-enabled touch
screen.
Stylus input operates in a similar manner to touch or mouse input.
When the stylus is in contact with the digitizer, applications receive
touch events just like they would when a finger is used to touch the
display. When the stylus is hovering above the digitizer, applications
receive hover events just like they would when a mouse pointer was
being moved across the display when no buttons are pressed.
Your application can distinguish between finger, mouse, stylus and
eraser input by querying the “tool type" associated with each pointer
in a MotionEvent using getToolType(). The currently defined tool types
are: TOOL_TYPE_UNKNOWN, TOOL_TYPE_FINGER, TOOL_TYPE_MOUSE,
TOOL_TYPE_STYLUS, and TOOL_TYPE_ERASER. By querying the tool type,
your application can choose to handle stylus input in different ways
from finger or mouse input.
Your application can also query which mouse or stylus buttons are
pressed by querying the “button state" of a MotionEvent using
getButtonState(). The currently defined button states are:
BUTTON_PRIMARY, BUTTON_SECONDARY, BUTTON_TERTIARY, BUTTON_BACK, and
BUTTON_FORWARD. For convenience, the back and forward mouse buttons
are automatically mapped to the KEYCODE_BACK and KEYCODE_FORWARD keys.
Your application can handle these keys to support mouse button based
back and forward navigation.
In addition to precisely measuring the position and pressure of a
contact, some stylus input devices also report the distance between
the stylus tip and the digitizer, the stylus tilt angle, and the
stylus orientation angle. Your application can query this information
using getAxisValue() with the axis codes AXIS_DISTANCE, AXIS_TILT, and
AXIS_ORIENTATION.
The answer of #Aaron Gillion is looking to be working, but trusting detection of a function to a name looks non-100%-reliable.
I checked out the standard Android InputDevice and found out some an interesting function supportsSource which allows if the device supports e.g. SOURCE_STYLUS. Here an example:
if (Build.VERSION_CODES.JELLY_BEAN <= Build.VERSION.SDK_INT) {
InputManager im = (InputManager) context.getSystemService(INPUT_SERVICE);
for (int id : im.getInputDeviceIds()) {
InputDevice inputDevice = im.getInputDevice(id);
if (
inputDevice.supportsSource(InputDevice.SOURCE_STYLUS) ||
inputDevice.supportsSource(InputDevice.SOURCE_BLUETOOTH_STYLUS)
)
return true;
}
}
return false;
If you pursue lower SDK you can inline supportsSource function, which is ridiculously simple:
public boolean supportsSource(int source) {
return (mSources & source) == source;
}
And constants:
added in 23 - SOURCE_BLUETOOTH_STYLUS = 0x00008000 | SOURCE_STYLUS
added in 14 - SOURCE_STYLUS = 0x00004000 | SOURCE_CLASS_POINTER
added in 9 - SOURCE_CLASS_POINTER = 0x00000002 from 9
I've tested the code on a couple of devices and it is detecting the stylus correctly. I would appreciated if you can share your results of detecting the stylus on non-Samsung devices.
I don't think there is any such thing as to detect stylus input. I would assume that if the device has touch capability, it can also respond to a stylus. Android doesn't specifically support stylus input, as far as I know.
Related
I have and C# Xamarin android app that hosts a reactjs app in a webview.
When using this app on a touch screen android device, It appears that occasionally tapping the screen is ignored.
What appears to be going on is that, the tap is interpreted as a mini drag event, as there was some small directional movement in the tap.
Looking at the android logs, for failed taps, I noticed output like the following:
adb -d logcat -s CustomFrequencyManagerService
06-19 13:35:49.225 2945 9989 D CustomFrequencyManagerService: acquireDVFSLockLocked : type : DVFS_MIN_LIMIT frequency : 839000 uid : 1000 pid : 2945 pkgName : GESTURE_DETECTED#CPU_MIN#49
06-19 13:35:49.781 2945 2945 D CustomFrequencyManagerService: releaseDVFSLockLocked : Getting Lock type frm List : DVFS_MIN_LIMIT frequency : 839000 uid : 1000 pid : 2945 tag : GESTURE_DETECTED#CPU_MIN#49
Note the GESTURE_DETECTED part of the log entry.
However for successful taps, CustomFrequencyManagerService has no output in the log.
Looking at this from the reactjs app perspective:
I noticed that the failed taps emit the following events:
touchstart
touchend
While the normal successful events are:
touchstart
touchend
mousedown
blur
mouseup
click
I could potentially change the reactjs app to respond directly to touch events instead of click events, but I was wondering if there was a way (hopefully programmatically via android app) to alter the sensitivity with regard to what's interpreted as a drag as opposed to a click?
By installing a IOnTouchListener on the Android.WebKit.WebView
_webView.SetOnTouchListener(new GestureIgnoreTouchListener());
I was able to see at what movement threshold a click turned into a drag.
public class GestureIgnoreTouchListener : Java.Lang.Object, Android.Views.View.IOnTouchListener
{
float _x;
float _y;
public bool OnTouch(Android.Views.View v, MotionEvent e)
{
if (e.Action == MotionEventActions.Down)
{
_x = e.RawX;
_y = e.RawY;
return false;
}
if (e.Action == MotionEventActions.Up)
{
var diffX = e.RawX - _x;
var diffY = e.RawY - _y;
var distance = Math.Sqrt(Math.Pow(diffX, 2) + Math.Pow(diffY, 2));
// observed:
// if distance is 10 or less then this is interpreted as a click.
// if distance is 12 or greater, click is not emitted.
Console.WriteLine(distance);
return false;
}
return false;
}
}
Ideally, if the distance was between 10 and 50, I would like to be able to make this be considered a click not a drag. Possibly I could create a synthetic click event, in this case, but I'm hoping I can somehow influence what ever android code is responsible for interpreting this as a drag.
There are two approaches I've seen people use for this situation. The first you already mentioned: tell react when you're in touch screen environment to use tap events instead of click events.
The second is to take into account what Android refers to as "touch slop":
From https://developer.android.com/training/gestures/movement.html:
Because finger-based touch isn't always the most precise form of
interaction, detecting touch events is often based more on movement
than on simple contact. To help apps distinguish between
movement-based gestures (such as a swipe) and non-movement gestures
(such as a single tap), Android includes the notion of "touch slop".
Touch slop refers to the distance in pixels a user's touch can wander
before the gesture is interpreted as a movement-based gesture. For
more discussion of this topic, see Managing Touch Events in a
ViewGroup.
Google provides an example of one way to deal with it in a difference context here: https://developer.android.com/training/gestures/viewgroup#vc which you could probably adapt to your situation.
I use Unity 3D with Samsung Gear. I have a working OVRPlayerController in my scene but I am having difficulties mapping the oculus tap, swipe and return button.
I have tried with something like:
if (Input.GetMouseButtonDown(0))
{
Debug.Log("input detected");
}
And with this I detect the tap encircled in red
I have tried also something like:
if (OVRInput.Get(OVRInput.Button.PrimaryThumbstick))
{
Debug.Log("Input detected");
}
Or :
if (OVRInput.Get(OVRInput.Button.One))
{
Debug.Log("Input detected");
}
But nothing seems to work. Is there any documentation that explains how is the input mapped on Samsung Gear that I encircled in yellow ? Does anyone have experience with this or can maybe guide me to some useful documentation on this matter ?
Cheers
My project settings for input:
For swipes, you can get the current Vector2 on the touchpad (either the HMD and/or the incoming remote controller touchpad) by using:
CurrentVector = OVRInput.Get(OVRInput.Axis2D.PrimaryTouchpad);
as for the back button, it is accessible using OVRInput.Button.Back either on Get, GetDown or GetUp. But know that a 2 seconds long back press is reserved for the Universal Menu and should not have any implications inside your game or app.
I'm trying to make an app that responds to the Cardboard.SDK.Tilted flag in some Update() method.
When running in Unity Player, by pressing the Esc button, Cardboard.SDK.Tilted is set to true, so here's all good.
But when i Build the app for Android, Cardboard.SDK.Tilted stays false if I tilt the device. Other VR apps with tilt actions work fine on my phone. Is there any other option I have to enable before building for Android to make this work?
I'm using Unity v5.3.3f1 and a Cardboard SDK v0.6, the devices I've tried on are Xperia Z2, Samsung Galaxy S3 and iPhone 6.
EDIT:
So, I've tried putting this code into both Update() and LateUpdate() methods:
if (Cardboard.SDK.Tilted) {
print("tilted, next scene");
NextScene ();
}
When the screen is tilted, new scene should be loaded.
But as I've said, it works only in Unity Player by pressing the Esc button to trigger the tilt, on a real device nothing happens - the Cardboard.SDK.Tilted variable is never set to true.
I've seen on https://recordnotfound.com/cardboard-unity-googlesamples-6780/issues that there was a issue of discontinuation of Tilt in v0.6, is it possible that this is no longer supported? But it's strange that it works in Unity Player but not on a real device.
I can verify that the Cardboard.SDK.Tilted flag does not appear to working as in previous versions of the SDK. The escape button triggers it in the debugger, but the tilt action does not trigger it in builds.
However, you can implement an equivalent quite simply with Input.acceleration:
float angle = 80.0f / 360.0f * 2.0f * Mathf.PI;
bool isTilted = Mathf.Abs(Input.acceleration.normalized.x) > Mathf.Sin(angle);
if (Cardboard.SDK.Tilted || isTilted)
{
//Action here
}
If the device's acceleration is entirely due to gravity, the angle float is the angle of the device from horizontal. Attempting to normalize a Vector3 that is too small sets it to zero, so small vectors shouldn't trip the conditional. Pre-calculate your sine to save a cycle.
I'm updating my application based on users huge request. My app turns the screen ON after something happens and now I'm integrating "pocket mode" functionality. So basically if the user has phone or device in his/her pocket I would like to detect this via proximity sensor and act based on this. But I'm experiencing a lot of troubles..
So I'm registering sensor and everything as usual. One thing I would like to point out is that I'm telling the PowerManager object to register as Proximity_Screen_Off_Wake_Lock. That means that every time the screen will go automatically off when something near is detected.
powerManager.newWakeLock(PowerManager.PROXIMITY_SCREEN_OFF_WAKE_LOCK, "ProximityScreenOff");
Basically when device is on the table and I move finger to the sensor the screen turns off as expected.
The problem starts when my activity launches and I'm already holding the finger on the sensor (or is in pocket - it's the same). So sensor doesn't detect anything that is already near phone. If I move the finger a little bit away the screen will turn ON again.
Is there anything that I could do, so I would get my desired behavior - that is, turning the screen OFF when phone is already in pocket?
You can manage this using boolean variable.
That is if near event of onSensorChanged called, Then n then only far event called.
boolean isNearCalled =false;
void onSensorChanged() {
if(near sensitivity) {
isNearCalled=true;
} else if(Distance sensitivity && isNearCalled) {
// Do stuff for far like screen off
isNearCalled= false;
}
}
Can anyone elaborate on what mobile browsers support multitouch, in particular the ability to press multiple "buttons" (or joysticks) at the same time? It is required for a game I'm making and would like to know, so I can switch to native app instead if necessary, though prefer not to.
If the answer is, it is generally not supported, does anyone happen to know a library/framework which can create screens from XML format or similar, like HTML, cross platform-wise and cross resolution-wise?
//get coordinates of 3 touch
function touch(e){
var touch1 = {x:e.changedTouches[0].clientX,y:e.changedTouches[0]};
var touch2 = {x:e.changedTouches[1].clientX,y:e.changedTouches[1]};
var touch3 = {x:e.changedTouches[2].clientX,y:e.changedTouches[2]};
}
you can get coordinates of each touch flowing up.
Multitouch is supported on Honeycomb and higher, and iOS 4 and higher.