I use Unity 3D with Samsung Gear. I have a working OVRPlayerController in my scene but I am having difficulties mapping the oculus tap, swipe and return button.
I have tried with something like:
if (Input.GetMouseButtonDown(0))
{
Debug.Log("input detected");
}
And with this I detect the tap encircled in red
I have tried also something like:
if (OVRInput.Get(OVRInput.Button.PrimaryThumbstick))
{
Debug.Log("Input detected");
}
Or :
if (OVRInput.Get(OVRInput.Button.One))
{
Debug.Log("Input detected");
}
But nothing seems to work. Is there any documentation that explains how is the input mapped on Samsung Gear that I encircled in yellow ? Does anyone have experience with this or can maybe guide me to some useful documentation on this matter ?
Cheers
My project settings for input:
For swipes, you can get the current Vector2 on the touchpad (either the HMD and/or the incoming remote controller touchpad) by using:
CurrentVector = OVRInput.Get(OVRInput.Axis2D.PrimaryTouchpad);
as for the back button, it is accessible using OVRInput.Button.Back either on Get, GetDown or GetUp. But know that a 2 seconds long back press is reserved for the Universal Menu and should not have any implications inside your game or app.
Related
I'm using Unity 3D for developing 3D desfense game.
I made a monster that can take agro from intruders of map, making them occupied while being vulnerable to other attacks. I wanted to make monster can be moved when it's touched once.
Process: touch the monster → touch the destination → monster moves to the destination
I wanted my monster to move through the entire map, but can take intruder's agro
only when they're on the road. I'm currently using Raycast to make this happen, and I'm stuck.
It works just fine on Unity, but when it's on a build and played on the phone, monster can't recognizes the touch or can't recognizes the point where I touched.
if (IsClickMonster)
{
if (Input.GetMouseButtonDown(0))
{
Ray MoveClick = cam.ScreenPointToRay(Input.mousePosition);
if (Physics.Raycast(MoveClick, out ClickedPosition, float.MaxValue, whatCanbeClickedOn) && ClickedPosition.collider.gameObject.CompareTag("monster") == false)
{
Debug.Log("Monster is Moving!");
clickEffect.transform.position = new Vector3(ClickedPosition.point.x, ClickedPosition.point.y + 1.0f, ClickedPosition.point.z);
StopCoroutine("MonsterMove");
IsMoving = true;
StartCoroutine("MonsterMove");
}
else
{
clickEffect.SetActive(false);
StopCoroutine("MonsterMove");
IsClickMonster = false;
return;
}
}
}
else
{
return;
}
↑ Picture of Moving the Skull Soldier. White dot(clickEffect) is where Player touched(ClickedPosition).
Physics.Raycast(MoveClick, out ClickedPosition, float.MaxValue, whatCanbeClickedOn)
Even though I'm using the "whatCanbeClickedOn" layermask, I wonder what can I do to avoid this from happening on mobile build.
In my experience with android, when something doesnt work well in android could be either you have some error or warning on execution that did not trigger on unity just by chance or maybe the phone is not capable of handling the game like it moves too many polygons or lights and it cant handle it well then starts making odd things, try checking the warnings in unity and see if any error appears too of course, thats my best bet
I am using an input processor for touchinput on a flappy bird like game. This works fine on my droid turbo, and a couple other newer phones. But with my two older tables, a xoom, and verizon tablet, touchDown occasionally doesn't fire. I should mention that the FPS is 60 throughout gameplay. Also, I use an inputMultiplexer which adds both the playerInput and hud/play stages. Could this just be a problem with older android? Any fix? I am sure it's not my code for the fact that it works on newer phones.
EDIT
I tried using Gdx.input.isTouched like so :
if(Gdx.input.isTouched()){
if(!touched){
jump();
}
touched = true;
} else{
touched = false;
}
But it gives me the same results as input processor :\
This is not a problem with the jump method, as of right now it just prints "touched" to the console.
it is the issue with the viewPort. that is the different phone has different screen size. please check the below link which will help you
http://stackoverflow.com/questions/39810169/libgdx-text-not-rendering-properly-on-larger-screens/39946652#39946652
I'm trying to make an app that responds to the Cardboard.SDK.Tilted flag in some Update() method.
When running in Unity Player, by pressing the Esc button, Cardboard.SDK.Tilted is set to true, so here's all good.
But when i Build the app for Android, Cardboard.SDK.Tilted stays false if I tilt the device. Other VR apps with tilt actions work fine on my phone. Is there any other option I have to enable before building for Android to make this work?
I'm using Unity v5.3.3f1 and a Cardboard SDK v0.6, the devices I've tried on are Xperia Z2, Samsung Galaxy S3 and iPhone 6.
EDIT:
So, I've tried putting this code into both Update() and LateUpdate() methods:
if (Cardboard.SDK.Tilted) {
print("tilted, next scene");
NextScene ();
}
When the screen is tilted, new scene should be loaded.
But as I've said, it works only in Unity Player by pressing the Esc button to trigger the tilt, on a real device nothing happens - the Cardboard.SDK.Tilted variable is never set to true.
I've seen on https://recordnotfound.com/cardboard-unity-googlesamples-6780/issues that there was a issue of discontinuation of Tilt in v0.6, is it possible that this is no longer supported? But it's strange that it works in Unity Player but not on a real device.
I can verify that the Cardboard.SDK.Tilted flag does not appear to working as in previous versions of the SDK. The escape button triggers it in the debugger, but the tilt action does not trigger it in builds.
However, you can implement an equivalent quite simply with Input.acceleration:
float angle = 80.0f / 360.0f * 2.0f * Mathf.PI;
bool isTilted = Mathf.Abs(Input.acceleration.normalized.x) > Mathf.Sin(angle);
if (Cardboard.SDK.Tilted || isTilted)
{
//Action here
}
If the device's acceleration is entirely due to gravity, the angle float is the angle of the device from horizontal. Attempting to normalize a Vector3 that is too small sets it to zero, so small vectors shouldn't trip the conditional. Pre-calculate your sine to save a cycle.
I want to show an menu item only if the device supports the stylus for input.
Unfortunately i've found nothing check if the device or the display supports the stylus/Spen for input.
Edit:
I can distinguish between Stylus and Finger after a MotionEvent is triggered using event.getToolType().
If the tooltype is TOOL_TYPE_STYLUS i can be sure that it supports the stylus.
If not i can guess if there is a pressure > 0 (relating to how to detect if the screen is capacitive or resistive in an Android device?)
But i would like to know it in the onCreate method of my activity.
Following is somehow not supported and does not work for me.
Configuration config = getResources().getConfiguration();
if (config.touchscreen == Configuration.TOUCHSCREEN_STYLUS)
Toast.makeText(this, "TOUCHSCREEN_STYLUS", Toast.LENGTH_SHORT).show();
You can detect S-Pen and other stylus's pretty reliably via InputManager:
boolean sPen = false;
if(Build.VERSION.SDK_INT > 15) {
InputManager inptmgr = (InputManager)getSystemService(INPUT_SERVICE);
int[] inputs = inptmgr.getInputDeviceIds();
for(int i = 0;i<inputs.length;i++) {
if(inptmgr.getInputDevice(inputs[i]).getName().toLowerCase().contains("pen")) sPen = true;
}
}
Usually devices will register with their appropriate names contained in them, for example, "Bluetooth Mouse", "Logitech USB Keyboard" or "E_Pen"
Here you go (from the Android documentation) - seems to only be supported in 4.0 and up though.
http://developer.android.com/about/versions/android-4.0.html
Android now provides APIs for receiving input from a stylus input
device such as a digitizer tablet peripheral or a stylus-enabled touch
screen.
Stylus input operates in a similar manner to touch or mouse input.
When the stylus is in contact with the digitizer, applications receive
touch events just like they would when a finger is used to touch the
display. When the stylus is hovering above the digitizer, applications
receive hover events just like they would when a mouse pointer was
being moved across the display when no buttons are pressed.
Your application can distinguish between finger, mouse, stylus and
eraser input by querying the “tool type" associated with each pointer
in a MotionEvent using getToolType(). The currently defined tool types
are: TOOL_TYPE_UNKNOWN, TOOL_TYPE_FINGER, TOOL_TYPE_MOUSE,
TOOL_TYPE_STYLUS, and TOOL_TYPE_ERASER. By querying the tool type,
your application can choose to handle stylus input in different ways
from finger or mouse input.
Your application can also query which mouse or stylus buttons are
pressed by querying the “button state" of a MotionEvent using
getButtonState(). The currently defined button states are:
BUTTON_PRIMARY, BUTTON_SECONDARY, BUTTON_TERTIARY, BUTTON_BACK, and
BUTTON_FORWARD. For convenience, the back and forward mouse buttons
are automatically mapped to the KEYCODE_BACK and KEYCODE_FORWARD keys.
Your application can handle these keys to support mouse button based
back and forward navigation.
In addition to precisely measuring the position and pressure of a
contact, some stylus input devices also report the distance between
the stylus tip and the digitizer, the stylus tilt angle, and the
stylus orientation angle. Your application can query this information
using getAxisValue() with the axis codes AXIS_DISTANCE, AXIS_TILT, and
AXIS_ORIENTATION.
The answer of #Aaron Gillion is looking to be working, but trusting detection of a function to a name looks non-100%-reliable.
I checked out the standard Android InputDevice and found out some an interesting function supportsSource which allows if the device supports e.g. SOURCE_STYLUS. Here an example:
if (Build.VERSION_CODES.JELLY_BEAN <= Build.VERSION.SDK_INT) {
InputManager im = (InputManager) context.getSystemService(INPUT_SERVICE);
for (int id : im.getInputDeviceIds()) {
InputDevice inputDevice = im.getInputDevice(id);
if (
inputDevice.supportsSource(InputDevice.SOURCE_STYLUS) ||
inputDevice.supportsSource(InputDevice.SOURCE_BLUETOOTH_STYLUS)
)
return true;
}
}
return false;
If you pursue lower SDK you can inline supportsSource function, which is ridiculously simple:
public boolean supportsSource(int source) {
return (mSources & source) == source;
}
And constants:
added in 23 - SOURCE_BLUETOOTH_STYLUS = 0x00008000 | SOURCE_STYLUS
added in 14 - SOURCE_STYLUS = 0x00004000 | SOURCE_CLASS_POINTER
added in 9 - SOURCE_CLASS_POINTER = 0x00000002 from 9
I've tested the code on a couple of devices and it is detecting the stylus correctly. I would appreciated if you can share your results of detecting the stylus on non-Samsung devices.
I don't think there is any such thing as to detect stylus input. I would assume that if the device has touch capability, it can also respond to a stylus. Android doesn't specifically support stylus input, as far as I know.
I am working on an app that will allow a user to take quick click and forget snapshots. Most of the app is done except for the camera working that way I would like. Right now I have the camera working but I can't seem to find a way to disable the shutter sound and I cant find a way to disable displaying the preview. I was able to cover the preview up with a control but I would rather just not have it displayed if possible.
To sum things up, these are the items that I would like to disable while utilizing the built in Camera controls.
Shutter sound
Camera screen display
Image preview onPictureTaken
Does anyone know of a resource that could point me in the right direction, I would greatly appreciate it. I have been following CommonsWare's example from this sample fairly closely.
Thank you.
This is actually a property in the build.prop of a phone. I'm unsure if its possible to change this. Unless you completely override it and use your own camera code. Using what you can that is available in the SDK.
Take a look at this:
CameraService.cpp
. . .
CameraService::Client::Client(const sp<CameraService>& cameraService,
const sp<ICameraClient>& cameraClient,
const sp<CameraHardwareInterface>& hardware,
int cameraId, int cameraFacing, int clientPid) {
mPreviewCallbackFlag = FRAME_CALLBACK_FLAG_NOOP;
mOrientation = getOrientation(0, mCameraFacing == CAMERA_FACING_FRONT);
mOrientationChanged = false;
cameraService->setCameraBusy(cameraId);
cameraService->loadSound();
LOG1("Client::Client X (pid %d)", callingPid)
}
void CameraService::loadSound() {
Mutex::Autolock lock(mSoundLock);
LOG1("CameraService::loadSound ref=%d", mSoundRef);
if (mSoundRef++) return;
mSoundPlayer[SOUND_SHUTTER] = newMediaPlayer("/system/media/audio/ui/camera_click.ogg");
mSoundPlayer[SOUND_RECORDING] = newMediaPlayer("/system/media/audio/ui/VideoRecord.ogg");
}
As can be noted, the click sound is started without your interaction.
This is the service used in the Gingerbread Source code.
The reason they DON'T allow this is because it is illegal is some countries. Only way to achieve what you want is to have a custom ROM.
Update
If what being said here: http://androidforums.com/t-mobile-g1/6371-camera-shutter-sound-effect-off.html
still applies, then you could write a timer that turns off the sound (Silent Mode) for a couple of seconds and then turn it back on each time you take a picture.
You may use the data from the preview callback using a function to save it at a picture on some type of trigger such as a button, using onclick listener. you could compress the image to jpeg or png. In this way, there no shutterCallback to be implemented. and therefore you can play any sound you want or none when taking a picture.
You can effectively hide the preview surface by giving it dimensions of 1p in the xml file (I found an example the said 0p but for some reason that was giving me errors).
It may be illegal to have a silent shutter in some places, but it doesn't appear that the US is such a place, as my HTC One gives me an option to silence it, and in fact, since Android 4.2 you can do this:
Camera.CameraInfo info=new Camera.CameraInfo();
if (info.canDisableShutterSound) {
camera.enableShutterSound(false);
}