I have a simple scene in unity which has an input field in it. When I run my scene in my Android Device and press the input field, the android keyboard does not show. I am connecting via USB to my laptop using Unity Remote 5 app.
Here is my code:
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.UI;
using TMPro;
public class InputNumber : MonoBehaviour {
public InputField input;
// Use this for initialization
void Start () {
if (input)
{
TouchScreenKeyboard.Open("", TouchScreenKeyboardType.Default, false, false, true);
}
input.keyboardType = TouchScreenKeyboardType.NumberPad;
}
// Update is called once per frame
void Update () {
}
}
No need to call TouchScreenKeyboard.Open() method. The native keyboard will not show up if you are running it in Unity Remote app. But it will show up on touching input field once you build and run the app from File > Build Settings > Build or File > Build and Run.
When using the InputField component, you do not need TouchScreenKeyboard.Open to open the keyboard manually. Once the InputField is clicked on, it will open itself. Remove the unnecessary TouchScreenKeyboard.Open code.
I am connecting via USB to my laptop using Unity Remote 5 app.
That's the problem.
The InputField component will only open the keyboard when you build and run the program on the device. Unity Remote 5 is only used to detect touch on the screen and read the sensors such as the GPS and accelerometer sensors while programming on the Editor. For features supported with Unity Remote 5 see this post.
Also, TouchScreenKeyboard.Open will not work in the Editor too. You have to build and run it on the mobile device for it to work but TouchScreenKeyboard.Open is not needed here. Just build the game and deploy it to your device and the keyboard should open when you click on the InputField.
you need the Cross-Platform-Input asset from the Unity standard asset pack in the asset store. this is free, and once imported into your project will work on its own with the text field. just import it and try you phone again
then you wont need:
if (input)
{
TouchScreenKeyboard.Open("", TouchScreenKeyboardType.Default, false, false, true);
}
input.keyboardType = TouchScreenKeyboardType.NumberPad;
Unitys mobile class will just open the keyboard when you tap the field. no extra coding needed.
Related
I'm trying to create a react-native apps with three js using expo-gl, expo-three frameworks.
Following is the list of imports..
import { ExpoWebGLRenderingContext, GLView } from 'expo-gl';
import ExpoTHREE, { Renderer, TextureLoader } from 'expo-three';
import { GLTFLoader } from 'three/examples/jsm/loaders/GLTFLoader';
import * as React from 'react';
import {
AmbientLight,
HemisphereLight,
BoxBufferGeometry,
Fog,
GridHelper,
Mesh,
MeshStandardMaterial,
PerspectiveCamera,
PointLight,
Scene,
SpotLight,
Camera,
InstancedMesh,
} from 'three';
import OrbitControlsView from 'expo-three-orbit-controls';
Apart from the basic scene, camera and light setup I'm trying to load a glb model using the ExpoTHREE.loadAsync method as below...
const loadGlb = async ()=>{
const obj = await ExpoTHREE.loadAsync(
[require('./assets/suzanne.glb')],
null,
null}).
then((e)=>{
scene.add(e.scene);
e.scene.traverse((f)=>{if(f.isMesh){f.material = new THREE.MeshNormalMaterial();}});
})
.catch((err)=>{console.log(err)});
}
loadGlb();
Using the ref: https://www.npmjs.com/package/expo-three
The model loads when I run the code on the desktop browser but not on my android phone using expo app. Please let me know what am I doing wrong.
You can access the app here https://expo.io/#praful124/expo3
I had the same problem and ended up running into the fact that it does not support textures inside the glb file. But if you find a solution, I will be very happy if you share them.
Im using Gluon to develop javafx applications to Android, Iphone (and to desktop). When I export a test application to my Android phone (Marshmallow 6.0) - I cannot hold down onto text to access the menu from where you can copy text (the context menu)
(Which is an example of what you can do with a context menu - and is not a question of how to copy text on long hold specifically in Android).
This was possible on iphone 6 when testing it there.
How can I detected wether the device/operating system has a default context menu or not in java?
On Desktop there is a default ContextMenu that is created and installed in TextFieldBehavior (private API). If you don't set your own custom context menu, that will be the one used when a ContextMenuEvent is fired (with a right click event for instance).
On mobile, both Android and iOS have a ContextMenu as well.
On iOS, it uses a native TextField (UITextField). When the long press event happens, it triggers the default context menu (on my iPad I can see a small magnifying glass, and after that the context menu shows up).
On Android, the JavaFX TextField has a custom skin, but shares the same private TextFieldBehavior as the desktop version. The problem in this case is the missing right click event that would trigger the ContextMenuEvent event.
That's why you have to fire manually a ContextMenuEvent event, as it was described in this question.
Conclusion: so far, this is basically required only on Android:
TextField textField = new TextField();
addPressAndHoldHandler(textField, Duration.seconds(1), event -> {
Bounds bounds = textField.localToScreen(textField.getBoundsInLocal());
textField.fireEvent(new ContextMenuEvent(ContextMenuEvent.CONTEXT_MENU_REQUESTED,
0, 0, bounds.getMinX() + 10, bounds.getMaxY() + 10, false, null));
});
I'm new to mobile testing , and currently I research for an automation framework for mobile testing.
I've started to look into Appium, and created some tests for the demo app I've made (one for IOS and the other for Android).
I've managed to write a test for each of the platforms , but I was wondering , how difficult it might be to write a one generic test which will be executed on the both platforms with minimum adjustments ?
Thanks
It is possible but you have to keep same labels for each component for all platforms, for example to click on a button, instead of locating through Xpath locate by its name.
WebElement button = driver.findElement(By.name("my button")); button.click();
More info finding elements in Appium docs:
http://appium.wikia.com/wiki/Finding_Elements
I built an automation framework from scratch which does exactly the same thing, i.e. have one code base and the tests run both on Android and iOS based on what device and app you give the test. This is how I went about doing it. (I used Java+Appium+Cucumber framework).
Following the Page Object pattern is a good practice for writing automation code.
That being said, you will have all the resource ids of Android and Accessibility ids of iOS in 2 separate files under a folder named say "ObjectRepository". These files usually have the extension of *.properties (It is called the properties file).
Say you have a Login button that you want to interact with on Android and iOS, you have will 2 files:
File 1) "androidObject.properties" which has:
Login.LoginButton=loginAndroidBtn
File 2) "iOSObject.properties" which has:
Login.LoginButton=loginiOSBtn
NOTE: In the key/value pair above, the key is the same "Login.LoginButton", the value is the resource id and the accessibility id of the Login Button in your Android and iOS application
In your code you would do the following:
if(IS_ANDROID) {
DRIVER.findElementById("Login.LoginButton").click();
} else {
DRIVER.findElementByAccessibilityId("Login.LoginButton").click();
}
In another file you would set what IS_ANDROID and IS_IOS means. You may do something like this:
public static DeviceConfig DEVICE_CONFIG;
private void setPlatform() {
if (DEVICE_CONFIG.platformName.equals("Android")) {
IS_ANDROID = true;
} else if (DEVICE_CONFIG.platformName.equals("iOS")) {
IS_IOS = true;
}
This way you can have one code base and run Android and iOS seamlessly.
I wrote an application using PyQt for Windwos/Linux and I want to port it to Android. Application is simple, has 2 levels of GUI and I do not want to implement touch functions, but to control application using USB keyboard.
I am using this project:
android-python27
to interpret my PyQt code for Android, but it doesn't work. I suppose that I have to modify code, but I do not know how. I only found one sample for using android-python27:
import android, time
droid = android.Android()
while 1:
droid.makeToast("Hello from Python 2.7 for Android")
time.sleep(5)
My application is being started in this way:
app = QtGui.QApplication(sys.argv)
app.Encoding(QtGui.QApplication.UnicodeUTF8)
ui = MainMenu()
class MainMenu(QtGui.QWidget):#glavni izbornik
def __init__(self):
super(MainMenu, self).__init__()
i=0
self.UI(self)
def UI(self, Form):
...
What should I change to adopt code to android-python27?
I have an address search field in my app. When this field gets focus I want to open the keyboard as in the following image.
It works fine for iOS when the keyboard type is set to Titanium.UI.KEYBOARD_NUMBERS_PUNCTUATION as in the following code
var search = Titanium.UI.createSearchBar({
barColor:'#c8c8c8',
autocorrect:true,
hintText:'enter address',
height:'43dp',
top:'75dp',
autocapitalization: Titanium.UI.TEXT_AUTOCAPITALIZATION_WORDS,
keyboardType:Titanium.UI.KEYBOARD_NUMBERS_PUNCTUATION
});
However on Android it appears as in the following image.
I am using Titanium mobile SDK 1.7.5
You should probably add :
softKeyboardOnFocus : Titanium.UI.Android.SOFT_KEYBOARD_SHOW_ON_FOCUS
Unfortunately, it may be overridden by the system. Try it on another Android system (3.0 for example) if the problem persist.