Upon writing a simple Android application in Qt, I encountered an uncomfortable dilemma:
I have a subclass of QWidget called PlotView, and have reimplemented the event-function from it:
bool PlotView::event(QEvent *event){
if(event->type() == QEvent::Gesture){
emit gestureEvent(static_cast<QGestureEvent*>(event));
return true;
}
return QWidget::event(event); \\Line A
}
Also I have the following lines in the class constructor:
this->grabGesture(Qt::SwipeGesture);
this->grabGesture(Qt::PanGesture);
this->grabGesture(Qt::PinchGesture);
this->setAttribute(Qt::WA_AcceptTouchEvents);
What I find very peculiar is that when running the application like this, it does not recognize any gestures. However, when removing the last line of the function (Line A above), the gestures are suddenly recognized, but the widget is not painted.
Some specs: I am currently running Qt 5.2.0, compiling on a Samsung Galaxy Note 10.1 GT-N8010 running Android 4.1.2.
Does anyone have any suggestions to how I could make this run with both the widget being painted and gesture recognition?
It seems adding the lines
this->grabGesture(Qt::SwipeGesture);
this->grabGesture(Qt::PanGesture);
this->grabGesture(Qt::PinchGesture);
this->setAttribute(Qt::WA_AcceptTouchEvents);
to the parent class constructor fixed the problem.
Related
I'm trying to work around a bug inside a material class in the Extras for the Qt3D module:
https://bugreports.qt.io/browse/QTBUG-109574. I've trying to accomplish this, running on my Nokia G20 smartphone. I'm using Qt 6.2.4.
I believe the problem is that the wrong QTechnique is being used at runtime. Within the implementation for QPhongMaterial, there contains a few different techniques with graphics API filters for OpenGL and OpenGL/ES, versions 2 and 3, as seen here: https://github.com/qt/qt3d/blob/6.2.4/src/extras/defaults/qdiffusespecularmaterial.cpp.
To try and influence the QTechnique choice, I've tried to override the rendering API, and the surface format to be compatible with my smartphone:
QSurfaceFormat surfaceFormat;
surfaceFormat.setMajorVersion(2);
surfaceFormat.setMinorVersion(0);
surfaceFormat.setProfile(QSurfaceFormat::OpenGLContextProfile::NoProfile);
qputenv("QSG_RHI_BACKEND", "gles2");
I've also tried to make my own class derived from QMaterial, but that's another issue, as I'm getting a black screen in my viewport with that, which I can't fix, because the Android GPU Inspector doesn't support my device.
My Cardboard-like VR-Viewer has a button that works by touching the screen. I created an app in Unity3D and this trigger mechanic first worked like a charm. Now all of a sudden, I think I only added an explosion particle effect, the touch function stopped working completely. I have tried things like removing the explosion from my scene again, but nothing seems to work. Another curious thing is, that I can't close the app in a normal way anymore (normally in VR Apps you have an X-Button in the top left of your screen, but clicking it doesn't do anything anymore too (It used to work!)). App still runs, doesn't crash, but no interaction is possible. I looked at the debug logs via adb - no errors there... App works like it used to when I start it inside the Unity Editor.
Did someone encounter a similar error or may have an idea about what the problem is? I'm using Unity Daydream Preview 5.4.2f2.
Edit: I forgot to mention I was using GvrViewer.Instance.Triggered to check if the screen was touched.
For all having the same problem, I worked around it by also checking if a touch just happened. In my Player : Monobehaviour I used:
void Update()
{
if (GvrViewer.Instance.Triggered ||
Input.touchCount > 0 && Input.touches[0].phase == TouchPhase.Ended)
{
//Do stuff.
}
}
I developed a simple javafx application to be ported in Android Environment, however I cant type any characters in the TextField. I guess its a bug, how to fix this one?
Th problem on galaxy S5 android 5.0.1 is not present but on galaxy tab 4 android 5.0.2 it doesn't work i type but none is displyed.
Tried with normal textfield. And the problem persist also I have added the properties .
Another strange rhig is that the space where recognizer. And the del button . The text not
THe code by example is very easy
Rectangle2D visualBounds = Screen.getPrimary().getVisualBounds();
double width = visualBounds.getWidth();
double height = visualBounds.getHeight();
TextField tt= new TextField();
tt.setTranslateY(-150);
StackPane stackPane = new StackPane();
stackPane.getChildren().addAll(tt);
borderpane.setCenter(stackPane);
Scene scene = new Scene(borderpane, width, height);
stage.setScene(scene);
Assuming that CustomTextField is just a custom TextField, this is a known issue, not related to the CustomTextField itself, given that it works in other device.
If you debug it:
./adb logcat -v threadtime
you surely find an exception that explains the issue: a StackOverFlow exception.
On older devices it can be solved adding this: create a java.custom.properties file, and include in it this property:
monocle.stackSize=128000
You may also include this one:
monocle.platform=Android
(it will be soon included by default in the next version)
Put the file at the root of your classpath, e.g. in the folder src/android/resources of your project.
Build and deploy the project on your mobile and check again.
I was trying to make something in Adobe Flash, Air for android. I simply made a square and converted it to an Symbol (called 'hello') and I entered this code.
Multitouch.inputMode = MultitouchInputMode.TOUCH_POINT;
hello.addEventListener(TouchEvent.TOUCH_TAP, tap);
function tap(event:TouchEvent):void
{
hello.x+=15;
}
but nothing happend. I even used CODE SNIPPERS, and also tested this on my phone(ALCATEL onetouch idol mini), and it also said that there are no errors.What have I done wrong?
You set everything correctly but forget only one important thing: check if the system supports touchevent.
if(Multitouch.supportsTouchEvent)
The following problem seems unique to 2.1, happens both on an emulator and on a nexus. The same example works fine on other platforms I've tested (1.5, 1.6 and 2.0 emulators).
I've added created gestureListener as described in this post.
The difference is that I've added the listener on a TextView which also has a contextMenu registered, i.e. sth like the following:
onCreate(...) {
...
// Layout contains a large TextView on which I want to add a context menu
tv = findViewById(R.id.text_view);
tv.registerForContextMenu(this);
// create the gestureListener according above mentioned post.
gestureListener = ...
// set the listener on the text-view
tv.setOnTouchListener(gestureListener);
...
}
When testing it, the correct gesture is recognized alright, but every other time it also causes the context menu to be opened.
As the same example is working on non 2.1 platforms, I've got a feeling it is not my code that is the problem...
Thankful for any suggestions.
Update:
Seems that the return value is flipped somewhere. If I let onFling() return the "wrong" value, i.e. true when the event is skipped and false when it was consumed, it works correctly in 2.1. But of course, that doesn't work on the other platforms. Seems like its time for an ugly workaround...
Thanks for the link steelbytes. I implemented the cancel-and-return-false solution in the last comment (Dec 27, 2010) but just for my onFling event and it appears to work on 1.6 as well as 2.x devices.