I've been looking for a way to detect the finger orientation together with a simplification of the touch area in the form of an ellipse, when using a mobile touch device. I choose Android rather than iOS since I found three usefull methods (getTouchMajor(), getTouchMinor(), and getOrientation()) in the Android reference for the MotionEvent class.
But I've tried to implement these three methods in my app, and sadly they does not return the wanted values. The orientation stays at 0, no matter what, while the getTouchMajor and getTouchMinor is exactly the same each time.
So my question is: Am I doing somethign wrong or is these methods just not implemented yet?
(I've tried the functions on several different devices including: Nexus 5, Nexus 7, and HTC One)
As we work together I know you know this answer, but I just wanted to put it up such that fellow users of Stackoverflow can have it as well.
We have found suggestive comments that maybe the touch screen drivers for most devices do not provide this data to the system.
We originally tested: Samsung Galaxy S2, HTC One, Nexus 5 (by LG) and Nexus 7 (by Asus), Samsung Galaxy Tap 3.
As opposed to the others the Samsung Galaxy Tap 3, gave different values for getTouchMajor() and getTouchMinor(), but this relationship was seen getTouchMajor() = getTouchMinor() * 3, and getOrientation() was always 0, as with all the other devices.
About 2 months later we discovered that the Google Nexus 10 can show an ellipse with a direction line, when you activate Input: Pointer Location under developer settings.
The initial conclusion was that most devices do not support, getTouchMajor(), getTouchMinor(). or getOrientation(), Which could be a limitation of the capacitative touch screens.
But seeing the Nexus 10, and the tracking of the ellipse and orientation gives hope for new interaction design.
It indicates to me that some devices do deliver on getTouchMinor and getTouchMajor as well as orientation. (or historical versions of the same functions).
I have not had the chance to code anything for the device myself but it seems plausible.
Related
I used to extract accelerometer data by android SDK on Samsung Galaxy S4,
and I encounter a strange problem when I change platform to HTC One M8.
Here is the statement about the experiment:
Once I move the device from place A to place B straight(negative direction of device), I will read the accelerometer data and compute the displacement between A and B.
The curve of acceleromter data on S4 is correct.
It contains two pick with different sign and the shape like 'S' lie on the floor.
But when I use M8, it give me the curve which is wrong obviously. it looks like 'W'
P.S. The motion and program of two are totally the same.
Can anyone give me some reason to the difference?
Is the g-sensor on M8 is the problem?
I'm really stuck on it.
thanks.
From what I understand from trying to use the Moves app, this is due to lack of support of the accelerometer when the device's display is off.
I am looking for a way to change that now and if I find something I will update here.
update:
It seems this could just be due to new hardware not being supported yet. This quote leads me to believe that "...are the low-powered, always listening "Smart Sensors." Accelerometers are nothing new, but HTC's can be used by apps all day long without significant drain on the the battery (as they don't fire up the processor etc). As HTC's opened up the API for these -- dubbed HTC's Smart Sensor Hub -- app developers will be able to hook into this information directly."
I have an OpenGL ES 2 app running on Android. I have tested on a few devices:
Samsung Galaxy S2
LG Optimus G
HTC One X
Kindle Fire
Kindle Fire HD
And the app runs as expected. However, there is a lingering issue on my Samsung Galaxy S3. In my demo, I render a bunch of spheres. I can also pan the camera around by touching and dragging my finger on the screen.
What I notice is "ghosting" when I move the camera. It's difficult to describe, but I can see the previous outlines of the sphere as I move the camera. And, I can continue to see the previous outlines as the camera moves. I don't see all the previous outlines -- only the last few (it's difficult to quantify things here). And, I only see the outlines within the sphere -- as far as I can tell, the previous outlines cease to exist outside of the sphere.
However, once the camera stops, the outlines catch up and disappear within ~1s. Simply put, when things are stationary, everything renders correctly.
I recently had some texturing issues (related to mipmapping) and I solved them the other day. The problem and solution are outlined here:
Black Artifacts on Android in OpenGL ES 2
Could my texturing fix be related to this? I realize that I'm leaving out A LOT of details, but I'm wondering if the symptoms are enough to go on? Any ideas?
Thanks.
Additional details:
The ghosting does not show up when taking a screenshot using the NDK.
A photo of the problem:
A temporary solution is on your phone, under Developer options, check the box to "Disable hardware overlays."
I'm not yet sure if there's a way to force this behavior when running your app.
At glowscript.org are various demo programs written in JavaScript or CoffeeScript that involve little code.
For example, the one-line program box() creates a 3D cube that can be rotated and zoomed, thanks to many defaults (which can be overridden), including basic lighting (two distant lights and some ambient lighting).
Problem:
These programs run fine in many browsers on Windows, Mac, and Linux, but in Firefox on the Samsung Galaxy S3 they are very dark. The appearance indicates that ambient light works (increasing it makes the scene bright) but the distant lights don't work (no difference with them on or off). I've tried running some WebGL demos found on the web and they look fine.
Can anyone think of where I should look for the problem? Why should the behavior be so different between desktop/laptop behavior and what happens on the Galaxy S3?
I fixed the problem on the Galaxy smartphone and added the following to the GlowScript help:
"Most tablets and smart phones do not yet support WebGL, though this is likely to change. On the Samsung Galaxy S3 smartphone, Firefox and Opera do run GlowScript programs, though animations are slow, transparency is buggy, and currently there is no way to zoom and rotate. There are reports that GlowScript also works on the Sony Experia smartphone."
The problem was that the Galaxy shader compiler does not handle for loops correctly. In the fragment shader there was a for loop over the various lights (up to 8 lights). Variables set in the for loop were often set to zero instead of to the specified value. The solution consisted of replacing the loop with a straight-line structure like this, where LP(i) and LC(i) are light positions and colors:
if (light_count == 0) return;
calc_color(LP(0), LC(0));
if (light_count == 1) return;
calc_color(LP(1), LC(1));
if (light_count == 2) return;
etc.
Yuck. Fortunately we only have to support a finite number of lights.
I have an application which uses Android ViewPager (7 views, pretty complex with images and animations). This application runs fantastically on my Asus Eee Pad. It is just gorgeous.
I also have a Galaxy Tab 2 10.1 which almost can not run the app... despite it's brand new and theoretically more powerfull than my EEE Pad. I would like to mention both terminals run Android 4.0.3 and their screen size is identical (1280x800). Application runs in landscape mode.
The application does not use sql, or internet access, just animations on different views and the swipe effect of the ViewPager.
I checked I think everything, from heap, from lint, from basically all the available tools for profiling but nothing seems to improve my UI performance on the Galaxy Tab 2... I wonder if there isn't some difference in the hardware finally, where the ASUS will use GPU to render the UI and the Galaxy Tab 2 its CPU...
I was wondering finally, if any of you, noticed some similar issues on Galaxy Tab or other tablets when comparing your app behaviors on multiple targets...
Cheers!
Paul
--- added September 6th ---
Well, it seems that, despite the Galaxy tablet should use hardware acceleration, it does not. When I am forcing each xml layout with android:layerType="hardware" and
v.setLayerType(View.LAYER_TYPE_HARDWARE, null);
ObjectAnimator oaAlpha = ObjectAnimator.ofFloat(v, "alpha",0f, 1f);
it works finally better... even if it is not as smooth as on the Asus one
Finally, I discovered - and this is really odd - that the performance issue was created by the fact that the font size was too high !?!
It was a sad experience... but for some stupid reason the Galaxy Tab 2.0 was having major issues rendering texts written with big sized letters. And it is related to the Galaxy Tabs (so I imagine hardware) because I tried with So just redesign your app with smaller fonts and don't forget to complain to Samsung.
I have app on the Android Market called Speed Anatomy. It has been working and stable for months. Now before Android 3.2 came out with its pixel-scaling feature, my (canvas based) app appeared un-scaled on some tablets with a large black border. I ended up implementing pixel scaling myself therefore it doesn't use the new 3.2 mechanism. This was fairly easy
public void setSurfaceSize(int width, int height) {
synchronized (mSurfaceHolder) {
matrix.reset();
float cancasScaleFactorY=height/(480.0f*scale);
float cancasScaleFactorX=width/(320.0f*scale);
float cancasScaleFactor=Math.min(cancasScaleFactorY, cancasScaleFactorX);
matrix.postScale(cancasScaleFactor,cancasScaleFactor);
...
public void doDraw(Canvas canvas) {
canvas.setMatrix(matrix);
...
boolean doTouchEvent(MotionEvent event) {
float[] xy=new float[2];
xy[0]=event.getX();
xy[1]=event.getY();
matrixI.set(matrix);
matrixI.invert(matrixI);
//revert touch coordinate to the original unscaled coordinate space.
matrixI.mapPoints(xy);
...
This works with all devices I have tested, including most tablets, the simulators on 3.0, 3.1 etc.
Last week I received an e-mail from a user, saying he had a Sony S with Android 3.2 tablet and the touch coordinates were off by a few centimeters. As I don't have an actual tablet, I went to my local Staples store where they have multiple Android tablets on display. Loaded my app on an eee Transformer with Android 3.2 and a Galaxy tab 10.1 with Android 3.1 and they both ran my app flawlessly.
So I figured the user had made a mistake and used an older version of my app (although he specifically told me he had the latest version of my app from the Android Market).
Yesterday night I was at a concert and there was a booth from Canadian carrier Telus with some Galaxy Tabs 10.1 on display. As I was waiting for some friends and had some time to kill, I loaded my apps on it to do one last test.
To my surprise, the touch detection was all off! Basically it acted like the scaling was done on doDraw(), that is the app was full screen and pixel scaled (except for text which gets rendered at hi-res with my method) but the touch coordinates were not scaled, that is, I had to touch the screen in the top left 320x480 corner or the touches would register outside the screen.
So I have two Galaxy tabs 10.1 with Android 3.1 on it, touches scale properly on one but not the other. I also have a user which claim problems on a Sony S with Android 3.2 and I have it working correctly on a transformer with 3.2. There seems to be a problem with matrix operations on those that don't work correctly. Either it's an intermittent problem or there is some other factor that I haven't thought of. Oh I just thought about the fact that it is probable that the second GT10.1 was a 3g since it was outside on the street while the others were wifi, (I don't know about the user's) but I can't see this affecting matrix transformations, right?
I've never had problems with Android Fragmentation before this, in fact it has been easier than dealing with the different iOS versions.
Any clue what could be causing this and how I fix the problem?
There is a free version of the app if you want to try it. Let me know if it works on your device.
EDIT:I just thought of something else. Is it possible that the Android Market is randomly serving an older version of my app once in a while? I'm not sure but I think this bug may have existed in a version that was briefly up on the Market. It has been fixed and updated more than a month ago however.
After weeks of struggle, I believe I have found the source of the problem in the method:
void android.graphics.Matrix.mapPoints(float[] dst, float[] src)
Since: API Level 1
Apply this matrix to the array of 2D points specified by src, and
write the transformed points into the array of points specified by
dst. The two arrays represent their "points" as pairs of floats [x,
y].
Parameters
dst The array of dst points (x,y pairs)
src The array of src points (x,y pairs)
On some devices, this function cannot take the same array for src and dst. I myself has never been able to reproduce the bug. I debugged this blindly by changing things in my code, making releases and asking some users who had reported the problem if it was fixed.
Changing every instance of:
matrix.mapPoints(xy,xy);
to
float[] tempxy = new float[2];
tempxy[0] = xy[0]; tempxy[1] = xy[1];
matrix.mapPoints(xy,tempxy);
seems to have fixed things.
I found another dev reported the issue here:
http://androidforums.com/developer-101/172225-android-graphics-matrix-mappoints-fail.html
This dev mentions it only happening on older versions of Android whereas to me it seems like a regression happening when users upgrade to Gingerbread. Although it doesn't affect all gingerbread devices as my Galaxy SII X was never affected. I was unable to reproduce it on the Android simulator.
Another hint that Gingerbread is affected is that lately with its rising popularity, users have been reporting the issue more often and the active installation count of my app started to go down. After I implemented the fix, it started going up again.
I hope this helps someone.
Matrix invert_m = new Matrix();
matrixI.invert(invert_m);
invert_m.mapPoints(xy );
You need a new instance of Matrix for return the invert one;