First of all, sorry for my english. I'm programming an Augmented Reality Browser for Android following the steps in Raghav Sood's book. Its title is Pro Android Augmented Reality. The thing is that when i launch the example of chapter 9 (named Pro Android AR 9 and located in https://github.com/RaghavSood/ProAndroidAugmentedReality) in my devices, it works fine or not depending on the kind of device.
In my Sony Ericsson Xperia Arc with Android 4.0.4 the browser implemented by Raghav works really fine. When i test the augmented-reality browser in Asus Eee Pad Transformer or in Samsung Galaxy Tab, both of them with Android 4.0.3, the readings from the sensors seem to be wrong. Holding the tablets in landscape mode in front of me, if i focus over one spot (wikipedia, twitter or local markers) and turn over myself to the right, the spots goes down. If i turn to the left, the spots goes up. If i want the spots to move to the left, i have to tilt the devices making their screens look to the ground. In the same order, if i turn the devices making the screens look up to the air, the spots will move to the right.
Seems like the program is confusing sensor readings. Anyone knows how to solve the problem??? Thanks in advance.
i made it. I don't know if it is a good solution but seems to work fine on tablets.
Changing the axis to:
SensorManager.remapCoordinateSystem(temp, SensorManager.AXIS_X, SensorManager.AXIS_Y, rotation)
in SensorActivity seems to fix the axis malfunction. Other problem was that the text asociated with each POI appeared turned in tablets using landscape mode. To avoid this, i used a conditional rotation modification in set method of PaintablePosition class:
if(drawObj.toString().toLowerCase().contains("paintableboxedtext")) rotation=rotation+270;
Related
Launching GVRDemo scene using Unity5.4.2f2-GVR13 installation package I'm experiencing unsteady tracking on my Samsung Galaxy S7. As you can see on an attached video it's not caused by low frame rate nor any of my code since the only thing I've changed in the scene is switching the cubeRoom object with a textured sphere object which better visualises the issue:
https://youtu.be/_NRQNbtdpuI
It doesn't matter if I change the quality setting from Fantastic to Fastest.
As you can see the frame rate doesn't drop around the stuttering moments, so it's not about the CPU/GPU performance
As I test the Google Street View app for example, there's no such issue – is it because it's been written natively for Android?
On the other hand I've noticed games like VR Fantasy with tracking system behaving differently – more smoothly due to the delay in the reaction time regarding the device's movement (looks nice, but causes nausea after 5 seconds). This makes me believe there is an issue with Google VR tracking.
Is anyone experiencing the same thing? What might be the reason for it?
I am also testing a simple VR app made with Unity (5.6b9), and i'm finding that Android performance is rather poor. This is the case on cheap phones (Moto G, $150) as well as fancy phones (Nexus 5X, Asus ZenFone 3) and even expensive phones (Samsung S6).
I'm particularly puzzled by how poor the performance of a VERY simple VR app made with unity is (empty scene, a cube and a sphere, no special lighting, single pass rendering...). The Samsung S6 performes very well with native GearVR apps, or photos/videos. All the phones perform very well with things like street view or youtube.
The same unity app running on iOS outperformer all the androids by a wide margin.
Are there some tricks we're not aware of for getting performance out of android?
I'm using Unity5.4.2f2-GVR13 too, and I think it's just random.
Charging up your phone and controller, and rebooting the phone seems to help a lot.
I used to extract accelerometer data by android SDK on Samsung Galaxy S4,
and I encounter a strange problem when I change platform to HTC One M8.
Here is the statement about the experiment:
Once I move the device from place A to place B straight(negative direction of device), I will read the accelerometer data and compute the displacement between A and B.
The curve of acceleromter data on S4 is correct.
It contains two pick with different sign and the shape like 'S' lie on the floor.
But when I use M8, it give me the curve which is wrong obviously. it looks like 'W'
P.S. The motion and program of two are totally the same.
Can anyone give me some reason to the difference?
Is the g-sensor on M8 is the problem?
I'm really stuck on it.
thanks.
From what I understand from trying to use the Moves app, this is due to lack of support of the accelerometer when the device's display is off.
I am looking for a way to change that now and if I find something I will update here.
update:
It seems this could just be due to new hardware not being supported yet. This quote leads me to believe that "...are the low-powered, always listening "Smart Sensors." Accelerometers are nothing new, but HTC's can be used by apps all day long without significant drain on the the battery (as they don't fire up the processor etc). As HTC's opened up the API for these -- dubbed HTC's Smart Sensor Hub -- app developers will be able to hook into this information directly."
I have an OpenGL ES 2 app running on Android. I have tested on a few devices:
Samsung Galaxy S2
LG Optimus G
HTC One X
Kindle Fire
Kindle Fire HD
And the app runs as expected. However, there is a lingering issue on my Samsung Galaxy S3. In my demo, I render a bunch of spheres. I can also pan the camera around by touching and dragging my finger on the screen.
What I notice is "ghosting" when I move the camera. It's difficult to describe, but I can see the previous outlines of the sphere as I move the camera. And, I can continue to see the previous outlines as the camera moves. I don't see all the previous outlines -- only the last few (it's difficult to quantify things here). And, I only see the outlines within the sphere -- as far as I can tell, the previous outlines cease to exist outside of the sphere.
However, once the camera stops, the outlines catch up and disappear within ~1s. Simply put, when things are stationary, everything renders correctly.
I recently had some texturing issues (related to mipmapping) and I solved them the other day. The problem and solution are outlined here:
Black Artifacts on Android in OpenGL ES 2
Could my texturing fix be related to this? I realize that I'm leaving out A LOT of details, but I'm wondering if the symptoms are enough to go on? Any ideas?
Thanks.
Additional details:
The ghosting does not show up when taking a screenshot using the NDK.
A photo of the problem:
A temporary solution is on your phone, under Developer options, check the box to "Disable hardware overlays."
I'm not yet sure if there's a way to force this behavior when running your app.
At glowscript.org are various demo programs written in JavaScript or CoffeeScript that involve little code.
For example, the one-line program box() creates a 3D cube that can be rotated and zoomed, thanks to many defaults (which can be overridden), including basic lighting (two distant lights and some ambient lighting).
Problem:
These programs run fine in many browsers on Windows, Mac, and Linux, but in Firefox on the Samsung Galaxy S3 they are very dark. The appearance indicates that ambient light works (increasing it makes the scene bright) but the distant lights don't work (no difference with them on or off). I've tried running some WebGL demos found on the web and they look fine.
Can anyone think of where I should look for the problem? Why should the behavior be so different between desktop/laptop behavior and what happens on the Galaxy S3?
I fixed the problem on the Galaxy smartphone and added the following to the GlowScript help:
"Most tablets and smart phones do not yet support WebGL, though this is likely to change. On the Samsung Galaxy S3 smartphone, Firefox and Opera do run GlowScript programs, though animations are slow, transparency is buggy, and currently there is no way to zoom and rotate. There are reports that GlowScript also works on the Sony Experia smartphone."
The problem was that the Galaxy shader compiler does not handle for loops correctly. In the fragment shader there was a for loop over the various lights (up to 8 lights). Variables set in the for loop were often set to zero instead of to the specified value. The solution consisted of replacing the loop with a straight-line structure like this, where LP(i) and LC(i) are light positions and colors:
if (light_count == 0) return;
calc_color(LP(0), LC(0));
if (light_count == 1) return;
calc_color(LP(1), LC(1));
if (light_count == 2) return;
etc.
Yuck. Fortunately we only have to support a finite number of lights.
I have got a small application that uses compass readings on my device (Xperia Neo V). When testing it, I got very satisfying results while pointing the desired direction. But I tested it on a Samsung Galaxy S3, and I run the same application on my neo V at the same time, the two devices did not show exactly the same direction. There was a drift about 10 degrees.
So, as an electronical engineer I know that different digital compass sensors may give different values due to design, but I coulnd't find an efficient solution to that yet. Any Ideas?
I would try some experiments using rotations in different direction on both devices in the same environment to find any correlation on the results. Possibly there's an offset that should be removed. A real pure compass device could help to estimate the noise level.