I want to load an object using opengles and view it when rotating the device, as if there is a real object floating in real world.
For example, when I turn my head right, the object in the screen move left.
I'm using the android magnectic and accelorate sensor , but the result is hardly good.
What's the general way of doing this VR like app, Just like google cardboard?
Is there any better way?It is highly appreciated if there is any example, thank you!
Related
I am making an Android app for Unity, using my own VR engine. It is small, and I have got looking around working perfectly. The only problem I am experiencing is where I cannot get my eyes to focus on the objects in front of them. I get double vision where my left eye sees objects too far to the left, and right eye to the right. I have tried pointing the eye cameras slightly inwards and moving them based on a raycast to find out where they are looking.
I am guessing it could be something to do with pointing the eyes outwards, my rig - nintendo labo headset with android phone inside [ making do with what I've got ;) ] - unfortunately the phone and lenses don't quite line up but this doesn't seem to affect one of my other projects, or perhaps I need to distort my camera in a special way.
Honestly, I have no idea! Some help from an expert or anyone who is slightly clued up in the subject would be greatly appreciated :D
It turns out I literally just need to point the cameras out rather than in
I tried to achieve using google's cloud Anchors, but it has a limitation of 24hrs (after that the cloud anchors become invalid).
And another way is creating the replica of Unity, but that would be too lengthy process.
Any other ways please suggest me or any idea: https://www.insidernavigation.com/#solution - how they achieved it?
And how to save the common coordinate system in cloud or locally?
Current versions of ARCore and ARKit have limited persistence capabilities. So a workaround - which I think is what they use in that site you linked, is to use images/QR codes which they use to localise the device with a real world position and then use the device's SLAM capabilities to determine the device's movement and pose.
So for example, you can have a QR code or image that represents position 1,1 facing north in the real world. Conveniently, you can use ARCore/ARKit's to detect that image. When that specific image is tracked by the device, you can then confidently determine that the device is position 1, 1 (or close to it). You then use that information to plot a dot on a map at 1,1.
As you move, you can track the deltas in the AR camera's pose (position and rotation) to determine if you moved forward, turned etc. You can then use these deltas to update the position of that dot on your map.
There is intrinsic drift in this, as SLAM isn't perfect. But the AR frameworks should have some way to compensate against this using feature detection, or the user can re-localize by looking for another QR/image target.
As far as my knowledge is concern, This Virtual Positioning system has not been introduced yet in Google arcore. The link you provided, these guys are using iBeacon for positioning.
yup I believe it could be possible. Currently most developed ways have its limitation .I am working on to find another way with the fusion of Cloud Anchors with IBeacon.
I want to develop a car race game in which I want to rotate car object (not image) in clockwise direction at the time of car selection. After that I want to change the car object color when user drag the screen to left, like image sliding in Android. I don't know about any type of 3rd party API which is used for gaming development to make it easier. I think that the rotating car must be an animation but As I search Android only supports the animations in the form of XML.
Please refer to OpenGL in Android, i think it's the only reasonable way for you to complete your task and to program any game in general.
This link will help you start: http://www.learnopengles.com/android-lesson-one-getting-started/
Can anyone help me with sample code for a compass needle that points in the direction where the phone is tilted?
I am trying to develop an application to rotate a bitmap in OpenGL from accelerometer values.
Ok so thanks to the comments above we do not have to use OpenGL, and while you could, I personally believe you can make life simpler by using a custom View.
Now in traditional standard StackOverFlow past time I am not going to give you code for this but extrmely large leg up. There is a thermometer example available here http://mindtherobot.com/blog/272/android-custom-ui-making-a-vintage-thermometer/.
Why have I sent you here?
It contains an example that renders a dial exceedingly close to a compass, with a few minor tweeks it could easily become a compass in terms of design. you will just need to remove the temperature related code and use the accelerometers instead.
It is a good intro to custom views and will show you how to get started.
I made a clock after following the tutorial just as another possibility to inspire you.
I am trying to display a .md2 model over top of the camera preview on my android phone. I don't need to use the accelerometers or anything. If anyone could even just point me in the right direction as to have to set up an opengl overlay that would be fantastic. If you are able to provide code that shows how to enable this that would be even better! It would be greatly appreciated..
I'm not able to provide code until later this week, but you might want to check out a library called min3d, because I believe they already have a parser written for .md2 files. Then I believe that if you use a GLSurfaceView, the background can be set to be transparent, and you can put a view of the camera behind it. Are you trying to get some kind of augmented reality effect? There are android specific libraries for that too, but they're pretty laggy (at least on my Motorola Droid).