I want to provide a service of augmented reality in my app using the location of the user. For example, if the user frames with the device's camera a monument, it must be provided a description on it.
How can i implement it on an Android app?
What framework I need to install?
Where I can find a few examples showing the basic functions?
EDIT
Rather than display the information on the monuments framed by the device, i could simply show in which direction are located certain points of interest. But, given a certain direction (eg north), how can i determine what is in that direction within a certain radius?
I think this is a whole field of study...
for example in Android, for implementing location you have to use the LocationManager.
To do the thing of the monument, you have to use iBeacon for Android for example.
Briefly, what you're looking for is "IPS - Indoor Position System". I dare to say this is not a place to ask for a "whole app projectation"
Good luck.
I have found the solution to the question above by myself. I'm using Metaio for android! It is a powerfull tool which provides a lot of examples about Augmented Reality!
Related
I am developing an augmented-reality app to be used on both Google's Project Tango tablet, and on ordinary android devices. The AR on the normal devices is being powered by Vuforia, so its libraries are available in the development of the app.
While the Tango's capabilities offer a unique opportunity to create a marker-free AR system, the Pose data has significant drift that makes it difficult to justify Tango development due to the data's instability.
When Vuforia was being researched for eventual inclusion into the app, I came across its Extended Tracking capabilities. It uses some advanced Computer Vision to provide tentative information on the device's location without having the AR marker onscreen. I tried out the demo, and it actually works great. Fairly accurate within reason, and minimal drift (especially when compared to the Tango's pose data!)
I would like to implement this extended tracking feature into the Tango version of the app, but after viewing the documentation it appears that the only way to take advantage of the extended tracking feature is to activate it while viewing an AR marker, and then the capability takes over once the marker disappears from view.
Is there any way to activate this Extended Tracking feature without requiring an AR marker to source its original position, and simply use it to stabilize and correct error in the Tango's pose data? This seems like the most realistic solution to the drift problem that I've come up with yet, and I'd really like to be able to take advantage of this technology.
this is my first answer on stack overflow, so I hope it can help!
I too have asked myself the same question for vuforia, as it can often be more stable with extended tracking than with a marker, like when far from a marker, or/and at an angle for example, it can be unstable, if I then cover up the marker, therefor forcing the extended tracking, it works better! I've not come across a way to just use extended tracking, but I haven't looked very far.
My suggestion is that you look into maybe using a UDT (user defined target) In the vuforia examples, you can find how to use UDT. They are made so that the user can take a photo of whatever he likes as a target. but what you could maybe do, is take this photo automatically, without user input, and the use this UDT, and the extended tracking from the created target.
A suggestion I thought useful. Personally I find the tracking of the tango amazing and much better than vuforia's extended tracking (to be expected with the extra sensors) But i suppose it all depends on the environment.
Good luck, hope this suggestion could work,
Beau
I want to make a simple location based app for android. What are my (free) options. I looked at SDKs like Wikitude and Vuforia, but they are too much for my needs. I do not need image recognition and stuff... I just need to see on camera direction and distance to some coordinate - location.
Try Junaio and Metaio augmented reality frameworks. They are easy to work with. And also take a look at this Augmented Reality Framework comaprison table. You can choose GPS feature enabled Frameworks from it.
I suggest you to use api's called Junaio or Metaio. Before somedays I have just worked with it. It is very nice to work with and results are also very good with nice efficiency. Just try with it. Here is the link.
My task is to develop application for Android that should be used by tourist. Basic use case: I am going through old part of some town and then i start my app, point with camera to some place and some old building that is already gone will be present in its place as it was before.
My first direction that i was exploring was location based recognizing, I tried some frameworks like Wikitude, MetaIO and DroidAR. None of these was 100% fulfilling my need, because (in my opinion), noone was using (for its robustness) the newest tools that should make easier this task, like new Google Play Services Location API. I dont know if I could do better but I would prefer not to write my own solution.
I am now thinking about exploring marker based recognition but it would require additional work to place some markers to desired places and I dont believe that user would be in right angle and distance to that marker. I have seen some video that used some sort of edge detection but none of frameworks I used had this feature.
Do you know about some direction, technology or idea that I could explore and may lead to successful solution?
Augmented Reality will transfer real coordinates system to camera coordinates system. In AR Location-based, the real coordinate is Geographic coordinate system. We will convert the GPS coordinate (Latitude, Longitude, Altitude) to Navigation coordinate (East, North, Up), then transfer Navigation coordinate to Camera coordinate and display it on camera view.
I just create demo for you, not using any SDK
https://github.com/dat-ng/ar-location-based-android
I personally recommend you to use "Wikitude". Because I have created AR app for android using Wikitude SDk.
Also here I'm providing you app link which has been developed by Wikitude itself.
See below link :
https://play.google.com/store/apps/details?id=com.wikitude&hl=en
This app will give you brief idea about exploring place details using Wikitude Sdk. These sdk have free as well as paid library. It is well documented & very easy to implement. Also they have given very good sample practices for beginners.
Refer this link :
http://www.wikitude.com/products/wikitude-augmented-reality-sdk-mobile/wikitude-sdk-android/
I hope this will take you on track.
you already had some great ideas about your app. I guess these links will make you to learn more.
See links below:
http://net.educause.edu/ir/library/pdf/ERB1101.pdf
http://www.adristorical-lands.eu/index.php/sq/augmented-reality-app
Hope this will help you to go further in your project. Thank you.
As far as I understood, Vuforia is a good starting point for developing AR-Applications on the Android Plattform.
The Docs for Simple Virtual Buttons are quite good, but how would one combine this with location based data?
For Example:
On the application level, both markers and location based data should be used; so one would need f.e. Vuforia and another component for integrating location based data.
To get a deeper insight of what should be possible, here is an example:
You go through a landscape were the phone can
1.) recognize its position and view location based points on the screen and
2.) recognize objects in your view and perform actions upon "touching" those (virtual buttons, I learned..)
So my final question is:
Do you know examples of frameworks or demo-apps, where such a task is being accomplished by tying Vuforia together with location based AR Product/Framework XYZ?
Please excuse me, If I am not as precise as needed-I searched SO, but (as far as I saw) there are no such questions already.
for location based AR like wikitude Layar etc who draw poi based on location on camera view you have to get poi from different Api like openstreet ,twitter,etc after parsing and put these data on cameraview. sensor values and gps values are used.
to start with you can go through Mixare code which is open source .
you can implement this code with Vuforia cameraview.
follow my ans here for details. Gud luck
Hello i want to ask which is the best sensor i can use for an augmented-reality application? my augmented reality app is using the mobiles camera and finds points of interest in the live view. i want to detect when the poi is in the field of view of my camera. I have read a lot of articles and i want to decide which option is the best. Here are my choices:
1)Compass with accelometer
2)Rotation vector
i though that the only solution is the 1) but finally i think that the 2) is more simple to create and more accurate than the first. Thanks in advance!
Mostly, you need to use multiple sensors. I've written a POI app, and use the accelerometer, GPS, compass and orientation sensors to get the current device position and field of view.
You should probably start from the basics of Augmented Reality, before attempting an app like a POI one, as they are considerably complex.
Might I humbly recommend my book, Pro Android Augmented Reality?
Even if you don't get a copy, you can still pick up the source code from it's GitHub repo. Chapter 9 contains code for a POI example app that shows nearby tweets and wikipedia articles.