We are developing an app using air for android. The app requires movement of the device in a single plane. The user will hold the device and rotate in his position, keeping the device stationary with his/her respect. Since the device is held fixed in its axis, we cannot use the accelerometer. Is there any other way we can get any value corresponding to this movement using actionscript, i.e air for android?
Related
Project Tango has Motion Tracking API. I'm curious what's the best way to track a motion in similar way (i.e. track position and orientation of the user's device in full six degrees of freedom) on standard Android and iOS devices using any kind of 3D party SDKs and/or physical additions (like markers or beacons)?
You might be interested in visual odometry.
From this documentation:
Tango implements Motion Tracking using visual-inertial odometry, or VIO, to estimate where a device is relative to where it started.
Standard visual odometry uses camera images to determine a change in position by looking at the relative position of different features in those images. For example, if you took a photo of a building from far away and then took another photo from closer up, it would be possible to calculate the distance the camera moved based on the change in size and position of the building in the photos.
Visual-inertial odometry supplements visual odometry with inertial motion sensors capable of tracking a device's rotation and acceleration. This allows a Tango device to estimate both its orientation and movement within a 3D space with even greater accuracy. Unlike GPS, Motion Tracking using VIO works indoors.
I want to develop an android app that captures hand gestures without touching the screen. What kind of sensors will I have to use? I don't want camera detection, that's too battery consuming. Will proximity sensor be of any help? How can I even access these sensors? Please, need help here in respect of coding.
Hi I make app in which I use camera and I need to detect in which side is phone moving (left,rigth,up or down) all sensor I have tried just return moving around axis.
So If I have camera and I will move phone to right/left/up/down how to detect it?
You can you the linear accelerometer to detect which direction the user is moving. This page has all the details you will need. There a few sample projects you can have a look at as well.
http://developer.android.com/guide/topics/sensors/sensors_motion.html#sensors-motion-accel
I am creating a project in Android.
This have pre-requirement is : Play video By moving Android device.
I have implemented Accelerometer Sensor, but if I move device on Plain surface and move device Up, Down, Left or Right, then No Event is called. It is only detects when device rotates in any direction.
Is it possible to detect moving device on plain surface??
Thanks in Advance.
I'm trying to develop an app that can recognize an active object (for example: a memory) that touch the smartphone display. Before I start to develop I've to know if there's any objects that my touch screen display can recognize? Which device can be recognizable by a smartphone display? I'm interested to know that for iPhone or for Android phone.
I found this app and you can see that with a card I can interact with a mobile device, now I'm asking you if anyone know how to do this kind of app with an iPhone or with an Android phone.
Does anyone knows how to do that? There's a library (iOS or Android) to recognize object that I put over the display?
volumique is the company that develops the monopoly card technology that you are talking about. However I will suggest two things.
For Android devices you can use NFC. Its kind of what you are doing right now but you just need to bring your object closer to the screen, no need to actually touch it.
For iOS, there is no NFC or RFID technology available. However you can develop a hardware which has active capacitors arranged in a pattern over it so when you bring your device closer to the iOS screen, the touch controller should recognize the pattern of the capacitors and report this to the main controller which can do recognition of the object with the help of your code. basically capacitive touch screens used in iPhones are just an array of capacitors arranged in a grid pattern. So when you touch using your finger, you change the capacitance of one or two capacitors and then the controller finds out the location of the change. However if you change the capacitance of say 5 6 sensors at the same time, in a particular order like in a pentagon, then you can write software for your controller that if the location of the sensors whose capacitance has been changed by this external object form the shape of a pentagon, then show the viewer that it is a 5 $ card (just an example). This is one way I can think of doing this.
Thanks