I have a very creative requirement - I am not sure if this is feasible - but it would certainly spice up my app if it could .
Premise: On Android phones, if the screen is covered by hand(not touching, just close to the screen) or if the
phone is placed over the ear during a call the phone locks or
basically it blacks out. So there must be some tech to recognize that
my hand is near the screen.
Problem: I have an image in my app. If the
user points to the image without touching the screen, just as an
extension to the premise, I must be able to know that the user is
pointing to the image and change the image. Is this possible ?
UPDATE: An example use:
Say I want to build a fun app, on touch the image leads to some other
place. For example - I have two doors one to a car and one to a lion.
Now just when the user is about to touch door 1 - the door should show
a message saying are you sure, and then actually touching it takes you
to another place. Kinda rudimentary example, but I hope you get the
point
The feature you are talking about is the proximity sensor. See Sensor and SensorEvent.values for Sensor.TYPE_PROXIMITY.
You could get the distance of the hand from the screen, but you won't really be sure where in the XY co-ordinate system the hand is. So you won't be able to figure out whether the user is pointing to the "car door" or to the "lion door".
You could make this work on a phone with a front camera with a wide angle so it can see the whole screen. You'd have to write the software for recognizing hand movements, and translate these to screen actions.
Why not just use touch, if I may ask?
Related
I want to detect where the user is looking on the screen and how long he sees the specific part of screen. It will be better if I'm able to detect at what widget (TextView, EditText etc..) the user is looking at the moment.
I didn't get anything perfect yet. Can we do this with any Android sensors or any other way?
I would like to create Android app for viewing images.
The idea is that users keep their tablet flat on the table (for sake of simplicity, only X and Y for now) and to scroll the picture (one that is too big to fit the screen) by moving their tablets (yes, this app has to use tablet movement, sorry, no fingers allowed :) ).
I managed to get some basic framework implemented (implementing sensor listeners is easy), but I'm not sure how to translate "LINEAR_ACCELERATION" to pixels. I'm pretty sure it can be done (for example, check "photo sphere" or "panorama" apps that move content exactly as you move your phone) but I can't find any working prototype online.
Where can I see how that kind of "magic" is done in real world?
Is it possible for an android app to run in the background, listen to specific triggers and modify the visible application content in the screen (in both system and third party apps)?
For an example, A 2 finger tap should lead to a ripple effect on the screen. If the temperature is high, The screen turns more reddish. If I say "do a barrel roll", The entire UI does a "barrel-roll" like the google Easter egg. And this should happen whether the user is in the homescreen, settings or his Instagram.
The best working example I can give is the built-in "Magnification Gesture" provided by Android. Three taps anywhere will zoom everything up except for the keyboard and the navigation bar. And it doesn't zoom as an image, the touch points are preserved.
Is this possible to do without or with root? Do I need a framework like Xposed?
Thanks.
I hope most of you have used the Panorama Image option in Camera app. They ask/force the user to move the device in one direction e.g. if you take 1st image then you move to 2nd image and then it will show you a TARGET BOX and when you reach that box the image is captured again and so on.
Panorama Images Example in Samsung Galaxy s2
I am interested to know that how can we achieve this functionality so that we can set target for user to move the device towards a certain direction. Which sensors can help to achieve this functionality and factors are involved?
Just a simple example is that we make a simple game in which we get an arrow towards the Target Box and when we get that target box in focus we get some score etc.
Please guide me on this, thanks.
I want to create some sort of graphical arrow, or possibly draw an arrow over a compass to show the user what direction the wind is coming from. This would obviously change, given the orientation of the persons handset.
My application can tell me what direction (in degrees) the wind direction is coming from.
My question is, what is the best way to implement something like this?
Thanks
In your Exclipse create a new Android project and select "Create project from existing sample". Choose target android version and then ApiDemos. There you will find a Compass application and many other examples which can help you draw your screen.
I guess the best would be if your wind arrow would be in 3D or simulated 3D, so that it does not matter how the user is holding his device, for he would always look at the wind arrow from an elevated virtual vintage point.
In the same ApiDemos there is also "Sensors" demo which draws the physical orientation of the device.
Draw a compass, draw the wind arrow accordingly.
If the device knows its orientation, rotate the whole thing so that N on the compass points to actual North.
Then ask users whether they are happy with this setup, if not, why, improve, etc. But start with something dead simple, like the above.