I have a requirement that I have to count that At End of Day how many Time user kept Hand on Android Screen ?
So for that after googling I found that, there is proximity Sensor in Android which senses hand movement without touching screen. Proxymity Sensor is located somewhere on Top of Screen for Accessing object close to phone while attending and cancelling any calls
But I wanted to capture user hand press on Android Screen. What could be better solution ? or can my purpose be fully solved via proximity Sensor only ?
Proxymity Sensor is very rough and used to tell whether mobile device is close to something big as head.
You need to implement hooks for everything that user may touch, and increase your counter, that should be saved/restored when app goes to background and returns.
Note that you only can know about touches within your app.
Related
As in Unity3D turn off the screen without entering sleep mode, so that the application continues to work?
I don't believe it's possible to have Unity run in the background.
I don't know if this is still common practice, but in the past, many people simply draw a fully black screen to conserve power.
See: https://forum.unity.com/threads/application-runinbackground-is-not-working-on-android.117723/
Have you looked at - https://docs.unity3d.com/ScriptReference/MonoBehaviour.OnApplicationFocus.html
OnApplicationFocus is called when the application loses or gains focus. Alt-tabbing or Cmd-tabbing can take focus away from the Unity application to another desktop application. This causes the GameObjects to receive an OnApplicationFocus call with the argument set to false. When the user switches back to the Unity application, the GameObjects receive an OnApplicationFocus call with the argument set to true.
You could implement this with the gameobjects you want to continue behaving as you like.
Is it possible to determine if a device (non-rooted) is in use at the moment, even if my app is not in the foreground? Precisely "in use" means the user made touch events in the last 5 seconds or display is on.
If so, what specific rights are required?
Thanks
AFAIK, android security model would not allow you to record touches if your app in not in the foreground.
There are some crude workarounds like overlaying a transparent screen to record touches. Not sure if these work now though.
"in use" means the user made touch events in the last 5 seconds
In Android, that's not practical, short of writing your own custom ROM.
or display is on
In Android, you can find out if the device is in an "interactive" mode or not. This does not strictly align with screen-on/screen-off, as the whole notion of screen-on/screen-off has pretty much fallen by the wayside.
For the research I am doing, I need to some way to track user touches when they are using the phone in general daily basis. The user will be fully aware about what they are recording. Any method to do would be great.
What have I tried so far?
Method 1.
Create an service with overlay transparent view.
Problem Due to obvious security flaws this is prevented starting with ICS. The input touches on the transparent view is not transferred to background and hence user is not able to interact with phone normally. I tried various methods with overlay view defining as type phone or type system alert or switching between them during program execution.
Method 2.
View with 1% screen size make with touch outside model
Problem As problem as previous. Touch outside only returns the if touch event happened outside without even initial x, y coordinates.
There are other methods I tried but those are highlighted. Currently, I am thinking about other options:
Option 1 - The pointer location option in developer options: In settings there is this pointer location option that I can utilize. When that option is on, all the info about touch are shown in the top panel. If I can have access to those data afterwards that would be fine too, despite the fact that there will be drawings on the screen when user is using the phone. I have traced the source code of ICS and found the core class that is making that tracking possible. I recreated that class into an service. The logcat shows all the info about touch when I am running it from app. Only problem is the same as problem 1. I cannot track it outside current app. So, if it logs the tracking info even when pointer option is turned on, how will be able to get the information later to use?
This option seems the easiest.
Option 2 - Android NDK If above method is not possible is it possible to do so using NDK? Right direction to this route is also great.
Option 3 - Custom ROM Do I really need to go for Custom ROM while doing this? I am pretty sure this is 100% sure way to do it. But it is seeming very impractical in this particular research.
I would appreciate any suggestion to the path that I can follow.
Thank you all in advance.
You can use rooted phones and RepetiTouch for your research. https://play.google.com/store/apps/details?id=com.cygery.repetitouch.free
I'd like to create an application that utilizes touch-screen as a "pad". There will be 3 small buttons in the bottom area of touch-screen, and the rest will be used a mouse movement area.
The first button will act as "left-click" in real mouse, the second one will act as "scroll", and the last one as "right-click"
When a user make any movement (event "move", "up" , "down" or "cancel") in that area, the real mouse-pointer in Windows Desktop will also move.
Transmission media will be Bluetooth and Wifi.
so, here's some questions :
1). is it possible to utilize multi-touch in Froyo ? Example for this case is when user want to "block" some text. In real mouse, we just hold left-click and then drag the pointer. In android, this will be touching the first button while at the same time, touching the "pad" area and make some movement.
2). How can I turn this application concept into a real application ? ( general idea or algorithms )
You might want to check out RemoteDroid. It's an open-source app which has most of the functionality you described.
http://code.google.com/p/remotedroid/
An app like this is going to have two main parts. An Android app which generates a series of movement vectors or movement data, and a program on your target operating system which receives this data and translates it into a software mouse. You will also need the bluetooth stack's necessary for that transfer (I get the feeling wifi won't give you the responsiveness you want without some serious optimization)
When it comes to the Android side of matters, I think you'll need to experiment in the best way to capture those mouse movements. I'd think a speed-vector structure might be your best bet, and it seems most similar to what I know of Mouse Movements.
Basically, I am trying to write a benchmark application to test the responsiveness of different Android devices' touchscreens. I figured the best way to go about doing this is to write an application that counts the number of samples returned from the touchscreen whenever it is touched. Unfortunately, I have no idea if this is possible and so far haven't found anything relevant to this idea.
I will most likely be using the MotionEvent class to determine when the touch screen is pressed, but what classes are available to determine the samples returned?
This is how I imagine my app to function:
Start app
Brief description screen, then button to begin testing
User touches an area of the screen
I haven't really determined how to do the output yet, but either part of the screen updates a real time graph, or the user just touches for some time and after he releases his finger from the screen, the app will output a new graph in a different activity.
Any help would be useful.
Thanks!