How to programmatically get touch coordinates to the screen in Android, regardless of open applications or games? The display of these coordinates can be turned on in the Developer options menu. Example of required coordinates. But how to get access to them, which class is responsible for this? Maybe there is some library or some way to find it?
I tried to use Accessibility service, but it doesn't have this data.
Related
Is there a way to get every single touched pixel during a touch event? I understand that Android can provide an eclipse but I would like to access every single pixel that was touched for research purposes.
I understand this is the same question as How to get the all the raw touch points at a particular instant for a tap?
Did something change since then to provide this data now on Android devices or possibly iOS?
I am currently trying to draw a line on top of a textview content in android default screen for example., settings screen. it is possible to get content of the screen if that screen belongs to our own application, i want to know how can i get content of device default screen content from my application. as part of going through few discussion i got if we get the view object of that screen we can able to do that. so is it possible to get the view object of a android default screen? kindly help on how can we achieve this.
thanks
so is it possible to get the view object of a android default screen?
No. That app runs in another process. Its objects cannot magically transport themselves to your process.
i want to know how can i get content of device default screen content from my application
On Android 5.0+, with user permission, you can use the media projection APIs to take screenshots.
I'm trying to develop an app that can recognize an active object (for example: a memory) that touch the smartphone display. Before I start to develop I've to know if there's any objects that my touch screen display can recognize? Which device can be recognizable by a smartphone display? I'm interested to know that for iPhone or for Android phone.
I found this app and you can see that with a card I can interact with a mobile device, now I'm asking you if anyone know how to do this kind of app with an iPhone or with an Android phone.
Does anyone knows how to do that? There's a library (iOS or Android) to recognize object that I put over the display?
volumique is the company that develops the monopoly card technology that you are talking about. However I will suggest two things.
For Android devices you can use NFC. Its kind of what you are doing right now but you just need to bring your object closer to the screen, no need to actually touch it.
For iOS, there is no NFC or RFID technology available. However you can develop a hardware which has active capacitors arranged in a pattern over it so when you bring your device closer to the iOS screen, the touch controller should recognize the pattern of the capacitors and report this to the main controller which can do recognition of the object with the help of your code. basically capacitive touch screens used in iPhones are just an array of capacitors arranged in a grid pattern. So when you touch using your finger, you change the capacitance of one or two capacitors and then the controller finds out the location of the change. However if you change the capacitance of say 5 6 sensors at the same time, in a particular order like in a pentagon, then you can write software for your controller that if the location of the sensors whose capacitance has been changed by this external object form the shape of a pentagon, then show the viewer that it is a 5 $ card (just an example). This is one way I can think of doing this.
Thanks
I am building an Android app that uses the phone's camera feature. I know there is a way to build a custom camera view. Instead, I am choosing to use the camera app via an intent and not build my own camera view. I want to disable/make disappear the pause button while taking a video and have just the stop button. I looked up the Camera API Guide at www.developer.android.com but it doesn't talk about how I could do this. Does anyone know a way to do this?
I doesn't think that this is possible. Using Intents is just a way to tell Android "hey, I'd like to take a video (photo, see MapView, etc). Can you do it for me?". It may trigger one or MORE Apps listening to that Intent, depending on what apps the user has Installed. Usually you can only choose very basic options via Intents, i.e. take video/picture or tell the MapView at which Position it should show up. These options usually also appear inside the App during normal use. I never see a "CustomCamera-App" that hasn't a pause Button, or where one is able to deactivate it inside the menu. Therefore the chances that it is possible to set that special option tends to zero.
I'm looking at the doc http://developer.android.com/reference/android/hardware/Camera.Parameters.html
and I'm not seeing anything that would allow me to make it possible for the user to tap on a specific point in the camera preview and have that become the point on which the camera would try to focus.
Is this simply missing? or am I overlooking how this can be done?
In Android 4.x, this is possible, with the setFocusAreas. You'll have to check getMaxNumFocusAreas first to see if this feature is supported on your device, and how many areas to use.
Then, you'll need to convert the user's touch coordinates to the coordinates used by the Camera.Area object (described here), and call setFocusAreas with the coordinates. From then on, calls to autoFocus will use that region for selecting focus.