I am working on a project for which I have to measure the touch surface area. This works both for android and iOS as long as the surface area is low (e.g. using the thumb). However, when the touch area increases (e.g. using the ball of the hand), the touch events are no longer passed to the application.
On my IPhone X (Software Version 14.6), the events where no longer passed to the app when the UITouch.majorRadius exceeded 170. And on my Android device (Redmi 9, Android version 10) when MotionEvent.getPressure exceeded 0.44.
I couldn't find any documentation on this behavior. But I assume its to protect from erroneous inputs.
I looked in the settings of both devices, but I did not find a way to turn this behavior off.
Is there any way to still receive touch events when the touch area is large?
Are there other Android or iOS devices that don't show this behavior?
I would appreciate any help.
So I've actually done some work in touch with unusual areas. I was focusing on multitouch, but its somewhat comparable. The quick answer is no. Because natively to the hardware there is no such thing as a "touch event".
You have capacitance changes being detected. That is HEAVILY filtered by the drivers which try to take capacitance differences and turn it into events. The OS does not deliver raw capacitance data to the apps, it assumes you always want the filtered versions. And if it did deliver that- it would be very hardware specific, and you'd have to reinterpret them into touch events
Here's a few things you're going to find out about touch
1)Pressure on android isn't what you should be looking at. Pressure is meant for things like styluses. You want getSize, which returns the normalized size. Pressure is more for how hard someone is pushing, which really doesn't apply to finger touches these days.
2)Your results will vary GREATLY by hardware. Every single different sensor will differ from each other.
3)THe OS will confuse large touch areas and multitouch. Part of this is because when you make contact with a large area like your heel of your hand, the contact is not uniform throughout. Which means the capacitances will differ, which will make it think you're seeing multiple figures. Also when doing heavy multitouch, you'll see the reverse as well (several nearby fingers look like 1 large touch). This is because the difference between the two, on a physical level, is hard to tell.
4)We were writing an app that was enabling 10 finger multitouch actions on keyboards. We found that we missed high level multitouch from women (especially asian women) more than others- size of your hand greatly effected this, as does how much they hover vs press down. The idea that there were physical capacitance differences in the skin was considered. We believed that it was more due to touching the device more lightly, but we can't throw out actual physical differences.
Some of that is just a dump because I think you'll need to know to look out for it as you continue. I'm not sure exactly what you're trying to do, but best of luck.
I am developing an Android application that requires devices to be laid side by side and/or above and below each other.
I know I can use the Nearby API to detect devices "Nearby" however I need something a little more "Finer Grained".
My app needs to be able to identify a device laying either on the left side, above, right side or below. While all devices are laying flat on a table (for instance).
I can find nothing on the web that describes this use case.
Is it possible?
UPDATE
My use case is that I want Android devices to be able to detect any number of "Other Devices" laying either to their left or right. The devices will be laid out horizontally with a "small" gap between each one.
In the same way that you might layout children's lettered blocks to spell out a word or phrase, or numbered blocks to make a sum.
not only should the line of devices be able to detect their immediate neighbours to their left and right the two devices at either end should be able to detect they they are the start and end (reading left to right) of the line.
Using proximity sensors is a likely way to solve your question. TYPE_PROXIMITY will gives the distance from a near by object. TYPE_MAGNETIC_FIELD gives the geomagnetic field strength on x/y/z.
For more read Position Sensors.
Making your own Mock GPS (Local PS to be exact). I don't have a link for this but its definitely possible. Check out how GPS works to get an idea. Wifi and Bluetooth are signals. but you know what else is a signal?
A: SOUND
make each phone make a large beep in turn and measure audio strength using receivers. This might work better than wifi/bluetooth. once you measure relative distances between every pair of phones, it only takes a good algorithm to find relative positions
A Possible Alternative Solution : use image processing. Get something like OpenCV for Android and setup one phone as a master. This will work only for a 2D Layout.
Another "idea" - use the cameras. Stick A board on top of your surface with 4 QR codes in each corner. (This will help identify the edges and orientation of your phone). If you're looking for a 3D layout and the phones have sufficient in-between space, you could stick a QR behind every phone and show a QR on the screen of every phone.
All of these are solutions. Maybe you can use individual ones. Maybe you can use a combination. who knows.
An idea, in case it's relevant for your use case:
Setup phase
start your app on each device in "pairing mode".
Each device will show a QR code containing the key required for communicating with the device (for example via Firebase), and screen details: size in pixels. It will also draw a rectangle at the screen boundaries.
A different phone, external to this layout will run your app as a "master", taking a picture of the phones from above.
Now you need to write an algorithm to identify the screens and their locations, orientation and extract the QR codes for analysis. Not easy, but doable.
Interaction phase
now all the phones (this should work on more than two phones) can collaborate screens to show parts of the same movie, for example.
Seems not, if You have only 2 devices, but if You have external sources (with known position) of any signal (audio, vibrate, BT or WiFi radio, etc.), which can be detected by devices with adequate accuracy, and devices time is synchronized, You can do this comparing time of signal start (or signal strength) on both devices like on this picture:
Or, if You can add some sensors to one of devices, You can create "other device locator", for example like this sound locator.
UPDATE
In a updated formulation, the issue is also not solvable: it's possible to determine which two devices are at the edge, but you can not determine which one is on the left and which is on the right side. It is necessary that at least one device knows that it, for example, is leftmost - then other devices, for example, generates a sound, the others receive it and determine their order according to the difference in arrival time. But the anchor point and synchronization of time are necessary.
By understating your use case, it is possible to find number of devices surrounded by host device, using Nearby Api, other techniques. But find how many devices each side!!! I don't think it is possible with the current mobile hardware and technology. Because, by considering all factors, magnetic sensors are only the least possible solution. But the current mobiles have no such capability.
The following point I made based on above answers.
TYPE_ACCELEROMETER, TYPE_LINEAR_ACCELERATION, TYPE_MAGNETIC_FIELD, TYPE_ORIENTATION these sensors are react to the magnetic field around the device (compass react to the magnet). You can try an app using TYPE_MAGNETIC_FIELD, test how it will react to other device when close to it (I think it will react).
But the point I am trying to make here is, if you put three devices to once side and 4 devices to other side, the MAGNETIC_FIELD sensor reads relative magnetic field. So we can't identify how may devices each side, Until unless you have made some scientific calculations.
The second point is, some one suggested TYPE_PROXIMITY sensor, but it is not meant to serve this purpose. Current mobiles, measures the proximity of an object in cm relative to the view screen of a device. This sensor is typically used to determine whether a handset is being held up to a person's ear.
Another least possibility is using location sensor, it can identify the coordinates relative to your device coordinates, you communicate between each device coordinates with host using NFC. But the problem is, your use case says those devices are very close to each other, so it is not measurable distance using location service.
To conclude, it is not possible to identify number of each devices each side of a host device, with the current mobile hardware. It will can archived by using external sensor that will extends the mobile capability. For example, a phone case that equipped with such a capability, this will open window to other use-cases and application as well.
I think a way but it may require a a bit work. First check if 2 devices are laying by getting device orientation and using accelerometer or rotation vector to check pitch, roll, etc.
When you are sure that they are laying send data from one device from one to another using BT or wifi. Data should include send time. Check retreive time on other device also you should check for latency for sending and retreiving data. If you can have a noticible time differences in ms for small distance differences between devices it would be easy to check how approximately close they are. Also you may ask users to hold their device 1 meter or fixed distance from each other to get a time of travel for BT or wifi signal you send to other.
I've been trying to set up a "VR Box" VR headset with Google Cardboard. That headset does not have an input button, but I've made a QR code for a VR profile that accepts a magnetic button input. It kind of works - if you wave a magnet near the top left of the phone (a Samsung S5 Mini) as it sits in the viewer, it mostly triggers a click event. But not reliably.
Google clearly know that magnet input is not reliable and discourage it in the Cardboard help: https://support.google.com/cardboard/manufacturers/answer/6323710
My question is: does anyone know the metrics Google have used to decide that magnetic input is not reliable?
Magnets in Cardboard are a very neat idea, but one that cannot properly work on many devices.
The main reason is that phone manufacturers are more and more cheap these days. Gyroscopes are quite rare in todays phones, mostly flagship devices have them. Some major software has also disabled the use of gyroscopes in recent updates, for example Google Chrome Android stopped using it for an orientation sensor.
It might be a surprise, because VR views work on the budget devices, but it's true - what you're seeing is something called sensor fusion - it uses an accelerometer and magnetometer to simulate a gyroscope. This works fairly well, but with way less precision and some artifacts (try moving the phone in VR mode fast vertically without rotating it - the view will rotate).
And guess what happens if you put a big magnet near the magnetometer - it barely works at all.
TL;DR: adding a magnetic button makes the head tracking even worse.
What you should do is make your own input touch button. I have upgraded a few cardboards this way. You have to press the screen somehow.
The best way is to get some anti-electrostatic foil and make a button of it, so when pressed it will touch the screen and your finger at the same time. You could also use a very short and narrow copper cable from the screen to the outside and press it with your finger.
A simple hole for inserting your finger works too.
I have a requirement that I have to count that At End of Day how many Time user kept Hand on Android Screen ?
So for that after googling I found that, there is proximity Sensor in Android which senses hand movement without touching screen. Proxymity Sensor is located somewhere on Top of Screen for Accessing object close to phone while attending and cancelling any calls
But I wanted to capture user hand press on Android Screen. What could be better solution ? or can my purpose be fully solved via proximity Sensor only ?
Proxymity Sensor is very rough and used to tell whether mobile device is close to something big as head.
You need to implement hooks for everything that user may touch, and increase your counter, that should be saved/restored when app goes to background and returns.
Note that you only can know about touches within your app.
I want to build a system where a tablet moves along a 1,5 meter rail with 10 stations. The user dan move the tablet aloong the rail from one station to another. Therefor a question:
What way is it possible that the app recognizes: "Tablet has reached positon #10". The acceleration sensor cannot solve this. I thought of a NFC terminal on each station but this is complicated and expensive.
Do you have any ideas how the tablet can recognize it has moved? Is there by the way a possibility to suppres the home and power button?
Thanks, best regards
Maybe you can do it using the magnetic sensor and putting magnets on each station.
Or you can use the camera to detect QRCodes or colored squares which will be placed on each station.
You can't suppress the effect of the Home button.
For the Power button, try to intercept it overriding the method onKeyEvent.
I thought of a NFC terminal on each station but this is complicated and expensive.
NFC tags cost well under 1 USD apiece. Assuming your tablets are NFC-capable, those should work, assuming you can position the tags in the proper spot, so that when the tablet is in the right place, the tag is underneath the place where the NFC antenna is on the tablet.
Do you have any ideas how the tablet can recognize it has moved?
NFC is likely to be the simplest and least-expensive option.
Is there by the way a possibility to suppres the home and power button?
Download the Android source code, modify it as you see fit, compile the results into a ROM mod, and install that on the tablets.