I want to build a system where a tablet moves along a 1,5 meter rail with 10 stations. The user dan move the tablet aloong the rail from one station to another. Therefor a question:
What way is it possible that the app recognizes: "Tablet has reached positon #10". The acceleration sensor cannot solve this. I thought of a NFC terminal on each station but this is complicated and expensive.
Do you have any ideas how the tablet can recognize it has moved? Is there by the way a possibility to suppres the home and power button?
Thanks, best regards
Maybe you can do it using the magnetic sensor and putting magnets on each station.
Or you can use the camera to detect QRCodes or colored squares which will be placed on each station.
You can't suppress the effect of the Home button.
For the Power button, try to intercept it overriding the method onKeyEvent.
I thought of a NFC terminal on each station but this is complicated and expensive.
NFC tags cost well under 1 USD apiece. Assuming your tablets are NFC-capable, those should work, assuming you can position the tags in the proper spot, so that when the tablet is in the right place, the tag is underneath the place where the NFC antenna is on the tablet.
Do you have any ideas how the tablet can recognize it has moved?
NFC is likely to be the simplest and least-expensive option.
Is there by the way a possibility to suppres the home and power button?
Download the Android source code, modify it as you see fit, compile the results into a ROM mod, and install that on the tablets.
Related
I am developing an Android application that requires devices to be laid side by side and/or above and below each other.
I know I can use the Nearby API to detect devices "Nearby" however I need something a little more "Finer Grained".
My app needs to be able to identify a device laying either on the left side, above, right side or below. While all devices are laying flat on a table (for instance).
I can find nothing on the web that describes this use case.
Is it possible?
UPDATE
My use case is that I want Android devices to be able to detect any number of "Other Devices" laying either to their left or right. The devices will be laid out horizontally with a "small" gap between each one.
In the same way that you might layout children's lettered blocks to spell out a word or phrase, or numbered blocks to make a sum.
not only should the line of devices be able to detect their immediate neighbours to their left and right the two devices at either end should be able to detect they they are the start and end (reading left to right) of the line.
Using proximity sensors is a likely way to solve your question. TYPE_PROXIMITY will gives the distance from a near by object. TYPE_MAGNETIC_FIELD gives the geomagnetic field strength on x/y/z.
For more read Position Sensors.
Making your own Mock GPS (Local PS to be exact). I don't have a link for this but its definitely possible. Check out how GPS works to get an idea. Wifi and Bluetooth are signals. but you know what else is a signal?
A: SOUND
make each phone make a large beep in turn and measure audio strength using receivers. This might work better than wifi/bluetooth. once you measure relative distances between every pair of phones, it only takes a good algorithm to find relative positions
A Possible Alternative Solution : use image processing. Get something like OpenCV for Android and setup one phone as a master. This will work only for a 2D Layout.
Another "idea" - use the cameras. Stick A board on top of your surface with 4 QR codes in each corner. (This will help identify the edges and orientation of your phone). If you're looking for a 3D layout and the phones have sufficient in-between space, you could stick a QR behind every phone and show a QR on the screen of every phone.
All of these are solutions. Maybe you can use individual ones. Maybe you can use a combination. who knows.
An idea, in case it's relevant for your use case:
Setup phase
start your app on each device in "pairing mode".
Each device will show a QR code containing the key required for communicating with the device (for example via Firebase), and screen details: size in pixels. It will also draw a rectangle at the screen boundaries.
A different phone, external to this layout will run your app as a "master", taking a picture of the phones from above.
Now you need to write an algorithm to identify the screens and their locations, orientation and extract the QR codes for analysis. Not easy, but doable.
Interaction phase
now all the phones (this should work on more than two phones) can collaborate screens to show parts of the same movie, for example.
Seems not, if You have only 2 devices, but if You have external sources (with known position) of any signal (audio, vibrate, BT or WiFi radio, etc.), which can be detected by devices with adequate accuracy, and devices time is synchronized, You can do this comparing time of signal start (or signal strength) on both devices like on this picture:
Or, if You can add some sensors to one of devices, You can create "other device locator", for example like this sound locator.
UPDATE
In a updated formulation, the issue is also not solvable: it's possible to determine which two devices are at the edge, but you can not determine which one is on the left and which is on the right side. It is necessary that at least one device knows that it, for example, is leftmost - then other devices, for example, generates a sound, the others receive it and determine their order according to the difference in arrival time. But the anchor point and synchronization of time are necessary.
By understating your use case, it is possible to find number of devices surrounded by host device, using Nearby Api, other techniques. But find how many devices each side!!! I don't think it is possible with the current mobile hardware and technology. Because, by considering all factors, magnetic sensors are only the least possible solution. But the current mobiles have no such capability.
The following point I made based on above answers.
TYPE_ACCELEROMETER, TYPE_LINEAR_ACCELERATION, TYPE_MAGNETIC_FIELD, TYPE_ORIENTATION these sensors are react to the magnetic field around the device (compass react to the magnet). You can try an app using TYPE_MAGNETIC_FIELD, test how it will react to other device when close to it (I think it will react).
But the point I am trying to make here is, if you put three devices to once side and 4 devices to other side, the MAGNETIC_FIELD sensor reads relative magnetic field. So we can't identify how may devices each side, Until unless you have made some scientific calculations.
The second point is, some one suggested TYPE_PROXIMITY sensor, but it is not meant to serve this purpose. Current mobiles, measures the proximity of an object in cm relative to the view screen of a device. This sensor is typically used to determine whether a handset is being held up to a person's ear.
Another least possibility is using location sensor, it can identify the coordinates relative to your device coordinates, you communicate between each device coordinates with host using NFC. But the problem is, your use case says those devices are very close to each other, so it is not measurable distance using location service.
To conclude, it is not possible to identify number of each devices each side of a host device, with the current mobile hardware. It will can archived by using external sensor that will extends the mobile capability. For example, a phone case that equipped with such a capability, this will open window to other use-cases and application as well.
I think a way but it may require a a bit work. First check if 2 devices are laying by getting device orientation and using accelerometer or rotation vector to check pitch, roll, etc.
When you are sure that they are laying send data from one device from one to another using BT or wifi. Data should include send time. Check retreive time on other device also you should check for latency for sending and retreiving data. If you can have a noticible time differences in ms for small distance differences between devices it would be easy to check how approximately close they are. Also you may ask users to hold their device 1 meter or fixed distance from each other to get a time of travel for BT or wifi signal you send to other.
I've been trying to set up a "VR Box" VR headset with Google Cardboard. That headset does not have an input button, but I've made a QR code for a VR profile that accepts a magnetic button input. It kind of works - if you wave a magnet near the top left of the phone (a Samsung S5 Mini) as it sits in the viewer, it mostly triggers a click event. But not reliably.
Google clearly know that magnet input is not reliable and discourage it in the Cardboard help: https://support.google.com/cardboard/manufacturers/answer/6323710
My question is: does anyone know the metrics Google have used to decide that magnetic input is not reliable?
Magnets in Cardboard are a very neat idea, but one that cannot properly work on many devices.
The main reason is that phone manufacturers are more and more cheap these days. Gyroscopes are quite rare in todays phones, mostly flagship devices have them. Some major software has also disabled the use of gyroscopes in recent updates, for example Google Chrome Android stopped using it for an orientation sensor.
It might be a surprise, because VR views work on the budget devices, but it's true - what you're seeing is something called sensor fusion - it uses an accelerometer and magnetometer to simulate a gyroscope. This works fairly well, but with way less precision and some artifacts (try moving the phone in VR mode fast vertically without rotating it - the view will rotate).
And guess what happens if you put a big magnet near the magnetometer - it barely works at all.
TL;DR: adding a magnetic button makes the head tracking even worse.
What you should do is make your own input touch button. I have upgraded a few cardboards this way. You have to press the screen somehow.
The best way is to get some anti-electrostatic foil and make a button of it, so when pressed it will touch the screen and your finger at the same time. You could also use a very short and narrow copper cable from the screen to the outside and press it with your finger.
A simple hole for inserting your finger works too.
I would like to make a simple tile on the Microsoft Band 2 that can display the current skin temperature when the user clicks on the tile. One way to do this is to register a listener for BandSkinTemperatureEvent and constantly save the temperature, and ACTION_TILE_OPENED to detect the user click, then update the temperature back to the tile using TextBlockData. But that seems like an extremely roundabout way to display a sensor reading which essentially should already be available on the Band itself. Also, this won't work when the Android phone is not connected.
Is there a way to achieve this functionality without having to use the above method?
Short answer: No.
Longer answer: If you look at the specs of either Microsoft Band, they don't have much memory, and certainly not enough to allow the running of foreign code. So, in order to allow what you are asking for, they would need to create a new layout object. And, given that the skin temperature sensor is too close to the band's processor to ever actually be accurate, the chances of them doing that is almost zero.
I'd like to create an application that utilizes touch-screen as a "pad". There will be 3 small buttons in the bottom area of touch-screen, and the rest will be used a mouse movement area.
The first button will act as "left-click" in real mouse, the second one will act as "scroll", and the last one as "right-click"
When a user make any movement (event "move", "up" , "down" or "cancel") in that area, the real mouse-pointer in Windows Desktop will also move.
Transmission media will be Bluetooth and Wifi.
so, here's some questions :
1). is it possible to utilize multi-touch in Froyo ? Example for this case is when user want to "block" some text. In real mouse, we just hold left-click and then drag the pointer. In android, this will be touching the first button while at the same time, touching the "pad" area and make some movement.
2). How can I turn this application concept into a real application ? ( general idea or algorithms )
You might want to check out RemoteDroid. It's an open-source app which has most of the functionality you described.
http://code.google.com/p/remotedroid/
An app like this is going to have two main parts. An Android app which generates a series of movement vectors or movement data, and a program on your target operating system which receives this data and translates it into a software mouse. You will also need the bluetooth stack's necessary for that transfer (I get the feeling wifi won't give you the responsiveness you want without some serious optimization)
When it comes to the Android side of matters, I think you'll need to experiment in the best way to capture those mouse movements. I'd think a speed-vector structure might be your best bet, and it seems most similar to what I know of Mouse Movements.
I am planning to develop an accelerometer based mouse on the android platform. the mobile device which i plan to use is htc nexus one. the cursor should move as the mobile is moved is space. will that be difficult compard to movement wrt gravity?
this is hard to answer due to way you have phrased the question.
What is it you are wanting to use the mouse for? If you are trying to move the mouse on a computer, you will need to also create a software package that the PC can run that has the ability to set the position of the mouse.
The accelerators in phones detect, obviously, acceleration, usually in the x y and Z axis. If you lay your phone on the table, you will notice the phone is under 1g (lower all or capital case should that be?). This is actually 1g of acceleration, even though it is not accelerating you still have it. You can detect the roll of a phone by recording how the component of this 1g differs in the three axis. ie you have equal g force in the x and z axis and zero in the y, then you can 'assume' the phone is being held at a 45 degree angle.
When the sum of the components is not equal to 1g, your know your phone is actually accelerating. However, you need to know the position of your phone. Due to a delightfully painful way maths works, if you work out the differential of the differential of the acceleration of your phone (in each axis) you should have the position. The exact way you work out position from acceleration is more then I can think of in the morning, but the relation ships are fairly simple to convert to/from, if you keep a constant for them all, which you can, TIME!
Old question, but still relevant to newer hardware, so here goes...
Your biggest problem is the fact that an accelerometer alone can't tell the difference between acceleration due to motion and acceleration due to gravity and tilting. To isolate out motion, you need a second sensor. Your problem is very much like the one that people building Segway-like balancing robots face, and the solution is pretty much the same as well:
A gyroscope. I believe the Samsung Galaxy S phones have gyros, but I'm not sure whether they're "real" MEMS gyros, or just simulated somehow in a way that might not be up to the task.
The camera. This is an untested theory of mine, but if you could somehow either reflect enough light off the desk with the flash (on phones with LED flash), or perhaps used a mousepad with some glow-in-the-dark pattern, and you could force the camera to do low-res videocapture when it knows it's out of focus, you could probably do pattern-recognition on the blurry unfocused blobs well enough to determine whether the phone is moving or stationary, and possibly get some sense of velocity and/or direction. Combine the low-quality data from the realtime blurry camera video stream with the relatively high-res data from the accelerometers, and you might have something that works.
However, before you even bother with 1 or 2, make sure you're ready to tackle the bigger problem: emulation of a HID bluetooth mouse. It's possible (but might require a rooted phone), and at least one app in Android Market does it, but it's not a trivial task. You aren't going to solve THIS problem in an afternoon, and you should probably try to solve it at least well enough to emulate a fake mouse and convincingly pair it to a computer expecting a real bluetooth mouse before you even bother with the accelerometer problem. Both are high-risk, so don't try to completely finish one task before starting the other, but don't spend too much time on either until you've got a fairly good grip on the problem's scope and know what you're getting into.
There IS an alternative, if bluetooth HID is too much... there are quite a few open source projects that involve skipping bluetooth HID, and just using it as a serial port communicating with a server running on the PC (or tethered directly via usb with ADB). AFAIK, none of them have particularly good phone-as-mouse capabilities, unless you consider using the phone as a touchpad to be "mouse".