I am planning to develop an accelerometer based mouse on the android platform. the mobile device which i plan to use is htc nexus one. the cursor should move as the mobile is moved is space. will that be difficult compard to movement wrt gravity?
this is hard to answer due to way you have phrased the question.
What is it you are wanting to use the mouse for? If you are trying to move the mouse on a computer, you will need to also create a software package that the PC can run that has the ability to set the position of the mouse.
The accelerators in phones detect, obviously, acceleration, usually in the x y and Z axis. If you lay your phone on the table, you will notice the phone is under 1g (lower all or capital case should that be?). This is actually 1g of acceleration, even though it is not accelerating you still have it. You can detect the roll of a phone by recording how the component of this 1g differs in the three axis. ie you have equal g force in the x and z axis and zero in the y, then you can 'assume' the phone is being held at a 45 degree angle.
When the sum of the components is not equal to 1g, your know your phone is actually accelerating. However, you need to know the position of your phone. Due to a delightfully painful way maths works, if you work out the differential of the differential of the acceleration of your phone (in each axis) you should have the position. The exact way you work out position from acceleration is more then I can think of in the morning, but the relation ships are fairly simple to convert to/from, if you keep a constant for them all, which you can, TIME!
Old question, but still relevant to newer hardware, so here goes...
Your biggest problem is the fact that an accelerometer alone can't tell the difference between acceleration due to motion and acceleration due to gravity and tilting. To isolate out motion, you need a second sensor. Your problem is very much like the one that people building Segway-like balancing robots face, and the solution is pretty much the same as well:
A gyroscope. I believe the Samsung Galaxy S phones have gyros, but I'm not sure whether they're "real" MEMS gyros, or just simulated somehow in a way that might not be up to the task.
The camera. This is an untested theory of mine, but if you could somehow either reflect enough light off the desk with the flash (on phones with LED flash), or perhaps used a mousepad with some glow-in-the-dark pattern, and you could force the camera to do low-res videocapture when it knows it's out of focus, you could probably do pattern-recognition on the blurry unfocused blobs well enough to determine whether the phone is moving or stationary, and possibly get some sense of velocity and/or direction. Combine the low-quality data from the realtime blurry camera video stream with the relatively high-res data from the accelerometers, and you might have something that works.
However, before you even bother with 1 or 2, make sure you're ready to tackle the bigger problem: emulation of a HID bluetooth mouse. It's possible (but might require a rooted phone), and at least one app in Android Market does it, but it's not a trivial task. You aren't going to solve THIS problem in an afternoon, and you should probably try to solve it at least well enough to emulate a fake mouse and convincingly pair it to a computer expecting a real bluetooth mouse before you even bother with the accelerometer problem. Both are high-risk, so don't try to completely finish one task before starting the other, but don't spend too much time on either until you've got a fairly good grip on the problem's scope and know what you're getting into.
There IS an alternative, if bluetooth HID is too much... there are quite a few open source projects that involve skipping bluetooth HID, and just using it as a serial port communicating with a server running on the PC (or tethered directly via usb with ADB). AFAIK, none of them have particularly good phone-as-mouse capabilities, unless you consider using the phone as a touchpad to be "mouse".
Related
I am working on a project for which I have to measure the touch surface area. This works both for android and iOS as long as the surface area is low (e.g. using the thumb). However, when the touch area increases (e.g. using the ball of the hand), the touch events are no longer passed to the application.
On my IPhone X (Software Version 14.6), the events where no longer passed to the app when the UITouch.majorRadius exceeded 170. And on my Android device (Redmi 9, Android version 10) when MotionEvent.getPressure exceeded 0.44.
I couldn't find any documentation on this behavior. But I assume its to protect from erroneous inputs.
I looked in the settings of both devices, but I did not find a way to turn this behavior off.
Is there any way to still receive touch events when the touch area is large?
Are there other Android or iOS devices that don't show this behavior?
I would appreciate any help.
So I've actually done some work in touch with unusual areas. I was focusing on multitouch, but its somewhat comparable. The quick answer is no. Because natively to the hardware there is no such thing as a "touch event".
You have capacitance changes being detected. That is HEAVILY filtered by the drivers which try to take capacitance differences and turn it into events. The OS does not deliver raw capacitance data to the apps, it assumes you always want the filtered versions. And if it did deliver that- it would be very hardware specific, and you'd have to reinterpret them into touch events
Here's a few things you're going to find out about touch
1)Pressure on android isn't what you should be looking at. Pressure is meant for things like styluses. You want getSize, which returns the normalized size. Pressure is more for how hard someone is pushing, which really doesn't apply to finger touches these days.
2)Your results will vary GREATLY by hardware. Every single different sensor will differ from each other.
3)THe OS will confuse large touch areas and multitouch. Part of this is because when you make contact with a large area like your heel of your hand, the contact is not uniform throughout. Which means the capacitances will differ, which will make it think you're seeing multiple figures. Also when doing heavy multitouch, you'll see the reverse as well (several nearby fingers look like 1 large touch). This is because the difference between the two, on a physical level, is hard to tell.
4)We were writing an app that was enabling 10 finger multitouch actions on keyboards. We found that we missed high level multitouch from women (especially asian women) more than others- size of your hand greatly effected this, as does how much they hover vs press down. The idea that there were physical capacitance differences in the skin was considered. We believed that it was more due to touching the device more lightly, but we can't throw out actual physical differences.
Some of that is just a dump because I think you'll need to know to look out for it as you continue. I'm not sure exactly what you're trying to do, but best of luck.
I am developing an Android application that requires devices to be laid side by side and/or above and below each other.
I know I can use the Nearby API to detect devices "Nearby" however I need something a little more "Finer Grained".
My app needs to be able to identify a device laying either on the left side, above, right side or below. While all devices are laying flat on a table (for instance).
I can find nothing on the web that describes this use case.
Is it possible?
UPDATE
My use case is that I want Android devices to be able to detect any number of "Other Devices" laying either to their left or right. The devices will be laid out horizontally with a "small" gap between each one.
In the same way that you might layout children's lettered blocks to spell out a word or phrase, or numbered blocks to make a sum.
not only should the line of devices be able to detect their immediate neighbours to their left and right the two devices at either end should be able to detect they they are the start and end (reading left to right) of the line.
Using proximity sensors is a likely way to solve your question. TYPE_PROXIMITY will gives the distance from a near by object. TYPE_MAGNETIC_FIELD gives the geomagnetic field strength on x/y/z.
For more read Position Sensors.
Making your own Mock GPS (Local PS to be exact). I don't have a link for this but its definitely possible. Check out how GPS works to get an idea. Wifi and Bluetooth are signals. but you know what else is a signal?
A: SOUND
make each phone make a large beep in turn and measure audio strength using receivers. This might work better than wifi/bluetooth. once you measure relative distances between every pair of phones, it only takes a good algorithm to find relative positions
A Possible Alternative Solution : use image processing. Get something like OpenCV for Android and setup one phone as a master. This will work only for a 2D Layout.
Another "idea" - use the cameras. Stick A board on top of your surface with 4 QR codes in each corner. (This will help identify the edges and orientation of your phone). If you're looking for a 3D layout and the phones have sufficient in-between space, you could stick a QR behind every phone and show a QR on the screen of every phone.
All of these are solutions. Maybe you can use individual ones. Maybe you can use a combination. who knows.
An idea, in case it's relevant for your use case:
Setup phase
start your app on each device in "pairing mode".
Each device will show a QR code containing the key required for communicating with the device (for example via Firebase), and screen details: size in pixels. It will also draw a rectangle at the screen boundaries.
A different phone, external to this layout will run your app as a "master", taking a picture of the phones from above.
Now you need to write an algorithm to identify the screens and their locations, orientation and extract the QR codes for analysis. Not easy, but doable.
Interaction phase
now all the phones (this should work on more than two phones) can collaborate screens to show parts of the same movie, for example.
Seems not, if You have only 2 devices, but if You have external sources (with known position) of any signal (audio, vibrate, BT or WiFi radio, etc.), which can be detected by devices with adequate accuracy, and devices time is synchronized, You can do this comparing time of signal start (or signal strength) on both devices like on this picture:
Or, if You can add some sensors to one of devices, You can create "other device locator", for example like this sound locator.
UPDATE
In a updated formulation, the issue is also not solvable: it's possible to determine which two devices are at the edge, but you can not determine which one is on the left and which is on the right side. It is necessary that at least one device knows that it, for example, is leftmost - then other devices, for example, generates a sound, the others receive it and determine their order according to the difference in arrival time. But the anchor point and synchronization of time are necessary.
By understating your use case, it is possible to find number of devices surrounded by host device, using Nearby Api, other techniques. But find how many devices each side!!! I don't think it is possible with the current mobile hardware and technology. Because, by considering all factors, magnetic sensors are only the least possible solution. But the current mobiles have no such capability.
The following point I made based on above answers.
TYPE_ACCELEROMETER, TYPE_LINEAR_ACCELERATION, TYPE_MAGNETIC_FIELD, TYPE_ORIENTATION these sensors are react to the magnetic field around the device (compass react to the magnet). You can try an app using TYPE_MAGNETIC_FIELD, test how it will react to other device when close to it (I think it will react).
But the point I am trying to make here is, if you put three devices to once side and 4 devices to other side, the MAGNETIC_FIELD sensor reads relative magnetic field. So we can't identify how may devices each side, Until unless you have made some scientific calculations.
The second point is, some one suggested TYPE_PROXIMITY sensor, but it is not meant to serve this purpose. Current mobiles, measures the proximity of an object in cm relative to the view screen of a device. This sensor is typically used to determine whether a handset is being held up to a person's ear.
Another least possibility is using location sensor, it can identify the coordinates relative to your device coordinates, you communicate between each device coordinates with host using NFC. But the problem is, your use case says those devices are very close to each other, so it is not measurable distance using location service.
To conclude, it is not possible to identify number of each devices each side of a host device, with the current mobile hardware. It will can archived by using external sensor that will extends the mobile capability. For example, a phone case that equipped with such a capability, this will open window to other use-cases and application as well.
I think a way but it may require a a bit work. First check if 2 devices are laying by getting device orientation and using accelerometer or rotation vector to check pitch, roll, etc.
When you are sure that they are laying send data from one device from one to another using BT or wifi. Data should include send time. Check retreive time on other device also you should check for latency for sending and retreiving data. If you can have a noticible time differences in ms for small distance differences between devices it would be easy to check how approximately close they are. Also you may ask users to hold their device 1 meter or fixed distance from each other to get a time of travel for BT or wifi signal you send to other.
I want to detect a specific pattern of motion on an Android mobile phone, e.g. if I do five sit-stands.
[Note: I am currently detecting the motion but the motion in all direction is the same.]
What I need is:
I need to differentiate the motion downward, upward, forward and backward.
I need to find the height of the mobile phone from ground level (and the height of the person holding it).
Is there any sample project which has pattern motion detection implemented?
This isn't impossible, but it may not be extremely accurate, given that the accuracy of the accelerometer and gyroscopes in phones have improved a lot.
What your app will doing is taking sensor data, and doing a regression analysis.
1) You will need to build a model of data that you classify as five sit and stands. This could be done by asking the user to do five sit and stands, or by loading the app with a more fine-tuned model from data that you've collected beforehand. There may be tricks you could do, such as loading several models of people with different heights, and asking the user to submit their own height in the app, to use the best model.
2) When run, your app will be trying to fit the data from the sensors (Android has great libraries for this), to the model that you've made. Hopefully, when the user performs five sit-stands, he will generate a set of motion data similar enough to your definition of five sit-stands that your algorithm accepts it as such.
A lot of the work here is assembling and classifying your model, and playing with it until you get an acceptable accuracy. Focus on what makes a stand-sit unique to other up and down motions - For instance, there might be a telltale sign of extending the legs in the data, followed by a different shape for straightening up fully. Or, if you expect the phone to be in the pocket, you may not have a lot of rotational motion, so you can reject test sets that registered lots of change from the gyroscope.
It is impossible. You can recognize downward and upward comparing acceleration with main gravity force but how do you know is your phone is in the back pocket when you rise or just in your waving hand when you say hello? Was if 5 stand ups or 5 hellos?
Forward and backward are even more unpredictable. What is forward for upside-down phone? What if forward at all from phone point of view?
And ground level as well as height are completely out of measurement. Phone will move and produce accelerations in exact way for dwarf or giant - it more depends on person behavior or motionless then on height.
It's a topic of research and probably I'm way too late to post it here, but I'm foraging the literature anyway, so what?
All kind of machine learning approaches have been set on the issue, I'll mention some on the way. Andy Ng's MOOC on machine learning gives you an entry point to the field and into Matlab/Octave that you instantly can put to practice, it demystifies the monsters too ("Support vector machine").
I'd like to detect if somebody is drunk from phone acceleration and maybe angle, therefore I'm flirting with neuronal networks for the issue (they're good for every issue basically, if you can afford the hardware), since I don't want to assume pre-defined patterns to look for.
Your task could be approached pattern based it seems, an approach applied to classify golf play motions, dancing, behavioural every day walking patterns, and two times drunk driving detection where one addresses the issue of finding a base line for what actually is longitudinal motion as opposed to every other direction, which maybe could contribute to find the baselines you need, like what is the ground level.
It is a dense shrub of aspects and approaches, below just some more.
Lim e.a. 2009: Real-time End Point Detection Specialized for Acceleration Signal
He & Yin 2009: Activity Recognition from acceleration data Based on
Discrete Consine Transform and SVM
Dhoble e.a. 2012: Online Spatio-Temporal Pattern Recognition with Evolving Spiking Neural Networks utilising Address Event Representation, Rank Order, and Temporal Spike Learning
Panagiotakis e.a.: Temporal segmentation and seamless stitching of motion patterns for synthesizing novel animations of periodic dances
This one uses visual data, but walks you through a matlab implementation of a neuronal network classifier:
Symeonidis 2000: Hand Gesture Recognition Using Neural Networks
I do not necessarily agree with Alex's response. This is possible (although maybe not as accurate as you would like) using accelerometer, device rotation and ALOT of trial/error and data mining.
The way I see that this can work is by defining a specific way that the user holds the device (or the device is locked and positioned on the users' body). As they go through the motions the orientation combined with acceleration and time will determine what sort of motion is being performed. You will need to use class objects like OrientationEventListener, SensorEventListener, SensorManager, Sensor and various timers e.g. Runnables or TimerTasks.
From there, you need to gather a lot of data. Observe, record and study what the numbers are for doing specific actions, and then come up with a range of values that define each movement and sub-movements. What I mean by sub-movements is, maybe a situp has five parts:
1) Rest position where phone orientation is x-value at time x
2) Situp started where phone orientation is range of y-values at time y (greater than x)
3) Situp is at final position where phone orientation is range of z-values at time z (greater than y)
4) Situp is in rebound (the user is falling back down to the floor) where phone orientation is range of y-values at time v (greater than z)
5) Situp is back at rest position where phone orientation is x-value at time n (greatest and final time)
Add acceleration to this as well, because there are certain circumstances where acceleration can be assumed. For example, my hypothesis is that people perform the actual situp (steps 1-3 in my above breakdown) at a faster acceleration than when they are falling back. In general, most people fall slower because they cannot see what's behind them. That can also be used as an additional condition to determine the direction of the user. This is probably not true for all cases, however, which is why your data mining is necessary. Because I can also hypothesize that if someone has done many situps, that final situp is very slow and then they just collapse back down to rest position due to exhaustion. In this case the acceleration will be opposite of my initial hypothesis.
Lastly, check out Motion Sensors: http://developer.android.com/guide/topics/sensors/sensors_motion.html
All in all, it is really a numbers game combined with your own "guestimation". But you might be surprised at how well it works. Perhaps (hopefully) good enough for your purposes.
Good luck!
I'm working with android sensor data. My application use
SensorManager.getRotationMatrixFromVector(
mRotationMatrix , event.values);
and it has been working well until this morning, when the rotation matrix started to send a lot of noise data (Change N to W in a second).
It's not a problem with my code, because on friday was working and no changes have been done. I have used a compass app from the market, and the compass is giving random data.
I have tested my app on another tablet, and it is working well.
Does someone know why is this happening? A problem with the sensor? Does it need a calibration?
I've worked quite a lot with these electronic compasses on mobile phones and its quite possible that there is nothing wrong with your code or sensor.
Instead it could very well be a problem with your environment. There are magnetic fields interfering with the earth's magnetic fields all the time. From electrical equipment interference to the metal structure holding up a building. At the end of the day a compass is just a magnet. If you stand beside a large lump of metal the compass will be attracted to it and point to it rather than the magnetic north pole.
Try this:
Install GPS status
then turn off all filtering (settings... gps & sensors...sensor filtering... no filtering).
Do the calibration (figure of 8 wavy stuff) and then move the phone around your desk.. near monitors, cables, etc. You'll see it go crazy. The information is completely unreliable. I found in the past that moving the phone a few inches to the right completely changed its reading. The same happens with a real compass. Strictly speaking there is no "problem". The device's compass is assigning itself with the strongest magnetic field. Even the magnetic content of nearby rocks can interfere with the compass.
As a further test I've just placed a real (orienteering) compass over my phone which has a compass app installed. The real compass is now pointing everywhere but magnetic North. The two devices are interfering with each other.
So my advice is.. go somewhere in the open, like a park or field, away from any potential interference and power lines, (if you have one bring a real compass to check that the GPS status app is pointing the right way), and see if your compass works as you'd expect.
Extra: The answer from #resus is also important when calibrating. Rotate the phone a few times in each axis. Looks silly but it does calibrate it properly.
Extra 2: Would it be possible/practical to use the compass bearing of your GPS? It would require that the device be moving (walking speed should be fine) but you would not need to worry about any interference. It should give an accurate reading provided your GPS signal is good.
Extra 3: Another thought just occurred to me.. You could try apply a low pass filter to the sensor. This means that the sudden changes in the sensor reading are filtered out .. have a look at this answer. And if that doesn't do a good job there are lots of algorithms on the web for you to choose from.
If you definitely haven't changed anything in your code, and it still works fine on other devices, it would suggest a problem with that particular device.
While your app is running (i.e. the compass is in use), you should be able to wave it in a figure of 8 in order to automatically recalibrate the compass. You should also make sure you aren't standing next to any large lumps of metal etc. that might interfere with readings.
You can override the onAccuracyChanged() method of SensorEventListener to flash up a message to the user when the compass requires recalibration (probably when accuracy drops to SENSOR_STATUS_ACCURACY_LOW).
In my experience of playing with the compass on android phones, they can be pretty unreliable...
If your application work on another tablet and other compass application do not work on your device, this is probably due to a bad calibration.
As said in the post above, to make the calibration, wave your device in a figure of 8. I just want to add that you should do it for EACH axis. This should fix your problem.
If it is not a calibration error, as some people have already answered, it is possible that the compass had gone through a magnetic field and now it is desmagnetized, so it is not working properly.
Where do you usually keep the tablet? Could it be that it was near big servers or magnets?
You should check the compass just in case, talk to to android's tech support.
Hope it helps.
I think the question was if calibration could be done without sending any data to compass. Because not everybody says that the compass is calibrated as shown in this video: https://support.google.com/maps/answer/6145351?hl=en and obviously you can not do anything else than advise the user to calibrate before using the program or when you get too much changes.
For example going left and right 90 degrees in about 25 ms.
Anyway I think it's good to give some seconds to the app before start taking data, because it gives some unstable values (too high and low in short time without movement) at the app loading moment.
Just let the handler onSensorChanged() coded with a conditional, and start a thread on the onCreate() handler, which will set a boolean to true, after some seconds.
Then you start to capture data on the onSensorChanged() handler.
Also this thread can help to detect the sensor accuracy, and then you can popup: In Android can I programmatically detect that the compass is not yet calibrated?
I know because I am building a robot using the compass of the smartphone, and I'm having this experience. So, if you are making a robot, make sure to give an spaced place between electronics and hardware to the smartphone, but remember that it's on any compass: electromagnetic fields can be altered by metals so heavily.
Nowadays I have the luck of developing a robot with an HMC-5983 and an MPU-6050, which can be calibrated by using its libraries with Arduino.
That code is compatible/portable to other uController but for not also so easy for smartphones, I guess that the offsets needed for calibrating the compass, the gyro and the accelerometer are inside some internals of Android, not available in the SDK.
I answered before thinking that maybe calibration was only for some devices, but realized that must be as I said before.
So, if playing with robots its possible, I mean it's also easy, but when using an smartphone maybe some custom firmware as CyanogenMod would bring the possibility of investigating the way of setting that offsets, but more important to run some program ported from sketch (following its concept only) to get them first ...
So, good luck ! What is also true, is that in both devices (smartphone and my robot) it's need to move them for them to get working well, as I showed you in the video of latest answer, also helpful on robots.
Good luck and a lot of fun with those things, are very powerful.
I'm making an application that works as a compass..
I'm using the accelerometer and the magnetic field sensors to compute the azimuth angle through, sensor.getOrientation().
I'm searching for something that can improve the magnetic field sensor accuracy, since I'm getting it state of accuracy as UNRELIABLE!
Any one knows anything about this?I'm looking for something that can be either hardcoded or for instance just physically moving the phone until it gets calibrated!
This is not a final answer (I don't know anything for sure), but my understanding from online posts is that waving the phone around in a figure of 8 a few times while the compass is in use is supposed to trigger automatic recalibration. This is what the google maps app suggests, for example. I don't know whether this is dependent on application functionality (something in maps that detects the waving by accelerometer and triggers a recalibration), or something in the android stack, or something specific to per-phone implementations. Try it and see!
Eg discussion: http://androidforums.com/epic-4g-support-troubleshooting/217317-cant-get-compass-calibrate.html
This reference appears to suggest this per-axis / figure-8 rotation process is built-in functionality: http://m.eclipsim.com/gpsstatus/
And here another article that claims this is built-in functionality, and that you don't even need to be running a compass-consuming app for the recalibration to work: http://www.ichimusai.org/2009/06/20/how-to-calibrate-the-htc-magic-compass/
Just a few points
The figure 8 motion works sometimes and not others, I have no idea why, they really need to have some kind of code based way to check if the 8 motion worked (Assuming that the physical motion is actually required)
They also need a way to detect that calibration is required, I looked at the code for the accuracy output (the unreliable constant) and once they send it to you they will not send it again, so for instance if you calibrate but then come within a strong magnetic field it will not resend (not sure why they did that)
One not completely reliable way to detect ongoing issues is that you can also use the magnetic sensor output and do something like field=sqrt(x*x+y*y+z*z) and check that field falls between say 25 and 65 and then ask the user to calibrate if it does not.
The bottom line after testing 18 phones is that I would never depend on a Android based compass with the current crop of phones, accuracy would always be in question.
I have also found even if you are lucky and have a fairly reliable phone you can never be sure that it's calibrated without checking it against a real compass, which kind of defeats the purpose.
NOTE: On a lot of the mis-behaving phones we have found that the sensor writes a calibration file and a tmp file with the same name. If you delete those files and re-boot the phones the calibration file is recreated with zero'd values and the cold start and general calibration problems resolve themselves.
The bad news is that they are stored in /data/misc and require root privileges to get at (thanks Google & Sensor mfg!) so even though I suspect this would solve a lot of problems for a lot of developers it just is not viable from a marketplace app perspective.
I am developing for Android. I'm using Titanium Alloy as development tool with the Titanium Geolocation module.
I have only tested 2 devices [Galaxy Note and S4] against a commercial magnetic compass. Following a calibration process [tilt along the 3 axis] and using 2 different compass apps and the app I'm working on, the Android compass seems accurate enough for basic use ... correlation was good enough for my purpose anyway. I also found the device compass reading to be very sensitive to other magnetic and electrical field interference ... initial mistake I made was to use the compass feature whilst device was in a device protector with a magnetic closure facility [quite common on tabs] ... this interference is particularly strong. I thus need to suggest to users of my app to remove device protectors, keep device free of other electronics and then do standard calibration before initializing the app.
Another option is:
Go To sensors menu: #*0#*
Then if you see a red line in Magnetic Sensor section and a Need for Calibration you should recalibrate your compass.
How;
According those guys;
Turn the Samsung Galaxy Mini S5 around all of its axes until the red
line in the black circle changes color from red to blue. You can also
run through a motion that follows the shape of an 8. It may be that
several attempts are needed to calibrate the compass...