I am trying to write a variable brightness flashlight app by using PWM (might use it for communication later). For that I need fast switching of the camera LED (say 100-200Hz), which is not possible through the Camera API's setParameters functionality (I guess the camera itself slows things down considerably).
Now – The LED is capable of switching rapidly and there are apps doing something similar (HTC flashlight for example, unfortunately couldn't find source code for it) so it all comes down to controlling the LED without the camera.
Any thoughts or ideas?
I know this is 4 years later, but you'd need a lot more than 100-200hz for PWM to work properly, without irritating the eye. You might get some control, but you won't be able to get 10% brightness without the pulses becoming noticeable, and even then, the duration of those pulses is too long to fool the eye. Typically PWM is handled at the microsecond level, around 100khz. I would like this to be possible as well. Except, if we could have say a 100khz carrier frequency in the flash, it would be possible to calculate distance to a subject with dedicated pixels in the sensor, as well as reject all ambient light through demodulation, if all pixels could be scanned fast enough. Sadly not possible though.
Normally to do that there'll be a PWM peripheral in the processor that handles the rapid switching for you, but that would need driver support; it won't be accessible to user applications. Here's a question which uses the driver to do it: Set brightness of flash in Android
Related
I know that when I want to use any of androids sensors, I have to register them and read their values via onSensorChanged().
If I am done with them, I unregister them to save power.
How come the step-counter in my Galaxy S7s health app is able to count steps?
I assume it's not e.g. the accelerometer, since the high power consumption of the sensor would drain my battery in hours.
I found a really great article on this because I wondered the same a while back.
https://www.explainthatstuff.com/how-pedometers-work.html
Here is the short version
Modern pedometers work in a very similar way but are partly electronic. Open one up and you'll find a metal pendulum (a hammer with a weight on one end) wired into an electronic counting circuit by a thin spring. Normally the circuit is open and no electric current flows through it. As you take a step, the hammer swings across and touches a metal contact in the center, completing the circuit and allowing current to flow. The flow of current energizes the circuit and adds one to your step count. As you complete the step, the hammer swings back again (helped by the spring) and the circuit is broken, effectively resetting the pedometer ready for the next step. The pedometer shows a count of your steps on an LCD display; most will convert the step count to an approximate distance in miles or kilometers (or the number of calories you've burned off) at the push of a button. Note that in some pedometers, the hammer-pendulum circuit works the opposite way: it's normally closed and each step makes it open temporarily.
More sophisticated pedometers (including some of the really good ones made by Omron) work entirely electronically and, since they have no moving parts, tend to be longer-lasting, more reliable, and considerably more accurate. They dispense with the swinging pendulum-hammer and measure your steps with two or three accelerometers instead. These are microchips arranged at right angles that detect minute changes in force as you move your legs. Since accelerometers are often built into gadgets like cellphones, it's increasingly common to find these sorts of things offering to count your steps for you too (there are plenty of pedometer apps for the iPhone, for example). GPS satellite navigation devices can also figure out how far you've walked or run, but they do it by calculating from satellite signals rather than counting steps
Some sensors like the accelerometer, and light sensor are running at all times in a low power mode. The light sensor is running at all times to adjust the screen brightness according to ambient light, and the accelerometer is running at all times to detect the screen rotation. I suspect this low power mode has the sensors running at low frequencies to save battery.
I'm working with android sensor data. My application use
SensorManager.getRotationMatrixFromVector(
mRotationMatrix , event.values);
and it has been working well until this morning, when the rotation matrix started to send a lot of noise data (Change N to W in a second).
It's not a problem with my code, because on friday was working and no changes have been done. I have used a compass app from the market, and the compass is giving random data.
I have tested my app on another tablet, and it is working well.
Does someone know why is this happening? A problem with the sensor? Does it need a calibration?
I've worked quite a lot with these electronic compasses on mobile phones and its quite possible that there is nothing wrong with your code or sensor.
Instead it could very well be a problem with your environment. There are magnetic fields interfering with the earth's magnetic fields all the time. From electrical equipment interference to the metal structure holding up a building. At the end of the day a compass is just a magnet. If you stand beside a large lump of metal the compass will be attracted to it and point to it rather than the magnetic north pole.
Try this:
Install GPS status
then turn off all filtering (settings... gps & sensors...sensor filtering... no filtering).
Do the calibration (figure of 8 wavy stuff) and then move the phone around your desk.. near monitors, cables, etc. You'll see it go crazy. The information is completely unreliable. I found in the past that moving the phone a few inches to the right completely changed its reading. The same happens with a real compass. Strictly speaking there is no "problem". The device's compass is assigning itself with the strongest magnetic field. Even the magnetic content of nearby rocks can interfere with the compass.
As a further test I've just placed a real (orienteering) compass over my phone which has a compass app installed. The real compass is now pointing everywhere but magnetic North. The two devices are interfering with each other.
So my advice is.. go somewhere in the open, like a park or field, away from any potential interference and power lines, (if you have one bring a real compass to check that the GPS status app is pointing the right way), and see if your compass works as you'd expect.
Extra: The answer from #resus is also important when calibrating. Rotate the phone a few times in each axis. Looks silly but it does calibrate it properly.
Extra 2: Would it be possible/practical to use the compass bearing of your GPS? It would require that the device be moving (walking speed should be fine) but you would not need to worry about any interference. It should give an accurate reading provided your GPS signal is good.
Extra 3: Another thought just occurred to me.. You could try apply a low pass filter to the sensor. This means that the sudden changes in the sensor reading are filtered out .. have a look at this answer. And if that doesn't do a good job there are lots of algorithms on the web for you to choose from.
If you definitely haven't changed anything in your code, and it still works fine on other devices, it would suggest a problem with that particular device.
While your app is running (i.e. the compass is in use), you should be able to wave it in a figure of 8 in order to automatically recalibrate the compass. You should also make sure you aren't standing next to any large lumps of metal etc. that might interfere with readings.
You can override the onAccuracyChanged() method of SensorEventListener to flash up a message to the user when the compass requires recalibration (probably when accuracy drops to SENSOR_STATUS_ACCURACY_LOW).
In my experience of playing with the compass on android phones, they can be pretty unreliable...
If your application work on another tablet and other compass application do not work on your device, this is probably due to a bad calibration.
As said in the post above, to make the calibration, wave your device in a figure of 8. I just want to add that you should do it for EACH axis. This should fix your problem.
If it is not a calibration error, as some people have already answered, it is possible that the compass had gone through a magnetic field and now it is desmagnetized, so it is not working properly.
Where do you usually keep the tablet? Could it be that it was near big servers or magnets?
You should check the compass just in case, talk to to android's tech support.
Hope it helps.
I think the question was if calibration could be done without sending any data to compass. Because not everybody says that the compass is calibrated as shown in this video: https://support.google.com/maps/answer/6145351?hl=en and obviously you can not do anything else than advise the user to calibrate before using the program or when you get too much changes.
For example going left and right 90 degrees in about 25 ms.
Anyway I think it's good to give some seconds to the app before start taking data, because it gives some unstable values (too high and low in short time without movement) at the app loading moment.
Just let the handler onSensorChanged() coded with a conditional, and start a thread on the onCreate() handler, which will set a boolean to true, after some seconds.
Then you start to capture data on the onSensorChanged() handler.
Also this thread can help to detect the sensor accuracy, and then you can popup: In Android can I programmatically detect that the compass is not yet calibrated?
I know because I am building a robot using the compass of the smartphone, and I'm having this experience. So, if you are making a robot, make sure to give an spaced place between electronics and hardware to the smartphone, but remember that it's on any compass: electromagnetic fields can be altered by metals so heavily.
Nowadays I have the luck of developing a robot with an HMC-5983 and an MPU-6050, which can be calibrated by using its libraries with Arduino.
That code is compatible/portable to other uController but for not also so easy for smartphones, I guess that the offsets needed for calibrating the compass, the gyro and the accelerometer are inside some internals of Android, not available in the SDK.
I answered before thinking that maybe calibration was only for some devices, but realized that must be as I said before.
So, if playing with robots its possible, I mean it's also easy, but when using an smartphone maybe some custom firmware as CyanogenMod would bring the possibility of investigating the way of setting that offsets, but more important to run some program ported from sketch (following its concept only) to get them first ...
So, good luck ! What is also true, is that in both devices (smartphone and my robot) it's need to move them for them to get working well, as I showed you in the video of latest answer, also helpful on robots.
Good luck and a lot of fun with those things, are very powerful.
Is the technology there for the camera of a smartphone to detect a light flashing and to detect it as morse code, at a maximum of 100m?
There's already at least one app in the iPhone App store that does this for some unknown distance. And the camera can detect luminance at a much greater distance, given enough contrast of the exposure between on and off light levels, a slow enough dot rate to not alias against the frame rate (remember about Nyquist sampling), and maybe a tripod to keep the light centered on some small set of pixels. So the answer is probably yes.
I think it's possible in ideal conditions. Clear air and no other "light noise", like in a dark night in the mountain or so. The problem is that users would try to use it in the city, discos etc... where it would obviously fail.
If you can record a video of the light and easily visually decode it upon watching, then there's a fair chance you may be able to do so programmatically with enough work.
The first challenge would be finding the light in the background, especially if its small and/or there's any movement of the camera or source. You might actually be able to leverage some kinds of video compression technology to help filter out the movement.
The second question is if the phone has enough horsepower and your algorithm enough efficiency to decode it in real time. For a slow enough signaling rate, the answer would be yes.
Finally there might be things you could do to make it easier. For example, if you could get the source to flash at exactly half the camera frame rate when it is on instead of being steady on, it might be easier to identify since it would be in every other frame. You can't synchronize that exactly (unless both devices make good use of GPS time), but might get close enough to be of help.
Yes, the technology is definitely there. I written an Android application for my "Advanced Internet Technology" class, which does exactly what you describe.
The application has still problems with bright noise (when other light sources leave or enter the camera view while recording). The approach that I'm using just uses the overall brightness changes to extract the Morse signal.
There are some more or less complicated algorithms in place to correct the auto exposure problem (the image darkens shortly after the light is "turned on") and to detect the thresholds for the Morse signal strength and speed.
Overall performance of the application is good. I tested it during the night in the mountains and as long as the sending signal is strong enough, there is no problem. In the library (with different light-sources around), it was less accurate. I had to be careful not to have additional light-sources at the "edge" of the camera screen. The application required the length of a "short" Morse signal to be 300ms at least.
The better approach would be to "search" the screen for the actual light-source. For my project it turned out to be too much work, but you should get good detection in noisy environment with this.
I am planning to develop an accelerometer based mouse on the android platform. the mobile device which i plan to use is htc nexus one. the cursor should move as the mobile is moved is space. will that be difficult compard to movement wrt gravity?
this is hard to answer due to way you have phrased the question.
What is it you are wanting to use the mouse for? If you are trying to move the mouse on a computer, you will need to also create a software package that the PC can run that has the ability to set the position of the mouse.
The accelerators in phones detect, obviously, acceleration, usually in the x y and Z axis. If you lay your phone on the table, you will notice the phone is under 1g (lower all or capital case should that be?). This is actually 1g of acceleration, even though it is not accelerating you still have it. You can detect the roll of a phone by recording how the component of this 1g differs in the three axis. ie you have equal g force in the x and z axis and zero in the y, then you can 'assume' the phone is being held at a 45 degree angle.
When the sum of the components is not equal to 1g, your know your phone is actually accelerating. However, you need to know the position of your phone. Due to a delightfully painful way maths works, if you work out the differential of the differential of the acceleration of your phone (in each axis) you should have the position. The exact way you work out position from acceleration is more then I can think of in the morning, but the relation ships are fairly simple to convert to/from, if you keep a constant for them all, which you can, TIME!
Old question, but still relevant to newer hardware, so here goes...
Your biggest problem is the fact that an accelerometer alone can't tell the difference between acceleration due to motion and acceleration due to gravity and tilting. To isolate out motion, you need a second sensor. Your problem is very much like the one that people building Segway-like balancing robots face, and the solution is pretty much the same as well:
A gyroscope. I believe the Samsung Galaxy S phones have gyros, but I'm not sure whether they're "real" MEMS gyros, or just simulated somehow in a way that might not be up to the task.
The camera. This is an untested theory of mine, but if you could somehow either reflect enough light off the desk with the flash (on phones with LED flash), or perhaps used a mousepad with some glow-in-the-dark pattern, and you could force the camera to do low-res videocapture when it knows it's out of focus, you could probably do pattern-recognition on the blurry unfocused blobs well enough to determine whether the phone is moving or stationary, and possibly get some sense of velocity and/or direction. Combine the low-quality data from the realtime blurry camera video stream with the relatively high-res data from the accelerometers, and you might have something that works.
However, before you even bother with 1 or 2, make sure you're ready to tackle the bigger problem: emulation of a HID bluetooth mouse. It's possible (but might require a rooted phone), and at least one app in Android Market does it, but it's not a trivial task. You aren't going to solve THIS problem in an afternoon, and you should probably try to solve it at least well enough to emulate a fake mouse and convincingly pair it to a computer expecting a real bluetooth mouse before you even bother with the accelerometer problem. Both are high-risk, so don't try to completely finish one task before starting the other, but don't spend too much time on either until you've got a fairly good grip on the problem's scope and know what you're getting into.
There IS an alternative, if bluetooth HID is too much... there are quite a few open source projects that involve skipping bluetooth HID, and just using it as a serial port communicating with a server running on the PC (or tethered directly via usb with ADB). AFAIK, none of them have particularly good phone-as-mouse capabilities, unless you consider using the phone as a touchpad to be "mouse".
I'm trying to build a gadget that detects pistol shots using Android. It's a part of a training aid for pistol shooters that tells how the shots are distributed in time and I use a HTC Tattoo for testing.
I use the MediaRecorder and its getMaxAmplitude method to get the highest amplitude during the last 1/100 s but it does not work as expected; speech gives me values from getMaxAmplitude in the range from 0 to about 25000 while the pistol shots (or shouting!) only reaches about 15000. With a sampling frequency of 8kHz there should be some samples with considerably high level.
Anyone who knows how these things work? Are there filters that are applied before registering the max amplitude. If so, is it hardware or software?
Thanks,
/George
It seems there's an AGC (Automatic Gain Control) filter in place. You should also be able to identify the shot by its frequency characteristics. I would expect it to show up across most of the audible spectrum, but get a spectrum analyzer (there are a few on the app market, like SpectralView) and try identifying the event by its frequency "signature" and amplitude. If you clap your hands what do you get for max amplitude? You could also try covering the phone with something to muffle the sound like a few layers of cloth
It seems like AGC is in the media recorder. When I use AudioRecord I can detect shots using the amplitude even though it sometimes reacts on sounds other than shots. This is not a problem since the shooter usually doesn't make any other noise while shooting.
But I will do some FFT too to get it perfect :-)
Sounds like you figured out your agc problem. One further suggestion: I'm not sure the FFT is the right tool for the job. You might have better detection and lower CPU use with a sliding power estimator.
e.g.
signal => square => moving average => peak detection
All of the above can be implemented very efficiently using fixed point math, which fits well with mobile android platforms.
You can find more info by searching for "Parseval's Theorem" and "CIC filter" (cascaded integrator comb)
Sorry for the late response; I didn't see this question until I started searching for a different problem...
I have started an application to do what I think you're attempting. It's an audio-based lap timer (button to start/stop recording, and loud audio noises for lap setting). It' not finished, but might provide you with a decent base to get started.
Right now, it allows you to monitor the signal volume coming from the mic, and set the ambient noise amount. It's also using the new BSD license, so feel free to check out the code here: http://code.google.com/p/audio-timer/. It's set up to use the 1.5 API to include as many devices as possible.
It's not finished, in that it has two main issues:
The audio capture doesn't currently work for emulated devices because of the unsupported frequency requested
The timer functionality doesn't work yet - was focusing on getting the audio capture first.
I'm looking into the frequency support, but Android doesn't seem to have a way to find out which frequencies are supported without trial and error per-device.
I also have on my local dev machine some extra code to create a layout for the listview items to display "lap" information. Got sidetracked by the frequency problem though. But since the display and audio capture are pretty much done, using the system time to fill in the display values for timing information should be relatively straightforward, and then it shouldn't be too difficult to add the ability to export the data table to a CSV on the SD card.
Let me know if you want to join this project, or if you have any questions.