Android, Logging Sensor Data to a File - android

I would like to log sensor data (Accelerometer, Gyro, Magnetometer, GPS, etc.) and record a time history which can be accessed for future plotting and analysis (as well as some real time calculations). I am a MATLAB programmer, so .csv files came to mind, but I've also seen a little about MySql, which I'm not too familiar with.
My question is, which datalogging method is most appropriate? I'm not limited to the two mentioned above, they're simply what I've seen so far. Any suggestions/ example codes?
PS, I've also run across MicroLog4Android (http://code.google.com/p/microlog4android/) but there are not any examples which I have come across, and being new to Android, I cannot tell if its meant for logging higher speed data (10-20 Hz) or if it's simply a program errorlog. It has some cool features (SMS sending, network storage, etc.) which make it attractive if this is indeed an appropriate tool.

Related

Get Accelero, Gyro and Magneto in same time Android

I'm working on Sensor fusion with Accelerometer, Gyroscope and Magnetic Field on Android. Thanks to SensorsManager I can be noticed for each new value of theses sensors.
In reality and this is the case for my Nexus 5 (I'm not sure for others Android devices), acceleration, rotation rate and magnetic field are sampled in same time. We can verify it using event.timestamp.
On others systems (like iOS, xSens...), Sensor SDK provides a notification with these 3 vectors in same time.
Of course, when I receive an acceleration(t), I can write some lines of codes with arrays to wait rotationRate(t) and magneticField(t). But if there is a way to have an access directly to these 3 vectors together it could be very interesting to know!
An other question relative to sensors data:
Is there advices from Android team to device constructors to provide data in chronological order ?
Thank you,
Thibaud
Short answer, no, Android doesn't provide a way to get all the sensor readings as it reads them.
Furthermore, the behavior that you've observed with SensorManager, namely that readings from different sensors happen to have the same timestamp suggesting that they were read together - should not be relied upon. There isn't documentation that guarantees this behavior (also, this is likely a quirk of your testing and update configuration), so relying upon it could come to bite you in some future update (and trying to take advantage of this is likely much more difficult to get right or fast than the approach I outline below).
Generally, unless all results are generated by the same sensor, it is impossible to get them all "at the same time". Furthermore, just about all the sensors are noisy so you'd already need to do some smoothing if you read them as fast as possible.
So, what you could do is sample them pretty quickly, then at specific intervals, report the latest sample from all sensors (or some smoothed value that accounts for the delta between sample time and report time). This is a trivial amount of extra code, especially if you're already smoothing noisy sensor data.
There is a workaround for this particular problem. When multiple registered listeners are present in an activty at the same time, timestamp for those event may be misleading. But you can multiple fragment objects into the said activity which have different context's. Then listen to every sensor in these fragments. With this approach the sensor reading timestamps become reliable.
Or listen in parallel threads if you know about concurrency...

Real-time audio denoise using FFT on android

I'm thinking of starting a android project, which records audio signals and does some processing to denoise. My quesion is, as many (nearly all) denoising algorithms involve FFT, is it possible for me to do a real-time program? By real-time I mean the program do recording and processing at the same time, so I could save my time when I finish recording.
I have made a sample project, which applies fourier transformation to the audio signal and implement a simple algorithm called sub-spectrum. But I found that it is difficult to implement this algorithm in real time, which means after I press the 'stop' button, it takes me a while to do the processing and save the file (I'm also wondering how do these commercial recorder programs record sound and at the same time save it). I know that my FFT may not be the fastest, but I'd like to know whether I could achieve 'real-time', if I fully optimized it or use the fastest FFT code? Thanks a lot!
It sounds like you are talking about broadband denoising. So I'll address my question to that. There are other kinds of denoising, from simple filtering to adaptive filtering to dynamic range expanding and probably others.
I don't think anyone can answer this question with a simple yes or no. You will have to try it and see what can be done.
First off, there are a variety of FFT implementations, including FFTW, of varying speed you could try. Some are faster than others, but at the end of the day they are all going to deliver comparable results.
This is one place where native C/C++ will outperform Java/Dalvik code because it can truly take advantage of vector code. For that to work, you'll probably need to write some assembler, or find some code that is already android optimized. I'm not aware of an android optimized FFT, but I'm sure it exists.
The real performance win will come from how you structure your overall denoising algorithm. All denoising I'm familiar with is extremely processor intensive and probably won't work on a phone in real-time, although it might on a tablet. That's just a(n educated) guess, though.

Controlling camera hardware in Android phone

I want to control the aperture, shutter speed and ISO on my android phone. Is there a way in which I can access the hardware features?
I won't say it's impossible to do this, but it IS effectively impossible to do it in a way that's generalizable to all -- or even many -- Android phones. If you stray from the official path defined by the Android API, you're pretty much on your own, and this is basically an embedded hardware development project.
Let's start with the basics: you need a schematic of the camera subsystem and datasheets for everything in the image pipeline. For every phone you intend to support. In some cases, you might find a few phones with more or less identical camera subsystems (particularly when you're talking about slightly-different carrier-specific models sold in the US), and occasionally you might get lucky enough to have a lot of similarity between the phone you care about and a Nexus phone.
This is no small feat. As far as I know, not even NEXUS phones have official schematics released. Popular phones (especially Samsung and HTC) usually get teardowns published, so everyone knows the broad details (camera module, video-encoding chipset, etc), but there's still a lot of guesswork involved in figuring out how it's all wired together.
Make no mistake -- this isn't casual hacking territory. If terms like I2C, SPI, MMC, and iDCT mean nothing to you, you aren't likely to get very far. If you don't literally understand how CMOS image sensors are read serially, and how bayer arrays are used to produce RGB images, you're almost certainly in over your head.
That doesn't mean you should throw in the towel and give up... but it DOES mean that trying to hack the camera on a commercial Android phone probably isn't the best place to start. There's a lot of background knowledge you're going to need in order to pull off a project like this, and you really need to acquire that knowledge from a hardware platform that YOU control & have proper documentation for. Make no mistake... on the hierarchy of "hard" Android software projects, this ranks pretty close to the top of the list.
My suggestion (simplified and condensed a bit): buy a Raspberry Pi, and learn how to light up a LED from a GPIO pin. Then learn how to selectively light up 8 LEDs through an 74HC595 shift register. Then buy a SPI-addressed flash chip on a breakout board, and learn how to write to it. At some point, buy a video image sensor with "serial" (fyi, "serial" != "rs232") interface from somebody like Sparkfun.com & learn how to read it one frame at a time, and dump the raw RGB data to flash. Learn how to use i2c to read and write the camera's control registers. At this point, you MIGHT be ready to tackle the camera in an Android phone for single photos.
If you're determined to start with an Android phone, at least stick to "Nexus" devices for now, and don't buy the phone (if you don't already own it) until you have the schematics, datasheets, and sourcecode in your possession. Don't buy the phone thinking you'll be able to trace the schematic yourself. You won't. At least, not unless you're a grad student and have one hell of a graduate-level electronics lab (with X-Ray capabilities) at your disposal. Most of these chips and modules are micro-BGA. You aren't going to trace them with a multimeter, and every Android camera I'm aware of has most of its low-level driver logic hidden in loadable kernel modules whose source isn't available.
That said, I'd dearly love to see somebody pull a project like this off. :-)
Android has published online training which contain all the information you need:
You can find it here - Media APIs
However, there are limitations, not all hardware's support all kind of parameters.
And if I recall correctly, you can't control the shutter speed and ISO.

Acquiring Sensor's data on Android Platforms

To read the sensor's data on an Android platform (i.e. Accelerometer, Gyroscope, Magnetometer, Barometer, GPS ), people over the internet are talking about two ways to acquire such data
Primary way: reading the data using the Android SDK via JAVA.
The 2nd way is related to reading the data using the Android NDK.
What about communicating with the sensors directly via SPI,I2C, or UART without the use of the SDK or the NDK ? I understand that I'll be burdened by understanding the communication protocol with the sensors and reading specific registers from which I can acquire the data in a more efficient way. Is this possible ?
In theory it is possible, Walid. If you throw enough time and money at most technical problems, solutions become possible. But I would have to ask why anyone would want to do it that way?
It would be like saying "I'm pretty sure I can drive my car, inverted. I'll operate the accelerator and brake with my hands, and I'll add a couple of extra mirrors to reflect the windshield view down to me. And I'll steer with my legs. Don't ask me how I'll operate the horn!" It's just doing it at a goofy level.
You'd surely need details of the individual chips, which means you'd need to tear your XOOM apart - that kind of implementation info is not published. Not because it's a big secret, but because it keeps costs down if manufacturers don't publish info that 100% of consumers don't need.
Bottom line: there are more productive uses of your energy and brainpower.
Peter

Differential GPS (dGPS) in Android

My goal is to use multiple android devices to achieve a more accurate GPS position, but from the research I've done around the net, it looks daunting. In order to do the differential calculations, I'd need to be able to access each phone's raw gps data, specifically the pseudorange of each satellite, in real time. Unfortunately, the Android API doesn't have any hooks for this data, only the processed resulting data. I did see, on another forum, that GPSmaster, who hangs out around here, claims to have pulled it off.
So two questions really: First, since I'm a total noob here, is there a way to send users private messages, ie, can I get GPSmaster's attention somehow? Second, does anyone know of a way to get access to the raw GPS output with going way below the API layer and having to recompile the OS?

Categories

Resources