To read the sensor's data on an Android platform (i.e. Accelerometer, Gyroscope, Magnetometer, Barometer, GPS ), people over the internet are talking about two ways to acquire such data
Primary way: reading the data using the Android SDK via JAVA.
The 2nd way is related to reading the data using the Android NDK.
What about communicating with the sensors directly via SPI,I2C, or UART without the use of the SDK or the NDK ? I understand that I'll be burdened by understanding the communication protocol with the sensors and reading specific registers from which I can acquire the data in a more efficient way. Is this possible ?
In theory it is possible, Walid. If you throw enough time and money at most technical problems, solutions become possible. But I would have to ask why anyone would want to do it that way?
It would be like saying "I'm pretty sure I can drive my car, inverted. I'll operate the accelerator and brake with my hands, and I'll add a couple of extra mirrors to reflect the windshield view down to me. And I'll steer with my legs. Don't ask me how I'll operate the horn!" It's just doing it at a goofy level.
You'd surely need details of the individual chips, which means you'd need to tear your XOOM apart - that kind of implementation info is not published. Not because it's a big secret, but because it keeps costs down if manufacturers don't publish info that 100% of consumers don't need.
Bottom line: there are more productive uses of your energy and brainpower.
Peter
Related
first of all i would like to say that beacons seems to be something great and usable, i am very enthusiastic still i saw it for the first time.
Now, i would like to try them and to make an Android app, but i'm confused about some things that i didn't found it clearly on internet:
Are beacons available already?
How much does a beacon cost?
Does it need to be charged ?
How much time can a beacon work without charging?
Do i need to setup every device for interaction with it?
Can i implement beacon in Unity App ?
Is there any tutorials about using it?
I know, post is a little big, but i would be very glad if i will found here answers. Have a nice day! ;)
Are beacons available already?
yes, check out the internet.. radius networks, kontakt.io, estimote etc., you could also hit Alibaba, or buy a raspberry pi and a bluetooth dongle
How much does a beacon cost?
cheap. Small beacons could be as little $1 each, but these have drawbacks like non-replaceable batteries and short range, Bluetooth 5 beacon could be more expensive, but these can cover a 1km radius (personally I think that's pointless). Typically expect around $30 for a "good" or top-tier beacon company's primary beacon choice. If you buy bulk you get cheaper, but you might want to experiment with a few different kinds before you do that. Our company bought like 300 at one stage and we might need to replace them with a different manufacturer now
Does it need to be charged ?
some have replaceable batteries, some are only available to be plugged in, some are just disposable - you need to track that yourself
How much time can a beacon work without charging?
it can't - you may be thinking of NFC here - a Bluetooth radio/antenna requires more power than you might think (but probably less than both of us think to be honest), however it needs a dedicated power source both to transmit and receive data
Do i need to setup every device for interaction with it?
no, you make an app that listens for it. Well.. there are actually lot of options, however, not with straightforward detection/processing. Eddystone promotes a notion of "the Physical Web", which is like using URLs sent by Eddystone beacons to show you the right content, or iPhones actually have more built in support for some (mainly) retail use cases. Android is great because you can do so much in the background, and foreground services give you a lot more say about how and when you are stopped. You should also be aware that 4/5/6/7 all have different caveats around scanning/receiving, but most of the differences will/should be absorbed by any SDK you might use
Can i implement beacon in Unity App ?
certainly, just find a use case (AR/VR and a drone with a beacon for a Dragon? :O)
Is there any tutorials about using it?
so many, google about, but I would recommend starting with Radius Network's Android Beacon Library.(This uses Altbeacon, but is VERY easily changed to work with iBeacon and even Eddystone, also it's free and these guys know their stuff). Also, there are many beacon apps you should download as the consistency is not guaranteed across devices, and a few apps have a few different features that you might want for debugging. Try Locate Beacon (by Radius) NRF Toolbox and basically any other BLE app with a decent score - it can be really good to cross reference the hits when funny stuff starts happening.
A lot of people talk about Beacons and managing them as if it's more complex than it is, you have an object that just screams an ID every X milliseconds, you hear that, you do something with it, once, every X seconds, or whatever you want
I would say you should get very familiar with the difference between BLE/ Bluetooth Smart and regular Bluetooth that interacts via a GATT server. With beacons you're essentially just listening to a peripheral device that advertises in a set format. As the developer, it is up to you to take this and make it meaningful for your user
I want to control the aperture, shutter speed and ISO on my android phone. Is there a way in which I can access the hardware features?
I won't say it's impossible to do this, but it IS effectively impossible to do it in a way that's generalizable to all -- or even many -- Android phones. If you stray from the official path defined by the Android API, you're pretty much on your own, and this is basically an embedded hardware development project.
Let's start with the basics: you need a schematic of the camera subsystem and datasheets for everything in the image pipeline. For every phone you intend to support. In some cases, you might find a few phones with more or less identical camera subsystems (particularly when you're talking about slightly-different carrier-specific models sold in the US), and occasionally you might get lucky enough to have a lot of similarity between the phone you care about and a Nexus phone.
This is no small feat. As far as I know, not even NEXUS phones have official schematics released. Popular phones (especially Samsung and HTC) usually get teardowns published, so everyone knows the broad details (camera module, video-encoding chipset, etc), but there's still a lot of guesswork involved in figuring out how it's all wired together.
Make no mistake -- this isn't casual hacking territory. If terms like I2C, SPI, MMC, and iDCT mean nothing to you, you aren't likely to get very far. If you don't literally understand how CMOS image sensors are read serially, and how bayer arrays are used to produce RGB images, you're almost certainly in over your head.
That doesn't mean you should throw in the towel and give up... but it DOES mean that trying to hack the camera on a commercial Android phone probably isn't the best place to start. There's a lot of background knowledge you're going to need in order to pull off a project like this, and you really need to acquire that knowledge from a hardware platform that YOU control & have proper documentation for. Make no mistake... on the hierarchy of "hard" Android software projects, this ranks pretty close to the top of the list.
My suggestion (simplified and condensed a bit): buy a Raspberry Pi, and learn how to light up a LED from a GPIO pin. Then learn how to selectively light up 8 LEDs through an 74HC595 shift register. Then buy a SPI-addressed flash chip on a breakout board, and learn how to write to it. At some point, buy a video image sensor with "serial" (fyi, "serial" != "rs232") interface from somebody like Sparkfun.com & learn how to read it one frame at a time, and dump the raw RGB data to flash. Learn how to use i2c to read and write the camera's control registers. At this point, you MIGHT be ready to tackle the camera in an Android phone for single photos.
If you're determined to start with an Android phone, at least stick to "Nexus" devices for now, and don't buy the phone (if you don't already own it) until you have the schematics, datasheets, and sourcecode in your possession. Don't buy the phone thinking you'll be able to trace the schematic yourself. You won't. At least, not unless you're a grad student and have one hell of a graduate-level electronics lab (with X-Ray capabilities) at your disposal. Most of these chips and modules are micro-BGA. You aren't going to trace them with a multimeter, and every Android camera I'm aware of has most of its low-level driver logic hidden in loadable kernel modules whose source isn't available.
That said, I'd dearly love to see somebody pull a project like this off. :-)
Android has published online training which contain all the information you need:
You can find it here - Media APIs
However, there are limitations, not all hardware's support all kind of parameters.
And if I recall correctly, you can't control the shutter speed and ISO.
I am developing a project that is intended to use the GPS capabilities of an Android phone and a nearby station to compute positioning to a much more precise degree (cm), using RTK DGPS technology.
So far, I haven't been able to see anyone saying they actually managed to perform a similar task (apart from #GPSmaster, who doesn't explain how), and the APK doesn't seem to offer any information from the GPS chip other than location and NMEA message updates. I need, if possible, pseudo-ranges and carrier phases.
I was wondering if:
It would be possible to look for lower level hooks on my phone using native code, or other lower level snooping;
It would be possible to send RTCM corrections to the GPS chip present on one of these devices;
Any ideas?
Generally speaking DGPS is a technique that improves real position accuracy by canceling out most of the atmospheric effects on the GPS signal. In a typical direct GPS measurement there is about a random error in the ranges computed to the satellites due to atmospheric effects. This is why a GPS receiver that is left collecting data in a fixed location will seem to wander with in an error ellipse. For two receiver stations in the same area the atmospheric effects are almost identical and they will wander in parallel within their similarly sized and oriented error ellipses. If one of the two receivers is at a know location then the differences in their apparent GPS locations can be taken and plotted from the true location of the known station to find the true location of the unknown station.
Back in the day (circa 1992) when we had to accomplish DGPS by "post processing" we used to take the raw NEMA data collected at the two stations match up the times, compute the baseline vector and apply it to the known point to find the unknown point. I think the NEMA data we were using was only recorded to the nearest 10 sec. The math isn't really that hard.
I suspect that NEMA GPS messages [http://developer.android.com/reference/android/location/GpsStatus.NmeaListener.html ] from a tablet at a known point (with a clear sky view) could probably be sent over an internet socket to a smart phone (also with a clear sky view), which could then compute the difference and achieve a sub-meter relative location over a distance of few km, even if the assumed Internet transit times were ignored. This technique would probably still work even if the tablet and smart phone were both applying broadcast DGPS adjustments.
With the andvent of Android 7.1, the raw data from GPS chips will be available to developers. (http://gpsworld.com/google-to-provide-raw-gnss-measurements/)
Others seem to have done something similar to what you wish to accomplish (http://gpsworld.com/innovation-precise-positioning-using-raw-gps-measurements-from-android-smartphones/)
No, it is not practical to get any lower level access to the GPS device by an Android application. This has several reasons:
The application has no other means of accessing the GPS device as through the Java based API. Native code is forbidden to use most devices and usually needs a Java wrapper to tunnel through the sandbox for Android sensor devices. This makes up the main security concept.
If native code would have access to the GPS device on a lower level, it would have to cope with several different manufacturers protocols now not abstracted by the API. Best chances are to get access to custom NMEA codes, which may still have device dependent caveats.
Even if lower level access would be possible, one loses the integrated merging of other location sources like WLAN and cellphone carrier, that are presumably merged in native code below the Java API but above the NMEA protocol.
You can use DGPS corrections in Europe via custom application for SISnet receiving correction signals from EGNOS augmentation satellites(http://egnos-portal.gsa.europa.eu/news/egnos-gets-invite-your-smartphone-11). It does however need a subscription (which isn't really open to public yet) to SISnet to obtain username and password for connection to their servers. There's some of SDK published which you may find useful. Just remember that you are limited to C/A signals only (pseudoranges) and you CANNOT get phase data (L1/L2) from those cheap chips inside smartphones.You'd need a precision GNSS receiver such as Trimble BD910 (http://www.trimble.com/gnss-inertial/bd910.aspx?dtID=overview) to be able to access L1 carrier phase signal for GPS & GLONASS. There are however cheaper chips that support SBAS but none are yet installed natively in phones.
Umm. Your android probably has such a crap GPS antenna that achieving cm accuracy is impossible. Maybe if you average the position for days.. usually DGPS support is not published and not many chipsets support it. Last time I saw DGPS implemented it involved hacking the actual GPS chip firmware to add support. Even getting A-GPS to work on a random chipset is iffy since they might not support a documented way of feeding the assistance data.
It should be related with the hardware implementation , rather than the software implementation.
In the reality, GPS is usually accompanied with Wi-Fi or 3G to assist in searching the current position.
RTCM correction can be sent to your android phone using NTRIP 'provider'. Then you need to apply it to your raw GPS in your android.
I would like to log sensor data (Accelerometer, Gyro, Magnetometer, GPS, etc.) and record a time history which can be accessed for future plotting and analysis (as well as some real time calculations). I am a MATLAB programmer, so .csv files came to mind, but I've also seen a little about MySql, which I'm not too familiar with.
My question is, which datalogging method is most appropriate? I'm not limited to the two mentioned above, they're simply what I've seen so far. Any suggestions/ example codes?
PS, I've also run across MicroLog4Android (http://code.google.com/p/microlog4android/) but there are not any examples which I have come across, and being new to Android, I cannot tell if its meant for logging higher speed data (10-20 Hz) or if it's simply a program errorlog. It has some cool features (SMS sending, network storage, etc.) which make it attractive if this is indeed an appropriate tool.
I started doing some thinking about creating an Android application that can be used within a corporate building to determine what room you are in. Obviously I'm thinking GPS and network locations wouldn't be accurate enough to accomplish this. (Not to mention the instability of GPS signal inside) I looked briefly into calculating distance via the accelerometer, but it is apparently highly volatile and leaves a large margin for error. I've also considered some sort of triangulation from routers, but you run into issues with walls/microwaves and various other things that could upset the signal strength. Does anyone have any possible ideas or directions to try?
How about bluetooth tags that emit a room id. You could make money merchandising the emitters on a facility size basis and they could provide additional functionality such as:
(1) forming a pico net that ultimately updates the an onsite server with locations of everyone using the app, or
(2) tracking key equipment with a similar bluetooth tag, etc.
You could get something similar to the link below in qty from China, I'm sure:
http://www.engadget.com/2009/03/26/nio-bluetooth-security-tag-keeps-tabs-on-your-belongings/
Perhaps it is not too much help but the papers I have found on this subject are these:
RSSI-Based Indoor Localization and Tracking Using Sigma-Point Kalman Smoothers
Pedestrian Tracking with Shoe-Mounted Inertial Sensors
Enhancing the Performance of Pedometers Using a Single Accelerometer
I have no idea how these methods would perform in real-life applications or how to turn them into a nice Android app.
I am curious what other answers you will get.