I use Google Location API to get the location of the user every 5 seconds and it works fine. then I must know the address of that location. I know I can use Geocoder.getFromLocation but as documents say, it is not always available, and it needs an internet network to work. is there another reliable way to get the address offline?
That is practically impossible, because you need to have a lot of information about which address is tied to (approximately) which lat/long value.
However, one solution is to include such a database in your application. It seems like MaxMind open sourced theirs quite some time ago. However, the best you can get there is cities, which is probably not what you are looking for.
So unless you are able to find a database with such information, this is not possible without internet connection.
Is there another reliable way to get the address offline?
So for sure with Geocoder you are not able to get it work without backend service in this case network.
If you want to have offline access to maps, they have to be stored in your device to be able reading them. So you have only one choice and download map data, try to process them (optimize size for mobile devices) and you have to create some structure where you will store data for fast-searching (most likely quad tree or R-tree)1. It's not trivial at all. Or (assume if exists) use some offline maps but also data must be stored in your device.
1 I have been working on some personal project focused for offline address searching. I'm using OpenStreetMaps as data source. But i can tell you it's not that easy. Since raw data are stored in a huge XML files (1-600 GB) i had to create convesion tool for desktop that processes data and saves them in quad tree that is serialised into database (tree is a huge and cannot be added to memory so i need to cache it and use semi-persistence). In other words tool prepares data for an usage in device. Final files have approximate 100 MB - 1 GB (depends on source size). I could tell you more but this is not right place.
I have done some speed performance test and only for "illustration" i compared performance with official Geocoder API:
But tool is not done but works as you can see above. I think performance is not bad but i think can be optimized (optimized statements and tree algorithms).
Related
After analyzing my Android application with a security tool, it has detected a high level vulnerability "File unsafe delete check". I have investigated about this, and it seems that the problem is that the application uses "file.delete()".
That function is considered unsafe because data could theoretically be retrieved with a tool that scans all the storage device. So, if that way of deleting is "unsafe"... what is the "safe" way to delete files in Android? (to avoid getting that "security error" that is supposedly a "high level" one). What is the proper way to delete files in Android Development?
I am getting the same security warning in 2 different applications, one made with native Java and the other one with Xamarin Forms. Thank you very much!
what is the "safe" way to delete files in Android?
There is none for the vast majority of Android devices. You use delete() on File.
That function is considered unsafe because data could theoretically be retrieved with a tool that scans all the storage device
If the Android device happens to use a classic hard drive (spinning magnetic media), you can overwrite the data before deleting it. On any sort of flash media, that will be ineffective, as the physical location where the data is written can vary with each write operation ("wear leveling").
So, this really boils down to your objective:
If you feel that the user will be harmed if this data is available to be read, store it encrypted with a user-supplied passphrase.
If you are simply trying to avoid this warning, ask the developers of this "security tool" what they are expecting you to do. Or, find a better tool.
This is not an Android specific issue.
It has to do with how file systems work, and the physical storage media it self.
When you delete a file, regardless of API, what is actually deleted is the record in the files table.
The actual data on disk or flash storage remains.
There is a method for secure deletion:
Before deleting the file, overwrite its contents with garbage or zeros several times.
But, this method only works for magnetic media such as hard disks.
Android devices use NAND flash for storage.
Because the number of writes a NAND chip can take before it fails is a lot less than that of magnetic memory, these chips usually come with a mechanism that spreads out the write commands.
What this means is that even if you try to write random data or zeros over your file, there is no guarantee the actual data will be overwritten.
The writes may go to a different sector to avoid wear.
So, on one hand, for flash storage it is enough to overwrite the file once, but on the other hand, it is impossible to do correctly at application level.
If you want to make your application secure, you must make sure to store sensitive data encrypted.
Then, even if someone tries to read the raw storage, they wouldn't be able to recover the data.
Don't store user credentials (like passwords) in regular files on Android.
Use Android accounts API and let the OS manage security.
If you still need file storage but want to protect the data, encrypt it in memory and then write to file.
As said by the other answers the first thing to consider under a theoretical point of view is if there is really a need to store any sensitive information in files to be kept on customer side
If this is really the case, encryption is the real way to guarantee proper security. Files would be protected not only against the recovery after deletion but also during their known life on the device
That said, in the case of a vulnerability assessment - i.e. a static analysis of the code - it would not be immediate to detect that you are calling for a deletion [via file.delete()] of encrypted files. Or maybe you are just calling the deletion of files with nothing to hide
In both these cases the found vulnerability would just be a false positive. Which is part of the game because you can guess that it's quite complicated for an automated tool to understand if something really "deserves" protection or not
What you can do to get rid of the vulnerability is adding the logic to empty the files before calling file.delete(). You have a sample here for this purpose. This will solve the vulnerability detection you are experiencing
My app are sometime needed syncing with web servers and pull the data in mobile sqlite database for offline usages, so database size is keep growing exponentially.
I want to know how the professional app like whatsapp,hike,evernote etc manage their offline sqlite database.
Please suggest me the steps to solve this problem.
PS: I am asking about offline database (i.e growing in the size after syncing) management do not confuse with database syncing with web servers.
I do not know how large is your data size is. However, I think it should not be a problem storing reasonably large data into the internal memory of an application. The internal memory is shared among all applications and hence it can grow until the storage getting filled.
In my opinion, the main problem here is the query time if you do not have the proper indexing to your database tables. Otherwise, keeping the databases in your internal storage is completely fine and I think you do not have to be worried about the amount of data which can be stored in the internal storage of an application as the newer Android devices provide better storage capability.
Hence, if your database is really big, which does not fit into the internal memory, you might consider having the data only which is being used frequently and delete otherwise. This highly depends on the use case of your application.
In one of the applications that I developed, I stored some large databases in the external memory and copied them into the internal memory whenever it was necessary. Copying the database from external storage into internal storage took some time (few seconds) though. However, once the database got copied I could run queries efficiently.
Let me know if you need any help or clarification for some points. I hope that helps you.
For max size databases. AFAIK You don't want to loose what's on the device and force a reload.
Ensure you don't drop the database with each new release of your app when a simple alter table add column will work.
What you do archive and remove from the device give the user a way to load it in the background.
There might be some Apps / databases where you can find a documentation, but probably this case is limited and an exception.
So to know exactly what's going on you need to create some snapshots of the databases. You can start with that of one app only, or do it directly with several, but without analyzing you won't get a reliable statement.
The reasons might be even different for each app as databases and app-features differ naturally too.
Faster growth in size than amount of incoming content might be related to cache-tables or indexing for searches, but perhaps there exist other reasons too. Without verification and some important basic-info about it, it's impossible to tell you a detailed reason.
It's possible that table-names of a database give already some hints, but if tablenames or even fields just use meaningless strings, then you've to analyze the data inside including the changes between snapshots.
The following link will help in understanding what exactly Whatsapp is using,
https://www.quora.com/How-is-the-Whatsapp-database-structured
Not really sure if you have to keep all the data all the time stored on the device, but if you have a choice you can always use cloud services (like FCM, AWS) to store or backup most of the data. If you need to keep all the data on the device, then perhaps one way is to use Caching mechanisms in your app.
For Example - Using LRU (Least Recently Used) to cache/store the data that you need on the device, while storing the rest on the cloud, and deleting whats unneeded from the device. If needed you can always retrieve the data on demand (i.e. if the user tries to pull to refresh or on a different action) and delete it whenever its not being used.
I am currently working on a (commercial) logistics project. We build a (partially) automated storage system in which the goods are stored randomly (think of nano-amazon). The positions of the objects are stored on the main computer and we are at the moment implementing the offsite backup via WAL (any objections?). One of our problems is that we have to operate during a power blackout and we can't produce enough energy for our computers for the worst case duration of the blackout which could be several hours. [This probably will never happen as we are in Germany, but there are some regulations we need to fulfill].
So my idea is to use a tablet [cheaper than a laptop], send the WAL-files to it so that the user can access the data during the blackout. But so far, I have seen no server implementation for tablets (either android or ios). Isn't there any or did I just not find it?
But maybe I'm also moving into the wrong direction. The Database is rather small (<50000 objects in the warehouse with each < 1kb) and the information we need during blackout is just one table (object_id -> position_in_warehouse) so that I even think about writing this information into a file and using git to copy the changes to the tablet. We also only need to know which objects have been removed during the blackout so that this information can easily be migrated back to the original db.
Or do you have other ideas?
Does your time have any value to you? Discard the Android + PostgreSQL option right now.
Keep it simple. You can get a cheap laptop for practically nothing, especially second hand. Since you clearly don't care about it actually working as a backup option, that seems like a no-brainer. You can run a streaming replica with WAL archiving for fallback.
For your real fallback option, you're on the right track with writing out the data you require to a flat file and syncing just that. Remember to actually test it - you should actually use it occasionally and make sure it works.
BTW, for your WAL-streaming backups, I suggest PgBarman, which will manage retention and rotation for you. You should also do logical dumps, and remember to test your backups.
I don't think there's a port of Postgres to Android - to use WAL files you'd need a working server. Even if it was ported, then you can't ship WAL files from x86 server to Android tablet - master and slave have to be the same major version, OS and architecture.
You really should just periodically export your data from Postgres to a simple file (I'd recommend SQLite) and just download it from a server. I suppose your tablets use WiFi and this file would be like 10MB zip-compressed.
Alternatively you could use rsync to keep this file updated. Don't use git - it will keep all previous versions of this file on your tablet - it would grow rather fast.
I am new to Android Application Development and a new member at stackoverflow. I am currently trying to design a recipe application. I have decided upon the features of the app and the scope it will cover. The scope is very vast for me in terms of covering all the recipes from all over the world. I am to deal with a lot of data in this process.
I am currently trying to figure a good and efficient way of handling the data in my app. So far, as per what I have read in different forums, I believe that I have two options in terms of a database choice : 1) SQLite 2) Database on remote server (MySql/Postgre)
Following are some of the thoughts that have been going on in my mind when it comes to taking a decision between the two :
1) SQLite : This could be a good option but would be slow as it would need to access the file system. I could eliminate the slowness by performing DB data fetch tasks in the AsyncTask. But then there could be a limitation of the storage on different phones. Also I believe using SQLite would be easier as compared to using a remote DB.
2) Remote Database : The issue that I can see here is the slowness with multiple DB requests coming at the same time. Can I use threads here in some way to queue multiple requests and handle them one by one ? Is there an efficient way to do this.
Also I have one more question in terms of the formatting of my data once I pull it out from the above DB's. Is there a way I could preserve the formatting of my data ?
I would be more than thankful if someone could share their knowledgeable and expert comments on the above scenario. Also this is not a homework for me and I am not looking for any ready made code solutions. I am just looking for hints/suggestions that would help me clear my thoughts and help me take a decision. I have been looking for this for sometime now but was not able to find concrete information. I hope I will get some good advice here from the experienced people who might have encountered similar situation.
Thanks for reading this long post.
What about combining both approaches?
A local SQLite database that has the least recently used receipes so you don't need network all the time. Network is way slower than accessing the filesystem.
Some remote database accessed via some HTTP interface where you can read / write the whole database. And if you want users to be able to add receipes for other users to see you'll need an external database anyways.
SQLite : This could be a good option but would be slow as it would need to access the file system.
Accessing a local database is pretty fast, 5ms or so if it's just a simple read only query on a small database.
But then there could be a limitation of the storage on different phones
Depends on your definition of huge database. It is okay if it is only 2MB which would be enough to store lots of text-only receipes.
Also I believe using SQLite would be easier as compared to using a remote DB.
Yes, Android has a nice built-in SQLite API but no remote database API. And you don't need to setup a database server & interface.
The issue that I can see here is the slowness with multiple DB requests coming at the same time.
A decent database server can handle thousands of requests. Depends on your server hardware & software. https://dba.stackexchange.com/ should have more info on that. Required performance depends on how much users you have / expect.
I'd suggest a simple REST interface to your database since it's pretty lightweight but does not expose your database directly to the web. There are tons of tutorials and books about creating such interfaces to databases. There are even hosted database services like nextDb that do most of the work for you.
Is there a way I could preserve the formatting of my data ?
You could store HTML formatted data in your database and display it in a WebView or a TextView (via Html#fromHtml()) - both can display formatted text.
Databases don't care what type of text you store, for transfer over the internets you may need to encode the text so it does not interfere with the transport formatting (XML, JSON, ...).
A simple way is to integrate Parse into your app. They have a nice framework that easily integrates into iOS and Android. Their plan is freemium, so you'll be able to use up to 1 million API request for no charge, and then its 7 cents for every request after that.
You'll have 1gb to store all your data sets / images, etc.
I don't use parse for everything, but I HIGHLY recommended it for large data schemes because they do all the scaling for you. Check out the API, I think it would be worth your time.
I just started to work on a few of my own projects, and I'm using Parse again. I have to say it's improved a lot over the last 6-8 months. Especially with the Twitter and Facebook integration.
The key issue here is the size of the data - any significant database of recipes would be too large to store on the phone imho,thus you seem stuck with the remote database solution.
As opposed to trying access the remote database from android I suggest you use a a go between web application that will process requests from the application and return JSON objects that you need.
It totally depends on your software requirements. If you need to deal with a small amount of data then you may choose SQLite, but for a huge amount to data better use a remote DB.
SQLite: It works fine with little amount of data & I experienced it response time is good.
Remote DB: I think you may use small server side app to submit the data to your client app. It will solve/reduce your thread related issues/complexities.
I am a php/mysql developer learning android. I am creating an android app that receives info from my php app to create list views of different products which will open a web view of that product's detail.
Currently my php cms web application outputs xml lists for an iphone app.... (also, separately outputs html). I have full control of the php app so if there is a better way to output the data for the android app please let me know.
I have created code that reads the xml from the web and creates the list view. The list can be refreshed daily, so the data does not need to be read from the online xml every time the app starts.
So I was thinking to store the data retrieved locally to improve my apps responsiveness. there may be up to 500 product descriptions to be stored at any given time in up to 30 different xml lists. I am starting development with one xml list with about 30 products.
For best performance should i store the product info in a sqlLite db or should i store the actual xml file in the cache/db or some other method like application cache.
I also was think to create the update of the data as a service, would this be a good idea?
The most efficient way to store data is RAM. But if you want to cache it, then the most efficient way is Database.
I recommend you store your data in sqlite android database.
You could also consider zipping you xml for faster network transfer and unzipping through java.util.zip package classes. You could even consider a simpler format for transmitting data, less verbose than xml, using a datainput/outputstream.
(I do that in of my apps and it works great)
Here are some details on data input / output stream method :
imagine a proprietary protocol for your data, only what you need. No tags, no attributes, just raw values in order.
on the client side, get an input stream on your data using URL.getContent() and cast it in input stream.
on the client side still, build a data input stream encapsulating your socket input stream and read data in order. Use readInt, readDouble, readUTF, and so on.
on the client side, from php, you need to find a way to save your data in a format that is compatible with the data format expected by the client. I can't tell much about PHP, I only program using java.
The advantage of this technique is that you save bandwith as there is only data and no verbose decoration due to xml. You should read about java specs to understand how double, int, strings are written in data output stream. But it can be hard using two languages to get the data right.
If php can't save format in a suitable way, use xml, it will be much simpler. First try with just plain xml, then give a try using a zip or tarball or xml file.
But all this is about speed gain during network connection.
The second part of what you have to do is to store each row of your list in a SQL table. Then you can retrieve it pretty fast using a CursorAdapter for your list view (it breaks the charming MVC model but it is quite fast !).
Sorry about this, but it became too long to write as a comment. This is not intended to be an answer to your question, because in my opinion Stéphane answered very well. The best solution is indeed to store the data in an sqlite database. Then you need to create the class to be used as a connection between the data, the database and the app. I don't want to take credit for what is said here already (I, too, voted it up).
I'm concerned with the other suggestion (use of low level raw streams for data manipulation, the list steps on that answer). I strongly recommend you to avoid creating your own proprietary protocol. It goes like this:
I need to exchange data.
I don't want to deal with the hassle of integrating external APIs into my code.
I know I can write two 5 minute routines to read and write the data back and forth.
Therefore, I just created my own proprietary format for exchanging data!
It makes me cry whenever I need to deal with unknown, obscure and arbitrary sequence of data blobs. It's always good to remember why we should not use unknown formats:
Reinventing the wheel is counter-productive. It seems not, but on the middle term it is. You can adapt your project to other mediums (server-side, other platforms) easily.
Using off-the-shelf components help you scale your code later.
Whenever you need to adapt your solution to other technologies and mediums, you'll work faster. Otherwise, you would probably end up with ad hoc code solutions that are not (easily) extensible and interoperable.
Using off the shelf components enables you to leverage advances in that particular technology. That's particularly important when you are using Android APIs, as they are frequently optimized for performance later down the road (See Android's Designing for Performance). Rolling your own standards may result in a performance penalty.
Unless you document your protocol, it's extremely easy to forget the protocol you created yourself. Just give it enough time and it will happen: you'll need to relearn/remember. If you document, then you are just wasting the computational time of your brain.
You think you don't need to scale your work, but chances are you will most of the time.
When you do, you will wish you had learned how to easily and seamlessly integrate well known formats.
The learning curve is needed anyway. In my experience, when you learn, you actually integrate well known formats faster than imagining your own way of doing things.
Finally, trust your data to geniuses that take their lives into creating cohesive and intelligent standards. They know it better!
Finally, if the purpose is to avoid the typical verbosity of XML, for whatever reasons, you have several options. Right now I can think of CSV, but I'm no expert in data storage, so if you're not confortable with it, I'm sure you can find good alternatives with plenty of ready to use APIs.
Good luck!