Out of Memory while parsing data in JSON android? - android

Hi I am parsing data from JSON. While parsing data I am getting OUT OF MEMORY error and dalvik-heap. How to over come from this problem. and I search in net i didn't found any solution please help me.
Thanks in advance
03-06 08:19:00.618: E/dalvikvm-heap(1415): Out of memory on a 2506782-byte allocation.
JSONObject jsonObj1 = new JSONObject(jsonStr1);
ArrayList<String> matter1= new ArrayList<String>();
String stat = jsonObj1.getString(TAG_stat);
String suc=jsonObj1.getString(TAG_success);
// Getting JSON Array node
matter = jsonObj1.getJSONArray(TAG_mdata);
// looping through All Contacts
dbdata =new ArrayList<String>();
for (int i = 0; i < matter.length(); i++) {
JSONObject j = matter.getJSONObject(i);
String ddesc= j.getString(TAG_ddesc);
String spd= j.getString(TAG_sped);
String gzid=j.getString(TAG_geozid);
String dev = j.getString(TAG_devid);
matter1.add(stat);
matter1.add(suc);
Log.v("contact", ""+matter1);
dbdata.add(ddesc);
dbdata.add(spd);
dbdata.add(gzid);
dbdata.add(dev);
dbdata1.add(dbdata);

One strange thing about this is that the crash only occurs every 2nd or 3rd time the app is run, leaving me to believe that the memory consumed by the app is not being garbage collected each time the app closes.
That is certainly possible, and if it is the case then it probably due to a memory leak that can be traced back to something that your application is doing. I think you should focus your initial efforts into investigating this aspect ... rather than loading the file in chunks. (I am not familiar with the Android tool-chain, but I am sure it includes memory usage profilers or memory dump analysers.)
EDIT
In response to your followup comment, the fact that it works 2 times in 3 suggests that your app ought to work roughly as-is. Admittedly, you don't have much leeway if the input file gets bigger.
A couple of ideas though:
Rather than reading the file into a String and running the JSON parser on the String, use a parser that can read directly from a stream. Your current solution needs space for two complete copies of the data in memory while you are doing the parsing.
If the file gets much bigger, you may need to think of a design that doesn't create a complete in-memory representation of the data.
I'm not sure that it is a good idea to read a JSON file in "chunks". This could present problems for parsing the JSON ... depending on exactly what you mean by reading in chunks.

Use Gson library. Will optimize all your process.

Related

use stored JSON instead of Core Data

There are lots of tutorials out there describing how to fetch JSON objects from the web and map them to Core Data.
I'm currently working on an iOS (later: Android as well) app which loads json objects from web and displays them to the user. In my opinion all this mapping from and to Core Data is an overhead in this case, it would be much easier to save the JSON objects directly and use them as "cache" in the app. Are there libraries/documented ways how to achieve fetching json objects, save them locally and fetch them with a predefined identifier?
I would love to fetch e.g. 10 objects, show them to the user and save the data locally. The next time the user is on that list the local data is shown and in the background the json-file is fetched again to be up-to-date. I guess this is a common use case but I didn't find any tutorials/frameworks enabling exactly this.
You can simply use NSURLCache to cache http responses instead saving JSONs
http://nshipster.com/nsurlcache/
There are many ways to implement this. You can implement cache using either file storage or database depending on the complexity as well as quantity of your data. If you're using files, you just need to store JSON response and load it whenever activity/fragment is crated. What I have done sometimes is store the JSON response in the form of string in a file, and then retrieve it on activity/fragment load. Here's an example of reading and writing string files:
Writing files:
FileOutputStream outputStream = context.openFileOutput("myfilename",Context.MODE_PRIVATE);
String stringToBeSaved = myJSONObject.toString();
outputStream.write(stringToBeSaved.getBytes());
Reading from files
FileInputStream inputStream= context.openFileInput("myfilename");
int c;
String temp="";
while( (c = inputStream.read()) != -1){
temp = temp + Character.toString((char)c);
You can convert this string to JSONObject using :
JSONObject jsonObject = new JSONObject(temp);
Or you can use the string according to your needs.

storing large data in android

I am pulling a large amount of json from a restful server. I use the GSON library from google to traverse this json and it works great. Now I want to save all of the json objects in my sqlite db, however I want to make use of a transaction to add all of them at once. This is difficult if you dont have all the objects ready in one datastructe. Since I am traversing the json one object at a time, I guess I would have to store that in a data structure such as an arraylist or hashmap and then afterwards use a database transaction to do the inserts fast. However... Storing a large amount of data aka 200 000 json objects into a structure in memory can take up a lot of memory and wil probably run out as well. What would be the best way to get all of that json objects into my sqlite db and at the same time not use up a lot of menory in otherwords storing and inserting in a way that allows for a lot of recycling.
If you want to add a large amount of data at an unique moment : it will take a lot of memory anyway. 200 000 large JSON objects take a certain amount of memory and you will not be able to change it.
You can keep this behavior, but I think it's not a great solution because you create a huge memory consumption on both Android device and server. It will be better if you receive data part by part and adding them this way : but you need to have control on the server code.
If you are absolutely forced to keep this behavior, maybe you should receive all the data at the same time, parse them on a huge JSON object, then make multiple transactions. Check if every transaction was executed correctly and put back your database in a good state if not. It's a really bad way to do it, IMHO... but I don't know all your constraints.
To finish : avoid receiving a large amount of data at only one time. It will be better to make multiple requests to get partial data set. It will make your app less network dependant : if you loose the network for 2 seconds, maybe only one request will fail. So you will have to retry only one request and received again a small part of data. With only one huge request : if you loose the network, you will have to retry the entire request...
I know this is not the best implementation of handling large json input in Android, but it certainly works great.
So my solution is pretty simple:
While parsing the JSON code, make use of prepared statements and a db transaction to execute the inserts.
So in short, as soon as a JSON object is parsed, take that info, insert it into the db using the prepared statement and the db transaction, then move on to the next object.
I have just pulled 210 000 rows with 23 fields each (thats 4.6 million values) from a remote server via JSON and GSON and at the same time inserting all those values into my SQLite3 db on the device, in less than 5 minutes and without wasting any memory. Each iteration of parsing/inserting uses the same amount of memory and is cleaned on the next iteration.
So yeah, there's the solution. Obviously, this is not the best solution for commercial applications or tables with 1000 000 + records, but it works great for my situation.
Have you ever tried to add a lot of data (and I really mean a lot, in my case 2600 rows of data) into the Android-internal database (SQLite)?
If so, you propably went the same road as I did.
I tried a normal InsertStatement which was way to slow (10 sec. in my emulator). Then I tried PreparedStatements. The time was better but still unacceptable. (6 sec.). After some frustrating hours of writing code and then throwing it away, I finally found a good solution.
The Android-OS provide the InsertHelper as a fast way to do bulk inserts.
To give you an overview of the performance (measured with an emulator on a crap computer, trying to insert 2600 rows of data):
Insert-Statement
10 seconds
Prepared-Statements
6 seconds
InsertHelper
320 ms
You can speed up the insertion even more with temporarily disable thread locks. This will gain about 30 % more performance. However it’s important to be sure that only one thread per time is using the database while inserting data due to it’s not threadsafe anymore.
public void fillDatabase(HashMap<String, int[]> localData){
//The InsertHelper needs to have the db instance + the name of the table where you want to add the data
InsertHelper ih = new InsertHelper(this.db, TABLE_NAME);
Iterator<Entry<String, int[]>> it = localData.entrySet().iterator();
final int firstExampleColumn = ih.getColumnIndex("firstExampleColumn");
final int secondExampleColumn = ih.getColumnIndex("secondExampleColumn");
final int thirdExampleColumn = ih.getColumnIndex("thirdExampleColumn");
try{
this.db.setLockingEnabled(false);
while(it.hasNext()){
Entry<String, int[]> entry = it.next();
int[] values = entry.getValue();
ih.prepareForInsert();
ih.bind(firstExampleColumn, entry.getKey());
ih.bind(secondExampleColumn, values[0]);
ih.bind(thirdExampleColumn, values[1]);
ih.execute();
}
}catch(Exception e){e.printStackTrace();}
finally{
if(ih!=null)
ih.close();
this.db.setLockingEnabled(true);
}
}

What is the best way to store parsed json data

i would like to ask, how to store json data. I have a JSON file, which i parse using JSON Library. Now i got the data from a file. But i want to store them and show them later again.
The question is, whats the best way to store data? And is it even worth to store them?
I'm thinking about sql database, because its simple and most used.
Official android docs have few examples, so far i searched but if u have better guide, let me know.
Thank you! :)
EDIT1:
Ok, i have json file with data, which i can add to my app using RAW resources. Those data wont change, its a list of recipes, i dont have to download it. I can read the data like this:
InputStream is = mContext.getResources().openRawResource(R.raw.package_01);
Writer writer = new StringWriter();
char[] buffer = new char[1024];
try {
Reader reader = new BufferedReader(new InputStreamReader(is, "UTF-8"));
int n;
while ((n = reader.read(buffer)) != -1) {
writer.write(buffer, 0, n);
}
is.close();
//catchblock
.....
}
and then i can parse the data trought JSONLibrary like this:
try {
//representing []JSON
JSONArray jsonArray = new JSONArray(writer.toString());
if(jsonArray != null){...}
...}
Im sending a HashMap to ListView, which includes name and id. And if the user clicks the ListView/GridView item, there is new Activity started, which shows all parsed data. I need to get match those parsed data with the id.
There are about 200 recipes in the file. The data are parsed on start of the Activity, while Splashscreen is displayed. I´m not sure, if its good idea to parse the data everytime when app starts.
So, is it effitient to parse data everytime the app starts? And if yes, how to keep the parsed data "together"? Should i use HashMap?
I hope i clarified my question :) Thanks for help.
EDIT2:
So after i knew what to do, i tried the suggested solution to use HashMap. Problem was there i got Failed Binder Exception. I have images encoded in Base64, that means i have a very long String, example here. That is a lot of data, there is limit:
The Binder transaction buffer has a limited fixed size, currently 1Mb, which is shared by all transactions in progress for the process.
I´ve tried to save it to
Set<String> titles = new HashSet<String>();
and then to SharedPreferences but the data for each recipe gets mixed.
**So, here it comes again, should i save the data to SQLite database or is there another effective option i can use? **
Thank you!
It really depends on a number of things like: How much data do you have now, how much will you have later, how complicated is the data. You could use something as simple as an array or hashmap; or something as complex as a database. You need to consider what you are trying to do , and find the simplest solution. If you are trying to persist data, you could use shared preferences, database, and internal/external storage (options outlined here).
Without more information it's hard to say what exactly to do. Keep it simple though. If you are getting JSON from a web service, I'd use an ArrayList or HashMap to store the data, rather than persisting it. It is simpler to implement and does the job.
EDIT:
To answer your question: Yes, using a HashMap and parsing each time is fine. You only have 200 fields, and you don't have images, so the time it will take to parse is minimal. Regardless of how you store the data, there is going to some level of "parsing" done. For example, if you store the data in a database, you are going to have to still pull the data, and put it into a HashMap.

Converting multiple datas into a single string for uploading using json in android

I have to submit data from 30 pages into the server.These datas from 30 pages are to be made into a single string and that i have to upload that single string into the server using json.
Each page may contain many answers tht may be either in plain text(value we receive from edit text),from check boxes(yes or no) and so on.....please suggest me a way to add all these data into a single string and upload it using json.
Based on the comment I suspect that you believe that you need to treat these "pages" as strings that you concat. However, what I think you're overlooking is that JSON is pretty versatile in how you add objects to it.
So, let's say you have the thing that you want to ship to your server and you call it
JSONObject myEntireFile = new JSONObject();
you can now add stuff to it at any time like this...
JSONObject page1 = new JSONObject();
myEntireFile.put("page1", page1);
meanwhile you can put whatever you want IN page 1 (cause that's just another serialized container).
You can keep doing this until you're ready to send it, at which time you just call
myEntireFile.toString();
which will convert your object into one long, well formatted, JSON string, that you can then open store for later use.

Can large String Arrays freeze my program?

I recently created a program that gets medi-large amounts of xml data and converts it into arrays of Strings, then displays the data.
The program works great, but it freezes when it is making the arrays (for around 16 seconds depending on the size).
Is there any way I can optimize my program (Alternatives to string arrays etc.)
3 optimizations that should help:
Threading
If the program freezes it most likely means that you're not using a separate thread to process the large XML file. This means that your app has to wait until this task finishes to respond again.
Instead, create a new thread to process the XML and notify the main thread via a Handler when it's done, or use AsyncTask. This is explained in more detail here.
Data storage
Additionally, a local SQLite database might be more appropriate to store large amounts of data, specially if you don't have to show it all at once. This can be achieved with cursors that are provided by the platform.
Configuration changes
Finally, make sure that your data doesn't have to be reconstructed when a configuration change occurs (such as an orientation change). A persistent SQLite database can help with that, and also these methods.
You can use SAX to process the stream of XML, rather than trying to parse the whole file and generating a DOM in memory.
If you find that you really are using too much memory, and you have a reason to keep the string in memory rather than caching them on disk, there are certainly ways you can reduce the memory requirements. It's a sad fact that Java strings use a lot of space. They require two objects (the string itself and an underlying char array) and use two bytes per char. If your data is mostly 7-bit ASCII, you may be better of leaving it as a UTF-8 encoded byte stream, using 1 byte per character in the typical case.
A very effective scheme is to maintain an array of 32k byte buffers, and append the UTF-8 representation of each new string onto the first empty space in one of those arrays. Your reference to the string becomes a simple integer: PTR = (buffer index * 32k) + (buffer offset). "PTR/32k" yields the index of the desired byte buffer, and "PTR % 32k" yields the location within the buffer. Use either an initial length byte or a null terminator to keep track of how long the string is. When you need to access one of the strings, don't allocate a new String object: unpack it into a mutable StringBuilder or work directly with the UTF-8 byte representation.
The above approach is obviously a lot more work, but can save you between a factor of 2 and 6 in memory usage (depending on the length of your strings). However, you should beware of premature optimization. If your problem is with the processing time to parse your input, or is somewhere else in your program, you could find that you've done a lot of work to fix something that isn't your bottleneck and thus get no improvement at all.

Categories

Resources