Json Object is giving OutOfMemory Exception - android

I'm using JSON Framework to update the data of client side from server. I have 30 Pojo Classes and i received the Http Response from server. I create the Object of Reader by using the method
InputStream instream = entity.getContent();
reader = new InputStreamReader(instream, "UNICODE");
And i pass it in the Json Object like this
synchronizationResponse = gson.fromJson(reader, SynchronizationResponse.class);
But this line is giving outOfMemory Exception.
I write the response in a file and found the size of file is around 12MB.
So is their any way to split the response in multiple response.So that i can read and write simultaneously to avoid OOM exception.
Looking for Help

Difficult to say, as long as we do not know, what your SynchronisationResponse Class looks like.
Whatever your content, if you can split it into sub-sections, do so and serialize those.

You can use a JsonReader to parse a large stream of JSON without running out of memory. The details are in the documentation, but basically you create a JsonReader:
JsonReader reader = new JsonReader(new InputStreamReader(instream, "UNICODE"));
and write a recursive descent parser using reader.nextName() to get keys, reader.nextString(), reader.nextInt(), etc. to get values, and reader.beginObject(), reader.endObject(), reader.beginArray(), reader.endArray() to mark object boundaries.
This allows you to read and process one piece at a time, at any granularity, without ever holding the entire structure in memory. If the data is long-lived, you can store it in a database or a file.

Related

use stored JSON instead of Core Data

There are lots of tutorials out there describing how to fetch JSON objects from the web and map them to Core Data.
I'm currently working on an iOS (later: Android as well) app which loads json objects from web and displays them to the user. In my opinion all this mapping from and to Core Data is an overhead in this case, it would be much easier to save the JSON objects directly and use them as "cache" in the app. Are there libraries/documented ways how to achieve fetching json objects, save them locally and fetch them with a predefined identifier?
I would love to fetch e.g. 10 objects, show them to the user and save the data locally. The next time the user is on that list the local data is shown and in the background the json-file is fetched again to be up-to-date. I guess this is a common use case but I didn't find any tutorials/frameworks enabling exactly this.
You can simply use NSURLCache to cache http responses instead saving JSONs
http://nshipster.com/nsurlcache/
There are many ways to implement this. You can implement cache using either file storage or database depending on the complexity as well as quantity of your data. If you're using files, you just need to store JSON response and load it whenever activity/fragment is crated. What I have done sometimes is store the JSON response in the form of string in a file, and then retrieve it on activity/fragment load. Here's an example of reading and writing string files:
Writing files:
FileOutputStream outputStream = context.openFileOutput("myfilename",Context.MODE_PRIVATE);
String stringToBeSaved = myJSONObject.toString();
outputStream.write(stringToBeSaved.getBytes());
Reading from files
FileInputStream inputStream= context.openFileInput("myfilename");
int c;
String temp="";
while( (c = inputStream.read()) != -1){
temp = temp + Character.toString((char)c);
You can convert this string to JSONObject using :
JSONObject jsonObject = new JSONObject(temp);
Or you can use the string according to your needs.

What is the best way to store parsed json data

i would like to ask, how to store json data. I have a JSON file, which i parse using JSON Library. Now i got the data from a file. But i want to store them and show them later again.
The question is, whats the best way to store data? And is it even worth to store them?
I'm thinking about sql database, because its simple and most used.
Official android docs have few examples, so far i searched but if u have better guide, let me know.
Thank you! :)
EDIT1:
Ok, i have json file with data, which i can add to my app using RAW resources. Those data wont change, its a list of recipes, i dont have to download it. I can read the data like this:
InputStream is = mContext.getResources().openRawResource(R.raw.package_01);
Writer writer = new StringWriter();
char[] buffer = new char[1024];
try {
Reader reader = new BufferedReader(new InputStreamReader(is, "UTF-8"));
int n;
while ((n = reader.read(buffer)) != -1) {
writer.write(buffer, 0, n);
}
is.close();
//catchblock
.....
}
and then i can parse the data trought JSONLibrary like this:
try {
//representing []JSON
JSONArray jsonArray = new JSONArray(writer.toString());
if(jsonArray != null){...}
...}
Im sending a HashMap to ListView, which includes name and id. And if the user clicks the ListView/GridView item, there is new Activity started, which shows all parsed data. I need to get match those parsed data with the id.
There are about 200 recipes in the file. The data are parsed on start of the Activity, while Splashscreen is displayed. I´m not sure, if its good idea to parse the data everytime when app starts.
So, is it effitient to parse data everytime the app starts? And if yes, how to keep the parsed data "together"? Should i use HashMap?
I hope i clarified my question :) Thanks for help.
EDIT2:
So after i knew what to do, i tried the suggested solution to use HashMap. Problem was there i got Failed Binder Exception. I have images encoded in Base64, that means i have a very long String, example here. That is a lot of data, there is limit:
The Binder transaction buffer has a limited fixed size, currently 1Mb, which is shared by all transactions in progress for the process.
I´ve tried to save it to
Set<String> titles = new HashSet<String>();
and then to SharedPreferences but the data for each recipe gets mixed.
**So, here it comes again, should i save the data to SQLite database or is there another effective option i can use? **
Thank you!
It really depends on a number of things like: How much data do you have now, how much will you have later, how complicated is the data. You could use something as simple as an array or hashmap; or something as complex as a database. You need to consider what you are trying to do , and find the simplest solution. If you are trying to persist data, you could use shared preferences, database, and internal/external storage (options outlined here).
Without more information it's hard to say what exactly to do. Keep it simple though. If you are getting JSON from a web service, I'd use an ArrayList or HashMap to store the data, rather than persisting it. It is simpler to implement and does the job.
EDIT:
To answer your question: Yes, using a HashMap and parsing each time is fine. You only have 200 fields, and you don't have images, so the time it will take to parse is minimal. Regardless of how you store the data, there is going to some level of "parsing" done. For example, if you store the data in a database, you are going to have to still pull the data, and put it into a HashMap.

Split list information coming from tornado web-service

I have a Tornado web service that returns a list to an android application as below
output = []
output.append(img_URL)
output.append(temp_folder_name)
self.finish(output)
First it assigns values to a list called "output" and then return it.
My question is how can I split this data within the Android application, I have the following two lines of code within the Android application
HttpEntity responseEntity = httpResponse.getEntity();
String transformedImageURL = EntityUtils.toString(responseEntity);
but when I try to output it (using Toast) the android application force closes. Could you please suggest me a better solution for this.
Thank you for your time.
The data between python and android will not full translate well unless you use a message passing convention. I would suggest rolling this data into lets say into JSON. Like this
output = []
output.append(img_URL)
output.append(temp_folder_name)
self.set_header("Content-Type", "application/json")
from json import dumps
self.finish(dumps(output))
On android side of things, you can use JSON library to parse data.

Compare large local gson against remote gson

When sending a large pojo I want to check for bytes changes and not detail differences in structure. Maybe serialize and hash the pojo in memory but that can fail on an Android device. Any thoughts would be grate. Should a traverse it dom style maybe.
OUT
outputStreamWriter = new OutputStreamWriter( out, "UTF-8");
jsonWriter = new JsonWriter(outputStreamWriter);
jsonWriter.setIndent(" ");
jsonWriter.setIndent("\t");
jsonWriter. beginArray();
mygson.toJson( largeTestPojo, LargeTestPojo.class, jsonWriter );
jsonWriter.endArray();
jsonWriter.flush();
IN
InputStreamReader isr = new InputStreamReader( in, "UTF-8");
StatsTypeAdapterFactory stats = new StatsTypeAdapterFactory();
Gson gson = new GsonBuilder().registerTypeAdapterFactory(stats).create();
jsonReader = new JsonReader( isr );
jsonReader. beginArray();
largeTestPojo= gson.fromJson( jsonReader, LargeTestPojo.class );
jsonReader.endArray();
I want to check for bytes changes and not detail differences in structure
I'm not sure what that means.
Compare large local gson against remote gson
It's not clear to me where the comparison is to occur, whether the complete input and output contents are to be available in the same place or if something like a hash -- as mentioned -- is to be used to represent something on one end or the other.
If the complete input and output structures are available at the point of comparison, then...
I likely wouldn't use the Gson API (or any serialization API) for the task of data structure (or service message) value comparisons. I prefer to use serialization APIs like Gson only for serialization/deserialization tasks, keeping those layers of the system as thin and as simple as reasonable.
So, I'd probably prefer to deserialize into the target LargeTestPojo, and implement a compareTo and/or equals method for it, or just put the two data sets into a map, and compare from there.

Android: Parsing large JSON file

I'm creating an android application which should parse a Json from a file or url to a jsonarray and jsonobjects.
The problem is that my json is 3.3 mb and when i use a simple code, something like this: (can't acces my real code now because im at work, copied some code from tutorial; so there might be some errors in it)
(assuming i already have my inputstream content)
InputStream content = entity.getContent();
BufferedReader reader = new BufferedReader(new InputStreamReader(content));
String line;
while ((line = reader.readLine()) != null) {
builder.append(line);
String twitterfeed = builder.toString();
}
JSONArray jsonArray = new JSONArray(twittefeed);
Log.i(ParseJSON.class.getName(),
"Number of entries " + jsonArray.length());
for (int i = 0; i < jsonArray.length(); i++) {
JSONObject jsonObject = jsonArray.getJSONObject(i);
Log.i(ParseJSON.class.getName(), jsonObject.getString("text"));
When i run this code on my android device, i get an OutOfMemory error when parsing the string to the jsonArray.
I logged some things i found that my total string is 17 mb (of a 3.3 mb json file?!) When i use a small json file, like a twitterfeed or so, the code works fine.
When i got this 17 mb string in my memory i can't parse the json, because then i run out of memory.
After a lot of research i found that jackson might be my solution, because i understood that it is possible to parse an inputstream. This should help, because than i don't need the 17 mb string in my memory; and this is not the most efficient way i gues.... But i can't get it clear of this really will work, and didn't get it running myself.
Does anyone know of this is really will work, and where i can find a tutorial?
I found the "createJsonParser -- public JsonParser createJsonParser(InputStream in)" and think this is my way to go... but i don't know how to implement this in my code, and can't find an example. Does anyone know how this works?
You should use json streaming either with gson or jackson. With Jackson you can use a hybrid approach as well. This would reduce your memory consumption significantly as only the portion of json being parsed is loaded into memory.
https://sites.google.com/site/gson/gson-user-guide
http://jackson.codehaus.org/
A jackson example http://www.mkyong.com/java/jackson-streaming-api-to-read-and-write-json/
The hybrid streaming approach with Jackson is a good option. That way, you can advance a pointer through your raw JSON input (validating it as you go) and, when you detect an input chunk that needs to be processed, you can read it into a tree-hierarchy so that you can pull out any data you want from it.

Categories

Resources