I have a problem when try to use hashtable with large data. I have a text file that contains over 111000 record, and when it reach 75000, outofmemory exception was thrown, so does anyone have any solution for this?
The problem happen at the line :
while ((text = reader.readLine())!= null);
and because of the java.lang.String <init> inside the readLine(), but I think the problem come from the hashtable too big to keep store data. I tested it on my xperia Neo and it failed although it run quite good on another samsung device.
The object SoundUnit which I keep on hashtable has a structure below:
private String filename;
private int start;
private int end;
just above 3 field inside SoundUnit object.
Here is the piece of code I use to read data from text to store on hashtable:
reader = new BufferedReader(new InputStreamReader(new FileInputStream(unitSelectionFile), "UTF-8"));
String text=null;
do {
if (text.length() != 0) {
if (mainHash.get(text)==null)
mainHash.put(key, soundUnit);
}
} while ((text = reader.readLine())!= null);
Yes, most likely the hashtable is simply to large. I did a test on a Galaxy Nexus, which is a rather new phone, and it ran out of memory after 230k entries, with unique keys and values about 8 chars long. I would suggest to put your data in a database instead.
Your reader should not be an issue, since BufferedReader only uses a small amount of memory.
You should get the InputStream from the response and pass it to any xml handler for parsing it. This should do the trick because by doing this your xml will be parsing dynamically as you get the inputstream.
Related
I am using Jsoup to print off HTMl to a mobile app and i know this is the code that is printing the meta data but i want to know exactly what buffer.append is doing?
StringBuffer buffer = new StringBuffer();
// Get meta info
Elements metaElems = doc.select("meta");
buffer.append("META DATA\r\n");
for (Element metaElem : metaElems) {
String name = metaElem.attr("name");
String content = metaElem.attr("content");
buffer.append("name ["+name+"] - content ["+content+"] \r\n");
}
Always somehow include the specific Class you're talking about with your code/question. To a reader, the buffer variable could be anything.
Judging from the context of your code you're probably talking about an instance of the java class StringBuffer. Read more about it here.
In a nutshell, it's like a String, just mutable, which means you can change its content after it has been instantiated like your code snippet is doing with .append(..).
Just call buffer.toString() once you're done with appending things to it and continue using the String Object that the buffer will return.
I'm creating an android application which should parse a Json from a file or url to a jsonarray and jsonobjects.
The problem is that my json is 3.3 mb and when i use a simple code, something like this: (can't acces my real code now because im at work, copied some code from tutorial; so there might be some errors in it)
(assuming i already have my inputstream content)
InputStream content = entity.getContent();
BufferedReader reader = new BufferedReader(new InputStreamReader(content));
String line;
while ((line = reader.readLine()) != null) {
builder.append(line);
String twitterfeed = builder.toString();
}
JSONArray jsonArray = new JSONArray(twittefeed);
Log.i(ParseJSON.class.getName(),
"Number of entries " + jsonArray.length());
for (int i = 0; i < jsonArray.length(); i++) {
JSONObject jsonObject = jsonArray.getJSONObject(i);
Log.i(ParseJSON.class.getName(), jsonObject.getString("text"));
When i run this code on my android device, i get an OutOfMemory error when parsing the string to the jsonArray.
I logged some things i found that my total string is 17 mb (of a 3.3 mb json file?!) When i use a small json file, like a twitterfeed or so, the code works fine.
When i got this 17 mb string in my memory i can't parse the json, because then i run out of memory.
After a lot of research i found that jackson might be my solution, because i understood that it is possible to parse an inputstream. This should help, because than i don't need the 17 mb string in my memory; and this is not the most efficient way i gues.... But i can't get it clear of this really will work, and didn't get it running myself.
Does anyone know of this is really will work, and where i can find a tutorial?
I found the "createJsonParser -- public JsonParser createJsonParser(InputStream in)" and think this is my way to go... but i don't know how to implement this in my code, and can't find an example. Does anyone know how this works?
You should use json streaming either with gson or jackson. With Jackson you can use a hybrid approach as well. This would reduce your memory consumption significantly as only the portion of json being parsed is loaded into memory.
https://sites.google.com/site/gson/gson-user-guide
http://jackson.codehaus.org/
A jackson example http://www.mkyong.com/java/jackson-streaming-api-to-read-and-write-json/
The hybrid streaming approach with Jackson is a good option. That way, you can advance a pointer through your raw JSON input (validating it as you go) and, when you detect an input chunk that needs to be processed, you can read it into a tree-hierarchy so that you can pull out any data you want from it.
I have an application, and I am trying to set up a fairly large SQLite database (one table with roughly 5000 rows) into it. I have built the DB classes and everything, and my app works when I tested on a smaller scale (400 rows), but now when I want to import my database, I get the out of memory error which I can't seem to find a way to get around.
The database is initially on MySQL on my web server, and I couldn't convert it for some odd reason but I managed to generate a text file with the queries to add all 5000 rows, which is 11.5mb in size. I have this file in my assets folder, and I am trying this to put it into my DB:
public void onHandleIntent(Intent intent) {
DBAdapter db = new DBAdapter(getApplicationContext());
db.open();
try {
InputStream is = getAssets().open("verbs_sql.txt");
db.executeSQL(convertStreamToString(is));
} catch (IOException e) {}
db.close();
// Run main activity
Intent i = new Intent(DatabaseReceiver.this, BaseActivity.class);
i.setFlags(Intent.FLAG_ACTIVITY_NEW_TASK | Intent.FLAG_ACTIVITY_EXCLUDE_FROM_RECENTS);
DatabaseReceiver.this.startActivity(i);
}
public static String convertStreamToString(InputStream is) throws IOException {
Writer writer = new StringWriter();
char[] buffer = new char[2048];
try {
Reader reader = new BufferedReader(new InputStreamReader(is, "UTF-8"));
int n;
while ((n = reader.read(buffer)) != -1) {
writer.write(buffer, 0, n);
}
} finally {
is.close();
}
String text = writer.toString();
return text;
}
}
The out of memory error occurs on the StringWriter(), so it looks likes it putting that big file on to the memory. How can I solve this? I have also tried looping through all 5000 rows but after maybe 30 seconds I got the out of memory again.
I had the same problem. I tried so many ways to solve it but failed. At last i found the reason and i wondered. In my case the reason for the error was i was printing the entire response string in log cat. That data was very huge and it took heap memory.
Take care of the following
Remove Log cat printing of bulk data.
Try to use only one J-SON Array for all Operation under one resonse(Reuse it for all).
Try to avoid array-list usage.
Insert item when each item iterate from J-Son Array. That means don't follow the method in which we are taking the item as object and put it in to an array-list and passing array-list to DB-helper and from there iterate the object and insert.
Sqlite databases are just files, when you're trying to run thousands of inserts on the phone you're hitting the SD card over and over for file access.
What you'll want to do is create the sqlite database on your desktop and include the already created database in the app. If you need to regularly update the information in the database you could post it on a website and have the app download it, just make sure to only do large downloads over Wifi.
Check out this Tech Talk for more information.
Edit: See this for more information on creating an sqlite database in windows and including it in your app.
I believe you could do this in the way you want. The problem with the code you posted is that you are trying to convert the entire file to a string. I am fairly certain that this would fail even on a desktop machine.
I believe that you would have better luck if you tried to read in one line at a time and execute the SQL. Then read the next line. You could also reduce the size of the file by passing it through zip.
If I can find a couple of minutes, I will attach some code.
I have a client / server app written using Android and I'm using the standard org.json package classes bundled with android to do the parsing and creating.
I've been getting weird characters appearing on the server side right in the middle of the json strings generated for example (not the full one, because its big):
{!lo":"es_MX","id":2791884,"os":"8"}
As you can see the (!) exclamation mark appears randomly instead of a double quote. I also get other random characters appearing mid string. It's very bizarre.
Here is the code which creates the JSON object...
JSONObject jsonObject = new JSONObject();
jsonObject.put("key", someValue);
Here is the code which sends..
HttpPost type = new HttpPost(<server url here>);
List<NameValuePair> params = new ArrayList<NameValuePair>();
params.add(new BasicNameValuePair("v", jsonObject.toString()));
type.setEntity(new UrlEncodedFormEntity(params, HTTP.UTF_8));
httpClient.execute(type); // This is a DefaultHttpClient
I say random, but the exclamation mark in this exact position is consistent in many errors, but not every-time. About 5 messages that get this error, among tens of thousands per day. And usually not the contents of the values inserted into the json, but the characters (such as the quote character above) that define the structure of the message, which suggests to me that this isn't a character set issue.
Has anyone come across this?
it seems you are composing string in other format, and on receiving text decode in another format
like iso to utf.
It looks like your sender is not properly setting the character set. Spanish will have symbols not present in regular ASCII or most Windows encodings, you need to use UTF-8:
Content-Type: text/html; charset=utf-8
Without knowing which HTTP exchange you're using (read more), it is not possible to give you an exact code snippet to fix the problem - but that should be easy enough to figure out.
You give not enough information. Radical method to fix your problem is just replace all (!) characters to (").
string.replaceAll("!", "\"");
I guess it is server side issue.
I had also simmilar problem. Let me write much more to describe my environment. My server was returning data in json format. But my problem was connected with special chars like ąść. YOu should know, json_encode() will return from server in this case string text as a null.
I know, it sucks! So I added mysql_query('SET CHARACTER SET utf8'); before my selction for items from database. This allowed me to take strings from server with special diacritics letters.
Now on the app site, I was taking data from server by GET method. First I was storing result data into InputStream. Then I was packing it into InputStreamReader and byte by byte I was appending it into stringBuilder. That's ready appended text was converting by toString() ready string. Then I was putting it to new JsonArray(readyString). However I discovered some parts of text for json had weird chars.. Especially in that places where were special letters like żóć. For example "description":"aaa" was throwing "descriptionPffa":"aa"null:`.
I decided to try another way for converting result from data. In places where I was converting data from server I used method below. At the end, wgen I got byteArrayOutputStream object I changed it to new String(byteArray) and then somehow it worked with new JsonArray(new String(byteArray))!
public class Streams {
public static byte[] getBytes(InputStream is) {
int len;
int size = 1024;
byte[] buf = new byte[0];
try {
ByteArrayOutputStream bos = new ByteArrayOutputStream();
buf = new byte[size];
while ((len = is.read(buf, 0, size)) != -1)
bos.write(buf, 0, len);
buf = bos.toByteArray();
} catch (IOException e) {
e.printStackTrace();
}
return buf;
}
}
Print the json on the client (using Log.d or similar) and see if it contains weird characters before sending it to the server.
I'm using JSON Framework to update the data of client side from server. I have 30 Pojo Classes and i received the Http Response from server. I create the Object of Reader by using the method
InputStream instream = entity.getContent();
reader = new InputStreamReader(instream, "UNICODE");
And i pass it in the Json Object like this
synchronizationResponse = gson.fromJson(reader, SynchronizationResponse.class);
But this line is giving outOfMemory Exception.
I write the response in a file and found the size of file is around 12MB.
So is their any way to split the response in multiple response.So that i can read and write simultaneously to avoid OOM exception.
Looking for Help
Difficult to say, as long as we do not know, what your SynchronisationResponse Class looks like.
Whatever your content, if you can split it into sub-sections, do so and serialize those.
You can use a JsonReader to parse a large stream of JSON without running out of memory. The details are in the documentation, but basically you create a JsonReader:
JsonReader reader = new JsonReader(new InputStreamReader(instream, "UNICODE"));
and write a recursive descent parser using reader.nextName() to get keys, reader.nextString(), reader.nextInt(), etc. to get values, and reader.beginObject(), reader.endObject(), reader.beginArray(), reader.endArray() to mark object boundaries.
This allows you to read and process one piece at a time, at any granularity, without ever holding the entire structure in memory. If the data is long-lived, you can store it in a database or a file.