I have a RESTful WCF service that I am using to retrieve encoded photos and display them in android (trying to anyway). The problem I am having is that the InputStream or possibly something else stops reading the characters before the end.
The response is just an XML string, I intend to parse it myself so no need to worry about that. What I need to know is what in the following code is stopping the input stream from reading characters into my buffer.
HttpEntity responseEntity = response.getEntity();
char[] buffer = new char[(int)responseEntity.getContentLength()];
InputStream stream = responseEntity.getContent();
InputStreamReader reader = new InputStreamReader(stream);
reader.read(buffer);
stream.close();
Have you implemented the HttpURLConnection class.. this would cause this behavior.
http://developer.android.com/reference/java/net/HttpURLConnection.html
Related
I'm having a bit of a problem with decodeStream returning null. It seems to be a fairly common problem around, but it's usually pinned down to one of two problems:
An OutOfMemory exception thrown by attempting to load a large bitmap in it's entirety.
Attempting to use the same input stream twice.
However, I'm not doing either. The code to run it is simply
stream = new java.net.URL(url).openStream();
Bitmap image = BitmapFactory.decodeStream(stream);
stream.close();
with the URL set to here. image is null after this code is complete. This issue's been driving me completely insane - it works fine on PNGs but seems to fall apart under every BMP I can give it, so any help would be appreciated.
Ultimately, the answer was found here, using an InputStream returned by a BufferedHTTPEntity. While it seems needlessly complex, I can only assume that simply getting a stream from the URL object directly doesn't return a stream of the appropriate type, and so it wasn't reading out all the data properly.
Cross-posting the code in case the question is erased:
private static InputStream fetch(String address) throws MalformedURLException,IOException {
HttpGet httpRequest = new HttpGet(URI.create(address) );
HttpClient httpclient = new DefaultHttpClient();
HttpResponse response = (HttpResponse) httpclient.execute(httpRequest);
HttpEntity entity = response.getEntity();
BufferedHttpEntity bufHttpEntity = new BufferedHttpEntity(entity);
InputStream instream = bufHttpEntity.getContent();
return instream;
}
I have a big JSON input (download the file) API and I don´t know how to parse this data. I need:
Save this data (entire JSON input) to text file or database. What is the best way for this?
Load this data from text file or database and create JSONArray from JSON tag "list" (first tag)
The solution should be fast and support Android 2.3. What you have recomend for this? Any ideas?
My code:
HttpClient httpClient = new DefaultHttpClient();
HttpGet httpGet = new HttpGet(urls[0]);
HttpResponse httpResponse;
httpResponse = httpClient.execute(httpGet);
HttpEntity httpEntity = httpResponse.getEntity();
... and what next ?...
FYI:
EntityUtils throws OutOfMemoryException
EDIT:
I try to save data to file like this:
InputStream inputStream = httpEntity.getContent();
FileOutputStream output = new FileOutputStream(Globals.fileNews);
int bufferSize = 1024;
byte[] buffer = new byte[bufferSize];
int len = 0;
while ((len = inputStream.read(buffer)) != -1) {
output.write(buffer, 0, len);
}
And it´s OK. I load data:
FileInputStream fis = null;
StringBuffer fileContent = new StringBuffer("");
fis = new FileInputStream(Globals.fileNews);
byte[] buffer = new byte[1024];
while (fis.read(buffer) != -1) {
fileContent.append(new String(buffer));
}
But how convert StringBuffer to JSONObject? fileContent.ToString() is not ideal, sometimes I get OutOfMemoryException.
First of all: Dispose the HttpClient. Google discourages it:
Unfortunately, Apache HTTP Client does not, which is one of the many
reasons we discourage its use.
Source: developer.android.com
A good replacement is Google Volley. You have to build the JAR yourself but it just works like charm. I use for my setups Google Volley with OkHttp-Stack and GSON requests.
In your case you would write another Request which just writes the response out to the SD-card chunk by chunk. You don't buffer the string before! And some logic to open an input-stream from the file you wrote and give it to your JSON Deserializer. Jackson and GSON are able to handle streams out of the box.
Of course everything works with Android 2.3.
Don't, I repeat, don't try to dump the whole serialized stuff into a string or something. That's almost a OutOfMemoryException guarantee.
I parse an XML document from a HTTPResponse.
Previously I initiated the parser with a String object created from the InputStream.
When I changed the setup so the inputStream isused directly in the parser I get OutOfMemory Exceptions.
The strange thing is that parsing the String worked without problems before, so I wonder why the InputStream should need more memory.
Previous code:
final byte[] encodedResponseBytes = IOUtils.toByteArray(httpResponse
.getEntity().getContent());
String message = new String(encodedResponseBytes);
parser.setInput(new StringReader(message));
New code:
InputStream stream = httpResponse
.getEntity().getContent();
parser.setInput(stream, null);
By changing the code I don't have a problem anymore:
InputStream stream = request.getResponseStream();
reader = new InputStreamReader(stream);
this.xmlParser.setInput(reader);
I have a REST service I can't alter, with methods for uploading an image, encoded as a Base64 string.
The problem is that the images can go up to sizes of 5-10MB, perhaps more. When I try to construct a Base64 representation of an image of this size on the device, I get an OutOfMemory exception.
I can however encode chunks of bytes at a time (3000 let's say), but this is useless as I would need the whole string to create a HttpGet/HttpPost object:
DefaultHttpClient client = new DefaultHttpClient();
HttpGet httpGet = new HttpGet("www.server.com/longString");
HttpResponse response = client.execute(httpGet);
Is there a way of going around this?
Edit: trying to use Heiko Rupp's suggestions + the android doc, I get an exception ("java.io.FileNotFoundException: http://www.google.com") at the following line: InputStream in = urlConnection.getInputStream();
try {
URL url = new URL("http://www.google.com");
HttpURLConnection urlConnection = (HttpURLConnection) url.openConnection();
urlConnection.setDoOutput(true);
urlConnection.setChunkedStreamingMode(0);
OutputStream out = new BufferedOutputStream(urlConnection.getOutputStream());
out.write("/translate".getBytes());
InputStream in = urlConnection.getInputStream();
BufferedReader r = new BufferedReader(new InputStreamReader(in));
StringBuilder total = new StringBuilder();
String line;
while ((line = r.readLine()) != null) {
total.append(line);
}
System.out.println("response:" + total);
} catch (MalformedURLException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
Am I missing something? The GET request that I need to execute looks like this:
"http://myRESTService.com/myMethod?params=LOOONG-String", so the idea was to connect to http://myRESTService.com/myMethod and then output a few characters of the long string at a time. Is this correct?
You should try to use the URLConnection instead of the apache http client, as this does not require you to hold the object to send in memory, but instead you can do something like:
pseudocode!
HttpUrlConnection con = restUrl.getConnection();
while (!done) {
byte[] part = base64encode(partOfImage);
con.write (part);
partOfImage = nextPartOfImage();
}
con.flush();
con.close();
Also in Android after 2.2 Google recommends the URLConnection over the http client. See the description of DefaultHttpClient.
The other thing you may want to look into is the amount of data to be sent. 10 MB + base64 will take quite a while to transfer (even with gzip compression, which the URLConnection transparently enables if the server side accepts it) over a mobile network.
You must read docs for this REST service, no such service will require you to send such long data in GET. Images are always sent as POST. POST data is always at the end of request and allows to be added iteratively.
I am trying to get compressed data from server. The guy that programmed the server told me, that he uses ZLIB library on his iPhone, and the gzcompress on server. I was trying to find any suitable way to get that data, but it ends up with info "java.io.IOException: unknown format (magic number 9c78)" while creating GZIPInputStream object. Finally I've reached point, where I had data as a String. It was compressed, so I used that answer to decompress: https://stackoverflow.com/a/6963668/419308 . But that code doesn't work. "in.read()" returns -1 at the beginning.
Anyone has any idea why there's -1 ? Or maybe a better way to get the compressed data?
EDIT:
I tried adding file to project and reading from that file. in.read() didn't return -1
EDIT2: According to jJ's answer I've tried this code:
HttpGet request = new HttpGet( urlTeam );
HttpResponse response = new DefaultHttpClient().execute( request );
HttpEntity entity = response.getEntity();
InputStream stream = AndroidHttpClient.getUngzippedContent( entity );
InputStreamReader reader = new InputStreamReader( stream );
BufferedReader buffer = new BufferedReader( reader );
StringBuilder sb = new StringBuilder();
sb.delete( 0, sb.length() );
String input;
while ( ( input = buffer.readLine() ) != null )
{
sb.append( input );
}
But the answer is still compressed (or unreadable)
on android you should use either HttpURLConnection which handles decompression (and http encoding headers) for you from GingerBread on or AndroidHttpClient (for older android versions) that has helper methods like getUngzippedContent and modifyRequestToAcceptGzipResponse.
This is a good summary Android HTTP clients