I have an application that uploads photos through a web service. In the past, I loaded a file into a stream, and converted to Base64. Then I posted the resulting string through the write() method of an OutputStreamWriter. Now, the web service has changed, and it expects multipart/form-data and it does not expect Base64.
So somehow I need to post the chararters of this file as is without conversion. I'm sure I'm close, but all I ever get is a content lengh underflow or overflow. The odd thing is that in the debugger I can see that my buffer length is the same length as the string I'm posting. Here's what I'm doing and hopefully enough code:
// conn is my connection
OutputStreamWriter dataStream = new OutputStreamWriter(conn.getOutputStream());
// c is my file
int bytesRead = 0;
long bytesAvailable = c.length();
while (bytesAvailable > 0) {
byte[] buffer = new byte[Math.min(12288, (int)bytesAvailable)];
bytesRead = fileInputStream.read(buffer, 0, Math.min(12288, (int)bytesAvailable));
// assign the string if needed.
if (bytesRead > 0) {
bytesAvailable = fileInputStream.available();
// I've tried many encoding types here.
String sTmp = new String(buffer, "ISO-8859-1");
// HERE'S the issue. I can't just write the buffer,
dataStream.write(sTmp);
dataStream.flush();
// Yes there's more code, but this should be enough to show why I don't know what I'm doing!
change
OutputStreamWriter dataStream = new OutputStreamWriter(conn.getOutputStream());
with this
DataOutputStream dataStream = new DataOutputStream(conn.getOutputStream());
and directly call dataStream.write(buffer);
let me know how it behave
Edit: edited answer according to comment
Related
I'm using HttpUrlConnection in Android to upload a large image to a server
Everything works so far, but the sending is a little bit slow...
I am using the setChunkedStreamingMode() method, so my older devices do not run out of memory while uploading the image. The Chunk size is currently set to 4MB
However, I looked into the Network Monitor of Android and noticed that I have a lot of small peaks, what is not what I expected
As far as I understood, the OutputStream should read bytes in as long as the chunk size is not reached. When the chunk size is reached, the chunk will be sent to server as a whole. But I don
t get the little peaks in the Network Monitor
I guess I am missing something here?
Here is my code:
private final String boundary = "obzorBOUNDARYrozbo";
private final String mMimeType = "multipart/form-data;boundary=" + boundary;
private final String lineEnd = "\r\n";
private final String hyphens = "--";
private final String encoding = "utf-8";
private final String connectionType = "Keep-Alive";
...
HttpURLConnection connection = (HttpURLConnection) new URL(url).openConnection();
// The body is sent in chunks of the defined size
connection.setChunkedStreamingMode(Settings.__SendFileChunkSize);
// Allows us to handle the response in the InputStream
connection.setDoInput(true);
// Allows us to write to the OutputStream
connection.setDoOutput(true);
// Disable the cache
connection.setUseCaches(false);
// Sets the request method to POST
connection.setRequestMethod("POST");
// Writes all Cookies written in the CookieManager to the header
connection.setRequestProperty(
"Cookie",
TextUtils.join(";", cookieManager.getCookieStore().getCookies())
);
// The connection should be kept alive
connection.setRequestProperty("Connection", connectionType);
// Set the charset to UTF-8
connection.setRequestProperty("Charset", encoding);
// The type is defined to be a multipart request
connection.setRequestProperty("Content-Type", mMimeType);
connection.setConnectTimeout(1000);
...
BufferedInputStream bis = new BufferedInputStream(
new FileInputStream(file),
Settings.__SendFileBufferSize
);
// Check if less than maxBufferSize bytes are remaining
int bytesAvailable = bis.available();
// Do not read more bytes than maxBufferSize
int bufferSize = Math.min(bytesAvailable, Settings.__SendFileBufferSize);
byte[] buffer = new byte[bufferSize];
while (bis.read(buffer, 0, bufferSize) > 0) {
dos.write(buffer, 0, bufferSize);
bytesAvailable = bis.available();
bufferSize = Math.min(bytesAvailable, Settings.__SendFileBufferSize);
}
bis.close();
bis = null;
buffer = null;
Here are my constants for the Chunk size and the buffer size
public static final int __SendFileChunkSize = 1024 * 1024 * 4;
public static final int __SendFileBufferSize = 1024 * 1024 * 4;
The image I want to upload is ~8MB
In my android application , user can upload a 300kb image;
I'm going to use This ( Android Asynchronous Http Client ) which I think is great and also Whatsapp is one of it's users.
In this library , I can use a RequestParams ( which is provided by apache I think) , and add either a file to it or an string ( lots of others too).
here it is :
1- Adding a file which is my image ( I think as a multipart/form-data)
RequestParams params = new RequestParams();
String contentType = RequestParams.APPLICATION_OCTET_STREAM;
params.put("my_image", new File(image_file_path), contentType); // here I added my Imagefile direcyly without base64ing it.
.
.
.
client.post(url, params, responseHandler);
2- Sending as string ( So it would be base64encoded)
File fileName = new File(image_file_path);
InputStream inputStream = new FileInputStream(fileName);
byte[] bytes;
byte[] buffer = new byte[8192];
int bytesRead;
ByteArrayOutputStream output = new ByteArrayOutputStream();
try {
while ((bytesRead = inputStream.read(buffer)) != -1) {
output.write(buffer, 0, bytesRead);
}
} catch (IOException e) {
e.printStackTrace();
}
bytes = output.toByteArray();
String encoded_image = Base64.encodeToString(bytes, Base64.DEFAULT);
// then add it to params :
params.add("my_image",encoded_image);
// And the rest is the same as above
So my Question is :
Which one is better in sake of Speed and Higher Quality ?
What are the differences ?
NOTE :
I've read many answers to similar questions , but none of them actually answers this question , For example This One
Don't know if params.put() and params.add would cause for a change of multipart encoding.
The base64 endoded data would transfer 30% slower as there are 30% more bytes to transfer.
What you mean by quality i do not know. The quality of the uploaded images would be equal as they would be byte by byte the same to the original.
I receive a file using the following code:
byte[] fileBytes;
....
JSONObject postJSON = new JSONObject();
postJSON.put("file_name", filename);
postJSON.put("client_id", clientID);
HttpPost post = new HttpPost(fileURL);
StringEntity se = new StringEntity( postJSON.toString(), "UTF-8");
se.setContentType(new BasicHeader(HTTP.CONTENT_TYPE, "application/json"));
post.setEntity(se);
response = httpClient.execute(post);
fileBytes = EntityUtils.toByteArray(response.getEntity());
Using the debugger, I see that the response gets an entity 27136 bytes in length, which is the correct length of the test file, but the fileBytes array is only 11470 bytes long. Can anyone tell my why this truncation is taking place? When I try to get other files, a similar truncation takes place, so it is not a function of the specific file or a specific file length.
Using the following code, I get 11997 bytes for the same file:
StringBuilder stringBuilder = new StringBuilder("");
stringBuilder.append(EntityUtils.toString(response.getEntity()));
fileBytes = stringBuilder.toString().getBytes();
Reading from an InputStream, I get 12288 bytes:
fileBytes = new byte[1024];
InputStream inputStream = response.getEntity().getContent();
int bytesRead = 0;
while(true){
bytesRead = inputStream.read(fileBytes);
if (bytesRead <= 0)
break;
....
}
Changing the encoding to UTF-16 gets me an internal server error.
I also tried the following:
InputStream inputStream = response.getEntity().getContent();
response.getEntity().getContentLength()];
while ((getByte = inputStream.read()) != -1) {
bos.write(getByte);
}
bos.close();
This also gave me a file of 11470.
In all cases, the files are corrupted, and cannot be opened. When compared in a binary file viewer, the firs 11 bytes match, and then the files diverge. I could not find any pattern in the corrupted file.
OK, the answer is apparently that all of the above are fine. The problem was with the server, which was not configuring the data stream correctly: Content-type was text/plain for all files, rather than application/pdf, and so on as appropriate.
My first clue was when we put a text file on the server, and it came over successfully. At that point I started working with the server side, and we figured it out pretty quickly.
Bottom line, if you are working on a server/client application, the problem might not be on your side.
I should have mentioned various posts which helped my construct the various versions that I collected above:
including this
and this
My apologies to various other helpful people whose posts I also looked at and up-voted.
I'm trying to upload a large file ~10 MB on server and realized that on 2.3.4 the stream is written in memory first before writing to server, i confirm this behavior by looking into Heap Memory Dump, because of this for large file it causes OutOfMemory exception. I don't see the same behavior on 4.2 device.
Following is the code I'm using:
URL url = new URL(uri);
connection = (HttpsURLConnection) url.openConnection();
connection.setDoOutput(true);
connection.setRequestMethod("PUT");
connection.setRequestProperty("content-type", "");
connection.setRequestProperty("Accept-Encoding", "");
connection.setFixedLengthStreamingMode((int)totalBytes);
out = new BufferedOutputStream(connection.getOutputStream());
fis = new BufferedInputStream(new FileInputStream(file));
byte[] buffer = new byte[32 * 1024];
int bytesRead = 0;
int totalBytesRead = 0;
while ((bytesRead = fis.read(buffer)) != -1)
{
totalBytesRead = totalBytesRead + bytesRead;
out.write(buffer, 0, bytesRead);// OOM Error
}
out.flush();
I filed a bug with google-android and they claim it's fixed for GingerBread release but I'm still able to replicate the issue in 2.3.4
Link to the bug https://code.google.com/p/android/issues/detail?id=53946 and
https://code.google.com/p/android/issues/detail?id=3164
I ended up using HttpClient for Eclair, Froyo, GingerBread
and HttpURLConnection for HoneyComb and above
I'm facing an Out of Memory Exception while converting a 1.8MB image to bytes and then encrypt, finally converting into a string (length printed in log is 1652328). And then, I'm appending this string to some XML format to post, where the real problem arises. While appending some tags to this pictureString using StringBuffer or StringBuilder or adding to a string Out of Memory exception is occuring. How can I resolve this issue?
For small images this issue is not replicating.
The below piece of code converts a picture at path path to String.
fis = new FileInputStream(path);
buffer = new byte[fis.available()];
try {
fis.read(buffer, 0, buffer.length);
String byteString =
com.mobile.android.components.Base64.encodeBytes(buffer);
return byteString;
} catch (IOException ex) {
}
The above byteString is appended to xml post as follows.
StringBuilder pictureName = new StringBuilder();
pictureName.append(byteString ); //here array out of bound at StringBuilder.extendBuffer
..........
appending continues
UPDATED
In the above appending, encoded byteStream is encrypted using cypher AES and then appended to StringBuilder.
Call bitmap.recycle(); as soon as you have converted the bitmap to a byte array. This will free the native object associated with this bitmap, and clear the reference to the pixel data.
Update
Its obvious that the memory chunk read from the filestream is too large to handle. Avoid reading the whole file at once. Do it piece by piece. Append the string to the xml without using an intermediate string object.
Update 2
You could do something like this to avoid loading the whole xml file while sending it to server.
// Allow Inputs & Outputs
connection.setDoInput(true);
connection.setDoOutput(true);
connection.setUseCaches(false);
// Enable POST method
connection.setRequestMethod("POST");
outputStream = new DataOutputStream( connection.getOutputStream() );
// Read file
bytesRead = fileInputStream.read(buffer, 0, bufferSize);
while (bytesRead > 0)
{
outputStream.write(buffer, 0, bufferSize);
bytesAvailable = fileInputStream.available();
bufferSize = Math.min(bytesAvailable, maxBufferSize);
bytesRead = fileInputStream.read(buffer, 0, bufferSize);
}
Then write the boundry characters, flush and close the streams.
Thanks everyone for the support.
I finally optimized my code to a max extent using file operations.
For encoding I used Base64.encodeFileToFile(picturePath, encodedPicturePath);
I saved the encoded image in a file.
And then for encryption,I used CypherOutPutStream where a FileOutputStream is passed in constructor.So Encryption is also done using file.
The final step is while using HttpPost,I used to send the total encrypted data as a StringEntity which is final Hurdle for OutOfMemeoryException.I changed the StringEntity to FileEntity.This reduced the heap consumption of my application and thus improved the overall performance and upload capacity.
Note:
Dont Encrypt the encoded image in chunks which will change the overall encoded data.Do it inn a single piece.
Failures:
Before i used files for Encoding ,I chunked the whole picture and encoded to a file.But,If i decode the encoded file,I failed to get the original picture.
Regards,Sha