I am sending a POST request to a web-service which returns data in XML format. When I invoke the web-service via the browser i can see the complete XML file. However, when I call the web-service from my android app (in emulator) I only get partial XML data?
The size of the XML file being sent is approx 14500 bytes. I first store the response in a byte array whose size i calculate using getContentLength() function. This method returns the correct size of the response.
However, my InputStreamReader reads approximately 6350 bytes of the actual data to the byte array. The remaining part of the byte array simply contains zeros.
I tried looking everywhere online but so far I haven't come across a solution. Yes chunking is one option but since I dont have access to the web-service code I can't implement that option.
Really appreciate if someone could guide me here!
I don't understand why do you need to use InputStreamReader (which can behave very wrong if you are not careful) to read web service data (in your case it is only around 14.5kb). Just use HttpURLConnection, connect to your ws, and it will take care for you about the connection and also the response connection.getResponseMessage();
Related
I am trying out Parse for a possible backend for my app but I have a concern when dealing with large files that I want to upload to it.
in the documentation is says I need to convert a file to byte array and pass that into the ParseFile like so
byte[] data = "Working at Parse is great!".getBytes();
ParseFile file = new ParseFile("resume.txt", data);
however a large file can obviously throw an OutOfMemoryException here since loading the file into a byte array would load the whole thing into memory.
So my question is how would I go about uploading a large file to Parse while avoiding getting OOME?
It is very likely your users will end up crashing the app if they have full control of the upload process since many users don't have a solid understanding of file sizes. I noticed that parse has a nice wrapper class in their API for iOS (PFFile) which is limited to 10 MB, it's a pity they haven't implemented the same for android.
I think you are going to have to do this manually using their REST API. Have a look at their REST API, more specifically the files endpoint. You can easily use an HttpURLConnection in conjunction with a FileInputStream which gives you more flexibility over the upload process using streams. I'm usually more comfortable doing stuff manually rather than using a wrapper that's not exactly clear what's going on behind the scenes, but if you are not, I can post a minimalistic example.
One possible solution is to read in portions of the large file and upload each portion separately. You would need to do this in such a way that you can reassemble the file when you download it again.
no reason that you cannot use the REST API and chunk the upload using normal http 1.1 headers.
You should read up on 'chunked-encoding' and figure out how to segment your overall byte-array in in android if its really that big.
In android, in the sender, you will have to segment stuff , explicitly managing io on both sides where there will be streams...
The output stream, you should be able to hook up directly to the http request objects "OUT STREAM". The input side is just a read of a smaller block of your input file so you can conserve memory.
You set up the http Connection
Connect it
get the OUTSTREAM for the http-connection-request and pass that to the streamCopy util that is handling your chunking.
HTTP req headers:
> Transfer-Encoding: chunked
> Expect: 100-continue
HTTP response headers from parse u should see...
something indicating a "100 continue" that its waiting for more data to be sent by the chunking process behind the outputstream attached to the request.
Remember to CLOSE streams/connections when done with the http POST.
Note - if you have big files you really should question what you are actually getting from the big size. All types of media files allow you to trim the size prior to the uploads.
I send a request off to our server and receive a response from the server. The response contains information related to an image. The server returns the image as a base64 string. It is taking way to long (30-40 seconds per image) to parse out the data so I can pull the information I need to save and view the image. I am looking for suggestions on better ways that might help to speed up the process. I am currently using a SAX parser and the slow down is when the .parse function is called.
I am polling my webservice and receiving a response which is very long (its supposed to be a json)
but because it's too long, the system truncates the end of it
and i receive a json that looks like this {"object":["one","two", ....
which is of course not a valid json anymore.
3 dots instead of the correct json ending.
There is nothing i can do about the length of the json.
Is there anything i can do to receive all of it?
I think you are getting confused between what you can print in debug, and this can be truncated by Android logs, and what you exactly receive from the server. Either try to print out what you get as a response in a file (on sdcard) or just print out the length of the received stuff.
But there are no chances that a response is truncated in http, think about downloading a huge file of several Mb, except if there is really something very wrong on the server side, like a bad content-length.
I have an issue with data coming from the internet. I have a package coming from the internet that has the content-type:application/octet-stream.
My question is how do i decode it to readable values (such as 1,4.321,-2.6013 etc)? or is there any application that can do this?
For example:
3ecccccd is 0.400 (http://gregstoll.dyndns.org/~gregstoll/floattohex/) some are real numbers. 07d0 is 2.000
HTTP's Content-Encoding header only applies to how data is encoded inside of the HTTP message itself, not how the data itself is encoded outside of HTTP. There is no Content-Encoding header present in your example, so HTTP is not encoding the data in any way, it is giving you the raw data as-is.
When HTTP's Content-Type header is application/octet-stream, that mean the data (after decoding it based on the Content-Encoding, if any) is raw 8bit data, the sender does not know what the actual type of the data is. Without a more meaningful Content-Type to tell you what the data actually represents, the only thing you can do is analyze the raw data and make educated guesses about what it might be, unless you know through other means what the data is supposed to be.
Usually, binary data formats have a header/signature at the front of the data to identify what the data is, so you might start with that.
Until you can identify the data type, you cannot know which bytes represent what kind of values, what endian is used for multi-byte values, etc. In short, you need more information about the data you are downloading in order to know how to process it.
That content type does not indicate any encoding. All it means is that the data represents a stream of (8-bit) bytes. How they're encoded is an entirely separate matter. To discover the encoding, you can look at other headers (such as Content-Encoding), or else the encoding might be implicit, in which case you'll need to consult documentation. Then choose an appropriate decoding strategy based on that.
I'm trying to send an image as a Base64 encoded string to my PHP script via HttpGet, but as I kind of expected I get a 414 URI too large from my server.
Is there a way to post large strings with HttpGet?
Any help is greatly appreciated.
URI limit depends on server settings and its not a good idea to send huge data via Get method. And no, you cant use Post on Get service.
The best would be to alter your webservice to receive Post request and then you may send as long data as you want
when the URI is too large (cause of you are using it to send an image...) all you can do is to try to make it smaller, by compressing it. Or if you have access to the server, increase the limit....
the one way or the other.... use http-post instead of http-get.
you wont have the problem that the size is limited (or if theres a limit, its way bigger then the one from http-get) and i cant believe that sending an image via http-get is usage as intended by the http