I have an issue with data coming from the internet. I have a package coming from the internet that has the content-type:application/octet-stream.
My question is how do i decode it to readable values (such as 1,4.321,-2.6013 etc)? or is there any application that can do this?
For example:
3ecccccd is 0.400 (http://gregstoll.dyndns.org/~gregstoll/floattohex/) some are real numbers. 07d0 is 2.000
HTTP's Content-Encoding header only applies to how data is encoded inside of the HTTP message itself, not how the data itself is encoded outside of HTTP. There is no Content-Encoding header present in your example, so HTTP is not encoding the data in any way, it is giving you the raw data as-is.
When HTTP's Content-Type header is application/octet-stream, that mean the data (after decoding it based on the Content-Encoding, if any) is raw 8bit data, the sender does not know what the actual type of the data is. Without a more meaningful Content-Type to tell you what the data actually represents, the only thing you can do is analyze the raw data and make educated guesses about what it might be, unless you know through other means what the data is supposed to be.
Usually, binary data formats have a header/signature at the front of the data to identify what the data is, so you might start with that.
Until you can identify the data type, you cannot know which bytes represent what kind of values, what endian is used for multi-byte values, etc. In short, you need more information about the data you are downloading in order to know how to process it.
That content type does not indicate any encoding. All it means is that the data represents a stream of (8-bit) bytes. How they're encoded is an entirely separate matter. To discover the encoding, you can look at other headers (such as Content-Encoding), or else the encoding might be implicit, in which case you'll need to consult documentation. Then choose an appropriate decoding strategy based on that.
Related
I want to be able to transfer blocks of my file instead of the complete file from my app inorder to transfer it more efficiently. What is the best way to do that?
Update
Here is an example of where I would need this approach : Say I have a 4GB file which I am trying to upload. If the network fails, my file upload will stop and I will have to start from scratch. Instead, if I keep track of the blocks that I have already transferred, I can continue from the blocks which were yet to be transferred. This is especially important for flaky network connections.
May i know what kind of file you have. ?I am asking because if it is other than text file then i think you need meu law and A law or G711 or G729 algorithm.
I figured out one approach to do this. What you could do is divide the file into blocks using the approach mentioned here - How to break a file into pieces using Java? and convert each block into a Base64 string and then transfer the encoded string. The server responds with the chunk number after it receives it. This chunk number can then be saved on the device locally and the next chunk can then be sent to the server. After all the chunks have been transferred, the Base64 decoding can be done on the server side and the files can be merged.
I am trying out Parse for a possible backend for my app but I have a concern when dealing with large files that I want to upload to it.
in the documentation is says I need to convert a file to byte array and pass that into the ParseFile like so
byte[] data = "Working at Parse is great!".getBytes();
ParseFile file = new ParseFile("resume.txt", data);
however a large file can obviously throw an OutOfMemoryException here since loading the file into a byte array would load the whole thing into memory.
So my question is how would I go about uploading a large file to Parse while avoiding getting OOME?
It is very likely your users will end up crashing the app if they have full control of the upload process since many users don't have a solid understanding of file sizes. I noticed that parse has a nice wrapper class in their API for iOS (PFFile) which is limited to 10 MB, it's a pity they haven't implemented the same for android.
I think you are going to have to do this manually using their REST API. Have a look at their REST API, more specifically the files endpoint. You can easily use an HttpURLConnection in conjunction with a FileInputStream which gives you more flexibility over the upload process using streams. I'm usually more comfortable doing stuff manually rather than using a wrapper that's not exactly clear what's going on behind the scenes, but if you are not, I can post a minimalistic example.
One possible solution is to read in portions of the large file and upload each portion separately. You would need to do this in such a way that you can reassemble the file when you download it again.
no reason that you cannot use the REST API and chunk the upload using normal http 1.1 headers.
You should read up on 'chunked-encoding' and figure out how to segment your overall byte-array in in android if its really that big.
In android, in the sender, you will have to segment stuff , explicitly managing io on both sides where there will be streams...
The output stream, you should be able to hook up directly to the http request objects "OUT STREAM". The input side is just a read of a smaller block of your input file so you can conserve memory.
You set up the http Connection
Connect it
get the OUTSTREAM for the http-connection-request and pass that to the streamCopy util that is handling your chunking.
HTTP req headers:
> Transfer-Encoding: chunked
> Expect: 100-continue
HTTP response headers from parse u should see...
something indicating a "100 continue" that its waiting for more data to be sent by the chunking process behind the outputstream attached to the request.
Remember to CLOSE streams/connections when done with the http POST.
Note - if you have big files you really should question what you are actually getting from the big size. All types of media files allow you to trim the size prior to the uploads.
I write app on android which will need to exchange xml data with http server. I wonder what would be the better approach. Send whole file via POST or maybe get all text from file put it on String and then send this String via POST. Is there will be some difference? If yes what is better option?
I would strongly recommend using POST. While sending the file content using GET is theoretically possibly, in some cases you may encounter problems when using URLs over 2000 characters in length. RFC imposes no strict limit, however some clients and servers impose their own limit. Look at this question for more details on this.
With POST this wouldn't apply and you can send (almost) any size data. To send the file, you would still need to read the content of the file and send it as POST parameter though. Again, in reality, most servers will not accept more than just below 2GB, but that's a separate issue.
I'm building an Android client for a web service that accepts POST data. We're standing on the fence which format to choose for the POST data. According to me, the easiest way is to send it in UrlEncoded format but server side developer thinks JSON is better.
What are the pros and cons of using UrlEncoded / jsonEncoded / bsonEncoded format?
I would avoid xmlencoded data, but what about the others?
The answer to your question greatly depends on what kind of data you're going to send. If your data is mostly string values, numbers and the like, probably JSON would be your best solution.
Avoid url-encoded data, use MultiPart instead -- it takes a bit more work, but it's more secure (url-encoded data it's visible in the server logs) and you may send large files (images?) easily.
If you are sending maps (set of key-value pairs) and arrays, JSON is probably the easiest to work with from a developer standpoint on both client and server. If you need to optimize instead on use bandwidth usage for large set of non-media data, protobuf works well.
I am sending a POST request to a web-service which returns data in XML format. When I invoke the web-service via the browser i can see the complete XML file. However, when I call the web-service from my android app (in emulator) I only get partial XML data?
The size of the XML file being sent is approx 14500 bytes. I first store the response in a byte array whose size i calculate using getContentLength() function. This method returns the correct size of the response.
However, my InputStreamReader reads approximately 6350 bytes of the actual data to the byte array. The remaining part of the byte array simply contains zeros.
I tried looking everywhere online but so far I haven't come across a solution. Yes chunking is one option but since I dont have access to the web-service code I can't implement that option.
Really appreciate if someone could guide me here!
I don't understand why do you need to use InputStreamReader (which can behave very wrong if you are not careful) to read web service data (in your case it is only around 14.5kb). Just use HttpURLConnection, connect to your ws, and it will take care for you about the connection and also the response connection.getResponseMessage();