How to decode a very large image on android? - android

I am decoding an image(ARGB) that is 8000x8000 pixels, so in uncompressed form it reaches
(2 + 2 + 2 + 2) * (8000 * 8000) = 488 MB
and crashes the android. I don't want to sample the image because I am converting the bitmap to byte array and sending it in a PUT request. I have tried "decodeRegion" but I don't know how to stitch the data(i.e. byte arrays) back together , since they have the head info at start and just concatenating them isn't helping.

Use an HTTP client library that allows you to upload from a file or stream, so that you do not need to decode the image and try to hold it in memory. OkHttp has options for this; see this recipe for streaming a POST request, this recipe for POSTing a file, or this recipe for multipart POSTs. Those techniques should be adaptable to a PUT request.

Why are you reading in a large image, decoding it, then posting the byte array? That's the wrong way to do it.
If your API actually requires the decoded bytes, fix it. More likely it wants the file's raw data. In which case you just need to use any networking API that gives you an OutputStream, and read in the file's data 1 MB at a time, reading it from the File's InputStream and writing it to the socket's OutputStream

Related

Conversion of .Raw10 (images / Videos) to RGB in Android studio

There are a few problems I am listing here..
I am using a Omnivision image sensor to get the raw video and images. I have to convert the raw image to bitmap or the video format to MJPEG..
I tried got data using Uri, then to inputstream then to a byte [], a N x 1. where I got about a million values. I am not sure whether this is the right way to get the image. Then I tried to decode using imcodes. I used to bitwise shift and added the values, but it took a lot of time and the app crashed. Instead of it, I reshaped into m x n and tried to display on a bitmap to view it as a null. I tried the dimoraic which I could not proceed. I tried decoding it as bitmap too, and the app crashed too.
Is there any way I could directly stream it in Android studio. I need to convert this raw video format into MJPEG format. I tried to stream it in python just like a webcam, which gave an error can't grab frame and something to do with MSMF

NSData bytes difference and java bytes difference

I need to send a .jpeg file to the server and I convert the .jpeg to NSData like imagebytesss = UIImageJPEGRepresentation(anotherimg, 0.75); but in android it sends normal bytes that takes directly from stream but it has difference characters of bytes than iOS(NSData).
Does anyone know, the difference between NSData bytes and Android, or incase how to convert my .jpeg to bytes like java in iOS.
Thanks,
UIImageJPEGRepresentation encodes an image with specified compression quality, but as far as I understood you want to upload the binary image data unchanged.
To achieve this you could create a NSData object with a local URL like so:
let data = try? Data(contentsOf: url)
Swift 4 example with a local JPG file named 'coffee.jpg' in the main bundle:
if let url = Bundle.main.url(forResource: "coffee", withExtension:"jpg") {
if let data = try? Data(contentsOf: url) {
//upload it
}
}

Streaming upload of a base64 image using retrofit

I have an upstream server that accepts image submissions using rest. The submitted image is part of a JSON payload similar to this one
{
"name": "Blah.jpg",
"uploader": "user1",
"image": "<base64.....>"
}
Using this strategy works for small images but generates Out of Memory errors on larger images.
Is it possible to stream the base64 component of the image? Pass in something like an iterator that will be used to read chunks of the image, base64 them and send them directly to the network?
Not with Gson or Moshi. Both of these libraries require strings to be in memory before emitting them to a stream.
I solved this with the following, in a class that extends okhttp3.RequestBody:
private void writeFile(File file, BufferedSink sink) throws IOException {
byte buf[] = new byte[3000];
try (FileInputStream fin = new FileInputStream(file)) {
while (fin.read(buf) >= 0) {
byte encoded[] = Base64.encodeBase64(buf);
sink.write(encoded);
}
}
}
It uses Android's android.util.Base64 Apache Commons' org.apache.commons.codec.binary.Base64 to encode a buffered chunk of data.
I ended up writing the other json fields separately, with enough granularity that I could insert the file record exactly where I needed to.
EDIT:
As you can see in my edits above, I ended up switching to Apache commons-codec, via compile 'commons-codec:commons-codec:1.5' in my build.gradle file.
I didn't have time to investigate why the Android SDK solution didn't work. I tried their Base64.encode(buf, Base64.NO_WRAP) as suggested elsewhere - supposedly equivalent to Apache Commons' encodeBase64(byte[]) - but this did not work, hence the switch.
The problem could have been on our backend, so don't rule out Android SDK's solution based on my post alone - I just wanted to add this note so readers can see the code snippet that actually worked for me in the end.

Android : What is the difference between converting Bitmap to byte array and Bitmap.compress?

I am uploading an image (JPEG) from android phone to server. I tried these two methods -
Method 1 :
int bytes=bitmap.getByteCount();
ByteBuffer byteBuffer=ByteBuffer.allocate(bytes);
bitmap.copyPixelsToBuffer(byteBuffer);
byte[] byteArray = byteBuffer.array();
outputStream.write(byteArray, 0, bytes-1);
Method 2 :
bitmap.compress(Bitmap.CompressFormat.JPEG,100,outputStream);
In method1, I am converting the bitmap to bytearray and writing it to stream. In method 2 I have called the compress function BUT given the quality as 100 (which means no loss I guess).
I expected both to give the same result. BUT the results are very different. In the server the following happened -
Method 1 (the uploaded file in server) :
A file of size 3.8MB was uploaded to the server. The uploaded file is unrecognizable. Does not open with any image viewer.
Method 2 (the uploaded file in server)
A JPEG file of 415KB was uploaded to the server. The uploaded file was in JPEG format.
What is the difference between the two methods. How did the size differ so much even though I gave the compression quality as 100? Also why was the file not recognizable by any image viewer in method 1?
I expected both to give the same result.
I have no idea why.
What is the difference between the two methods.
The second approach creates a JPEG file. The first one does not. The first one merely makes a copy of the bytes that form the decoded image to the supplied buffer. It does not do so in any particular file format, let alone JPEG.
How did the size differ so much even though I gave the compression quality as 100?
Because the first approach applies no compression. 100 for JPEG quality does not mean "not compressed".
Also why was the file not recognizable by any image viewer in method 1?
Because the bytes copied to the buffer are not being written in any particular file format, and certainly not JPEG. That buffer is not designed to be written to disk. Rather, that buffer is designed to be used only to re-create the bitmap later on (e.g., for a bitmap passed over IPC).

Android change image size before sending via HTTP [duplicate]

This question already has answers here:
Get the image as Thumbnail
(2 answers)
Closed 9 years ago.
My app is currently sending images from an Android device to a PHP script by converting the image into a bit array and then converting to base64. The base64 string is then sent in a HTTP request.
The problem is that is the image is big (like the ones taken from android camera) then the transfer fails. What i want to do is change the image size before it goes through the conversion process.
How can i do this? I've tried to google it but have had no luck so far.
If your image size is big then you have to need first scale in to small size then encode this by base64 class then you send this on your server.
For scale your image read this http://developer.sonymobile.com/2011/06/27/how-to-scale-images-for-your-android-application
or other post
Use jpeg compression!
('Cause I'm not sure how sending up a base64 encoded byte array is going to save you any space.)
May I jump in and assume you've got a stage where you've converted your image into an array of pixels instead? If not, I'll assume there is no reason why the obvious conversion from bytes to integers representing pixels applies. Then we'll convert it to a compressed jpeg.
final int[] pixels = yourpixels;
You'll also need width and height:
final int width = theWidth; etc...
Next, get hold of your output stream in your client:
final HttpURLConnection connection = doWhateverYouDoToOpenYourConnection();
final OutputStream httpOutputStream = connection.getOutputStream();
Now the crucial step is to use the compression methods of Android's bitmap library to stream the compressed image onto the http output stream:
final Bitmap androidBitmap = Bitmap.createBitmap(pixels, width, height,Config.ARGB_8888);
androidBitmap.compress(android.graphics.Bitmap.CompressFormat.JPEG, YOUR_QUALITY_INT, outputStream);
Start with something like YOUR_QUALITY_INT = 85 to see significant improvement in image size without much visible deformation.
If this fails, create a scaled bitmap from a scale matrix: documentation here. This reduces the width and height of your bitmap on creation, which obviously reduces request size.
Hope this helps.

Categories

Resources