I'm playing with sending some images over a Bluetooth link to an embedded device. The device speaks and decodes PNGs for the images, however it seems some of the messages are getting too large for transfer over the wire at a reasonable speed / data size.
I'm currently using Bitmap.compress(Bitmap.Compression.PNG, 0, <ByteArrayOutputStream>) to get the byte array for the compressed data.
Is there any way I can either make Bitmap.compress() be lossy with the colors. And for that matter, would Bitmap.compress ever use a 256-colour mode on it's own, given an image that has an appropriate number of colors.
Related
I need to send a picture from my app to the server. However before I send it I need to do some little modifications on it like rotating it. My problem is to know what quality to save it before sending it to the server. I tried
bitmap.compress(CompressFormat.JPEG, 100, fileoutputstream);
file size before: 1.5mb - file size after 3 mb :(
so it seems that the file is now bigger than before and I'm not sure that the quality is better (as the image was already compressed). Now just doing
bitmap.compress(CompressFormat.JPEG, 80, fileoutputstream);
file size before: 1.5mb - file size after 0.3 mb
so surely with a quality of 80 I will have a lesser quality image than before. So what quality does android use by default when a user takes a picture and saves it to the gallery? How do I save the image with the same quality as before, without losing anything?
For an app like instagram/500px what is an acceptable image quality we can set before sending picture to the server? 50? 75? 80? 100?
Every saving in JPEG is cost to you some quality loss. Even if you use "100% quality", some lossy compression will occur.
How to save the image with the same quality as before, without loosing
anything ?
You can't save jpeg without quality loss. Only very specialized software can do some tricks without recompression, with very limited features like rotation by 90 degree.
so surelly with a quality of 80 i will have a less quality than before
So what quality does android use by default
JPEG standard not defines quality in percents at all. If you use "80%" in one program, it can be same as "50%" or "90%" in other. This is just numbers for some encoder.
JPEG and PNG are both compress mode. So, it can not make sure the 100% quality.
For JPEG, in my case, i often use 90% for high quality, use 70% for low quality. In opencv, for CV_IMWRITE_JPEG_QUALITY, default value is 90% quality.
Hope this is helpful.
I am trying to write a content provider in order to send an image to another process via ImageView.setImageUri() call, the Bitmap i want to send is in memory and, due to performance reasons, i would prefer avoiding saving it on disk and even encoding it because it might take up to 2/3 seconds to encode in PNG and it wouldn't make much sense since the other end is then decoding it.
So, i am able to send the image in PNG correctly using a PipeHelper but i cannot send it "raw" (so an Android Bitmap as a bytearray directly from memory), is there any option to do that? Am i passing the wrong mime type? Why setImageBitmap() works with raw bitmaps while setImageUri() doesn't?
I know i could use a Parcelable Bitmap but i want to overcome Parcelable size constraints. Finally, is there any encoding faster than PNG that supports alpha if ending size is not an issue?
I am trying to use data from Android picture. I do not like JPEG format, since eventually I will use gray scale data. YUV format is fine with me, since the first half part is gray-scale.
from the Android development tutorial,
public final void takePicture (Camera.ShutterCallback shutter,
Camera.PictureCallback raw, Camera.PictureCallback postview,
Camera.PictureCallback jpeg)
Added in API level 5
Triggers an asynchronous image capture. The camera service will
initiate a series of callbacks to the application as the image capture
progresses. The shutter callback occurs after the image is captured.
This can be used to trigger a sound to let the user know that image
has been captured. The raw callback occurs when the raw image data is
available (NOTE: the data will be null if there is no raw image
callback buffer available or the raw image callback buffer is not
large enough to hold the raw image). The postview callback occurs when
a scaled, fully processed postview image is available (NOTE: not all
hardware supports this). The jpeg callback occurs when the compressed
image is available. If the application does not need a particular
callback, a null can be passed instead of a callback method.
It talks about "the raw image data". However, I find nowhere information about the format for the raw image data?
Do you have any idea about that?
I want to get the gray-scale data of the picture taken by the photo, and the data are located in the phone memory, so it would not cost time to write/read from image files, or convert between different image formats. Or maybe I have to sacrifice some to get it??
After some search, I think I found the answer:
From the Android tutorial:
"The raw callback occurs when the raw image data is available (NOTE:
the data will be null if there is no raw image callback buffer
available or the raw image callback buffer is not large enough to hold
the raw image)."
See this link (2011/05/10)
Android: Raw image callback supported devices
Not all devices support raw pictureCallback.
https://groups.google.com/forum/?fromgroups=#!topic/android-developers/ZRkeoCD2uyc (2009)
The employee Dave Sparks at Google said:
"The original intent was to return an uncompressed RGB565 frame, but
this proved to be impractical. " "I am inclined to deprecate that API
entirely and replace it with hooks for native signal processing. "
Many people report the similar problem. See:
http://code.google.com/p/android/issues/detail?id=10910
Since many image processing processes are based on gray scale images, I am looking forward gray scale raw data in the memory produced for each picture by the Android.
You may have some luck with getSupportedPictureFormats(). If it lists some YUV format, you can use setPictureFormat() and the desired resolution, and ciunterintuitively you will get the uncompressed high quality image in JpegPreview callback, from which grayscale (a.k.a. luminance) can be easily extracted.
Most devices will only list JPEG as a valid choice. That's because they perform compression in hardware, on the camera side. Note that the data transfer from camera to application RAM is often the bottleneck; if you can use stagefright hw JPEG decoder, you will actually get the result faster.
The biggest problem with using the raw callback is that many developers have trouble with getting anything returned on many phones.
If you are satisfied with just the YUV array, your camera preview SurfaceView can implement PreviewCallback and you can add the onPreviewFrame method to your class. This function will allow you direct access to the YUV array for every frame. You can fetch it when you choose.
EDIT: I should specify that I was assuming you were building a custom camera application in which you extended SurfaceView for a custom camera preview surface. In order to follow my advice you will need to build a custom camera. If you are trying to do things quickly though I suggest building a new bitmap out of the JPEG data where you implement the greyscale yourself.
We need to downsample image received from InputStream. It is an image received from some URL and it can be either pretty small or very large. To fit this image in memory we have to downsample it. First we retrieve image size with the help of inJustDecodeBounds and calculate necessary sample. Then we create downsampled bitmap by specifying this sample in BitmapFactory.Options.inSampleSize. This 2-steps decoding needs two calls of decodeStream() and works just fine.
This works just fine for files from SD card. But in our case input stream cannot be reset so we can't call decodeStream() twice. Cloning of input stream is also not an option because of its huge size. Alternatively, we can create 2 HTTP requests to the same URL: first to get image size, and then to decode actual image with downsampling, but this solution seems to be rather ugly.
Can we reuse stream which cannot be reset? Or please propose some known workarounds for this problem.
If you wan't to reuse the stream it is obviously must be saved to either RAM or the SD-card, because network InputStream (let's imagine it is not Buffered) is not keeping downloaded data.
So the option to workaround this as said before is to save image directly to the sd-card (maybe in some temp directory) if image could really huge.
I would like to capture an image with the Android Camera but because the image may contain sensitive data I dont want the image saved to the phone or sd card. Instead I would like a base64 string (compressed) which would be sent to the server immediately
In PhoneGap it seems files are saved to various places automatically.
Natively I was never able to get the image stream - in onJpegPictureTaken() the byte[] parameter was always null.
can anyone suggest a way?
See Camera.onPreviewFrame() and the YuvImage.compresstoJpeg() to be able to get a byte array you can convert into a bitmap.
Note that YuvImage.compressToJpeg() is only available in SDK 8 or later, I think. For earlier versions you'll need to implement your own YUV decoder. There are several examples around or, I could provide you an example.
Those two methods will allow you to get a camera picture in memory and never persist it to SD. Beware that bitmaps of most camera preview sizes will chew up memory pretty quickly and you'll need to be very careful to recycle the bitmaps and probably also have to scale them down a bit to do much with them and still fit inside the native heap restrictions on most devices.
Good luck!