Android Universal Image Loader Custom Image Decoder with CMYK support - android

Background
We have just swapped to the UIL library, which seems fantastic. Unfortunately we have to support CMYK images (against our will) and have attempted to modify an existing ImageDecoder called BaseImageDecoder.
The code for this can be found here. http://pastebin.com/NqbSr0w3
We had an existing AsyncTask http://pastebin.com/5aq6QrRd that used an ImageMagick wrapper described in this SO post (Convert Image byte[] from CMYK to RGB?). This worked fine before in our setup.
The Problem
The current decoder fails to load the cached image from the file system and this results in a decoding error. We have looked through the source code and believe we are using the right functions. We also thought that adding our extra level of decoding at this point in the process was ideal, as the image may have been resized and stored on the file system.
File cachedImageFile = ImageLoader.getInstance().getDiscCache().get(decodingInfo.getImageUri());
if (!cachedImageFile.exists()) {
Log.v("App", "FILE DOES NOT EXIST");
return null;
}
The above lines always return that the file does not exist.
The Question
Are we incorrect to process our CMYK images at this point, and if not why can't we get the image from the cache on file system?

Related

Why when loading image with Glide as ByteArray into an ImageView doesn't rotate it according to EXIF?

I am using Glide library on Android to load a JPG format image into an ImageView, first I convert it to a ByteArray and then I use the following code:
GlideApp.with(context)
.load(selectedImageByteArray)
.into(image_view)
However, when the selected image orientation EXIF data is equal to "Rotate 270 CW" the image is not rotated by Glide unless I use the following code:
GlideApp.with(context)
.load(selectedImagePath)
.into(image_view)
This way I pass the selected image Uri instead of a ByteArray, why does this happen?
I attach and example (even here in Stack is not rotated):
Because that interface takes a path and nothing else. Kind of annoying since depending on the api level it is not always available. In some of the newer versions of android it is not easy to get the actual path of the file.image
data for that
The short answer is that the data that interface uses is stored in the file, not in the image data itself. There are many stack overflow links about this:
SO

Jpeg image from Android camera cannot be loaded as Qt Pixmap

In my android application, I used ImageReader surface to capture the image from the camera and dump it into a file. Here is the relevant code:
void saveImage(Image img) {
Bytebuffer buf = img.GetPlanes()[0].getBuffer();
buf.rewind();
byte[] data = new byte[buf.remaining()];
buf.get(data);
saveToFile(data);
}
The generated file, when viewed through avplay, seems to display the image just fine. However, when I load the same content using Qt's QPixmap::loadFromData on Ubuntu, the method fails. The errors I get are:
Corrupt JPEG data: premature end of data segment
Invalid JPEG file structure: two SOI markers
I am wondering if anyone has any insight on how to overcome this problem. Not sure if it is a problem with Android MediaCodec class or the jpeg library Qt internally uses has a bug. Regards.

Activity automatically closes during image processing ( android )

I am developing an application that includes image processing ( grayscale , Bw filte ,object detection ,color adjustment ,level adjustment ).
As you know, new mobile phones takes high quality images with large sizes.Due to memory limitations, it is difficult to image processing and outofmemoryException occurs frequently .So I've moved compeletely image proccessing from Java layer to JNI as follows:
Mat file loaded in jni by source file path.
Proccessing ....
Result mat are stored in sd card.
Result image loaded in inSampledSize bitmap as preview.
OutOfMemoryException does not occur with this method in image proccessing .
but sometime when image have very large dimensions , Activity closed automattically during image processing without any exception and it's cause did not specify when debugging.
Why is this happening? And How can I fix this?
Excuse me for my english.
There are two things you could two :
First check if the problem happens while doing the image processing, if yes try to offload it from main thread and do it in a separate thread (may be an async task to start with). For more info on this refer this
Out of Memory Issue happens easily when you try set the high resolution images(bitmap) to the ImageView. If you are doing this you should lower the resolution before setting it to the ImageView. For more info on this please refer this link and also this stackoverflow post.

How to detect if JPG is in RGB (or CMYK) format?

I need a (really fast) way to check if a JPG file is in RGB format (or any other format that Android can show).
Actually, at this moment, I just know if a JPG file can be show when I try to convert it to Bitmap using BitmapFactory.
I think this should not be the fastest way. So I try to get it by using ExifInterface. Unfortunately, ExifInterface (from Android) does not have any tag that indicates that the jpg can be shown in Android (color space tag or something).
Then, I think I have 2 ways:
1) A fast way to get bitmap from jpg: any tip of how to do it?
2) Or try to read Exif tags by my self, but without adding any other lib to the project: I don't have any idea of how to do it!
Ok so I did some looking around and I may have a solution for you but it may require a little work. The link is a pure java library that I think you can use in your project or at least modify and include a few classes. I have not worked with this yet but it looks like it will work.
http://commons.apache.org/proper/commons-imaging
final ImageInfo imageInfo = Imaging.getImageInfo(File file);
if(imageInfo.getColorType() == ImageInfo.COLOR_TYPE_CMYK){
}
else {
}

Android: difference between BitmapFactory.decodeResource() and BitmapFactory.decodeFile() result?

I'm noticing a crash (in an external native library that does some image processing) when I pass it the pixel data returned from bitmap.getPixels().
If I package the image in the app, in the drawables folder and load the Bitmap with
BitmapFactory.decodeResource()
then grab the pixel data with
bitmap.getPixels()
there's no crash, and everything works as expected. However, if I load the same image from the file system with
BitmapFactory.decodeFile()
then grab the pixels with
bitmap.getPixels()
and hand that off, the native lib crashes.
Is there a difference between the way these two calls process the image into a Bitmap?
Reading the Android sources There is one interesting diffrence: The decodeFile method may call a different native bitmap decoder if the passed file is an asset, while the decodeResource will never do this.
if (is instanceof AssetManager.AssetInputStream) {
bm = nativeDecodeAsset(((AssetManager.AssetInputStream) is).getAssetInt(),
outPadding, opts);
However, the crash is most likely a bug in your native code. Messing up the stackframe with bad pointers and/or buffer overruns typically results in weird crashes like this. Try to check all your native code that runs before the crash and see if you can spot any memory issues like that.

Categories

Resources