I am currently working on an android application, that should be able to detect a QR code from a picture taken. It is not necessary to decode the QR code because it's only needed to calibrate the camera.
I am using openCV and when I tried to detect the QR code from the original downloaded picture of the QR code it works fine. That's the code I used:
bitmap= BitmapFactory.decodeResource(getResources(),R.drawable.qrcodemitzeugs);
Mat img =new Mat();
Utils.bitmapToMat(bitmap,img)
Mat points= new Mat();
QRCodeDetector detector=new QRCodeDetector();
boolean data = detector.detect(img, points);
But when I try the same Code on a photo taken with my smartphone's camera, the QR code doesn't get detected. I searched for a solution and found that maybe the contrast wasn't high enough so converted the picture into binary using following code:
Imgproc.cvtColor(img,img2,Imgproc.COLOR_BGR2GRAY);
Imgproc.threshold(img2,img2,100,255,Imgproc.THRESH_BINARY);
It returned the whole image in black and white but still the QR code wasn't detected.
Did I do something wrong or is there a solution for this problem?
One of the images i used
I had to resize the picture before uploading it
So I solved my problem, by resizing the Mat to a max of 1200x1200. Apparently the OpenCV QRCodeDetector can only handle Mats between ca. 85x85 and ca. 1200x1200, in which the QR Code itself has to have at least a size of ca. 80x80. I tested this with the image of the original QR Code, which had a size of 600x600. I resized it until the QR Code wasn't detected anymore.
I need a (really fast) way to check if a JPG file is in RGB format (or any other format that Android can show).
Actually, at this moment, I just know if a JPG file can be show when I try to convert it to Bitmap using BitmapFactory.
I think this should not be the fastest way. So I try to get it by using ExifInterface. Unfortunately, ExifInterface (from Android) does not have any tag that indicates that the jpg can be shown in Android (color space tag or something).
Then, I think I have 2 ways:
1) A fast way to get bitmap from jpg: any tip of how to do it?
2) Or try to read Exif tags by my self, but without adding any other lib to the project: I don't have any idea of how to do it!
Ok so I did some looking around and I may have a solution for you but it may require a little work. The link is a pure java library that I think you can use in your project or at least modify and include a few classes. I have not worked with this yet but it looks like it will work.
http://commons.apache.org/proper/commons-imaging
final ImageInfo imageInfo = Imaging.getImageInfo(File file);
if(imageInfo.getColorType() == ImageInfo.COLOR_TYPE_CMYK){
}
else {
}
It seems to be the simplest thing in the world: taking a picture within your Android app using the default camera activity. However, there are many pitfalls which are covered in several posts across StackOverflow and the web as, for instance, Null Intents being passed back, the orientation of the picture not being correct or OutOfMemoryErrors.
I'm looking for a solution that allows me to
start the camera activity via the camera intent,
retrieve the Uri of the photo, and
retrieve the correct orientation of the photo.
Moreover, I would like to avoid a device configuration (manufacturer, model, os version) specific implementation as far as possible. So I'm wondering: what is the best way to achieve this?
UPDATE: January 2nd, 2014:
I tried really hard to avoid implementing different strategies based on the device manufacturer. Unfortunately, I did not get around it. Going through hundreds of posts and talking to several developers, nobody found a solution that works on all devices without implementing device manufacturer specific code.
After I posted my solution here on StackOverflow, some developers asked me to publish my code on github. So here it is now: AndroidCameraUtil on github
The code was successfully tested on a wide variety of devices with Android API-Level >= 8. For a complete list, please see the Readme file on github.
The CameraIntentHelperActivity provides the main functionality, which is also described in more detail in the following.
Calling the default camera activity:
for Samsung and Sony devices: I call the camera activity with the method call to startActivityForResult. I only set the constant CAPTURE_IMAGE_ACTIVITY_REQUEST_CODE. I do NOT set any other intent extras.
for all other devices: I call the camera activity with the method call to startActivityForResult as previously. This time, however, I additionally set the intent extra MediaStore.EXTRA_OUTPUT and provide an URI, where I want the image to be stored.
In both cases I remember the time the camera activity was started.
On camera activity result:
Mediastore: First, I try to read the photo being captured from the MediaStore. Using a mangedQuery on the MediaStore content, I retrieve the latest image being taken, as well as its orientation property and its timestamp. If I find an image and it was not taken before the camera intent was called, it is the image I was looking for. Otherwise, I dismiss the result and try one of the following approaches.
Intent extra: Second, I try to get an image Uri from intent.getData() of the returning intent. If this is not successful either, I continue with step 3.
Default photo Uri: If all of the above mentioned steps did not work, I use the image Uri I passed to the camera activity.
At this point, I retrieved the photo Uri and its orientation which I pass to my UploadPhotoActivity.
Image processing
Please take a close look at my BitmapHelper class. It is based on the code described in detail in that tutorial.
Moreover, the shrinkBitmap method also rotates the image if required based on the orientation information extracted earlier.
I hope this is helpful to some of you.
I have tested this code with a Sony Xperia Go, Samsung Galaxy SII, Samsung Galaxy SIII mini and a Samsung Galaxy Y it worked on all devices!
But on the LG E400 (2.3.6) it didn’t work and you get double pictures in the gallery. So i have added the manufacturer.contains("lge") in the void startCameraIntent() and it fixed the problem.
if(!(manufacturer.contains("samsung")) && !(manufacturer.contains("sony")) && !(manufacturer.contains("lge"))) {
String filename = System.currentTimeMillis() + ".jpg";
ContentValues values = new ContentValues();
values.put(MediaStore.Images.Media.TITLE, filename);
cameraPicUri = getContentResolver().insert(MediaStore.Images.Media.EXTERNAL_CONTENT_URI, values);
intent.putExtra(MediaStore.EXTRA_OUTPUT, cameraPicUri);
}
On a Galaxy S3 with CM 10.1 I get a nullpointer exception in BitmapHelper:
bm = BitmapFactory.decodeFileDescriptor(fileDescriptor.getFileDescriptor(), null, options);
subsequently my UploadPhotoActivity fails at:
try {
photo = BitmapHelper.readBitmap(this, cameraPicUri);
if (photo != null) {
photo = BitmapHelper.shrinkBitmap(photo, 600, rotateXDegrees);
thumbnail = BitmapHelper.shrinkBitmap(photo, 100);
ImageView imageView = (ImageView) findViewById(R.id.sustainable_action_photo);
imageView.setImageBitmap(photo);
} else {
Log.e(TAG,"IMAGE ERROR 1");
}
} catch (Exception e) {
Log.e(TAG,"IMAGE ERROR 2");
e.printStackTrace();
}
at the second log (IMAGE ERROR 2).
After a couple of tries my camera broke and I got a "Could not connect to camera"-error.
Tested it on a nexus 7 and it works perfectly.
Edit: Narrowed it down to this:
fileDescriptor = context.getContentResolver().openAssetFileDescriptor(selectedImage, "r");
Although selectedImage contains this:
file:///storage/emulated/0/DCIM/Camera/IMG_20131023_183343.jpg
The fileDescriptor returns a FileNotFoundException. I checked the file system and the image is not saved at this location. The cameraPicUri in TakePhotoActivity points to a non existant image. I am currently checking where it all goes wrong.
Edit2: I figured out the error: Since the device is a Samsung, and tells the App that it is a Samsung device, your Samsung specific fixes are applied. Cyanogenmod does not need those fixes though, and in the end the code breaks. Once you remove
(manufacturer.contains("samsung")) &&
It works. Since this is a custom ROM you could not plan for that of course. I am trying to figure out a way to detect if the device is running cyanogenmod and then include this in your code.
Thanks for a nice camera fix!
Edit3: I fixed it to run on Cyanogenmod on the Galaxy S3 by changing your code to this:
Well, now it sometimes works, sometimes it does not. Strange.
if (getPackageManager().hasSystemFeature("com.cyanogenmod.android") || (!(manufacturer.contains("samsung")) && !(manufacturer.contains("sony")) && !(manufacturer.contains("lge"))))
I experience some problems when using this with Sony Xperia Z5.
I added this and it got a lot better.
if (buildType.contains("sony")&& buildDevice.contains("e5823")) {
setPreDefinedCameraUri = true;}
But 4 times out of 22 it restarted the camera and once it restarted two times. I restarted the App due to every test.
Is there some way to get around this or do I accept this result?
The thing is that if the camera restarts I can press the back button twice and boom, the image is there in my Imageview and saved
I have developed an application that calls the ZXing library to invoke the camera.
I have now just been told that we need this working in the emulator for testing purposes.
I don't want to change the code to read from a file, I wan't to call the library, which will call the camera and I wan't the camera to think its looking at an image (if that makes sense).
I've seen a lot around of using webcams but I don't want a live feed.
Does anyone know if this is at all possible?
It depends on how you're integrating with Zxing. If you are scanning barcodes via intent, then the web-cam style live feed may be your only option.
AFAIK there is a reader class that expects you to supply the images. Something like this...
Reader reader = new MultiFormatReader(); //If you are calling the reader in your code
BinaryBitmap bitmap = new BinaryBitmap(new HybridBinarizer(source)); //The source as
//supplied by you
Result result = reader.decode(bitmap);
String text = result.getText();
Note that the above code is bits and pieces taken from here .
So it looks like you should be able to supply some image and have the bar code reader take a shot at it.
Hello everyone,
If any one of you can help me this. I am using zxing to decode barCode image, but it returns com.google.zxing.NotFoundException, don't know why. The same image gets decoded via Intent provided to zxing, but not when I use it to decode from image file.
The code that I am using is below :
mMultiFormatReader = new MultiFormatReader();
mMultiFormatReader.setHints(null);
BinaryBitmap bitmap = new BinaryBitmap(new HybridBinarizer(new RGBLuminanceSource(path)));
Result result = mMultiFormatReader.decodeWithState(bitmap);
I don't think it's exactly the same image, since you can't have it scan a file by Intent. I assume you mean that you can scan the image off your screen fine, but the image itself does not decode.
That's just life, really. Some images won't happen to decode. But you may try TRY_HARDER mode or use a different binarizer to see if that works.