I need to detect the user face and also compare the face to authenticate my application,for that I used FaceDetector API to detect the user face.
When i run my code it works without any defects.But it gives detected faces count as Zero.
public class AndroidFaceDetectorActivity extends Activity {
private static final int TAKE_PICTURE_CODE = 100;
private static final int MAX_FACES = 5;
private Bitmap cameraBitmap = null;
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
((Button)findViewById(R.id.take_picture)).setOnClickListener(btnClick);
}
#Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
super.onActivityResult(requestCode, resultCode, data);
if(TAKE_PICTURE_CODE == requestCode){
processCameraImage(data);
}
}
private void openCamera(){
Intent intent = new Intent(android.provider.MediaStore.ACTION_IMAGE_CAPTURE);
startActivityForResult(intent, TAKE_PICTURE_CODE);
}
private void processCameraImage(Intent intent){
setContentView(R.layout.detectlayout);
((Button)findViewById(R.id.detect_face)).setOnClickListener(btnClick);
ImageView imageView = (ImageView)findViewById(R.id.image_view);
cameraBitmap = (Bitmap)intent.getExtras().get("data");
imageView.setImageBitmap(cameraBitmap);
}
private void detectFaces(){
if(null != cameraBitmap){
Log.d("FACE_RECOGNITION","CHECK");
int width = cameraBitmap.getWidth();
int height = cameraBitmap.getHeight();
FaceDetector detector = new FaceDetector(width, height,AndroidFaceDetectorActivity.MAX_FACES);
Face[] faces = new Face[AndroidFaceDetectorActivity.MAX_FACES];
Bitmap bitmap565 = Bitmap.createBitmap(width, height, Config.RGB_565);
Paint ditherPaint = new Paint();
Paint drawPaint = new Paint();
ditherPaint.setDither(true);
drawPaint.setColor(Color.RED);
drawPaint.setStyle(Paint.Style.STROKE);
drawPaint.setStrokeWidth(2);
Canvas canvas = new Canvas();
canvas.setBitmap(bitmap565);
canvas.drawBitmap(cameraBitmap, 0, 0, ditherPaint);
int facesFound = detector.findFaces(bitmap565, faces);
PointF midPoint = new PointF();
float eyeDistance = 0.0f;
float confidence = 0.0f;
Log.i("FaceDetector", "Number of faces found: " + facesFound);
if(facesFound > 0)
{
for(int index=0; index<facesFound; ++index){
faces[index].getMidPoint(midPoint);
eyeDistance = faces[index].eyesDistance();
confidence = faces[index].confidence();
Log.i("FaceDetector",
"Confidence: " + confidence +
", Eye distance: " + eyeDistance +
", Mid Point: (" + midPoint.x + ", " + midPoint.y + ")");
canvas.drawRect((int)midPoint.x - eyeDistance ,
(int)midPoint.y - eyeDistance ,
(int)midPoint.x + eyeDistance,
(int)midPoint.y + eyeDistance, drawPaint);
}
}
String filepath = Environment.getExternalStorageDirectory() + "/facedetect" + System.currentTimeMillis() + ".jpg";
try {
FileOutputStream fos = new FileOutputStream(filepath);
bitmap565.compress(CompressFormat.JPEG, 90, fos);
fos.flush();
fos.close();
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
ImageView imageView = (ImageView)findViewById(R.id.image_view);
imageView.setImageBitmap(bitmap565);
}
}
private View.OnClickListener btnClick = new View.OnClickListener() {
//#Override
public void onClick(View v) {
switch(v.getId()){
case R.id.take_picture: openCamera(); break;
case R.id.detect_face: detectFaces(); break;
}
}
};
}
What i did wrong?
or
Is any other way to do that?
Thanks
getExtras().get("data") for MediaStore.ACTION_IMAGE_CAPTURE intent produces very low-resolution bitmap (I believe it's 160x120 px) which could work as a thumbnail, but is not enough for face detection to do its job.
Normally face detection is OK on medium-res images (e.g. 64x480 px) that you can receive form Camera.previewCallback(), but this way you need permissions and code that controls the camera in your app, you cannot use an intent for that.
Here is the official into to face detection on Android: http://developer.android.com/guide/topics/media/camera.html#face-detection.
If you really prefer it this way, you may use getData() to find the captured image at its full resolution, and convert it into a bitmap, like
cameraBitmap = BitmapFactory.decodeFile(data.getData().getPath());
Related
I am using Camera in my app to take pictures of ID cards, I have a rectangular overlay to which image will be cropped. issue is that the image quality is reduced once the image is captured.
I am unable to figure out where exactly it is happening. In cutImage method, I am cutting the image but I don't think I am doing anything to the resolution of the image there.
Can any one suggest where the quality might be going down.
takePicture is called when the user clicks to take the picture.
Once the picture is taken there is a button 'use picture' that is when usePicture is called.
cutImage method is used to crop the image based on the preview.
any suggestions on how to stop the resolution from going down will be very very helpful
protected void takePicture() {
Log.e(TAG, "takePicture started");
if(null == cameraDevice) {
Log.e(TAG, "cameraDevice is null");
return;
}
try {
ImageReader reader = ImageReader.newInstance(textureViewWidth, textureViewHeight, ImageFormat.JPEG, 1);
List<Surface> outputSurfaces = new ArrayList<Surface>(2);
outputSurfaces.add(reader.getSurface());
outputSurfaces.add(new Surface(textureView.getSurfaceTexture()));
final CaptureRequest.Builder captureBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
captureBuilder.addTarget(reader.getSurface());
captureBuilder.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_AUTO);
// Orientation
int rotation = getActivity().getWindowManager().getDefaultDisplay().getRotation();
captureBuilder.set(CaptureRequest.JPEG_ORIENTATION, ORIENTATIONS.get(rotation));
ImageReader.OnImageAvailableListener readerListener = new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable(ImageReader reader) {
Image image = null;
try {
image = reader.acquireLatestImage();
ByteBuffer buffer = image.getPlanes()[0].getBuffer();
byte[] bytes = new byte[buffer.capacity()];
buffer.get(bytes);
takenPictureBytes = bytes;
Log.d(TAG, "takenPictureBytes length - " + takenPictureBytes.length);
} catch (Exception e) {
Log.d(TAG, " onImageAvailable exception ");
e.printStackTrace();
} finally {
if (image != null) {
Log.d(TAG, " image closing");
image.close();
}
}
}
};
reader.setOnImageAvailableListener(readerListener, mBackgroundHandler);
final CameraCaptureSession.CaptureCallback captureListener = new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureCompleted(CameraCaptureSession session, CaptureRequest request, TotalCaptureResult result) {
super.onCaptureCompleted(session, request, result);
Log.d(TAG, "takePicture - camera capture session");
switchPanels(true);
progress.setVisibility(View.GONE);
}
};
cameraDevice.createCaptureSession(outputSurfaces, new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(CameraCaptureSession session) {
try {
session.capture(captureBuilder.build(), captureListener, mBackgroundHandler);
} catch (CameraAccessException e) {
Log.d(TAG, "takePicture - onConfigured- camera access exception ");
e.printStackTrace();
}
}
#Override
public void onConfigureFailed(CameraCaptureSession session) {
Log.d(TAG, "takePicture - onConfigureFailed");
}
}, mBackgroundHandler);
} catch (CameraAccessException e) {
Log.d(TAG, "takePicture - CameraAccessException ");
e.printStackTrace();
}
}
private void usePicture() {
Log.d(TAG, "usePicture - started ");
if(null != takenPictureBytes ){
try{
String imagePath = null;
Bitmap bitmap = BitmapFactory.decodeByteArray(takenPictureBytes, 0, takenPictureBytes.length);
int bitmapByteCountUsePic = byteSizeOf(bitmap);
Matrix matrix = new Matrix();
matrix.postRotate(90);
Bitmap rotatedBitmap = Bitmap.createBitmap(bitmap, 0, 0, bitmap.getWidth(), bitmap.getHeight(), matrix, true);
if (isFrameMode) {
float withRatio = (float) rotatedBitmap.getWidth() / (float) textureViewWidth;
float heightRatio = (float) rotatedBitmap.getHeight() / (float) textureViewHeight;
Bitmap newImage = cutImage(rotatedBitmap, (int) (photoFrameView.getWidth() * withRatio), (int) (photoFrameView.getHeight() * heightRatio), withRatio);
int bitmapByteCountNewImage = byteSizeOf(newImage);
imagePath = saveBitmap(newImage);
} else {
imagePath = saveBitmap(rotatedBitmap);
}
TakePhotoFragment.TakePhotoFragmentEvent takePhotoFragmentEvent = new TakePhotoFragment.TakePhotoFragmentEvent();
takePhotoFragmentEvent.setImagePath(imagePath);
// send rxjava
//pop backstack
RxBus.getInstance().post(takePhotoFragmentEvent);
getActivity().getSupportFragmentManager().popBackStack();
}catch (Exception e){
Log.d(TAG, "usePicture - exception ");
e.printStackTrace();
}
}else{
Log.d(TAG, "usePicture - takenPictureBytes is null");
DialogUtil.showErrorSnackBar(getView(), R.string.retake_photo );
}
}
public Bitmap cutImage(final Bitmap bitmap, final int pixepWidth, final int pixelsHeight, float widthRatio) {
int bitmapByteCountCutImage = byteSizeOf(bitmap);
Bitmap output = createBitmap(pixepWidth, pixelsHeight, Bitmap.Config.ARGB_8888);
Bitmap original = bitmap;
final Paint paint = new Paint();
Canvas canvas = new Canvas(output);
int padding = (int) ((float) getResources().getDimensionPixelSize(R.dimen.double_padding) * widthRatio);
Rect rect = new Rect(padding, (original.getHeight() - pixelsHeight) / 2, padding + pixepWidth, original.getHeight() - (original.getHeight() - pixelsHeight) / 2);
final RectF cutedRect = new RectF(0, 0, pixepWidth, pixelsHeight);
paint.setAntiAlias(true);
canvas.drawARGB(0, 0, 0, 0);
canvas.drawBitmap(original, rect, cutedRect, paint);
return output;
}
private String saveBitmap(Bitmap bitmap) {
File pictureFileDir = getDir();
if (!pictureFileDir.exists() && !pictureFileDir.mkdirs()) {
Toast.makeText(getActivity(), "Can't create directory to save image.", Toast.LENGTH_LONG).show();
return null;
}
SimpleDateFormat dateFormat = new SimpleDateFormat("yyyymmddhhmmssSSS");
String date = dateFormat.format(new Date());
String photoFile = "Picture_" + date + ".jpg";
String filename = pictureFileDir.getPath() + File.separator + photoFile;
File pictureFile = new File(filename);
try {
ByteArrayOutputStream stream = new ByteArrayOutputStream();
bitmap.compress(Bitmap.CompressFormat.JPEG, 100, stream);
FileOutputStream fos = new FileOutputStream(pictureFile);
fos.write(stream.toByteArray());
fos.close();
return pictureFile.getAbsolutePath();
} catch (Exception error) {
Log.d(TAG, "File" + filename + "not saved: " + error.getMessage());
}
return null;
}
You are changing the bitmap size/resolution in this code:
float withRatio = (float) rotatedBitmap.getWidth() / (float) textureViewWidth;
float heightRatio = (float) rotatedBitmap.getHeight() / (float) textureViewHeight;
Bitmap newImage = cutImage(rotatedBitmap, (int) (photoFrameView.getWidth() * withRatio), (int) (photoFrameView.getHeight() * heightRatio), withRatio);
int bitmapByteCountNewImage = byteSizeOf(newImage);
imagePath = saveBitmap(newImage);
Put in a breakpoint and see what the new heightRatio and widthRatio are, and what the photoFrameView.getWidth() * withRatio value comes out to. I think you will find it is small compared to the original image. I'm not sure why you are calculating the Ratios with the textureViewWidth/Height, you shouldn't have to do that. Whatever you are displaying the image in should be able to 'Fill' without having to change the size of the underlying bitmap, and thus losing resolution.
You might check out this method:
rawBitmap = ((BitmapDrawable)imageToLoad.getDrawable()).getBitmap();
theBitmap = Bitmap.createScaledBitmap(rawBitmap, 285, 313, false);
Face Detection finds my face, then after 3 seconds the circle disappears. Only happens on some phones, so I am unsure why it is happening. My code is pretty boilerplate:
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_detect);
detector = new FaceDetector.Builder(getApplicationContext())
.setTrackingEnabled(false)
.setProminentFaceOnly(true)
.setMode(FaceDetector.FAST_MODE)
.setMinFaceSize((float) 0.60)
.setLandmarkType(FaceDetector.ALL_CLASSIFICATIONS)
.setClassificationType(FaceDetector.ALL_CLASSIFICATIONS)
.build();
initViews();
}
private void initViews() {
imgTakePicture = (ImageView) findViewById(R.id.imgTakePic);
btnTakePicture = (Button) findViewById(R.id.btnTakePicture);
txtSampleDesc = (TextView) findViewById(R.id.txtSampleDescription);
txtTakenPicDesc = (TextView) findViewById(R.id.textView);
btnTakePicture.setOnClickListener(this);
imgTakePicture.setOnClickListener(this);
}
#Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
Log.d(TAG, "onActivityResult: this is resyult");
if (requestCode == CAMERA_REQUEST && resultCode == RESULT_OK) {
launchMediaScanIntent();
try {
processCameraPicture();
} catch (Exception e) {
Toast.makeText(getApplicationContext(), "Failed to load Image", Toast.LENGTH_SHORT).show();
}
}
}
private void launchMediaScanIntent() {
Log.d(TAG, "launchMediaScanIntent: ");
Intent mediaScanIntent = new Intent(Intent.ACTION_MEDIA_SCANNER_SCAN_FILE);
mediaScanIntent.setData(imageUri);
this.sendBroadcast(mediaScanIntent);
}
private void startCamera() {
Log.d(TAG, "startCamera: ");
Intent intent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
Log.d(TAG, "startCamera: 2");
File photo = new File(Environment.getExternalStorageDirectory(), "/videoDIARY/ReferencePic/photo.jpg");
imageUri = Uri.fromFile(photo);
intent.putExtra(MediaStore.EXTRA_OUTPUT, imageUri);
startActivityForResult(intent, CAMERA_REQUEST);
}
EDIT: Ok, I have worked out this is all about the device orientation. Works fine on all devices in landscape mode, only on some devices in portrait mode. Still trying to work out why, will update when i fix!
Ok, so it turned out to have nothing to do with facial detection, and everything to do with how Android saves Camera Intent Images. Basically it gets the orientation confused, so you need to check the width vs the height, to make sure it is doing it right, and rotate it if not. Here is how I checked:
private Bitmap decodeBitmapUri(Context ctx, Uri uri) throws FileNotFoundException {
Log.d(TAG, "decodeBitmapUri: ");
//Toast.makeText(this, "1o" , Toast.LENGTH_LONG).show();
Log.d(TAG, "initViews1: face detector is ============================ " + detector.isOperational());
int targetW = 300;
int targetH = 300;
BitmapFactory.Options bmOptions = new BitmapFactory.Options();
bmOptions.inJustDecodeBounds = true;
bmOptions.inPreferredConfig=Bitmap.Config.RGB_565;
BitmapFactory.decodeStream(ctx.getContentResolver().openInputStream(uri), null, bmOptions);
android.hardware.Camera.CameraInfo info = new android.hardware.Camera.CameraInfo();
android.hardware.Camera.getCameraInfo(Camera.CameraInfo.CAMERA_FACING_FRONT, info);
int rotation = this.getWindowManager().getDefaultDisplay().getRotation();
int orientation = this.getResources().getConfiguration().orientation;
Log.d(TAG, "decodeBitmapUri: OREINTATION is ==================== " + orientation);
Log.d(TAG, "decodeBitmapUri: CAMERA ROTATION ========================= " + rotation);
//Camera.Size size = android.hardware.Camera.get
int photoW = bmOptions.outWidth;
Log.d(TAG, "decodeBitmapUri: width: " + photoW );
int photoH = bmOptions.outHeight;
Log.d(TAG, "decodeBitmapUri: height: " + photoH);
Log.d(TAG, "decodeBitmapUri: 4");
//Toast.makeText(this, "11" , Toast.LENGTH_LONG).show();
int scaleFactor = Math.min(photoW / targetW, photoH / targetH);
bmOptions.inJustDecodeBounds = false;
bmOptions.inSampleSize = scaleFactor;
/*this is because some phones default a camera Intent to landscape no matter how the phone is held
* so we check for camera orienatation, then check to see if width is greater than height
* */
if(orientation == 1 && (photoW > photoH)){
return rotate(BitmapFactory.decodeStream(ctx.getContentResolver()
.openInputStream(uri), null, bmOptions));
}
return BitmapFactory.decodeStream(ctx.getContentResolver()
.openInputStream(uri), null, bmOptions);
}
public static Bitmap rotate(Bitmap bitmap){
int w = bitmap.getWidth();
int h = bitmap.getHeight();
Matrix mtx = new Matrix();
mtx.postRotate(270);
return Bitmap.createBitmap(bitmap, 0, 0, w, h, mtx, true);
}
private static final int REQUET_LOADIMAGE = 111;
private Button btnDetect;
private ImageView image;
private Bitmap bitmap;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.photogallery);
image = (ImageView) findViewById(R.id.image);
btnDetect=(Button)findViewById(R.id.btnDetect);
btnDetect.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
detectFacesInImage();
}
});
Intent intent = new Intent();
intent.setType("image/*"); // filter only image type files
intent.setAction(Intent.ACTION_GET_CONTENT);
intent.addCategory(Intent.CATEGORY_OPENABLE);
startActivityForResult(intent, REQUET_LOADIMAGE);
}
#Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
super.onActivityResult(requestCode, resultCode, data);
if (requestCode == REQUET_LOADIMAGE && resultCode == RESULT_OK) {
if (bitmap != null) {
bitmap.recycle();
}
try {
InputStream inputStream = getContentResolver().openInputStream(data.getData());
bitmap = BitmapFactory.decodeStream(inputStream);
inputStream.close();
image.setImageBitmap(bitmap);
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
}
public void detectFacesInImage() {
//Create a Paint object for drawing with
Paint myRectPaint = new Paint();
myRectPaint.setStrokeWidth(5);
myRectPaint.setColor(Color.RED);
myRectPaint.setStyle(Paint.Style.STROKE);
//Create a Canvas object for drawing on
Bitmap tempBitmap = Bitmap.createBitmap(bitmap.getWidth(), bitmap.getHeight(), Bitmap.Config.RGB_565);
Canvas tempCanvas = new Canvas(tempBitmap);
tempCanvas.drawBitmap(bitmap, 0, 0, null);
//Detect the Faces
FaceDetector faceDetector = new FaceDetector.Builder(getApplicationContext())
.setTrackingEnabled(false)
.build();
Frame frame = new Frame.Builder().setBitmap(bitmap).build();
SparseArray<Face> faces = faceDetector.detect(frame);
if (faces.size() == 0) {
Toast.makeText(this, "None face detected!", Toast.LENGTH_SHORT).show();
} else {
//Draw Rectangles on the Faces
for (int i = 0; i < faces.size(); i++) {
Face thisFace = faces.valueAt(i);
float x1 = thisFace.getPosition().x;
float y1 = thisFace.getPosition().y;
float x2 = x1 + thisFace.getWidth();
float y2 = y1 + thisFace.getHeight();
tempCanvas.drawRoundRect(new RectF(x1, y1, x2, y2), 2, 2, myRectPaint);
faceDetector.release();
}
image.setImageDrawable(new BitmapDrawable(getResources(), tempBitmap));
}
}
}
I have successfully implemented face detection using google play-services-vision:9.4.0+' and also get the detected face by a simple program with the help of canvas.
I want to know the gender of the given photo on just onClicklistner.
Is there any way to do this?
Try to use this api: Gender Classification
https://algorithmia.com/algorithms/deeplearning/GenderClassification
import com.algorithmia.*;
import com.algorithmia.algo.*;
String input = "{\n"
+ " \"image\": \"data://deeplearning/example_data/m_ali.jpg\"\n"
+ "}";
AlgorithmiaClient client = Algorithmia.client("YOUR_API_KEY");
Algorithm algo = client.algo("algo://deeplearning/GenderClassification/1.0.1");
AlgoResponse result = algo.pipeJson(input);
System.out.println(result.asJsonString());
Json Result:
{
"results": [
[0.9948568344116211, "Male"],
[0.0051431627944111815, "Female"]
]
}
I have this multiple image in a canvas, how do I get the anchors of each image in the canvas and be able to drag, resize and move it on the canvas. Just like in the other image editor Android applications. Please help me. Thank you
here's the code:
public class MainActivity extends Activity implements OnClickListener {
static final int PICKED_ONE = 0;
static final int PICKED_TWO = 1;
boolean onePicked = false;
boolean twoPicked = false;
Button choosePicture1, choosePicture2;
ImageView compositeImageView;
Bitmap bmp1, bmp2;
Bitmap returnBmp;
Bitmap drawingBitmap;
Canvas canvas;
Paint paint;
protected static final String TAG = MainActivity.class.getName();
Bitmap mBackImage, mTopImage, mBackground, mInnerImage, mNewSaving;
Canvas mComboImage;
FileOutputStream mFileOutputStream;
BitmapDrawable mBitmapDrawable;
private String mCurrent = null;
private static String mTempDir;
Button save;
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
save = (Button)findViewById(R.id.save);
mTempDir = Environment.getExternalStorageDirectory() + "/" + "Pixiedoo" + "/";
mCurrent = "Aura.png";
prepareDirectory();
compositeImageView = (ImageView) this
.findViewById(R.id.CompositeImageView);
choosePicture1 = (Button) this.findViewById(R.id.ChoosePictureButton1);
choosePicture2 = (Button) this.findViewById(R.id.ChoosePictureButton2);
choosePicture1.setOnClickListener(this);
choosePicture2.setOnClickListener(this);
save.setOnClickListener(new View.OnClickListener() {
#SuppressWarnings("deprecation")
public void onClick(View v) {
Log.v(TAG, "Save Tab Clicked");
try {
mBitmapDrawable = new BitmapDrawable(drawingBitmap);
mNewSaving = ((BitmapDrawable) mBitmapDrawable).getBitmap();
String FtoSave = mTempDir + mCurrent;
File mFile = new File(FtoSave);
mFileOutputStream = new FileOutputStream(mFile);
mNewSaving.compress(CompressFormat.PNG, 95, mFileOutputStream);
mFileOutputStream.flush();
mFileOutputStream.close();
} catch (FileNotFoundException e) {
Log.v(TAG, "FileNotFoundExceptionError " + e.toString());
} catch (IOException e) {
Log.v(TAG, "IOExceptionError " + e.toString());
}
}
});
}
private boolean prepareDirectory() {
try {
if (makeDirectory()) {
return true;
} else {
return false;
}
} catch (Exception e) {
e.printStackTrace();
//Toast.makeText(this, getString(R.string.sdcard_error), 1000).show();
return false;
}
}
private boolean makeDirectory() {
File mTempFile = new File(mTempDir);
if (!mTempFile.exists()) {
mTempFile.mkdirs();
}
if (mTempFile.isDirectory()) {
File[] mFiles = mTempFile.listFiles();
for (File mEveryFile : mFiles) {
if (!mEveryFile.delete()) {
//System.out.println(getString(R.string.failed_to_delete) + mEveryFile);
}
}
}
return (mTempFile.isDirectory());
}
public void onClick(View v) {
int which = -1;
if (v == choosePicture1) {
which = PICKED_ONE;
} else if (v == choosePicture2) {
which = PICKED_TWO;
}
Intent choosePictureIntent = new Intent(Intent.ACTION_PICK,
android.provider.MediaStore.Images.Media.EXTERNAL_CONTENT_URI);
startActivityForResult(choosePictureIntent, which);
}
protected void onActivityResult(int requestCode, int resultCode,
Intent intent) {
super.onActivityResult(requestCode, resultCode, intent);
if (resultCode == RESULT_OK) {
Uri imageFileUri = intent.getData();
if (requestCode == PICKED_ONE) {
bmp1 = loadBitmap(imageFileUri);
onePicked = true;
} else if (requestCode == PICKED_TWO) {
bmp2 = loadBitmap(imageFileUri);
twoPicked = true;
}
if (onePicked && twoPicked) {
drawingBitmap = Bitmap.createBitmap(bmp1.getWidth(),
bmp1.getHeight(), bmp1.getConfig());
canvas = new Canvas(drawingBitmap);
paint = new Paint();
canvas.drawBitmap(bmp1, 90, 0, paint);
// paint.setXfermode(new PorterDuffXfermode(
// android.graphics.PorterDuff.Mode.MULTIPLY));
canvas.drawBitmap(bmp2, 30, 40, paint);
compositeImageView.setImageBitmap(drawingBitmap);
}
}
}
private Bitmap loadBitmap(Uri imageFileUri) {
Display currentDisplay = getWindowManager().getDefaultDisplay();
float dw = currentDisplay.getWidth();
float dh = currentDisplay.getHeight();
returnBmp = Bitmap.createBitmap((int) dw, (int) dh,
Bitmap.Config.ARGB_8888);
try {
// Load up the image's dimensions not the image itself
BitmapFactory.Options bmpFactoryOptions = new BitmapFactory.Options();
bmpFactoryOptions.inJustDecodeBounds = true;
returnBmp = BitmapFactory.decodeStream(getContentResolver()
.openInputStream(imageFileUri), null, bmpFactoryOptions);
int heightRatio = (int) Math.ceil(bmpFactoryOptions.outHeight / dh);
int widthRatio = (int) Math.ceil(bmpFactoryOptions.outWidth / dw);
Log.v("HEIGHTRATIO", "" + heightRatio);
Log.v("WIDTHRATIO", "" + widthRatio);
// If both of the ratios are greater than 1, one of the sides of the
// image is greater than the screen
if (heightRatio > 1 && widthRatio > 1) {
if (heightRatio > widthRatio) {
// Height ratio is larger, scale according to it
bmpFactoryOptions.inSampleSize = heightRatio;
} else {
// Width ratio is larger, scale according to it
bmpFactoryOptions.inSampleSize = widthRatio;
}
}
// Decode it for real
bmpFactoryOptions.inJustDecodeBounds = false;
returnBmp = BitmapFactory.decodeStream(getContentResolver()
.openInputStream(imageFileUri), null, bmpFactoryOptions);
} catch (FileNotFoundException e) {
Log.v("ERROR", e.toString());
}
return returnBmp;
}
}
I am having code for combining two image and show them together in a canvas as follows. Can you say how to store that as a single image.
public class ChoosePictureComposite extends Activity implements OnClickListener {
static final int PICKED_ONE = 0;
static final int PICKED_TWO = 1;
boolean onePicked = false;
boolean twoPicked = false;
Button choosePicture1, choosePicture2;
ImageView compositeImageView;
Bitmap bmp1, bmp2;
Canvas canvas;
Paint paint;
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
compositeImageView = (ImageView) this
.findViewById(R.id.CompositeImageView);
choosePicture1 = (Button) this.findViewById(R.id.ChoosePictureButton1);
choosePicture2 = (Button) this.findViewById(R.id.ChoosePictureButton2);
choosePicture1.setOnClickListener(this);
choosePicture2.setOnClickListener(this);
}
public void onClick(View v) {
int which = -1;
if (v == choosePicture1) {
which = PICKED_ONE;
} else if (v == choosePicture2) {
which = PICKED_TWO;
}
Intent choosePictureIntent = new Intent(Intent.ACTION_PICK,
android.provider.MediaStore.Images.Media.EXTERNAL_CONTENT_URI);
startActivityForResult(choosePictureIntent, which);
}
protected void onActivityResult(int requestCode, int resultCode,
Intent intent) {
super.onActivityResult(requestCode, resultCode, intent);
if (resultCode == RESULT_OK) {
Uri imageFileUri = intent.getData();
if (requestCode == PICKED_ONE) {
bmp1 = loadBitmap(imageFileUri);
onePicked = true;
} else if (requestCode == PICKED_TWO) {
bmp2 = loadBitmap(imageFileUri);
twoPicked = true;
}
if (onePicked && twoPicked) {
Bitmap drawingBitmap = Bitmap.createBitmap(bmp1.getWidth(),
bmp1.getHeight(), bmp1.getConfig());
canvas = new Canvas(drawingBitmap);
paint = new Paint();
canvas.drawBitmap(bmp1, 90, 0, paint);
// paint.setXfermode(new PorterDuffXfermode(
// android.graphics.PorterDuff.Mode.MULTIPLY));
canvas.drawBitmap(bmp2, 30, 40, paint);
compositeImageView.setImageBitmap(drawingBitmap);
}
}
}
private Bitmap loadBitmap(Uri imageFileUri) {
Display currentDisplay = getWindowManager().getDefaultDisplay();
float dw = currentDisplay.getWidth();
float dh = currentDisplay.getHeight();
Bitmap returnBmp = Bitmap.createBitmap((int) dw, (int) dh,
Bitmap.Config.ARGB_4444);
try {
// Load up the image's dimensions not the image itself
BitmapFactory.Options bmpFactoryOptions = new BitmapFactory.Options();
bmpFactoryOptions.inJustDecodeBounds = true;
returnBmp = BitmapFactory.decodeStream(getContentResolver()
.openInputStream(imageFileUri), null, bmpFactoryOptions);
int heightRatio = (int) Math.ceil(bmpFactoryOptions.outHeight / dh);
int widthRatio = (int) Math.ceil(bmpFactoryOptions.outWidth / dw);
Log.v("HEIGHTRATIO", "" + heightRatio);
Log.v("WIDTHRATIO", "" + widthRatio);
// If both of the ratios are greater than 1, one of the sides of the
// image is greater than the screen
if (heightRatio > 1 && widthRatio > 1) {
if (heightRatio > widthRatio) {
// Height ratio is larger, scale according to it
bmpFactoryOptions.inSampleSize = heightRatio;
} else {
// Width ratio is larger, scale according to it
bmpFactoryOptions.inSampleSize = widthRatio;
}
}
// Decode it for real
bmpFactoryOptions.inJustDecodeBounds = false;
returnBmp = BitmapFactory.decodeStream(getContentResolver()
.openInputStream(imageFileUri), null, bmpFactoryOptions);
} catch (FileNotFoundException e) {
Log.v("ERROR", e.toString());
}
return returnBmp;
}
}
Try this:
public class Aura extends Activity {
protected static final String TAG = Aura.class.getName();
private static String mTempDir;
Bitmap mBackImage, mTopImage, mBackground, mInnerImage, mNewSaving;
Canvas mComboImage;
FileOutputStream mFileOutputStream;
BitmapDrawable mBitmapDrawable;
private String mCurrent = null;
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.aura);
mTempDir = Environment.getExternalStorageDirectory() + "/" + "Aura" + "/";
mCurrent = "Aura.png";
prepareDirectory();
mBackground = Bitmap.createBitmap(604, 1024, Bitmap.Config.ARGB_8888);
mBackImage = BitmapFactory.decodeResource(getResources(), R.drawable.aura);
mTopImage = BitmapFactory.decodeResource(getResources(), R.drawable.test);
mInnerImage = BitmapFactory.decodeResource(getResources(), R.drawable.anothertest);
mComboImage = new Canvas(mBackground);
mComboImage.drawBitmap(mBackImage, 0f, 0f, null);
mComboImage.drawBitmap(mTopImage, 0f, 0f, null);
mComboImage.drawBitmap(mInnerImage, 0f, 0f, null);
mFileOutputStream = null;
mSave.setOnClickListener(new View.OnClickListener() {
public void onClick(View v) {
Log.v(TAG, "Save Tab Clicked");
try {
mBitmapDrawable = new BitmapDrawable(mBackground);
mNewSaving = ((BitmapDrawable) mBitmapDrawable).getBitmap();
String FtoSave = mTempDir + mCurrent;
File mFile = new File(FtoSave);
mFileOutputStream = new FileOutputStream(mFile);
mNewSaving.compress(CompressFormat.PNG, 95, mFileOutputStream);
mFileOutputStream.flush();
mFileOutputStream.close();
} catch (FileNotFoundException e) {
Log.v(TAG, "FileNotFoundExceptionError " + e.toString());
} catch (IOException e) {
Log.v(TAG, "IOExceptionError " + e.toString());
}
}
});
}//onCreate
private boolean prepareDirectory() {
try {
if (makeDirectory()) {
return true;
} else {
return false;
}
} catch (Exception e) {
e.printStackTrace();
Toast.makeText(this, getString(R.string.sdcard_error), 1000).show();
return false;
}
}
private boolean makeDirectory() {
File mTempFile = new File(mTempDir);
if (!mTempFile.exists()) {
mTempFile.mkdirs();
}
if (mTempFile.isDirectory()) {
File[] mFiles = mTempFile.listFiles();
for (File mEveryFile : mFiles) {
if (!mEveryFile.delete()) {
System.out.println(getString(R.string.failed_to_delete) + mEveryFile);
}
}
}
return (mTempFile.isDirectory());
}
#Override
public boolean onKeyDown(int keyCode, KeyEvent event) {
if ((!(android.os.Build.VERSION.SDK_INT > android.os.Build.VERSION_CODES.DONUT)
&& keyCode == KeyEvent.KEYCODE_BACK && event.getRepeatCount() == 0)) {
onBackPressed();
}
return super.onKeyDown(keyCode, event);
}
public void onBackPressed() {
finish();
}
}
create a new Bitmap with the height sum of the individual height of your images.
Bitmap newBitmap = Bitmap.createBitmap(widht
, totalHeight, Config.ARGB_8888);
Canvas canvas = new Canvas(newBitmap);
canvas.drawBitmap(image11, 0, 0, null);
canvas.drawBitmap(image2, 0, image1Height, null);
this should draw the 2 images into the newBitmap, one below the other. change height parameters to widht if you want them side by side.
Is this what you were looking for ?