Add image over image at specific XY points in ImageView - android

I have an image of women. I find her eye points using FaceDetector. Now I want to add hair image over her face using those eyes points.
I am loading that image from gallery using below code
btnLoad.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
Intent intent = new Intent();
intent.setType("image/*");
intent.setAction(Intent.ACTION_GET_CONTENT);
intent.addCategory(Intent.CATEGORY_OPENABLE);
startActivityForResult(intent, RQS_LOADIMAGE);
}
});
In onActivityResult, i am checking the face cordinates
#Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
super.onActivityResult(requestCode, resultCode, data);
try {
InputStream inputStream = getContentResolver().openInputStream(data.getData());
myBitmap = BitmapFactory.decodeStream(inputStream);
inputStream.close();
imgView.setImageBitmap(myBitmap);
if (myBitmap == null) {
Toast.makeText(MainActivity.this, "myBitmap == null", Toast.LENGTH_LONG).show();
} else {
detectFace();
}
} catch (IOException e) {
e.printStackTrace();
}
}
Face Detection method
private void detectFace() {
Paint myRectPaint = new Paint();
myRectPaint.setStrokeWidth(5);
myRectPaint.setColor(Color.RED);
myRectPaint.setStyle(Paint.Style.STROKE);
tempBitmap = Bitmap.createBitmap(myBitmap.getWidth(), myBitmap.getHeight(), Bitmap.Config.RGB_565);
Canvas tempCanvas = new Canvas(tempBitmap);
tempCanvas.drawBitmap(myBitmap, 0, 0, null);
FaceDetector faceDetector = new FaceDetector.Builder(this)
.setTrackingEnabled(true)
.setLandmarkType(FaceDetector.ALL_LANDMARKS)
.setMode(FaceDetector.ACCURATE_MODE)
.setClassificationType(FaceDetector.ALL_CLASSIFICATIONS)
.build();
Frame frame = new Frame.Builder().setBitmap(myBitmap).build();
SparseArray<Face> faces = faceDetector.detect(frame);
imgView.setImageDrawable(new BitmapDrawable(getResources(), drawOnFace(faces)));
}
Getting Eye coordinates using below code :-
private Bitmap drawOnFace(SparseArray<Face> faceArray) {
tempBitmap = Bitmap.createBitmap(myBitmap.getWidth(), myBitmap.getHeight(), Bitmap.Config.RGB_565);
Canvas canvas = new Canvas(tempBitmap);
canvas.drawBitmap(myBitmap, 0, 0, null);
for (int i = 0; i < faceArray.size(); i++) {
Face face = faceArray.get(i);
for (Landmark landmark : face.getLandmarks()) {
switch (landmark.getType()) {
case Landmark.LEFT_EYE:
drawPoint(canvas, landmark.getPosition());
break;
case Landmark.RIGHT_EYE:
drawPoint(canvas, landmark.getPosition());
break;
}
}
}
return tempBitmap;
}
Draw circle over eyes using below code :-
private void drawPoint(Canvas canvas, PointF point) {
Paint paint = new Paint();
paint.setColor(Color.RED);
paint.setStrokeWidth(8);
paint.setStyle(Paint.Style.STROKE);
float x = point.x;
float y = point.y;
canvas.drawCircle(x, y, 10, paint);
}
Now inside DrawPoint method, I have eye coordinates. I want to use those points to put hair image over face.
I really don't know what to do next. Appreciate help guys.
Thank you in advance

To place image over camera preview use this code
float left=0,top=0;
Bitmap bitmap= BitmapFactory.decodeResource(getResources(),R.drawable.mustache);
//if you are in non activity then use context.getResources()
canvas.drawBitmap(bitmap,left,top,paint);

Related

Flood fill algorithm on vector drawable image view

I want to create application like this.
Flood fill algorithm
I applied that code and it is working fine with JPG or PNG file.
But I want to use that algorithm with Vector drawable imageview
Vector drawable imageview before flood fill
After flood fill of vector imageview
Expected result should be like this (Flood fill work perfectly when I set JPG or PNG file)
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_flood_fill);
iv_FloodFillActivity_image = findViewById(R.id.iv_FloodFillActivity_image);
iv_FloodFillActivity_image.setOnTouchListener(new View.OnTouchListener() {
#Override
public boolean onTouch(View v, MotionEvent event) {
if (event.getAction() == MotionEvent.ACTION_DOWN) {
floodColor(event.getX(), event.getY());
}
return true;
}
});
}
private void floodColor(float x, float y) {
final Point p1 = new Point();
p1.x = (int) x;// X and y are co - ordinates when user clicks on the screen
p1.y = (int) y;
Bitmap bitmap = getBitmapFromVectorDrawable(iv_FloodFillActivity_image.getDrawable());
//bitmap = bitmap.copy(Bitmap.Config.ARGB_8888, true);
int pixel = bitmap.getPixel((int) x, (int) y);
int[] sourceColorRGB = new int[]{
Color.red(pixel),
Color.green(pixel),
Color.blue(pixel)
};
final int targetColor = Color.parseColor("#FF2200");
QueueLinearFloodFiller queueLinearFloodFiller = new QueueLinearFloodFiller(bitmap, sourceColor, targetColor);
int toleranceHex = Color.parseColor("#545454");
int[] toleranceRGB = new int[]{
Color.red(toleranceHex),
Color.green(toleranceHex),
Color.blue(toleranceHex)
};
queueLinearFloodFiller.setTolerance(toleranceRGB);
queueLinearFloodFiller.setFillColor(targetColor);
queueLinearFloodFiller.setTargetColor(sourceColorRGB);
queueLinearFloodFiller.floodFill(p1.x, p1.y);
bitmap = queueLinearFloodFiller.getImage();
iv_FloodFillActivity_image.setImageBitmap(bitmap);
}
private Bitmap getBitmapFromVectorDrawable(Drawable drawable) {
try {
Bitmap bitmap = Bitmap.createBitmap(drawable.getIntrinsicWidth(), drawable.getIntrinsicHeight(), Bitmap.Config.ARGB_8888);
Canvas canvas = new Canvas(bitmap);
drawable.setBounds(0, 0, canvas.getWidth(), canvas.getHeight());
drawable.draw(canvas);
return bitmap;
} catch (OutOfMemoryError e) {
return null;
}
}
Check Class : QueueLinearFloodFiller
How can I use vector drawable?

face detector with gender Male or Female

private static final int REQUET_LOADIMAGE = 111;
private Button btnDetect;
private ImageView image;
private Bitmap bitmap;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.photogallery);
image = (ImageView) findViewById(R.id.image);
btnDetect=(Button)findViewById(R.id.btnDetect);
btnDetect.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
detectFacesInImage();
}
});
Intent intent = new Intent();
intent.setType("image/*"); // filter only image type files
intent.setAction(Intent.ACTION_GET_CONTENT);
intent.addCategory(Intent.CATEGORY_OPENABLE);
startActivityForResult(intent, REQUET_LOADIMAGE);
}
#Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
super.onActivityResult(requestCode, resultCode, data);
if (requestCode == REQUET_LOADIMAGE && resultCode == RESULT_OK) {
if (bitmap != null) {
bitmap.recycle();
}
try {
InputStream inputStream = getContentResolver().openInputStream(data.getData());
bitmap = BitmapFactory.decodeStream(inputStream);
inputStream.close();
image.setImageBitmap(bitmap);
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
}
public void detectFacesInImage() {
//Create a Paint object for drawing with
Paint myRectPaint = new Paint();
myRectPaint.setStrokeWidth(5);
myRectPaint.setColor(Color.RED);
myRectPaint.setStyle(Paint.Style.STROKE);
//Create a Canvas object for drawing on
Bitmap tempBitmap = Bitmap.createBitmap(bitmap.getWidth(), bitmap.getHeight(), Bitmap.Config.RGB_565);
Canvas tempCanvas = new Canvas(tempBitmap);
tempCanvas.drawBitmap(bitmap, 0, 0, null);
//Detect the Faces
FaceDetector faceDetector = new FaceDetector.Builder(getApplicationContext())
.setTrackingEnabled(false)
.build();
Frame frame = new Frame.Builder().setBitmap(bitmap).build();
SparseArray<Face> faces = faceDetector.detect(frame);
if (faces.size() == 0) {
Toast.makeText(this, "None face detected!", Toast.LENGTH_SHORT).show();
} else {
//Draw Rectangles on the Faces
for (int i = 0; i < faces.size(); i++) {
Face thisFace = faces.valueAt(i);
float x1 = thisFace.getPosition().x;
float y1 = thisFace.getPosition().y;
float x2 = x1 + thisFace.getWidth();
float y2 = y1 + thisFace.getHeight();
tempCanvas.drawRoundRect(new RectF(x1, y1, x2, y2), 2, 2, myRectPaint);
faceDetector.release();
}
image.setImageDrawable(new BitmapDrawable(getResources(), tempBitmap));
}
}
}
I have successfully implemented face detection using google play-services-vision:9.4.0+' and also get the detected face by a simple program with the help of canvas.
I want to know the gender of the given photo on just onClicklistner.
Is there any way to do this?
Try to use this api: Gender Classification
https://algorithmia.com/algorithms/deeplearning/GenderClassification
import com.algorithmia.*;
import com.algorithmia.algo.*;
String input = "{\n"
+ " \"image\": \"data://deeplearning/example_data/m_ali.jpg\"\n"
+ "}";
AlgorithmiaClient client = Algorithmia.client("YOUR_API_KEY");
Algorithm algo = client.algo("algo://deeplearning/GenderClassification/1.0.1");
AlgoResponse result = algo.pipeJson(input);
System.out.println(result.asJsonString());
Json Result:
{
"results": [
[0.9948568344116211, "Male"],
[0.0051431627944111815, "Female"]
]
}

Picture manipulation after capture

I need your help, if you are kind enough, into an issue that I'm having with my Android app:
I've managed to capture a picture in my MainActivity and display it into an separate activity - PictureActivity. My code is as follows:
In my MainActivity i have
private static final int CAPTURE_IMAGE_ACTIVITY_REQUEST_CODE = 100;
/**
* This is called in tap on a graphic element in my MainActivity layout
*/
public void launchCamera(View v) {
Intent intent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
startActivityForResult(intent, CAPTURE_IMAGE_ACTIVITY_REQUEST_CODE);
}
#Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
if (requestCode == CAPTURE_IMAGE_ACTIVITY_REQUEST_CODE) {
Bitmap imageData = null;
if (resultCode == RESULT_OK) {
imageData = (Bitmap) data.getExtras().get("data");
Intent i = new Intent(this, PictureActivity.class);
i.putExtra("captured_picture", imageData);
startActivity(i);
} else if (resultCode == RESULT_CANCELED) {
// User cancelled the image capture
} else {
Toast.makeText(getApplicationContext(), R.string.picture_capture_error, Toast.LENGTH_SHORT).show();
}
}
}
My PictureActivity looks like this:
public class PictureActivity extends ActionBarActivity {
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_picture);
Bitmap bitmap = getIntent().getExtras().getParcelable("captured_picture");
ImageView view = (ImageView) findViewById(R.id.preview_photo);
view.setImageBitmap(bitmap);
}
#Override
public boolean onCreateOptionsMenu(Menu menu) {
// Inflate the menu; this adds items to the action bar if it is present.
getMenuInflater().inflate(R.menu.image_menu, menu);
return true;
}
#Override
public boolean onOptionsItemSelected(MenuItem item) {
// Handle action bar item clicks here. The action bar will
// automatically handle clicks on the Home/Up button, so long
// as you specify a parent activity in AndroidManifest.xml.
int id = item.getItemId();
return super.onOptionsItemSelected(item);
}
}
My PictureActivity layout looks like this
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:orientation="vertical" android:layout_width="match_parent"
android:layout_height="match_parent">
<ImageView
android:id="#+id/preview_photo"
android:layout_width="match_parent"
android:layout_height="match_parent" />
</LinearLayout>
This is how the final result looks so far:
Long story short, these are the things I want to do next:
rotate the picture if the user taps on the middle button from the main action bar
crop the picture if the user taps on the first button from the left
Next, save the image somewhere (in shared preferences or session maybe?) and, aftewords, upload it to a remote server. I say "save the image somewhere" because the user can chose to take a second picture (from a maximum of two) and perform the same action, as above, on it (by taping on the first button from the right, in the main action bar).
What I can't figure out so far is:
how do I know what is the current image (that I see on my screen)
how can I set it's name and it's location where it should be saved to until I upload it to the remote server
how can I manipulate it when I tap on one of the two buttons (crop or rotate)
Sorry for the long post and thank you in advance for your help!
To answer your 3 questions:
The current image you see is in your devices RAM. Its the bitmap object which holds the data.
You have to save the bitmap to your devices storage
you have to transform your bitmap before saving it to storage
Here is some code which might help you:
public static void saveBitmapToFile(Bitmap bitmap, String filename){
FileOutputStream out = null;
try {
out = new FileOutputStream(filename);
bitmap.compress(Bitmap.CompressFormat.PNG, 100, out); // bmp is your Bitmap instance
// PNG is a lossless format, the compression factor (100) is ignored
} catch (Exception e) {
e.printStackTrace();
} finally {
try {
if (out != null) {
out.close();
}
} catch (IOException e) {
e.printStackTrace();
}
}
}
public static Bitmap cropAndRotateBitmap(Bitmap bitmap, Rect cropRect, int rotateByDegrees) {
Log.v(TAG, "cropAndRotateBitmap()");
Log.v(TAG, "bitmap: " + bitmap);
//Log.v(TAG, "orientation: " + orientation);
Log.v(TAG, "degrees: " + rotateByDegrees);
Log.v(TAG, "cropRect: " + cropRect);
int cropRectWidth = cropRect.right - cropRect.left;
int cropRectHeight = cropRect.bottom - cropRect.top;
System.gc();
Bitmap result;
Matrix m = new Matrix();
Canvas canvas = new Canvas();
if (rotateByDegrees == 0) {
result = Bitmap.createBitmap(cropRectWidth, cropRectHeight, Bitmap.Config.ARGB_8888);
canvas = new Canvas(result);
m.postTranslate(-cropRect.left, -cropRect.top);
} else if (rotateByDegrees == 90) {
Log.v(TAG, "rotate 90, cropRect: " + cropRect);
result = Bitmap.createBitmap(cropRectHeight, cropRectWidth, Bitmap.Config.ARGB_8888);
canvas = new Canvas(result);
m.postTranslate(-cropRect.left, -cropRect.height() - cropRect.top);
m.postRotate(90);
} else if (rotateByDegrees == 180) {
Log.v(TAG, "rotate 180");
result = Bitmap.createBitmap(cropRectWidth, cropRectHeight, Bitmap.Config.ARGB_8888);
canvas = new Canvas(result);
m.postTranslate(-cropRect.left - cropRect.width(), -cropRect.height() - cropRect.top);
m.postRotate(180);
} else { // 270
Log.v(TAG, "rotate 270");
result = Bitmap.createBitmap(cropRectHeight, cropRectWidth, Bitmap.Config.ARGB_8888);
canvas = new Canvas(result);
m.postTranslate(-cropRect.width() - cropRect.left, -cropRect.top);
m.postRotate(270);
}
Paint whitePaint = new Paint(Paint.ANTI_ALIAS_FLAG);
whitePaint.setStyle(Paint.Style.FILL);
whitePaint.setColor(Color.WHITE);
canvas.drawBitmap(bitmap, m, whitePaint);
// canvas.restore();
return result;
}
public static Bitmap rotateBitmap(Bitmap bitmap, int rotateByDegrees) {
Log.v(TAG, "rotateBitmap()");
Log.v(TAG, "degrees: " + rotateByDegrees);
System.gc();
Bitmap result;
Canvas canvas = new Canvas();
if (rotateByDegrees == 0) {
result = Bitmap.createBitmap(bitmap.getWidth(), bitmap.getHeight(), Bitmap.Config.ARGB_8888);
canvas = new Canvas(result);
canvas.save();
} else if (rotateByDegrees == 90) {
result = Bitmap.createBitmap(bitmap.getHeight(), bitmap.getWidth(), Bitmap.Config.ARGB_8888);
canvas = new Canvas(result);
canvas.save();
canvas.rotate(90);
canvas.translate(0, -1 * bitmap.getHeight());
} else if (rotateByDegrees == 180) {
result = Bitmap.createBitmap(bitmap.getWidth(), bitmap.getHeight(), Bitmap.Config.ARGB_8888);
canvas = new Canvas(result);
canvas.save();
canvas.rotate(180);
canvas.translate(-1 * bitmap.getWidth(), -1 * bitmap.getHeight());
} else { // 270
result = Bitmap.createBitmap(bitmap.getHeight(), bitmap.getWidth(), Bitmap.Config.ARGB_8888);
canvas = new Canvas(result);
canvas.save();
canvas.rotate(270);
canvas.translate(-1 * bitmap.getWidth(), 0);
}
canvas.drawBitmap(bitmap, new Matrix(), null);
canvas.restore();
return result;
}

Setting image view background to bitmap

Ive been looking around but couldn't find the solution to my problem. Im am trying to pass the bmp1 to the second activity, Profile. The code pasted does not work, anyone with possible suggestions would be great.
Here is my code for the first part
Bitmap bmp = BitmapFactory.decodeStream(getContentResolver().openInputStream(imageFileUri), null, bmpFactoryOptions);
int heightRatio = (int)Math.ceil(bmpFactoryOptions.outHeight/(float)dh);
int widthRatio = (int)Math.ceil(bmpFactoryOptions.outWidth/(float)dw);
if (heightRatio > 1 && widthRatio > 1)
{
if (heightRatio > widthRatio) {
bmpFactoryOptions.inSampleSize = heightRatio;
} else {
bmpFactoryOptions.inSampleSize = widthRatio;
}
}
bmpFactoryOptions.inJustDecodeBounds = false;
bmp = BitmapFactory.decodeStream(getContentResolver().openInputStream(imageFileUri), null, bmpFactoryOptions);
Bitmap bmp1 = BitmapFactory.decodeStream(getContentResolver().openInputStream(imageFileUri), null, bmpFactoryOptions);
Bitmap alteredBitmap = Bitmap.createBitmap(bmp1.getWidth(),bmp1.getHeight(), bmp1.getConfig());
Canvas canvas = new Canvas(alteredBitmap);
Paint paint = new Paint();
Matrix matrix = new Matrix();
matrix.setValues(new float[] {
.5f, 0, 0,
0, .5f, 0,
0, 0, 1
});
canvas.drawBitmap(bmp, matrix, paint);
ImageView alteredImageView = (ImageView) this.findViewById(R.id.AlteredImageView);
alteredImageView.setImageBitmap(alteredBitmap);
chosenImageView.setImageBitmap(bmp1);
} catch (FileNotFoundException e) { Log.v("ERROR",e.toString());
}
}
Nex.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
// TODO Auto-generated method stub
//Uri imageFileUri = intent.getData();
Intent intent = new Intent(Choose.this, Profile.class);
// your bitmap
ByteArrayOutputStream bs = new ByteArrayOutputStream();
bmp1.compress(Bitmap.CompressFormat.PNG, 50, bs);
intent.putExtra("byteArray", bs.toByteArray());
intent.putExtra("location", textView1.getText().toString());
startActivity(intent);
}
}
);
}
}
public class Profile extends Activity {
ImageView picture;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.profile);
picture = (ImageView) findViewById(R.id.Picture);
Bitmap bitmap = (Bitmap) intent.getParcelableExtra("bytearray");
use
intent.putExtra("BitmapImage", bmp1);
and
Bitmap bitmap = (Bitmap) intent.getParcelableExtra("BitmapImage");
No need to think this much..
take one static Bitmap in Choose Activity and use it in Profile Activity.
Hope this will help you to understand:
Choose.java
static Bitmap bit=null;
//then assign bitmap when it available
bit=bmp1 //this is your bitmap
//now at Profile.java use Choose.bit
if(Choose.bit!=null)
{
profimageview.setBitmap(Choose.bit);
}
There is a mistake in your code. you are using String "byteArray" as Key for putting byte array but retrieving in Profile Activity with different String Key "Image"

Unable to detect face using FaceDetector in android

I need to detect the user face and also compare the face to authenticate my application,for that I used FaceDetector API to detect the user face.
When i run my code it works without any defects.But it gives detected faces count as Zero.
public class AndroidFaceDetectorActivity extends Activity {
private static final int TAKE_PICTURE_CODE = 100;
private static final int MAX_FACES = 5;
private Bitmap cameraBitmap = null;
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
((Button)findViewById(R.id.take_picture)).setOnClickListener(btnClick);
}
#Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
super.onActivityResult(requestCode, resultCode, data);
if(TAKE_PICTURE_CODE == requestCode){
processCameraImage(data);
}
}
private void openCamera(){
Intent intent = new Intent(android.provider.MediaStore.ACTION_IMAGE_CAPTURE);
startActivityForResult(intent, TAKE_PICTURE_CODE);
}
private void processCameraImage(Intent intent){
setContentView(R.layout.detectlayout);
((Button)findViewById(R.id.detect_face)).setOnClickListener(btnClick);
ImageView imageView = (ImageView)findViewById(R.id.image_view);
cameraBitmap = (Bitmap)intent.getExtras().get("data");
imageView.setImageBitmap(cameraBitmap);
}
private void detectFaces(){
if(null != cameraBitmap){
Log.d("FACE_RECOGNITION","CHECK");
int width = cameraBitmap.getWidth();
int height = cameraBitmap.getHeight();
FaceDetector detector = new FaceDetector(width, height,AndroidFaceDetectorActivity.MAX_FACES);
Face[] faces = new Face[AndroidFaceDetectorActivity.MAX_FACES];
Bitmap bitmap565 = Bitmap.createBitmap(width, height, Config.RGB_565);
Paint ditherPaint = new Paint();
Paint drawPaint = new Paint();
ditherPaint.setDither(true);
drawPaint.setColor(Color.RED);
drawPaint.setStyle(Paint.Style.STROKE);
drawPaint.setStrokeWidth(2);
Canvas canvas = new Canvas();
canvas.setBitmap(bitmap565);
canvas.drawBitmap(cameraBitmap, 0, 0, ditherPaint);
int facesFound = detector.findFaces(bitmap565, faces);
PointF midPoint = new PointF();
float eyeDistance = 0.0f;
float confidence = 0.0f;
Log.i("FaceDetector", "Number of faces found: " + facesFound);
if(facesFound > 0)
{
for(int index=0; index<facesFound; ++index){
faces[index].getMidPoint(midPoint);
eyeDistance = faces[index].eyesDistance();
confidence = faces[index].confidence();
Log.i("FaceDetector",
"Confidence: " + confidence +
", Eye distance: " + eyeDistance +
", Mid Point: (" + midPoint.x + ", " + midPoint.y + ")");
canvas.drawRect((int)midPoint.x - eyeDistance ,
(int)midPoint.y - eyeDistance ,
(int)midPoint.x + eyeDistance,
(int)midPoint.y + eyeDistance, drawPaint);
}
}
String filepath = Environment.getExternalStorageDirectory() + "/facedetect" + System.currentTimeMillis() + ".jpg";
try {
FileOutputStream fos = new FileOutputStream(filepath);
bitmap565.compress(CompressFormat.JPEG, 90, fos);
fos.flush();
fos.close();
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
ImageView imageView = (ImageView)findViewById(R.id.image_view);
imageView.setImageBitmap(bitmap565);
}
}
private View.OnClickListener btnClick = new View.OnClickListener() {
//#Override
public void onClick(View v) {
switch(v.getId()){
case R.id.take_picture: openCamera(); break;
case R.id.detect_face: detectFaces(); break;
}
}
};
}
What i did wrong?
or
Is any other way to do that?
Thanks
getExtras().get("data") for MediaStore.ACTION_IMAGE_CAPTURE intent produces very low-resolution bitmap (I believe it's 160x120 px) which could work as a thumbnail, but is not enough for face detection to do its job.
Normally face detection is OK on medium-res images (e.g. 64x480 px) that you can receive form Camera.previewCallback(), but this way you need permissions and code that controls the camera in your app, you cannot use an intent for that.
Here is the official into to face detection on Android: http://developer.android.com/guide/topics/media/camera.html#face-detection.
If you really prefer it this way, you may use getData() to find the captured image at its full resolution, and convert it into a bitmap, like
cameraBitmap = BitmapFactory.decodeFile(data.getData().getPath());

Categories

Resources