Memory/Allocation errors when setting ImageButton from Camera Photo - android

I am trying to take a photo from the phones camera and place in it in a ImageButton as part of a Profile activity including all users details, and then save the image as shared pref.
If I use the following code the ImageButton simply does not update:
protected void onActivityResult(int requestCode, int resultCode, final Intent data) {
// method checks data returned form camera via startActivityForResult
super.onActivityResult(requestCode, resultCode, data);
Runnable runnable = new Runnable(){#Override
public void run() {
handler.post(new Runnable(){
#Override
public void run() {
Bundle extras = data.getExtras();
photo = (Bitmap) extras.get("data");
takeAndSetPhoto.setImageBitmap(photo);
Toast.makeText(getBaseContext(), "Image set to profile!",
Toast.LENGTH_SHORT).show();
}//edn inner run
});//end new runnab;e
}
};
new Thread(runnable).start();
}//end if result OK
}// end onActivity
Alternatively the image DOES load if I use this method but I get erros:
Allocation fail for scaled Bitmap
Out Of Memory 01-03 10:13:06.645: E/AndroidRuntime(30163):
android.view.InflateException: Binary XML file line #18: Error
inflating class
The code is using Uro for phot taken:
public void run() {
Bundle extras = data.getExtras();
Uri photoShot = data.getData(); view
takeAndSetPhoto.setImageURI(photoShot);
Toast.makeText(getBaseContext(), "Image set to profile!",
Toast.LENGTH_SHORT).show();
}//edn inner run
All suggestions appreciated.
Cheers
Ciaran
Resizing Image:
When resizing the image:
Bitmap photoBitmap = Bitmap.createScaledBitmap(photo, 100, 100, false);
It works fine on the emulator with SD card activated, but crashes on a real device (a tablet 8inch)

Problem was I was not resizing image when taken form device phone, passed image to this method before setting to the image view:
public Bitmap reSizeImage(Bitmap bitmapImage) {
// resize bitmap image passed and rerun new one
Bitmap resizedImage = null;
float factorH = h / (float) bitmapImage.getHeight();
float factorW = w / (float) bitmapImage.getWidth();
float factorToUse = (factorH > factorW) ? factorW : factorH;
try {
resizedImage = Bitmap.createScaledBitmap(bitmapImage,
(int) (bitmapImage.getWidth() * factorToUse),
(int) (bitmapImage.getHeight() * factorToUse), false);
} catch (IllegalArgumentException e) {
Log.d(TAG, "Problem resizing Image #Line 510+");
e.printStackTrace();
}
Log.d(TAG,
"in resixed, value of resized image: "
+ resizedImage.toString());
return resizedImage;
}// end reSize

Related

low quality when capture picture and send it image view

low quality when capture picture and send it image view
when imgCamera button pressed
case R.id.imgCamera:
Intent cameraIntent = new Intent(android.provider.MediaStore.ACTION_IMAGE_CAPTURE);
startActivityForResult(cameraIntent, CAMERA_REQUEST);
break;
**the activity result :**
#Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
super.onActivityResult(requestCode, resultCode, data);
if (resultCode == RESULT_OK) {
switch (requestCode) {
case CAMERA_REQUEST:
mPhotoEditor.clearAllViews();
Bitmap photo = (Bitmap) data.getExtras().get("data");
mPhotoEditorView.getSource().setImageBitmap(photo);
break;
case PICK_REQUEST:
try {
mPhotoEditor.clearAllViews();
Uri uri = data.getData();
Bitmap bitmap = MediaStore.Images.Media.getBitmap(getContentResolver(), uri);
mPhotoEditorView.getSource().setImageBitmap(bitmap);
} catch (IOException e) {
e.printStackTrace();
}
break;
}
}
}
please help me I want to get the best quality for the picture captured
To save the full-size image you need to do much more stuff.
data.getData();
This would return image thumbnail which is a low-quality image for the original one.
I can't find an accurate guide than the official documentation. check this link from Android Developer documentation. Following it, you would save the high-quality image easily instead of a low-quality one. Not to mention, You would learn about FileProivder and storage (Maybe you like to save it internal storage or external storage).
Be patient .. happy coding
my comment tells you want is wrong. Here's the solution:
first, modify your camera intent:
if (takePictureIntent.resolveActivity(getPackageManager()) != null) {
// Create the File where the photo should go
_photoUri = createImageUri();
// Continue only if the File was successfully created
if (_photoUri != null) {
//setflags is required to clear flag_activity_new_task that is automatically set on
//direct calls. If not cleared, you get instant returns from the app.
takePictureIntent.setFlags(0);
takePictureIntent.putExtra(MediaStore.EXTRA_OUTPUT, _photoUri);
startActivityForResult(takePictureIntent, REQUEST_TAKE_PHOTO);
}
}
private Uri createImageUri() {
return FileProvider.getUriForFile(this
, this.getApplicationContext().getPackageName()
, new File(Environment.getExternalStorageDirectory()
, "orderphoto.jpg"));
}
this tells the camera where to put the photo.
check in your activity result for RESULT_OK to ensure they didn't cancel.
You'll need to then be able to read from the file system. Here's how our app does it: Note i can't be positive i'm not referencing a custom function in this. YMMV.
private Bitmap DecodeFile(Uri fileUri) {
/* There isn't enough memory to open up more than a couple camera photos */
/* So pre-scale the target _bitmap into which the file is decoded */
/* Get the size of the ImageView */
int targetW = _imageView.getWidth();
int targetH = _imageView.getHeight();
/* Get the size of the image */
BitmapFactory.Options bmOptions = new BitmapFactory.Options();
bmOptions.inJustDecodeBounds = true;
BitmapFactory.decodeFile(fileUri.getPath(), bmOptions);
// TODO: 2/25/2019 Update to BitmapFactory.decodeStream()
int photoW = bmOptions.outWidth;
int photoH = bmOptions.outHeight;
if (photoW == 0 || photoH == 0) {
AppUtility.ShowToast("BitmapFactory.Decode: File Decoding Failed!");
}
/* Figure out which way needs to be reduced less */
int scaleFactor = 1;
if ((targetW > 0) || (targetH > 0)) {
scaleFactor = Math.min(photoW/targetW, photoH/targetH);
}
/* Set _bitmap options to scale the image decode target */
bmOptions.inJustDecodeBounds = false;
bmOptions.inSampleSize = scaleFactor;
bmOptions.inPurgeable = true;
/* compress/shrink the bitmap */
// TODO: 2/25/2019 Update to BitmapFactory.decodeStream()
return BitmapFactory.decodeFile(fileUri.getPath(), bmOptions);
}
When you use this, it does all the hard work for you and sizes it to the screen. Note that some samsung devices lie about the orientation so you may have problems about that.

How to convert raw byte[] into bitmap in camera App

This is the picture taking fuction
class Camera {
...
void capturePicture() {
Camera.Size size = mParams.getPictureSize();
int bitsPerPixel = ImageFormat.getBitsPerPixel(mParams.getPictureFormat());
int bufferSize = (int) Math.ceil(size.width * size.height * bitsPerPixel / 8d) ;
Log.d(TAG, "Picture Size : " + size.width + "\t" + size.height);
Log.d(TAG, "Picture format : " + mParams.getPictureFormat());
Log.d(TAG, "Bits per Pixel = " + bitsPerPixel);
Log.d(TAG, "Buffer Size = " + bufferSize);
byte[] buffer = new byte[1382400];
addBuffer(buffer);
Camera.ShutterCallback shutterCallback = () -> mCameraCallbacks.onShutter();
Camera.PictureCallback pictureCallback = (data, camera) -> {
mCameraControllerCallbacks.onPicture(data);
};
mCamera.takePicture(shutterCallback, pictureCallback, null, null);
}
public interface CameraCallbacks {
void onPicture(byte[] bytes);
}
The the picture size should be 3264 x 2448 however the bitsPerPixel returns -1 so I can't use it to calculate. It turn out the minimum buffer size is 1382400 I don't know why.
Here is the Activity receives the callback
public class CameraActivity extends AppCompatActivity implements Camera.CameraCallbacks
#Override
public void onPicture(byte[] bytes) {
final ByteBuffer buffer = ByteBuffer.wrap(bytes).order(ByteOrder.LITTLE_ENDIAN);
final int[] ints = new int[bytes.length / 4];
buffer.asIntBuffer().put(ints);
Log.d(TAG,"Creating Bitmap of Size : "+mCameraView.mPictureSize.width +" x "+mCameraView.mPictureSize.height);
Bitmap bitmap = Bitmap.createBitmap(ints, mCameraView.mPictureSize.width, mCameraView.mPictureSize.height, Bitmap.Config.ARGB_8888);
Intent intent = new Intent(CameraActivity.this, PicturePreviewActivity.class);
intent.putExtra("bitmap", bmp);
startActivityForResult(intent, SAVE_PICTURE_OR_NOT);
}
The code here is obviously wrong and I am having trouble rearrange these byte[] into ints[] the way bitmap accepts because I don't know the data structure inside these bytes.
Also BitmapFactory.decodeByteArray won't work because it can't read raw data.
Can anybody help me on this one?
It is not possible to retrieve uncompressed bitmap data from android camera, so the picture call back need to move from RawCallback to JpegCallback and use decodeByteArray() to obtain Bitmap.Also it is unreliable to pass Bitmap through Intent So the simplest way is to write to the receiving Activity directly.The Code became like this:
#Override
public void onPicture(byte[] bytes) {
Bitmap bitmap = BitmapFactory.decodeByteArray(bytes, 0, mJpeg.get().length,null);
PicturePreviewActivity.mBitmap=new WeakReference<>(bitmap);
Intent intent = new Intent(CameraActivity.this, PicturePreviewActivity.class);
startActivityForResult(intent, SAVE_PICTURE_OR_NOT);
}
}

How to properly take a screenshot, globally?

Background
Since Android API 21, it's possible for apps to take screenshots globally and record the screen.
The problem
I've made a sample code out of all I've found on the Internet, but it has a few issues:
It's quite slow. Maybe it's possible to avoid it, at least for multiple screenshots, by avoiding the notification of it being removed till really not needed.
It has black margins on left and right, meaning something might be wrong with the calculations :
What I've tried
MainActivity.java
public class MainActivity extends AppCompatActivity {
private static final int REQUEST_ID = 1;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
findViewById(R.id.checkIfPossibleToRecordButton).setOnClickListener(new OnClickListener() {
#Override
public void onClick(final View v) {
ScreenshotManager.INSTANCE.requestScreenshotPermission(MainActivity.this, REQUEST_ID);
}
});
findViewById(R.id.takeScreenshotButton).setOnClickListener(new OnClickListener() {
#Override
public void onClick(final View v) {
ScreenshotManager.INSTANCE.takeScreenshot(MainActivity.this);
}
});
}
#Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
if (requestCode == REQUEST_ID)
ScreenshotManager.INSTANCE.onActivityResult(resultCode, data);
}
}
layout/activity_main.xml
<LinearLayout
android:id="#+id/rootView"
xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:gravity="center"
android:orientation="vertical">
<Button
android:id="#+id/checkIfPossibleToRecordButton"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="request if possible"/>
<Button
android:id="#+id/takeScreenshotButton"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="take screenshot"/>
</LinearLayout>
ScreenshotManager
public class ScreenshotManager {
private static final String SCREENCAP_NAME = "screencap";
private static final int VIRTUAL_DISPLAY_FLAGS = DisplayManager.VIRTUAL_DISPLAY_FLAG_OWN_CONTENT_ONLY | DisplayManager.VIRTUAL_DISPLAY_FLAG_PUBLIC;
public static final ScreenshotManager INSTANCE = new ScreenshotManager();
private Intent mIntent;
private ScreenshotManager() {
}
public void requestScreenshotPermission(#NonNull Activity activity, int requestId) {
MediaProjectionManager mediaProjectionManager = (MediaProjectionManager) activity.getSystemService(Context.MEDIA_PROJECTION_SERVICE);
activity.startActivityForResult(mediaProjectionManager.createScreenCaptureIntent(), requestId);
}
public void onActivityResult(int resultCode, Intent data) {
if (resultCode == Activity.RESULT_OK && data != null)
mIntent = data;
else mIntent = null;
}
#UiThread
public boolean takeScreenshot(#NonNull Context context) {
if (mIntent == null)
return false;
final MediaProjectionManager mediaProjectionManager = (MediaProjectionManager) context.getSystemService(Context.MEDIA_PROJECTION_SERVICE);
final MediaProjection mediaProjection = mediaProjectionManager.getMediaProjection(Activity.RESULT_OK, mIntent);
if (mediaProjection == null)
return false;
final int density = context.getResources().getDisplayMetrics().densityDpi;
final Display display = ((WindowManager) context.getSystemService(Context.WINDOW_SERVICE)).getDefaultDisplay();
final Point size = new Point();
display.getSize(size);
final int width = size.x, height = size.y;
// start capture reader
final ImageReader imageReader = ImageReader.newInstance(width, height, PixelFormat.RGBA_8888, 1);
final VirtualDisplay virtualDisplay = mediaProjection.createVirtualDisplay(SCREENCAP_NAME, width, height, density, VIRTUAL_DISPLAY_FLAGS, imageReader.getSurface(), null, null);
imageReader.setOnImageAvailableListener(new OnImageAvailableListener() {
#Override
public void onImageAvailable(final ImageReader reader) {
Log.d("AppLog", "onImageAvailable");
mediaProjection.stop();
new AsyncTask<Void, Void, Bitmap>() {
#Override
protected Bitmap doInBackground(final Void... params) {
Image image = null;
Bitmap bitmap = null;
try {
image = reader.acquireLatestImage();
if (image != null) {
Plane[] planes = image.getPlanes();
ByteBuffer buffer = planes[0].getBuffer();
int pixelStride = planes[0].getPixelStride(), rowStride = planes[0].getRowStride(), rowPadding = rowStride - pixelStride * width;
bitmap = Bitmap.createBitmap(width + rowPadding / pixelStride, height, Config.ARGB_8888);
bitmap.copyPixelsFromBuffer(buffer);
return bitmap;
}
} catch (Exception e) {
if (bitmap != null)
bitmap.recycle();
e.printStackTrace();
}
if (image != null)
image.close();
reader.close();
return null;
}
#Override
protected void onPostExecute(final Bitmap bitmap) {
super.onPostExecute(bitmap);
Log.d("AppLog", "Got bitmap?" + (bitmap != null));
}
}.execute();
}
}, null);
mediaProjection.registerCallback(new Callback() {
#Override
public void onStop() {
super.onStop();
if (virtualDisplay != null)
virtualDisplay.release();
imageReader.setOnImageAvailableListener(null, null);
mediaProjection.unregisterCallback(this);
}
}, null);
return true;
}
}
The questions
Well it's about the problems:
Why is it so slow? Is there a way to improve it?
How can I avoid, between taking screenshots, the removal of the notification of them? When can I remove the notification? Does the notification mean it constantly takes screenshots?
Why does the output bitmap (currently I don't do anything with it, because it's still POC) have black margins in it? What's wrong with the code in this matter?
NOTE: I don't want to take a screenshot only of the current app. I want to know how to use it globally, for all apps, which is possible officially only by using this API, as far as I know.
EDIT: I've noticed that on CommonsWare website (here), it is said that the output bitmap is larger for some reason, but as opposed to what I've noticed (black margin in beginning AND end), it says it's supposed to be in the end:
For inexplicable reasons, it will be a bit larger, with excess unused
pixels on each row on the end.
I've tried what was offered there, but it crashes with the exception "java.lang.RuntimeException: Buffer not large enough for pixels" .
Why does the output bitmap (currently I don't do anything with it, because it's still POC) have black margins in it? What's wrong with the code in this matter?
You have black margins around your screenshot because you are not using realSize of the window you're in. To solve this:
Get the real size of the window:
final Point windowSize = new Point();
WindowManager windowManager = (WindowManager) context.getSystemService(Context.WINDOW_SERVICE);
windowManager.getDefaultDisplay().getRealSize(windowSize);
Use that to create your image reader:
imageReader = ImageReader.newInstance(windowSize.x, windowSize.y, PixelFormat.RGBA_8888, MAX_IMAGES);
This third step may not be required but I have seen otherwise in my app's production code (which runs on a variety of android devices out there). When you acquire an image for ImageReader and create a bitmap out of it. Crop that bitmap using the window size using below code.
// fix the extra width from Image
Bitmap croppedBitmap;
try {
croppedBitmap = Bitmap.createBitmap(bitmap, 0, 0, windowSize.x, windowSize.y);
} catch (OutOfMemoryError e) {
Timber.d(e, "Out of memory when cropping bitmap of screen size");
croppedBitmap = bitmap;
}
if (croppedBitmap != bitmap) {
bitmap.recycle();
}
I don't want to take a screenshot only of the current app. I want to know how to use it globally, for all apps, which is possible officially only by using this API, as far as I know.
To capture screen/take screenshot you need an object of MediaProjection. To create such object, you need pair of resultCode (int) and Intent. You already know how these objects are acquired and cache those in your ScreenshotManager class.
Coming back to taking screenshots of any app, you need to follow the same procedure of getting these variables resultCode and Intent but instead of caching it locally in your class variables, start a background service and pass these variables to the same like any other normal parameters. Take a look at how Telecine does it here. When this background service is started it can provide a trigger (a notification button) to the user which when clicked, will perform the same operations of capturing screen/taking screenshot as you are doing in your ScreenshotManager class.
Why is it so slow? Is there a way to improve it?
How much slow is it to your expectations? My use case for Media projection API is to take a screenshot and present it to the user for editing. For me the speed is decent enough. One thing I feel worth mentioning is that the ImageReader class can accept a Handler to a thread in setOnImageAvailableListener. If you provide a handler there, onImageAvailable callback will be triggered on the handler thread instead of the one that created the ImageReader. This will help you in NOT creating a AsyncTask (and starting it) when an image is available instead the callback itself will happen on a background thread. This is how I create my ImageReader:
private void createImageReader() {
startBackgroundThread();
imageReader = ImageReader.newInstance(windowSize.x, windowSize.y, PixelFormat.RGBA_8888, MAX_IMAGES);
ImageHandler imageHandler = new ImageHandler(context, domainModel, windowSize, this, notificationManager, analytics);
imageReader.setOnImageAvailableListener(imageHandler, backgroundHandler);
}
private void startBackgroundThread() {
backgroundThread = new HandlerThread(NAME_VIRTUAL_DISPLAY);
backgroundThread.start();
backgroundHandler = new Handler(backgroundThread.getLooper());
}

Android cropped image quality issue

I am trying to save an image as cropped from android and then show it in my app. I am using the code below, but when I try to view the image in my app the image quality is not good as in the attached image. Am doing anything wrong? Any help would be great.
My code is:
dipToPixel = TypedValue.applyDimension(TypedValue.COMPLEX_UNIT_DIP, 1, getResources().getDisplayMetrics());
public void onActivityResult(int requestCode, int resultCode, Intent data) {
super.onActivityResult(requestCode, resultCode, data);
if (requestCode == 1 && resultCode == getActivity().RESULT_OK && data != null) {
picUri = data.getData();
performCrop();
}
if (requestCode == 111 && resultCode == getActivity().RESULT_OK && data != null) {
Bundle extras = data.getExtras();
Bitmap bitmapImage = extras.getParcelable("data");
tweetImage.setImageBitmap(bitmapImage);
tweetImage.getViewTreeObserver().addOnPreDrawListener(new ViewTreeObserver.OnPreDrawListener() {
public boolean onPreDraw() {
tweetImage.getViewTreeObserver().removeOnPreDrawListener(this);
widthPixel = tweetImage.getMeasuredWidth();
heightPixel = tweetImage.getMeasuredHeight();
return true;
}
});
System.out.println("photo added");
addPhotoVar = 1;
addPhotoBtn.setText("remove");
}
callbackManager.onActivityResult(requestCode, resultCode, data);
}
private void performCrop() {
try {
//call the standard crop action intent (the user device may not support it)
Intent cropIntent = new Intent("com.android.camera.action.CROP");
//indicate image type and Uri
cropIntent.setDataAndType(picUri, "image/*");
//set crop properties
cropIntent.putExtra("crop", "true");
//indicate aspect of desired crop
cropIntent.putExtra("aspectX", 1);
cropIntent.putExtra("aspectY", 1);
//indicate output X and Y
cropIntent.putExtra("outputX", Math.round(screenWidth / dipToPixel)-10);
cropIntent.putExtra("outputY", Math.round(screenWidth / dipToPixel)-10);
//retrieve data on return
cropIntent.putExtra("return-data", true);
//start the activity - we handle returning in onActivityResult
startActivityForResult(cropIntent, 111);
}
// respond to users whose devices do not support the crop action
catch (ActivityNotFoundException anfe) {
// display an error message
String errorMessage = "your device doesn't support the crop action!";
Toast toast = Toast.makeText(getActivity(), errorMessage, Toast.LENGTH_SHORT);
toast.show();
}
}
Below is the code that I use the image and save to database:
tweetImage.buildDrawingCache();
bm = tweetImage.getDrawingCache();
if (widthPixel < heightPixel) {
basePixel = widthPixel;
}
else {
basePixel = heightPixel;
}
if (basePixel > 768) {
widthRatio = (float) 768/basePixel;
heightRatio = (float) 768/basePixel;
}
else {
widthRatio = 1;
heightRatio = 1;
}
Bitmap bmResized = Bitmap.createScaledBitmap(bm,(int)(widthPixel*widthRatio), (int)(heightPixel*heightRatio), true);
ByteArrayOutputStream stream = new ByteArrayOutputStream();
bmResized.compress(Bitmap.CompressFormat.JPEG, 100, stream);
byteArray1 = stream.toByteArray();
image1 = new ParseFile("profilePhoto.jpg", byteArray1, "image/jpg");
Use this library, this library manage cropped image quality as well it keeps both image Crop Library
Change :
bmResized.compress(Bitmap.CompressFormat.JPEG, 100, stream);
to :
bmResized.compress(Bitmap.CompressFormat.PNG, 100, stream);
Since JPEG format uses a lossy compression, you should use PNG to save the bitmap, if you don't want quality loss.
Also, you should avoid using com.android.camera.action.CROP intent as it doesn't exist on all the devices as explained here.
There are some alternatives listed on the above link, you may use one of them.
Please refer this link :
https://commonsware.com/blog/2013/01/23/no-android-does-not-have-crop-intent.html
Here are some libraries to consider for image crop:
https://github.com/lvillani/android-cropimage
https://github.com/biokys/cropimage
https://github.com/MMP-forTour/cropimage (forked from the above one)
https://github.com/dtitov/pickncrop

Getting nullpointerexception for bitmap.getWidth() when reading image from sdcard

I read an image from sd card with BitmapFactory:
String myJpgPath = "/sdcard/yourdollar/img001.jpg";
BitmapFactory.Options options = new BitmapFactory.Options();
options.inSampleSize = 2;
Bitmap bm = BitmapFactory.decodeFile(myJpgPath, options);
If I write:
Log.i("width and height: ", bm.getWidth() + " " + bm.getHeight());
I get a nullpointer exception. I tried to scale with Bitmap.createScaledBitmap() but I get the same error. After it I am processing the bitmap so I would like to have a bitmap in the end that has any width or height, because it seems like I didn't give parameters for the bitmap. But I cannot scale it, so how can I get this image as a bitmap with a width of 500 and a height of 500?
***UPDATE
buttonClick = (Button) findViewById(R.id.buttonClick);
buttonClick.setOnClickListener(new OnClickListener() {
public void onClick(View v) { // <5>
preview.camera.takePicture(shutterCallback, rawCallback, jpegCallback);
Intent intentstart = new Intent(CameraActivity.this, Intent2.class);
startActivity(intentstart);
}
});
Okay here is the thing. This button takes a picture with the camera then change activity. If I do this way, the app does not have time to create the so when I use it in my second activity to read it, it throws nullpointerexception. So I got tricky (and wrong as well) and put:
preview.camera.takePicture(shutterCallback, rawCallback, jpegCallback);
try {
Thread.sleep(1200);
} catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
Intent intentstart = new Intent(CameraActivity.this, Intent2.class);
startActivity(intentstart);
So it has 1.2 seconds to create the file, then change activity. Is there any way to check if the file created or not? Of course if its true then it should change activity.
Any suggestion?
First just check whether your image in your required directory /sdcard/yourdollar/img001.jpg And, Just try like this -
String myJpgPath = "/sdcard/yourdollar/img001.jpg";
Bitmap bm = BitmapFactory.decodeFile(myJpgPath);
and get image height & width in your Logcat
Log.i("width and height: ", bm.getWidth() + " " + bm.getHeight());

Categories

Resources