My aim is to
Capture bytes from the camera onPictureTaken
Convert the byte[] to a Bitmap BitmapFactory.decodeByteArray
Save the Bitmap
I have a problem with step 2. This step for some reason makes the picture contain jagged edges and a loss of quality. If I do step 1 and save the bytes to a file directly (without step 2) then the image looks a lot better.
How do I convert bytes to a bitmap (step 2) without losing quality?
#Override
public void onPictureTaken(byte[] byteData, Camera camera) {
String filePath = "PATH_TO_IMAGE_FILE";
//Save bytes directly
{
File pictureFile = new File(filePath + ".fromBytes");
try {
FileOutputStream fos = new FileOutputStream(pictureFile);
fos.write(byteData);
fos.close();
} catch (Exception error) {
Log.d( "File not saved: " , error.getMessage());
}
}
//convert to bitmap -> convert to bytes -> then save
{
//I think this is where is loses quality ??
Bitmap decodedBitmap = BitmapFactory.decodeByteArray(byteData, 0, byteData.length);
//
ByteArrayOutputStream blob = new ByteArrayOutputStream();
decodedBitmap.compress(Bitmap.CompressFormat.PNG, 100, blob);
byte[] bitmapdata = blob.toByteArray();
File pictureFile = new File(filePath + ".fromBitmap");
try {
FileOutputStream fos = new FileOutputStream(pictureFile);
fos.write(bitmapdata);
fos.close();
} catch (Exception error) {
Log.d( "File not saved: " , error.getMessage());
}
}
}
Thanks
Related
i upload image to server with help of volley and bitmap and i successfully pass the data, but when i take the image using camera the image quality become so poor and also when i pass an image of size above 500kb the app become crash. Why this happen??
can anyone help me,
this is how my camera intent perform
private void onCaptureImageResult(Intent data) {
thumbnail = (Bitmap) data.getExtras().get("data");
File destination = new File(Environment.getExternalStorageDirectory(),
System.currentTimeMillis() + ".jpg");
FileOutputStream fo;
try {
destination.createNewFile();
fo = new FileOutputStream(destination);
//fo.write(bytes.toByteArray());
fo.close();
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
if (thumbnail!=null){
addImageNew.setImageBitmap(thumbnail);
}
}
this is how my gallery intent perform
private void onSelectFromGalleryResult(Intent data) {
thumbnail=null;
if (data != null) {
try {
thumbnail = MediaStore.Images.Media.getBitmap(getApplicationContext().getContentResolver(), data.getData());
} catch (IOException e) {
e.printStackTrace();
}
}
if (thumbnail!=null){
addImageNew.setImageBitmap(thumbnail);
}
}
this how i convert Bitmap to string
public String getStringImage(Bitmap bmp){
ByteArrayOutputStream baos = new ByteArrayOutputStream();
bmp.compress(Bitmap.CompressFormat.JPEG, 90, baos);
byte[] imageBytes = baos.toByteArray();
String encodedImage = Base64.encodeToString(imageBytes, Base64.DEFAULT);
return encodedImage;
}
NOTE: i have only problem in image quality and high size image passing
You can try to use below link solution. It may be work for you.
200kb image to base64 cannot send to web service
I didn't find where getStringImage(Bitmap bmp) is called, but you can try to do something like that:
BitmapFactory.Options options = new BitmapFactory.Options();
options.inSampleSize = 1;
Bitmap bitmap = BitmapFactory.decodeByteArray(data, 0, data.length, options);
getStringImage(bitmap);
Or maybe you can change the compress to 100, for high quality:
bmp.compress(Bitmap.CompressFormat.JPEG, 100, baos);
I am building an Android app that takes photos, then uploads them to a Rails API.
The api expects the base64 encoded raw file bytes, to be stored as a temp file representing the image in JPG format.
However, the API is rejecting the uploaded file with this error message:
<Paperclip::Errors::NotIdentifiedByImageMagickError:
This seems to be due to a failure of encoding on the part of the Android app.
The base64 image bytes that I'm sending up look like this:
Which appears invalid just by looking at it.
The image is created in android by taking a pic with the Camera API and base64 encoding the resulting byteArray:
String encoded = Base64.encodeToString(byteArray, Base64.DEFAULT);
Anyone know what I'm doing wrong here?
On button click for capturing an image from a camera use this
Intent intent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
File f = new File(android.os.Environment.getExternalStorageDirectory(), "temp.jpg");
intent.putExtra(MediaStore.EXTRA_OUTPUT, Uri.fromFile(f));
((Activity) context).startActivityForResult(intent, Constants.REQUEST_IMAGE_CAPTURE);
and on activityResult of the activity implement the following code:
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
super.onActivityResult(requestCode, resultCode, data);
final ImageView uploadArea = (ImageView) attachmentDialog.findViewById(R.id.uploadArea);
Bitmap bitmap;
if (resultCode == RESULT_OK) {
if (requestCode == 1) {
File f = new File(Environment.getExternalStorageDirectory().toString());
for (File temp : f.listFiles()) {
if (temp.getName().equals("temp.jpg")) {
f = temp;
break;
}
}
try {
BitmapFactory.Options bitmapOptions = new BitmapFactory.Options();
bitmap = BitmapFactory.decodeFile(f.getAbsolutePath(),
bitmapOptions);
Matrix matrix = new Matrix();
matrix.postRotate(-90);
Bitmap rotatedBitmap = Bitmap.createBitmap(bitmap, 0, 0, bitmap.getWidth(), bitmap.getHeight(), matrix, true);
ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();
rotatedBitmap.compress(Bitmap.CompressFormat.PNG, 100, byteArrayOutputStream);
byte[] attachmentBytes = byteArrayOutputStream.toByteArray();
String attachmentData = Base64.encodeToString(attachmentBytes, Base64.DEFAULT);
uploadArea.setImageBitmap(rotatedBitmap);
String path = android.os.Environment
.getExternalStorageDirectory()
+ File.separator
+ "CTSTemp" + File.separator + "default";
f.delete();
OutputStream outFile = null;
File file = new File(path, String.valueOf(System.currentTimeMillis()) + ".jpg");
try {
outFile = new FileOutputStream(file);
bitmap.compress(Bitmap.CompressFormat.JPEG, 85, outFile);
outFile.flush();
outFile.close();
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} catch (Exception e) {
e.printStackTrace();
}
} catch (Exception e) {
e.printStackTrace();
}
}
}
I hope this will help you, and for any farther info, please ask
In the database of my app i have stored a gif image as BLOB. How can i retrive this gif from database and save it to sdcard as gif file? For png and jpeg i know how to do it. How can i achieve this correctly without any library?
For png and jpeg i do like this code:
ByteArrayInputStream imageStream;
imageStream = new ByteArrayInputStream(imageBlob);
Bitmap image= BitmapFactory.decodeStream(imageStream);
storeImage(image);
private void storeImage(Bitmap image) {
File pictureFile = getOutputMediaFile();
if (pictureFile == null)
{
return;
}
try {
FileOutputStream fos = new FileOutputStream(pictureFile);
image.compress(Bitmap.CompressFormat.PNG, 90, fos);
fos.close();
} catch (FileNotFoundException e) {
Log.i("MyApp", "File not found: " + e.getMessage());
} catch (IOException e) {
Log.i("MyApp", "Error accessing file: " + e.getMessage());
}
}
//To Get the Path for Image Storage
private File getOutputMediaFile(){
File mediaStorageDir = new File(Environment.getExternalStorageDirectory()
+ "/Android/data/"
+ contexto.getPackageName()
+ "/Files");
// Create the storage directory if it does not exist
if (! mediaStorageDir.exists()){
if (! mediaStorageDir.mkdirs()){
return null;
}
}
// Create a media file name
File mediaFile;
String mImageName="myImage.png";
mediaFile = new File(mediaStorageDir.getPath() + File.separator + mImageName);
return mediaFile;
}
I tried below with no success. This doesnt create the file. I dont know what is wrong or if im missing something.
ByteArrayInputStream imageStream = new ByteArrayInputStream(imageBlob);
ByteArrayOutputStream byteBuffer = new ByteArrayOutputStream();
int bufferSize = 1024;
byte[] buffer = new byte[bufferSize];
int len = 0;
while ((len = imageStream.read(buffer)) != -1) {
byteBuffer.write(buffer, 0, len);
}
FileOutputStream stream = new FileOutputStream(myPath);
stream.write(byteBuffer.toByteArray());
Need help please. Thanks in advance.
ByteArrayInputStream imageStream = new ByteArrayInputStream(imageBlob);
Once you have that just save the stream to file. Don't make a bitmap of it and don't use BitmapFactory. You can use that for png, jpg, and gif.
On a rooted Android device, i want to take a screenshot and convert the raw format image to a Png image then save it locally. So far, i managed to access the framebuffer, take the screenshot and save the raw image. The problem is when i convert it to Png format, the image i get is all wrong.. a bunch of white and grey lines.
Here's what i did:
public void putRawImageInArray (byte [] array, File f ) throws IOException{
#SuppressWarnings("resource")
BufferedInputStream bufferedInputStream = new BufferedInputStream(new FileInputStream(f)); //The framebuffer raw image is in the file
bufferedInputStream.read(array, 0, array.length);//read the file
}
public void convertToBitmap (byte [] rawarray) throws IOException{
byte [] Bits = new byte[rawarray.length*4];
int i;
for(i=0;i<rawarray.length;i++)
{
Bits[i*4] =
Bits[i*4+1] =
Bits[i*4+2] = (byte) ~rawarray[i];
Bits[i*4+3] = -1;//0xff, that's the alpha.
}
Bitmap bm = Bitmap.createBitmap(w, h, Bitmap.Config.ARGB_8888);
bm.copyPixelsFromBuffer(ByteBuffer.wrap(Bits));
File f = new File(Environment.getExternalStorageDirectory(), "/pictures/picture.png");
f.createNewFile();
if (f.exists() == true) {
f.delete();
}
try{
OutputStream fos=new FileOutputStream(f);
bm.compress(CompressFormat.PNG, 100, fos);
fos.close();
} catch (FileNotFoundException e) {
Log.d(TAG, "File not found: " + e.getMessage());
} catch (IOException e) {
Log.d(TAG, "Error accessing file: " + e.getMessage());
}
What am i doing wrong?
Try removing all this
byte [] Bits = new byte[rawarray.length*4];
int i;
for(i=0;i<rawarray.length;i++)
{
Bits[i*4] =
Bits[i*4+1] =
Bits[i*4+2] = (byte) ~rawarray[i];
Bits[i*4+3] = -1;//0xff, that's the alpha.
}
and use rawarray directly
Bitmap bm = Bitmap.createBitmap(w, h, Bitmap.Config.ARGB_8888);
bm.copyPixelsFromBuffer(ByteBuffer.wrap(rawarray));
and make sure that the color model that you are using(Bitmap.Config.ARGB_8888) is same as the color model of the image data.
in my app I take a picture and save it in specified way. My phone is Nexus 4 and the camera is very strong and every picture is about 2.31MB how can I save picture for example in 60KB?
Can I do it with code? and how?
File newdir = new File(dir);
if(!newdir.exists()){
newdir.mkdirs();
}
picturesCount++;
String file = dir+picturesCount+".jpg";
imagePath = file;
File newfile = new File(file);
try {
newfile.createNewFile();
} catch (IOException e) {}
Uri outputFileUri = Uri.fromFile(newfile);
Intent cameraIntent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
cameraIntent.putExtra(MediaStore.EXTRA_OUTPUT, outputFileUri);
startActivityForResult(cameraIntent, TAKE_PHOTO_CODE);
When you are using camera and picture is taken, you can perform the task in the method below to reduce the quality of the image and type like JPEG or PNG (PNG takes more space than JPEG) so just play with the value (60 current out of 100) and see the difference in size.
#Override
public void onPictureTaken(byte[] data, Camera camera) {
InputStream in = new ByteArrayInputStream(data);
BitmapFactory.Options options = new BitmapFactory.Options();
Bitmap preview_bitmap = BitmapFactory.decodeStream(in, null, options);
ByteArrayOutputStream stream = new ByteArrayOutputStream();
preview_bitmap.compress(Bitmap.CompressFormat.JPEG, 60, stream);
byte[] byteArray = stream.toByteArray();
FileOutputStream outStream = null;
try {
outStream = new FileOutputStream(new File(
Environment.getExternalStorageDirectory(), "Image1.jpg"));
outStream.write(byteArray);
outStream.close();
} catch (FileNotFoundException e) {
Log.d("CAMERA", e.getMessage());
} catch (IOException e) {
Log.d("CAMERA", e.getMessage());
}
}