Bitmap Image Saved to Internal Storage is Corrupt - android

I want to design an app that generates a QR code and gives the user the possibility to save the generated image to their internal storage only. I successfully generate the bitmap and save it as .PNG image, but when I try to open it from the gallery it appears broken or corrupt.
Below is the code to generate the bitmap and display it on an ImageView(qrCode):
bitmap = encodeAsBitmap(value);
qrCode.setImageBitmap(bitmap);
Bitmap encodeAsBitmap(String str) throws WriterException {
BitMatrix result;
try {
result = new MultiFormatWriter().encode(str,
BarcodeFormat.QR_CODE, WIDTH, WIDTH, null);
} catch (IllegalArgumentException iae) {
// Unsupported format
return null;
}
int w = result.getWidth();
int h = result.getHeight();
int[] pixels = new int[w * h];
for (int y = 0; y < h; y++) {
int offset = y * w;
for (int x = 0; x < w; x++) {
pixels[offset + x] = result.get(x, y) ? getResources().getColor(R.color.colorBlack) :
getResources().getColor(R.color.colorWhite);
}
}
Bitmap bitmap = Bitmap.createBitmap(w, h, Bitmap.Config.ARGB_8888);
bitmap.setPixels(pixels, 0, 500, 0, 0, w, h);
return bitmap;
}
It works perfectly up to this level. The user can then click a button in order to save this image to their device's internal storage, thanks to the below method:
public void onClickSaveCode(View view) {
String title = getResources().getString(R.string.saved_image_title_prepend) + stringDate;
String format = getResources().getString(R.string.saved_image_format);
String directory = getResources().getString(R.string.saved_image_directory);
// Method call to save image
saveImageToInternalStorage(bitmap, directory, title, format);
}
public boolean saveImageToInternalStorage(Bitmap bitmap, String directory, String title, String format) {
ContextWrapper contextWrapper = new ContextWrapper(getApplicationContext());
File imageDirectory = contextWrapper.getDir(directory, Context.MODE_WORLD_READABLE);
File path = new File(imageDirectory, title + format);
try {
FileOutputStream fos = new FileOutputStream(path);
// Use the compress method on the Bitmap object to write image to the OutputStream
bitmap.compress(Bitmap.CompressFormat.PNG, QUALITY, fos);
fos.close();
new SingleMediaScanner(this, path);
Toast.makeText(this, getString(R.string.save_success), Toast.LENGTH_LONG).show();
return true;
} catch (Exception e) {
e.printStackTrace();
Toast.makeText(this, getString(R.string.save_failure), Toast.LENGTH_LONG).show();
return false;
}
}
And finally below is the MediaScannerConnection class to scan for all images saved to the device and display them in the gallery:
public class SingleMediaScanner implements MediaScannerConnectionClient {
private MediaScannerConnection mSC;
private File file;
public SingleMediaScanner(Context context, File f) {
file = f;
mSC = new MediaScannerConnection(context, this);
mSC.connect();
}
#Override
public void onMediaScannerConnected() {
mSC.scanFile(file.getAbsolutePath(), null);
}
#Override
public void onScanCompleted(String path, Uri uri) {
mSC.disconnect();
}
}
The images are saved, yet they appear in the gallery as broken files.
Any help will be greatly appreciated.

string path = Android.OS.Environment.ExternalStorageDirectory.AbsolutePath;
string filePath = System.IO.Path.Combine(path, "compressed.png");
//Bitmap bmp = ((BitmapDrawable)imgV.Drawable).Bitmap;
Bitmap b = newBitmap;
FileStream ms = new FileStream(filePath, FileMode.Create);
//FileOutputStream fos = new FileOutputStream(filePath,true);
await b.CompressAsync(Bitmap.CompressFormat.Png, 100, ms);
ms.Close();
//ByteArrayOutputStream opstream = new ByteArrayOutputStream();
//b.Compress(Bitmap.CompressFormat.Png, 100, opstream);
//byte[] bytArray = opstream.ToByteArray();
Toast.MakeText(Application.Context, "Compressed : " , ToastLength.Short).Show();
imgCompress.SetImageBitmap(b);

Related

How to take picture with camera using ARCore

ARCore camera doesn't seem to support takePicture.
https://developers.google.com/ar/reference/java/com/google/ar/core/Camera
Anyone know how I can take pictures with ARCore?
I am assuming you mean a picture of what the camera is seeing and the AR objects. At a high level you need to get permission to write to external storage to save the picture, copy the frame from OpenGL and then save it as a png (for example). Here are the specifics:
Add the WRITE_EXTERNAL_STORAGE permission to the AndroidManifest.xml
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
Then change CameraPermissionHelper to iterate over both the CAMERA and WRITE_EXTERNAL_STORAGE permissions to make sure they are granted
private static final String REQUIRED_PERMISSIONS[] = {
Manifest.permission.WRITE_EXTERNAL_STORAGE,
Manifest.permission.CAMERA
};
/**
* Check to see we have the necessary permissions for this app.
*/
public static boolean hasCameraPermission(Activity activity) {
for (String p : REQUIRED_PERMISSIONS) {
if (ContextCompat.checkSelfPermission(activity, p) !=
PackageManager.PERMISSION_GRANTED) {
return false;
}
}
return true;
}
/**
* Check to see we have the necessary permissions for this app,
* and ask for them if we don't.
*/
public static void requestCameraPermission(Activity activity) {
ActivityCompat.requestPermissions(activity, REQUIRED_PERMISSIONS,
CAMERA_PERMISSION_CODE);
}
/**
* Check to see if we need to show the rationale for this permission.
*/
public static boolean shouldShowRequestPermissionRationale(Activity activity) {
for (String p : REQUIRED_PERMISSIONS) {
if (ActivityCompat.shouldShowRequestPermissionRationale(activity, p)) {
return true;
}
}
return false;
}
Next, add a couple fields to HelloARActivity to keep track of the dimensions of the frame and boolean to indicate when to save the picture.
private int mWidth;
private int mHeight;
private boolean capturePicture = false;
Set the width and height in onSurfaceChanged()
public void onSurfaceChanged(GL10 gl, int width, int height) {
mDisplayRotationHelper.onSurfaceChanged(width, height);
GLES20.glViewport(0, 0, width, height);
mWidth = width;
mHeight = height;
}
At the bottom of onDrawFrame(), add a check for the capture flag. This should be done after all the other drawing happens.
if (capturePicture) {
capturePicture = false;
SavePicture();
}
Then add the onClick method for a button to take the picture, and the actual code to save the image:
public void onSavePicture(View view) {
// Here just a set a flag so we can copy
// the image from the onDrawFrame() method.
// This is required for OpenGL so we are on the rendering thread.
this.capturePicture = true;
}
/**
* Call from the GLThread to save a picture of the current frame.
*/
public void SavePicture() throws IOException {
int pixelData[] = new int[mWidth * mHeight];
// Read the pixels from the current GL frame.
IntBuffer buf = IntBuffer.wrap(pixelData);
buf.position(0);
GLES20.glReadPixels(0, 0, mWidth, mHeight,
GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, buf);
// Create a file in the Pictures/HelloAR album.
final File out = new File(Environment.getExternalStoragePublicDirectory(
Environment.DIRECTORY_PICTURES) + "/HelloAR", "Img" +
Long.toHexString(System.currentTimeMillis()) + ".png");
// Make sure the directory exists
if (!out.getParentFile().exists()) {
out.getParentFile().mkdirs();
}
// Convert the pixel data from RGBA to what Android wants, ARGB.
int bitmapData[] = new int[pixelData.length];
for (int i = 0; i < mHeight; i++) {
for (int j = 0; j < mWidth; j++) {
int p = pixelData[i * mWidth + j];
int b = (p & 0x00ff0000) >> 16;
int r = (p & 0x000000ff) << 16;
int ga = p & 0xff00ff00;
bitmapData[(mHeight - i - 1) * mWidth + j] = ga | r | b;
}
}
// Create a bitmap.
Bitmap bmp = Bitmap.createBitmap(bitmapData,
mWidth, mHeight, Bitmap.Config.ARGB_8888);
// Write it to disk.
FileOutputStream fos = new FileOutputStream(out);
bmp.compress(Bitmap.CompressFormat.PNG, 100, fos);
fos.flush();
fos.close();
runOnUiThread(new Runnable() {
#Override
public void run() {
showSnackbarMessage("Wrote " + out.getName(), false);
}
});
}
Last step is to add the button to the end of activity_main.xml layout
<Button
android:id="#+id/fboRecord_button"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_alignStart="#+id/surfaceview"
android:layout_alignTop="#+id/surfaceview"
android:onClick="onSavePicture"
android:text="Snap"
tools:ignore="OnClick"/>
Acquiring the image buffer
In the latest ARCore SDK, we get access to the image buffer via public class Frame. Below is the sample code which gives us access to the image buffer.
private void onSceneUpdate(FrameTime frameTime) {
try {
Frame currentFrame = sceneView.getArFrame();
Image currentImage = currentFrame.acquireCameraImage();
int imageFormat = currentImage.getFormat();
if (imageFormat == ImageFormat.YUV_420_888) {
Log.d("ImageFormat", "Image format is YUV_420_888");
}
}
onSceneUpdate() will be called for every update if you register it to setOnUpdateListener() callback. Image will be in YUV_420_888 format, but it will have full Field of view of native high resolution camera.
Also do not forget to close resources of received image by calling currentImage.close(). Otherwise you will receive a ResourceExhaustedException on the next run of onSceneUpdate.
Writing the acquired image buffer to a file
Following implementation converts YUV buffer to compressed JPEG byte array
private static byte[] NV21toJPEG(byte[] nv21, int width, int height) {
ByteArrayOutputStream out = new ByteArrayOutputStream();
YuvImage yuv = new YuvImage(nv21, ImageFormat.NV21, width, height, null);
yuv.compressToJpeg(new Rect(0, 0, width, height), 100, out);
return out.toByteArray();
}
public static void WriteImageInformation(Image image, String path) {
byte[] data = null;
data = NV21toJPEG(YUV_420_888toNV21(image),
image.getWidth(), image.getHeight());
BufferedOutputStream bos = new BufferedOutputStream(new FileOutputStream(path));
bos.write(data);
bos.flush();
bos.close();
}
private static byte[] YUV_420_888toNV21(Image image) {
byte[] nv21;
ByteBuffer yBuffer = image.getPlanes()[0].getBuffer();
ByteBuffer uBuffer = image.getPlanes()[1].getBuffer();
ByteBuffer vBuffer = image.getPlanes()[2].getBuffer();
int ySize = yBuffer.remaining();
int uSize = uBuffer.remaining();
int vSize = vBuffer.remaining();
nv21 = new byte[ySize + uSize + vSize];
//U and V are swapped
yBuffer.get(nv21, 0, ySize);
vBuffer.get(nv21, ySize, vSize);
uBuffer.get(nv21, ySize + vSize, uSize);
return nv21;
}
Sorry for answering late.You can use code to click picture in ARCore:
private String generateFilename() {
String date =
new SimpleDateFormat("yyyyMMddHHmmss", java.util.Locale.getDefault()).format(new Date());
return Environment.getExternalStoragePublicDirectory(
Environment.DIRECTORY_PICTURES) + File.separator + "Sceneform/" + date + "_screenshot.jpg";
}
private void saveBitmapToDisk(Bitmap bitmap, String filename) throws IOException {
File out = new File(filename);
if (!out.getParentFile().exists()) {
out.getParentFile().mkdirs();
}
try (FileOutputStream outputStream = new FileOutputStream(filename);
ByteArrayOutputStream outputData = new ByteArrayOutputStream()) {
bitmap.compress(Bitmap.CompressFormat.PNG, 100, outputData);
outputData.writeTo(outputStream);
outputStream.flush();
outputStream.close();
} catch (IOException ex) {
throw new IOException("Failed to save bitmap to disk", ex);
}
}
private void takePhoto() {
final String filename = generateFilename();
/*ArSceneView view = fragment.getArSceneView();*/
mSurfaceView = findViewById(R.id.surfaceview);
// Create a bitmap the size of the scene view.
final Bitmap bitmap = Bitmap.createBitmap(mSurfaceView.getWidth(), mSurfaceView.getHeight(),
Bitmap.Config.ARGB_8888);
// Create a handler thread to offload the processing of the image.
final HandlerThread handlerThread = new HandlerThread("PixelCopier");
handlerThread.start();
// Make the request to copy.
PixelCopy.request(mSurfaceView, bitmap, (copyResult) -> {
if (copyResult == PixelCopy.SUCCESS) {
try {
saveBitmapToDisk(bitmap, filename);
} catch (IOException e) {
Toast toast = Toast.makeText(DrawAR.this, e.toString(),
Toast.LENGTH_LONG);
toast.show();
return;
}
Snackbar snackbar = Snackbar.make(findViewById(android.R.id.content),
"Photo saved", Snackbar.LENGTH_LONG);
snackbar.setAction("Open in Photos", v -> {
File photoFile = new File(filename);
Uri photoURI = FileProvider.getUriForFile(DrawAR.this,
DrawAR.this.getPackageName() + ".ar.codelab.name.provider",
photoFile);
Intent intent = new Intent(Intent.ACTION_VIEW, photoURI);
intent.setDataAndType(photoURI, "image/*");
intent.addFlags(Intent.FLAG_GRANT_READ_URI_PERMISSION);
startActivity(intent);
});
snackbar.show();
} else {
Log.d("DrawAR", "Failed to copyPixels: " + copyResult);
Toast toast = Toast.makeText(DrawAR.this,
"Failed to copyPixels: " + copyResult, Toast.LENGTH_LONG);
toast.show();
}
handlerThread.quitSafely();
}, new Handler(handlerThread.getLooper()));
}

how to store a bitmap into internal storage in android

I know it's a very basic question but i am stuck to resolve this problem. I am working on image-sketching mobile app i have done all the work now I just want to store a resulting bitmap-image into internal memory.I have created a method "mageStore()" for image-storing purposes please write code there. I will be very thankful to you.
`private class ImageProcessingTask extends AsyncTask<Bitmap, Void, Bitmap> {
private ProgressDialog abhanDialog = null;
private Bitmap returnedBitmap = null;
#Override
protected void onPreExecute() {
returnedBitmap = null;
abhanDialog = new ProgressDialog(AbhanActivity.this);
abhanDialog.setMessage(getString(R.string.please_wait));
abhanDialog.setCancelable(false);
abhanDialog.show();
}
#Override
protected Bitmap doInBackground(Bitmap... params) {
final Bitmap sketched = AbhanSketch.createSketch(params[0]);
final Bitmap gaussianBitmap = AbhanEffects.applyGaussianBlur(sketched);
final Bitmap sepiaBitmap = AbhanEffects.sepiaTonnedBitmap(gaussianBitmap, 151, 0.71,
0.71, 0.76);
returnedBitmap = AbhanEffects.sharpenBitmap(sepiaBitmap, 0.81);
return returnedBitmap;
}
#Override
protected void onPostExecute(Bitmap result) {
if (abhanDialog != null && abhanDialog.isShowing()) {
abhanDialog.cancel();
}
if (result != null) {
mImageView.setImageBitmap(result);
mImageView.buildDrawingCache();
bmap = mImageView.getDrawingCache();
storeImage(bmap);
isImage = false;
enableButton();
final boolean isFileDeleted = Utils.deleteFile(mPath);
if (DEBUG) {
android.util.Log.i(TAG, "File Deleted: " + isFileDeleted);
}
}
}
}
private void storeImage(Bitmap image) {
...please enter code here for image storing
}`
Here is your missing code inside a function
private void storeImage(Bitmap image) {
File sdcard = Environment.getExternalStorageDirectory() ;
File folder = new File(sdcard.getAbsoluteFile(), "YOUR_APP_DIRECTORY");
if(!folder.exists())
folder.mkdir();
File file = new File(folder.getAbsoluteFile(), "IMG_" + System.currentTimeMillis() + ".jpg") ;
if (file.exists())
file.delete();
try {
FileOutputStream out = new FileOutputStream(file);
bitmap = Bitmap.createScaledBitmap(bitmap, 400, (int) ( bitmap.getHeight() * (400.0 / bitmap.getWidth()) ) ,false);
bitmap.compress(Bitmap.CompressFormat.JPEG, 90, out);
out.flush();
out.close();
} catch (Exception e) {
e.printStackTrace();
}
}
You can omit or edit this line
bitmap = Bitmap.createScaledBitmap(bitmap, 400, (int) ( bitmap.getHeight() * (400.0 / bitmap.getWidth()) ) ,false);
according to your need.
in your function write the following code
String path = Saveme(image,"image_name.jpg");
//path contains the full path to directory where all your images get stored internaly lolz but privately
for gallery
Saveme(image,"my image","my image test for gallery save");
and the defination for the Saveme() function is following
private String Saveme(Bitmap bitmapImage, String img_name){
ContextWrapper cw = new ContextWrapper(getApplicationContext());
// path to /data/data/yourapp/app_data/imageDir
File directory = cw.getDir("imageDir", Context.MODE_PRIVATE);
// Create imageDir
File mypath=new File(directory,img_name);
FileOutputStream fos = null;
try {
fos = new FileOutputStream(mypath);
// Use the compress method on the BitMap object to write image to the OutputStream
bitmapImage.compress(Bitmap.CompressFormat.PNG, 100, fos);
} catch (Exception e) {
e.printStackTrace();
} finally {
try {
fos.close();
} catch (IOException e) {
e.printStackTrace();
}
}
return directory.getAbsolutePath();
}
in gallery images are displayed from media store so you need to store image in media store the following code can help you for this
public void Saveme(Bitmap b, String title, String dsc)
{
MediaStore.Images.Media.insertImage(getContentResolver(), b, title ,dsc);
}

When saving bitmap to disk, solid paths show artifacts

[Edit: I've made a minimal project to try to narrow down what's going on. The code at the bottom still generates the same artifacts when saved]
I have an app that draws simple 2D geometry using Paths. The shapes are all solid colors, sometimes with alpha < 255, and may be decorated with lines. In the View that draws the geometry, there has never been an issue with how things get drawn. However, when I use the same code to draw to a Bitmap, and then save it as either a JPEG (with 100 quality) or PNG, there is always the same artifacting in the solid-colored areas of the output files. It's a sort of mottling that is usually associated with JPEG compression.
Screenshot of View:
Saved image:
Zoom in on artifacts:
I have tried the following
Saving to either PNG and JPEG
Turning dithering and antialiasing on and off
Increasing the DPI of the Bitmap, and also allowed the Bitmap to use its default API
Applying the matrix I use as a camera to the geometric representation, instead of applying it to the Canvas for the bitmap
Turning HW Acceleration on and off app-wide
Using a 3rd party library to save the Bitmap to a .bmp file
All yield the same artifacts, neither making it worse nor better.
public class MainActivity extends AppCompatActivity {
Context context;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
this.context = getApplicationContext();
}
// button OnClick listener
public void saveImage(View view) {
new saveBitmapToDisk().execute(false);
}
public Bitmap getBitmap() {
final int bitmapHeight = 600, bitmapWidth = 600;
Bitmap bitmap = Bitmap.createBitmap(bitmapWidth, bitmapHeight, Bitmap.Config.ARGB_8888);
Canvas bitmapCanvas = new Canvas(bitmap);
float[] triangle = new float[6];
triangle[0] = bitmapWidth / 2;
triangle[1] = 0;
triangle[2] = 0;
triangle[3] = bitmapHeight / 2;
triangle[4] = bitmapWidth / 2;
triangle[5] = bitmapHeight / 2;
Path solidPath = new Path();
Paint solidPaint = new Paint();
solidPaint.setStyle(Paint.Style.FILL);
solidPath.moveTo(triangle[0], triangle[1]);
for(int i = 2; i < triangle.length; i += 2)
solidPath.lineTo(triangle[i], triangle[i+1]);
solidPath.close();
solidPaint.setColor(Color.GREEN);
bitmapCanvas.drawPath(solidPath, solidPaint);
return bitmap;
}
private class saveBitmapToDisk extends AsyncTask<Boolean, Integer, Uri> {
Boolean toShare;
#Override
protected Uri doInBackground(Boolean... shareFile) {
this.toShare = shareFile[0];
final String appName = context.getResources().getString(R.string.app_name);
final String IMAGE_SAVE_DIRECTORY = String.format("/%s/", appName);
final String fullPath = Environment.getExternalStorageDirectory().getAbsolutePath() + IMAGE_SAVE_DIRECTORY;
File dir, file;
try {
dir = new File(fullPath);
if (!dir.exists())
dir.mkdirs();
OutputStream fOut;
file = new File(fullPath, String.format("%s.png", appName));
for (int suffix = 0; file.exists(); suffix++)
file = new File(fullPath, String.format("%s%03d.png", appName, suffix));
file.createNewFile();
fOut = new FileOutputStream(file);
Bitmap saveBitmap = getBitmap();
saveBitmap.compress(Bitmap.CompressFormat.PNG, 100, fOut);
fOut.flush();
fOut.close();
MediaStore.Images.Media.insertImage(context.getContentResolver(), file.getAbsolutePath(), file.getName(), file.getName());
} catch (OutOfMemoryError e) {
Log.e("MainActivity", "Out of Memory saving bitmap; bitmap is too large");
return null;
} catch (Exception e) {
Log.e("MainActivity", e.getMessage());
return null;
}
return Uri.fromFile(file);
}
#Override
protected void onPostExecute(Uri uri) {
super.onPostExecute(uri);
Toast.makeText(context, "Image saved", Toast.LENGTH_SHORT).show();
}
}
}
I tested your program with PNG and the file has no artifacts
These artifacts are a result of JPEG compression
Edit:
The line
MediaStore.Images.Media.insertImage(context.getContentResolver(), file.getAbsolutePath(), file.getName(), file.getName());
was causing the conversion to jpeg.
The proper way to save the image is
ContentValues values = new ContentValues();
values.put(Images.Media.DATE_TAKEN, System.currentTimeMillis());
values.put(Images.Media.MIME_TYPE, "image/png");
values.put(MediaStore.MediaColumns.DATA, file.getAbsolutePath());
context.getContentResolver().insert(Images.Media.EXTERNAL_CONTENT_URI, values);
Here is my simplified test program that sends the generated file directly
public class Test2Activity extends Activity {
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
new saveBitmapToDisk().execute();
}
public Bitmap getBitmap() {
final int bitmapHeight = 600, bitmapWidth = 600;
Bitmap bitmap = Bitmap.createBitmap(bitmapWidth, bitmapHeight, Bitmap.Config.ARGB_8888);
Canvas bitmapCanvas = new Canvas(bitmap);
Paint solidPaint = new Paint(Paint.ANTI_ALIAS_FLAG);
solidPaint.setStyle(Paint.Style.FILL);
solidPaint.setColor(Color.RED);
bitmapCanvas.drawCircle(300, 300, 200, solidPaint);
return bitmap;
}
private class saveBitmapToDisk extends AsyncTask<Void, Void, Uri> {
Boolean toShare;
#Override
protected Uri doInBackground(Void... shareFile) {
Context context = Test2Activity.this;
try {
File file = new File(context.getExternalFilesDir(null), "test.png");
FileOutputStream fOut = new FileOutputStream(file);
Bitmap saveBitmap = getBitmap();
saveBitmap.compress(Bitmap.CompressFormat.PNG, 100, fOut);
fOut.flush();
fOut.close();
return Uri.fromFile(file);
} catch (OutOfMemoryError e) {
Log.e("MainActivity", "Out of Memory saving bitmap; bitmap is too large");
return null;
} catch (Exception e) {
Log.e("MainActivity", e.getMessage());
return null;
}
}
#Override
protected void onPostExecute(Uri uri) {
Context context = Test2Activity.this;
Toast.makeText(context, "Image saved", Toast.LENGTH_SHORT).show();
final Intent intent = new Intent(android.content.Intent.ACTION_SEND);
intent.setFlags(Intent.FLAG_ACTIVITY_NEW_TASK);
intent.putExtra(Intent.EXTRA_STREAM, uri);
intent.setType("image/png");
Test2Activity.this.startActivity(intent);
}
}
}
Artifacts like this are natural and unavoidable consequence of JPEG compression.
They should not crop up in PNG compression. If you are getting such artifacts when you create a PNG file, I'd wager that you are not creating a PNG stream at all, but rather a JPEG stream in a file with a PNG extension. No decent decoder relies on the file extension.
I noticed two things in your code:
1) The filename you save to is String.format("%s.jpg", appName) or String.format("%s%03d.png", appName, suffix) independent of the actual encoding.
2) The bitmap you save has its density determined by prefs.saveImageDensity().get() so it may not be the same as the actual density of the bitmap you see on the screen.
Maybe you confused yourself with 1) or perhaps 2) causes the compression-artefacts you're seeing?

How to get Alpha of a Bitmap?

I have two problems.
First I am changing the alpha of a Bitmap and saving it to an ImageView but whenever I am getting the Bitmap from the ImageView it is different to how it looks in the ImageView, the RGB values are Changed.
Second, I am wondering how to get alpha of a bitmap.
imageview=(ImageView)findViewById(R.id.image);
public Bitmap ColorDodgeBlend(Bitmap source, Bitmap layer,int alpha) {
Bitmap base = source.copy(Config.ARGB_8888, true);
Bitmap blend = layer.copy(Config.ARGB_8888, false);
IntBuffer buffBase = IntBuffer.allocate(base.getWidth() * base.getHeight());
base.copyPixelsToBuffer(buffBase);
buffBase.rewind();
IntBuffer buffBlend = IntBuffer.allocate(blend.getWidth() * blend.getHeight());
blend.copyPixelsToBuffer(buffBlend);
buffBlend.rewind();
IntBuffer buffOut = IntBuffer.allocate(base.getWidth() * base.getHeight());
buffOut.rewind();
while (buffOut.position() < buffOut.limit()) {
int filterInt = buffBlend.get();
int srcInt = buffBase.get();
int redValueFilter = Color.red(filterInt);
int greenValueFilter = Color.green(filterInt);
int blueValueFilter = Color.blue(filterInt);
int redValueSrc = Color.red(srcInt);
int greenValueSrc = Color.green(srcInt);
int blueValueSrc = Color.blue(srcInt);
int redValueFinal = colordodge(redValueFilter, redValueSrc);
int greenValueFinal = colordodge(greenValueFilter, greenValueSrc);
int blueValueFinal = colordodge(blueValueFilter, blueValueSrc);
int pixel = Color.argb(alpha, redValueFinal, greenValueFinal, blueValueFinal);
buffOut.put(pixel);
}
buffOut.rewind();
base.copyPixelsFromBuffer(buffOut);
blend.recycle();
return base;
};
bmp=ColorDodgeBlend(Bitmap source, Bitmap layer,alpha); imageview.setImageBitmap(bmp);
But when I try to save bitmap from ImageView, the rgb of saved bitmap is different to how it appears in the ImageView, changing alpha changes the value of rgb.
public Bitmap loadBitmapFromView(View v) {
Bitmap b = Bitmap.createBitmap(v.getWidth(), v.getHeight(),
Bitmap.Config.ARGB_8888);
Canvas c = new Canvas(b);
v.draw(c);
return b
}
bitmap b=loadBitmapFromView(imageview);
saveBitmap(b);
private void saveBitmap(Bitmap bmp) {
try {
File f = new File(Environment.getExternalStorageDirectory()
+ "/Pictures/SketchPhoto/");
f.mkdirs();
Date d = new Date();
CharSequence s = DateFormat
.format("MM-dd-yy hh-mm-ss", d.getTime());
fileName = s.toString() + ".jpeg";
String fullf = f + "/" + fileName;
FileOutputStream fos = new FileOutputStream(fullf);
bmp.compress(Bitmap.CompressFormat.JPEG, 100, fos);
fos.close();
Toast.makeText(getApplicationContext(), "Sketch Saved", 100).show();
} catch (Exception ex) {
ex.printStackTrace();
}
}
I have done some research and found that it only happens when that value of alpha is smaller than 255.

Android ZXing Get Barcode Image

I am using Zxing library to generate a barcode in my Android application
Intent intent = new Intent("com.google.zxing.client.android.ENCODE");
intent.putExtra("ENCODE_FORMAT", "UPC_A");
intent.putExtra("ENCODE_DATA", "55555555555");
startActivityForResult(intent,0);
Is there anyway to save the generated image in my application which is calling Zxing? I see that in my onActivityResult I get intent null.
Thanks in advance for your help
Take the views cache and save it in bitmap something like this
View myBarCodeView = view.getRootView()
//Else this might return null
myBarCodeView.setDrawingCacheEnabled(true)
//Save it in bitmap
Bitmap mBitmap = myBarCodeView.getDrawingCache()
OR
draw your own barcode or QR CODE
//Change the writers as per your need
private void generateQRCode(String data) {
com.google.zxing.Writer writer = new QRCodeWriter();
String finaldata =Uri.encode(data, "ISO-8859-1");
try {
BitMatrix bm = writer.encode(finaldata,BarcodeFormat.QR_CODE, 350, 350);
mBitmap = Bitmap.createBitmap(350, 350, Config.ARGB_8888);
for (int i = 0; i < 350; i++) {
for (int j = 0; j < 350; j++) {
mBitmap.setPixel(i, j, bm.get(i, j) ? Color.BLACK: Color.WHITE);
}
}
} catch (WriterException e) {
e.printStackTrace();
}
if (mBitmap != null) {
mImageView.setImageBitmap(mBitmap);
}
}
public void generateBarCode(String data){
com.google.zxing.Writer c9 = new Code128Writer();
try {
BitMatrix bm = c9.encode(data,BarcodeFormat.CODE_128,350, 350);
mBitmap = Bitmap.createBitmap(350, 350, Config.ARGB_8888);
for (int i = 0; i < 350; i++) {
for (int j = 0; j < 350; j++) {
mBitmap.setPixel(i, j, bm.get(i, j) ? Color.BLACK : Color.WHITE);
}
}
} catch (WriterException e) {
e.printStackTrace();
}
if (mBitmap != null) {
mImageView.setImageBitmap(mBitmap);
}
}
Once you get the bitmap image just save it
//create a file to write bitmap data
File f = new File(FilePath, FileName+".png");
f.createNewFile();
//Convert bitmap to byte array
ByteArrayOutputStream bos = new ByteArrayOutputStream();
ImageBitmap.compress(CompressFormat.PNG, 0, bos);
byte[] bytearray = bos.toByteArray();
//Write bytes in file
FileOutputStream fos = new FileOutputStream(f);
fos.write(bytearray);
fos.flush();
fos.close();
You can also check a small library from github that i had created to create Barcode or QR Code
GZxingEncoder Encoder = GZxingEncoder.getInstance();
Encoder.initalize(this);
//To generate bar code use this
Bitmap bitmap = Encoder.generateBarCode_general("some text")
It is not returned in the Intent right now. There's no way to get it. You could suggest a patch to make it be returned -- it is probably a couple days' work. Or try Girish's approach, which is just to embed the encoding directly.
To store the scanned image in ZXing, You have to override a method drawResultPoints in Class CaptureActivity.
String root = Environment.getExternalStorageDirectory().toString();
File myDir = new File(root);
myDir.mkdirs();
Random generator = new Random();
int n = 10000;
n = generator.nextInt(n);
String fname = "Image-"+ n +".jpg";
File file = new File (myDir, fname);
if (file.exists ()) file.delete ();
try {
FileOutputStream out = new FileOutputStream(file);
barcode.compress(Bitmap.CompressFormat.JPEG, 90, out);
out.flush();
out.close();
} catch (Exception e) {
Toast.makeText(getApplicationContext(), e.getMessage(), Toast.LENGTH_LONG).show();
}
This will saved the scanned image in the root directory of SD card, you can customize it to save it in any particular folder you need. The image it will be storing is the scanned image which appears as a ghost image while you scan.

Categories

Resources