I've created a camera with Android Studio and want to save the taken image to the gallery.
I use the Camera2 Api and don't really know how to save the picture.
Moreover, I don't know, where my photo gets stored. The App says: Saved: /storage/emulated/1.jpg.
Here is some code:
mFile = new File(Environment.getExternalStorageDirectory() + "1.jpg");
private final ImageReader.OnImageAvailableListener mOnImageAvailableListener
= new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable(ImageReader reader) {
mBackgroundHandler.post(new ImageSaver(reader.acquireNextImage(), mFile));
}
};
private static class ImageSaver implements Runnable {
/**
* The JPEG image
*/
private final Image mImage;
/**
* The file we save the image into.
*/
private final File mFile;
public ImageSaver(Image image, File file) {
mImage = image;
mFile = file;
}
#Override
public void run() {
ByteBuffer buffer = mImage.getPlanes()[0].getBuffer();
byte[] bytes = new byte[buffer.remaining()];
buffer.get(bytes);
FileOutputStream output = null;
try {
output = new FileOutputStream(mFile);
output.write(bytes);
} catch (IOException e) {
e.printStackTrace();
} finally {
mImage.close();
if (null != output) {
try {
output.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
}
}
The next problem is, that I don't know how to store more photos. In this case, 1.jpg is always overwritten.
To add your picture in the gallery :
private void galleryAddPic() {
Intent mediaScanIntent = new Intent(Intent.ACTION_MEDIA_SCANNER_SCAN_FILE);
File f = new File(mCurrentPhotoPath);
Uri contentUri = Uri.fromFile(f);
mediaScanIntent.setData(contentUri);
this.sendBroadcast(mediaScanIntent);
}
See here
To save multiple pictures generate a new name for every new picture (use the date and time or an UUID)
Related
I am trying get JPEG image from both camera parallel on Snapdragon 820 platform.
I not getting first camera Image callback. I only getting second camera JPEG callback.
Here is my code :
protected void takePictureBack() {
Log.d(TAG, "takePictureBack() called");
if (null == cameraDeviceBack) {
Log.e(TAG, "cameraDeviceBack is null");
return;
}
try {
final File file_back = new File(Environment.getExternalStorageDirectory() + "/pic_back.jpg");
final CaptureRequest.Builder captureBuilderBack = cameraDeviceBack.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
List<Surface> outputSurfaces = new ArrayList<Surface>(3);
outputSurfaces.add(new Surface(mTextureViewBack.getSurfaceTexture()));
ImageReader reader = ImageReader.newInstance(640, 480, ImageFormat.JPEG, 1);
outputSurfaces.add(reader.getSurface());
captureBuilderBack.addTarget(reader.getSurface());
ImageReader.OnImageAvailableListener readerListenerBack = new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable(ImageReader reader) {
Log.d(TAG, "onImageAvailable() called with: reader = [" + reader + "]");
if (reader.getImageFormat() == ImageFormat.JPEG) {
Log.d(TAG, "onImageAvailable() called with back: reader = JPEG");
Image image = null;
try {
image = reader.acquireLatestImage();
ByteBuffer buffer = image.getPlanes()[0].getBuffer();
byte[] bytes = new byte[buffer.capacity()];
buffer.get(bytes);
save(bytes);
} catch (IOException e) {
e.printStackTrace();
} finally {
if (image != null) {
image.close();
}
}
}
}
private void save(byte[] bytes) throws IOException {
OutputStream output = null;
try {
output = new FileOutputStream(file_back);
output.write(bytes);
} finally {
if (null != output) {
output.close();
}
}
}
};
reader.setOnImageAvailableListener(readerListenerBack, mBackgroundHandlerBack);
captureBuilderBack.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_AUTO);
final CameraCaptureSession.CaptureCallback captureListenerBack = new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureCompleted(CameraCaptureSession session, CaptureRequest request, TotalCaptureResult result) {
super.onCaptureCompleted(session, request, result);
if (DEBUG) Log.d(TAG, "onCaptureCompleted: take picture back successfully");
//Toast.makeText(getActivity(), "Take picture successfully", Toast.LENGTH_SHORT).show();
createCameraPreviewBack();
mCaptureResultBack = result;
}
};
cameraDeviceBack.createCaptureSession(outputSurfaces, new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(CameraCaptureSession session) {
try {
session.capture(captureBuilderBack.build(), captureListenerBack, mBackgroundHandlerBack);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
#Override
public void onConfigureFailed(CameraCaptureSession session) {
}
}, mBackgroundHandlerBack);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
Front camera capture code is also same.
Individual single camera JPEG capture work fine.
Any idea, why I am not getting bot JPEG Images callback ?
I find solution.We need take "ImageReader reader" as global variable.
By this change I am able to get two JPEG from both camera.
I have a chat app and I need to save the images that the user sends receives. For the images the user sends I am saving it to an image folder like so
private void imageDownload(final String url){
Picasso.with(getContext())
.load(url)
.into(getTarget(url));
}
//target to save
private static Target getTarget(final String url){
Target target = new Target(){
#Override
public void onBitmapLoaded(final Bitmap bitmap, Picasso.LoadedFrom from) {
new Thread(new Runnable() {
#Override
public void run() {
File heyJudeFile = new File(Environment.getExternalStorageDirectory().getPath() + "/Hey Jude");
if (!heyJudeFile.exists()){
heyJudeFile.mkdirs();
}
localProfilePictureAdress = Environment.getExternalStorageDirectory().getPath() + "/Hey Jude" + "/" + url.substring(url.lastIndexOf("/")+1);
File file = new File(localProfilePictureAdress);
try {
file.createNewFile();
FileOutputStream ostream = new FileOutputStream(file);
bitmap.compress(Bitmap.CompressFormat.JPEG, 80, ostream);
ostream.flush();
ostream.close();
} catch (IOException e) {
Log.e("IOException", e.getLocalizedMessage());
}
}
}).start();
}
#Override
public void onBitmapFailed(Drawable errorDrawable) {
}
#Override
public void onPrepareLoad(Drawable placeHolderDrawable) {
}
};
return target;
}
But for images the user receives I want to save the images but I do not want the images to be viewable in the gallery how does that work?
Thanks
Use below method to hide media from gallery
/* To Hide media file in gallery */
public void createNoMedia(String myDir){
File noMediaFile = new File(myDir, ".nomedia");
if (!noMediaFile.exists()) {
try {
noMediaFile.createNewFile();
} catch (IOException e) {
e.printStackTrace();
}
}
}
/* To use this */
dir = Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_PICTURES) + "/MyImages/";
File newDir = new File(dir);
newDir.mkdirs();
createNoMedia(dir);
and in manifest.xml
You also need the WRITE_EXTERNAL_STORAGE permission.
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
you can do it by editing name of your file and adding an unknown format like ".example" then it doesn't appear in gallery
I am using GPUImage android library to apply filters on camera preview and save the image with filters applied after take a picture. The problem is that when I took the pictute, I can't get the image with the filters.
I am using the following code:
#Override
public void onPictureTaken(byte[] data, Camera camera) {
Bitmap bitmap = BitmapFactory.decodeByteArray(data, 0, data.length);
mGPUImage.setImage(bitmap);
bitmap = mGPUImage.getBitmapWithFilterApplied();
saveImage(bitmap);
}
The sample code in GPUImage's library page (https://github.com/CyberAgent/android-gpuimage/#sample-code) says:
With preview:
#Override
public void onCreate(final Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity);
Uri imageUri = ...;
mGPUImage = new GPUImage(this);
mGPUImage.setGLSurfaceView((GLSurfaceView) findViewById(R.id.surfaceView));
mGPUImage.setImage(imageUri); // this loads image on the current thread, should be run in a thread ?? (can't understand this line)
mGPUImage.setFilter(new GPUImageSepiaFilter());
// Later when image should be saved saved:
mGPUImage.saveToPictures("GPUImage", "ImageWithFilter.jpg", null);
}
Even in their sample I can't save the image with filter.
Please somebody could explain it to me?
Use this code
///Call this method when you are ready to save image///
//I am calling on click of save button//
private void takePhoto() {
releaseCamera();
new AsyncTask<Void, Void, Void>() {
Bitmap bitmap;
#Override
protected void onPreExecute() {
super.onPreExecute();
try {
bitmap = gpuImageView.capture();
} catch (InterruptedException e) {
e.printStackTrace();
}
}
#Override
protected Void doInBackground(Void... params) {
File dir = Util.getCameraDirectory(); // directory where you want to save image
if (!dir.exists()) {
dir.mkdirs();
}
String filename = getString(R.string.app_name) + System.currentTimeMillis() + ".jpg";
File file = new File(dir, filename);
try {
FileOutputStream fileOutputStream = new FileOutputStream(file);
bitmap.compress(Bitmap.CompressFormat.JPEG, 90, fileOutputStream);
fileOutputStream.flush();
fileOutputStream.close();
Intent intent =
new Intent(Intent.ACTION_MEDIA_SCANNER_SCAN_FILE);
intent.setData(Uri.fromFile(file));
sendBroadcast(intent);
} catch (Exception exception) {
exception.printStackTrace();
}
return null;
}
#Override
protected void onPostExecute(Void aVoid) {
super.onPostExecute(aVoid);
prepareCamera();
Toast.makeText(MyActivity.this, R.string.msg_after_save, Toast.LENGTH_SHORT).show();
}
}.execute();
}
private void prepareCamera() {
camera = Camera.open(cameraId);
Camera.Parameters parameters = camera.getParameters();
Camera.Size size = getOptimalPreviewSize(camera.getParameters().getSupportedPreviewSizes(), getWindowManager().getDefaultDisplay().getWidth(), getWindowManager().getDefaultDisplay().getHeight());
parameters.setPreviewSize(size.width, size.height);
if (parameters.getSupportedFocusModes().contains(
Camera.Parameters.FOCUS_MODE_CONTINUOUS_PICTURE)) {
parameters
.setFocusMode(Camera.Parameters.FOCUS_MODE_CONTINUOUS_PICTURE);
}
parameters.setPreviewFormat(ImageFormat.NV21);
camera.setParameters(parameters);
Camera.CameraInfo info = new Camera.CameraInfo();
Camera.getCameraInfo(cameraId, info);
int orientation = getCameraDisplayOrientation(info);
boolean flipHorizontal = info.facing == Camera.CameraInfo.CAMERA_FACING_FRONT;
gpuImageView.getGPUImage().setUpCamera(camera, orientation,
flipHorizontal, false);
}
private void releaseCamera() {
if (camera != null) {
camera.stopPreview();
camera.setPreviewCallback(null);
camera.release();
camera = null;
}
}
Let me know if you face any further issue.
Also, check again for write permission in manifest
I'm trying to share image on Instagram in my app. I've url of the image and using Picasso library to download image.
Target target = new Target() {
#Override
public void onBitmapLoaded(final Bitmap bitmap, Picasso.LoadedFrom from) {
Log.d(TAG, "Bitmap Loaded");
File outputDir = getApplicationContext().getCacheDir(); // context being the Activity pointer
try {
File outputFile = File.createTempFile("instagram", "png", outputDir);
outputFile.createNewFile();
FileOutputStream ostream = new FileOutputStream(outputFile);
bitmap.compress(Bitmap.CompressFormat.JPEG,100,ostream);
ostream.close();
Log.d(TAG, "Image downloaded");
shareOnInstagram(Uri.fromFile(outputFile));
} catch (IOException e) {
e.printStackTrace();
}
}
#Override
public void onBitmapFailed(Drawable errorDrawable) {
}
#Override
public void onPrepareLoad(Drawable placeHolderDrawable) {
}
};
Picasso.with(this).load(imageUrl).into(target);
But onBitmapLoaded is never being called. Is there any other way to share image on Instagram from a url? The intent which share on Instagram takes Intent.EXTRA_STREAM parameter which should be a media path on device.
How do I convert an image from a url into that type?
Picasso only keeps weak reference to target, so in your case it will be garbage collected. As a result, onBitmapLoaded is not being called.
You should store strong reference to target (make target member of your class).
I using this approach
public class ShareToOtherApp extends AsyncTask<Bitmap, Void, Uri> {
#Override
protected Uri doInBackground(Bitmap... bitmaps) {
return bitmaps.length > 0 ? BitmaptoUri(bitmaps[0]) : null;
}
#Override
protected void onPostExecute(Uri uri) {
Intent shareIntent = new Intent();
shareIntent.setAction(Intent.ACTION_SEND);
if (uri != null) {
shareIntent.putExtra(Intent.EXTRA_STREAM, uri);
}
shareIntent.setType("image/*");
Intent chooserIntent = Intent.createChooser(shareIntent, "Share Image");
chooserIntent.addFlags(Intent.FLAG_ACTIVITY_NEW_TASK);
MyApp.GetContext().startActivity(chooserIntent);
}
public File GetSDCardDir(){
boolean ISSDCard;
File[] Dirs = ContextCompat.getExternalFilesDirs(MyApp.GetContext(), null);
ISSDCard = false;
for (File Dir : Dirs) {
if (Dir != null) {
if (Dir.getPath().contains("sdcard")) {
ISSDCard = true;
break;
}
}
}
File SDCardDir;
if(ISSDCard && Dirs[Dirs.length -1] != null){
SDCardDir = Dirs[Dirs.length -1];
}else{
SDCardDir = Dirs[0];
}
return SDCardDir;
}
public Uri BitmaptoUri(Bitmap bitmap){
Uri uri = null;
try {
File file = new File(GetSDCardDir() , HConstants.IMG_FILE_NAME + ".jpg");
if(file.getParentFile() != null){
file.getParentFile().mkdirs();
}else{
GetSDCardDir().mkdirs();
}
file.createNewFile();
FileOutputStream out = new FileOutputStream(file);
bitmap.compress(Bitmap.CompressFormat.JPEG, 90, out);
out.close();
uri = Uri.fromFile(file);
} catch (IOException e) {
e.printStackTrace();
}
return uri;
}
}
and finally for using from it.
new ShareToOtherApp().execute(bitmap);
I want to capture an image from custom camera and then display it on an other activity where i can add other png emoticons, i have this code but it doesnt show any photo after activity launches
Please see some changes which i have done in your code.
1. please call new Intent after taking photo and the pass the filepath as i do.
MainActivity :
ImageView mCaptureButton = (ImageView) findViewById(R.id.button_capture);
mCaptureButton.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
// get an image from the came
mCamPreview.captureImage(MainActivity.this);
//starting(); // don't call Intent here
}
});
}
CameraPreview class:
private Activity _activity ;
public void captureImage(Activity activity){
this._activity = activity;
mCamera.takePicture(null, null, mJpegCallback);
}
PictureCallback mJpegCallback = new PictureCallback() {
#Override
public void onPictureTaken(byte[] data, Camera camera) {
final Bitmap bitmap = BitmapFactory.decodeByteArray(data, 0, data.length);
// Store bitmap to local storage
FileOutputStream out = null;
try {
// Prepare file path to store bitmap
// This will create Pictures/MY_APP_NAME_DIR/
File mediaStorageDir = new File(Environment.getExternalStoragePublicDirectory(
Environment.DIRECTORY_PICTURES), "AVORI");
if (!mediaStorageDir.exists()) {
if (!mediaStorageDir.mkdirs()) {
Log.d(TAG, "failed to create directory");
return;
}
}
// Bitmap will be stored at /Pictures/MY_APP_NAME_DIR/YOUR_FILE_NAME.jpg
String filePath = mediaStorageDir.getPath() + File.separator + "Av.jpg";
out = new FileOutputStream(filePath);
bitmap.compress(Bitmap.CompressFormat.JPEG, 100, out);
//call EditPicture.class here
Intent i = new Intent(_activity, EditPicture.class);
i.putExtra("FILE_PATH", filePath );
startActivity(i);
} catch (Exception e) {
e.printStackTrace();
} finally {
try {
if (out != null) {
out.close();
}
} catch (IOException e) {
e.printStackTrace();
}
}
}
};