Photos captured from Android camera are completely black - android

I make a camera and try to capture a picture. Since the original data is YUV, I turn it into RGB using function:
static public void decodeYUV420SP(byte[] rgbBuf, byte[] yuv420sp,int width, int height)
However, the photo saved is completely black, there is no content in it.
I also found the following way:
mBitmap = BitmapFactory.decodeByteArray(data, 0, data.length);
but the project was shut down.
Are there any other effective ways to save a photo? Thank you!

An old post, but it speaks of a similar problem that I have so I might as well answer the part I know :)
You're probably doing it wrong. I suggest you use the JPEG callback to store the image:
mCamera.takePicture(null, null, callbackJPEG);
This way you will get JPEG data into the routine which you can store into a file unmodified:
final Camera.PictureCallback mCall = new Camera.PictureCallback()
{
#Override
public void onPictureTaken(byte[] data, Camera camera)
{
//Needs <uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
File sdCard = Environment.getExternalStorageDirectory();
File file = new File(sdCard, "pic.jpg");
fil = new FileOutputStream(file);
fil.write(data);
fil.close();
}
}
As far as the black picture goes, I have found that placing a simple Thread.sleep(250) between camera.startPreview() and camera.takePicture() takes care of that particular problem on my Galaxy Nexus.
I have no idea why this delay is necessary. Even if I add camera.setOneShotPreviewCallback() and call camera.takePicture() from the callback, the image comes out black if I don't first delay...
Oh, and the delay is not just "some" delay. It has to be some pretty long value. For example, 250ms sometimes works, sometimes not on my phone.

The complete black photo is a result of immediate call to mCamera.takePicture() after calling mCamera.startPreview(). Android should be given appropriate time to process its autofocus activity before taking the actual picture. The blackness is result of erratic exposure caused due to interruption while the autofocus was happening.
I recommend calling mCamera.autoFocus() right after mCamera.startPreview().
The mCamera.takePicture() should be called in the callback function of the autofocus function call.
This flow ensures that the picture is taken after the autofocus is complete and removes blackness or exposure issues from the image taken.
The delay mentioned in Velis' answer works for some devices because those devices complete autofocus activity. Ensuring proper callback flow removes this arbitrary delay and would work on every device.

I solved this issue using following argument:
final CaptureRequest.Builder captureBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
When I was using TEMPLATE_STILL_CAPTURE instead of TEMPLATE_PREVIEW, which was capturing my image as full black image. This thing works in my case.

Related

Take a picture on android camera without click on button or screen

I want to receive a code through SMS from other device in BroadcastReciever and on that code receive open camera activity immediately and take a picture of surrounding and then close camera automatically if any one understand me help me.
NO
That would be a security breach !
Android's camera API doesn't allow photos to be taken automatically without the knowledge of the end user.
However, you could write your own custom camera App and try to build this feature on top of it.
Try
camera.takePicture (null, null, pictureCallback);
private PictureCallback pictureCallback = new PictureCallback () {
#Override
public void onPictureTaken (final byte[] data, final Camera camera) {
}
};
you can set particular code for message when that message comes on recieve of that message call method to take photo that you want.

Change "Exif" data between onPictureTaken and write()?

I am developing an app for a custom android device. It is still early in the development and it is possible that the camera may physically be rotated 90 degrees to the rest of the device. This means that there is scope for great confusion between portrait and landscape for any images it takes. For this reason I would like absolute control over the Exif data in any images that the camera takes. The portrait vs landscape information in the camera parameters may be incorrect. For this reason I would like to be able to force a change in the Exif data inside onPictureTaken, before the image is saved. Is this possible, and if so how?
I am struggling because examples of playing with exif data seem to either work by changing camera parameters, or by working on an already saved file - so that's either too early or too late!
public void onPictureTaken(byte[] jpg_data, Camera camera)
{
// can I change exif data here?
try
{
FileOutputStream buf = new FileOutputStream(filename);
buf.write(jpg_data);
//... etc.
EDIT: Maybe I am misunderstanding something here... is there Exif data already contained within the jpg_data that gets passed to onPictureTaken? Or is it optionally added?
The standard way of writing exif data in Android is by using the ExifInterface which sadly only works on files that have already been written to disk.
If you wish to do the exif write without using a library, then you would have to do it after your FileOutputStream is finished writing the file.
If you don't mind using a library, Sanselan (http://commons.apache.org/proper/commons-imaging/) might have the ability to do it to a byte[] array, but the documentation is pretty limited.

What is the use of callbacks in takepicture()

I am new to android and camera,so not able to understand clearly.
Why is JPEGcallback and Raw Picture callback required in camera takePicture?And how these callBacks are handled?
Please help.Thanks
This is asyc callback to notify that image data is ready after picture is taken.
So i.e. if you want to store imagedata to jpeg, then use bitmapfactory etc to store in desire format.
similarly if you want to take another picture you should not until this app notify by callback.

Display an image taken by own Camera in the same Activity

In my code I have my own camera and it works and saves the pictures properly.
First I call:
myCamera.takePicture(null, null, jpegCallBack);
Then I want to save the picture and display it in the same activity I'm now.
So in the jpegCallBack I have (fullPic is my ImageView):
try {
FileOutputStream fos = new FileOutputStream(pictureFile);
fos.write(data);
fos.close();
Media.getBitmap(getContentResolver(), Uri.fromFile(pictureFile) );
String image_string = Uri.fromFile(pictureFile).toString();
bm = BitmapFactory.decodeFile(image_string);
fullPic.setImageBitmap(bm);
} catch ...
But it doesn't work, the first time I press the Shot Button, it saves the file and continues displaying the camera. Sometimes when I take the second one app stops.
I've also tried with this:
fullPic.setImageURI(Uri.fromFile(pictureFile));
and it says: "Bitmap too large to be uploaded into a texture (4608x2592, max=4096x4096)"
I can reduce the size of the displayed image if it is necessary (but not the saved one), but I don't know how to do it.
In the AndroidManifest I have permission for both WRITE and READ.
My XML is like this:
<LL>
<RL>
<FrameLayout></FL> for the Camera Preview
<Button></> To shot the picture
<ImageView></> Here I want to show the picture taken
</RL>
</LL>
Why doesn't it work? Any ideas to display it? I have been looking another stackoverflow links but I don't find the solution. If you need more data/code, please, let me know, I tried to summarize to make it the clearest possible.
Thank you!
Edit:
This is the call of jpegCallBack:
private PictureCallback jpegCallBack = new PictureCallback() {
#Override
public void onPictureTaken(byte[] data, Camera camera)
Maybe I can so the image directly from that data in some way. Can I?
I tried this way but it breaks:
bm = BitmapFactory.decodeByteArray(data, 0, data.length);
fullPic.setImageBitmap(bm);
LogCat: On the blue line I press the shot button
NullPointerException, is it possible that it occurs because I'm using a Samsung? In this thread they say that Samsung has problems with this: Capture Image from Camera and Display in Activity
You're not getting NullPointerException, but out of memory error.
There is a series of very helpful articles on how to load bitmap efficiently. The only thing you need to know is the target display size so you will know to which size to scale down.
If the app is crashing on a 5MB allocation it means you are already at the peek of your memory usage. Please check this Android presentation on memory management.

Summary: Take a picture utilizing Camera Intent and display the photo with correct orientation (works on hopefully all devices)

It seems to be the simplest thing in the world: taking a picture within your Android app using the default camera activity. However, there are many pitfalls which are covered in several posts across StackOverflow and the web as, for instance, Null Intents being passed back, the orientation of the picture not being correct or OutOfMemoryErrors.
I'm looking for a solution that allows me to
start the camera activity via the camera intent,
retrieve the Uri of the photo, and
retrieve the correct orientation of the photo.
Moreover, I would like to avoid a device configuration (manufacturer, model, os version) specific implementation as far as possible. So I'm wondering: what is the best way to achieve this?
UPDATE: January 2nd, 2014:
I tried really hard to avoid implementing different strategies based on the device manufacturer. Unfortunately, I did not get around it. Going through hundreds of posts and talking to several developers, nobody found a solution that works on all devices without implementing device manufacturer specific code.
After I posted my solution here on StackOverflow, some developers asked me to publish my code on github. So here it is now: AndroidCameraUtil on github
The code was successfully tested on a wide variety of devices with Android API-Level >= 8. For a complete list, please see the Readme file on github.
The CameraIntentHelperActivity provides the main functionality, which is also described in more detail in the following.
Calling the default camera activity:
for Samsung and Sony devices: I call the camera activity with the method call to startActivityForResult. I only set the constant CAPTURE_IMAGE_ACTIVITY_REQUEST_CODE. I do NOT set any other intent extras.
for all other devices: I call the camera activity with the method call to startActivityForResult as previously. This time, however, I additionally set the intent extra MediaStore.EXTRA_OUTPUT and provide an URI, where I want the image to be stored.
In both cases I remember the time the camera activity was started.
On camera activity result:
Mediastore: First, I try to read the photo being captured from the MediaStore. Using a mangedQuery on the MediaStore content, I retrieve the latest image being taken, as well as its orientation property and its timestamp. If I find an image and it was not taken before the camera intent was called, it is the image I was looking for. Otherwise, I dismiss the result and try one of the following approaches.
Intent extra: Second, I try to get an image Uri from intent.getData() of the returning intent. If this is not successful either, I continue with step 3.
Default photo Uri: If all of the above mentioned steps did not work, I use the image Uri I passed to the camera activity.
At this point, I retrieved the photo Uri and its orientation which I pass to my UploadPhotoActivity.
Image processing
Please take a close look at my BitmapHelper class. It is based on the code described in detail in that tutorial.
Moreover, the shrinkBitmap method also rotates the image if required based on the orientation information extracted earlier.
I hope this is helpful to some of you.
I have tested this code with a Sony Xperia Go, Samsung Galaxy SII, Samsung Galaxy SIII mini and a Samsung Galaxy Y it worked on all devices!
But on the LG E400 (2.3.6) it didn’t work and you get double pictures in the gallery. So i have added the manufacturer.contains("lge") in the void startCameraIntent() and it fixed the problem.
if(!(manufacturer.contains("samsung")) && !(manufacturer.contains("sony")) && !(manufacturer.contains("lge"))) {
String filename = System.currentTimeMillis() + ".jpg";
ContentValues values = new ContentValues();
values.put(MediaStore.Images.Media.TITLE, filename);
cameraPicUri = getContentResolver().insert(MediaStore.Images.Media.EXTERNAL_CONTENT_URI, values);
intent.putExtra(MediaStore.EXTRA_OUTPUT, cameraPicUri);
}
On a Galaxy S3 with CM 10.1 I get a nullpointer exception in BitmapHelper:
bm = BitmapFactory.decodeFileDescriptor(fileDescriptor.getFileDescriptor(), null, options);
subsequently my UploadPhotoActivity fails at:
try {
photo = BitmapHelper.readBitmap(this, cameraPicUri);
if (photo != null) {
photo = BitmapHelper.shrinkBitmap(photo, 600, rotateXDegrees);
thumbnail = BitmapHelper.shrinkBitmap(photo, 100);
ImageView imageView = (ImageView) findViewById(R.id.sustainable_action_photo);
imageView.setImageBitmap(photo);
} else {
Log.e(TAG,"IMAGE ERROR 1");
}
} catch (Exception e) {
Log.e(TAG,"IMAGE ERROR 2");
e.printStackTrace();
}
at the second log (IMAGE ERROR 2).
After a couple of tries my camera broke and I got a "Could not connect to camera"-error.
Tested it on a nexus 7 and it works perfectly.
Edit: Narrowed it down to this:
fileDescriptor = context.getContentResolver().openAssetFileDescriptor(selectedImage, "r");
Although selectedImage contains this:
file:///storage/emulated/0/DCIM/Camera/IMG_20131023_183343.jpg
The fileDescriptor returns a FileNotFoundException. I checked the file system and the image is not saved at this location. The cameraPicUri in TakePhotoActivity points to a non existant image. I am currently checking where it all goes wrong.
Edit2: I figured out the error: Since the device is a Samsung, and tells the App that it is a Samsung device, your Samsung specific fixes are applied. Cyanogenmod does not need those fixes though, and in the end the code breaks. Once you remove
(manufacturer.contains("samsung")) &&
It works. Since this is a custom ROM you could not plan for that of course. I am trying to figure out a way to detect if the device is running cyanogenmod and then include this in your code.
Thanks for a nice camera fix!
Edit3: I fixed it to run on Cyanogenmod on the Galaxy S3 by changing your code to this:
Well, now it sometimes works, sometimes it does not. Strange.
if (getPackageManager().hasSystemFeature("com.cyanogenmod.android") || (!(manufacturer.contains("samsung")) && !(manufacturer.contains("sony")) && !(manufacturer.contains("lge"))))
I experience some problems when using this with Sony Xperia Z5.
I added this and it got a lot better.
if (buildType.contains("sony")&& buildDevice.contains("e5823")) {
setPreDefinedCameraUri = true;}
But 4 times out of 22 it restarted the camera and once it restarted two times. I restarted the App due to every test.
Is there some way to get around this or do I accept this result?
The thing is that if the camera restarts I can press the back button twice and boom, the image is there in my Imageview and saved

Categories

Resources