I want to receive a code through SMS from other device in BroadcastReciever and on that code receive open camera activity immediately and take a picture of surrounding and then close camera automatically if any one understand me help me.
NO
That would be a security breach !
Android's camera API doesn't allow photos to be taken automatically without the knowledge of the end user.
However, you could write your own custom camera App and try to build this feature on top of it.
Try
camera.takePicture (null, null, pictureCallback);
private PictureCallback pictureCallback = new PictureCallback () {
#Override
public void onPictureTaken (final byte[] data, final Camera camera) {
}
};
you can set particular code for message when that message comes on recieve of that message call method to take photo that you want.
I am developing an app for a custom android device. It is still early in the development and it is possible that the camera may physically be rotated 90 degrees to the rest of the device. This means that there is scope for great confusion between portrait and landscape for any images it takes. For this reason I would like absolute control over the Exif data in any images that the camera takes. The portrait vs landscape information in the camera parameters may be incorrect. For this reason I would like to be able to force a change in the Exif data inside onPictureTaken, before the image is saved. Is this possible, and if so how?
I am struggling because examples of playing with exif data seem to either work by changing camera parameters, or by working on an already saved file - so that's either too early or too late!
public void onPictureTaken(byte[] jpg_data, Camera camera)
{
// can I change exif data here?
try
{
FileOutputStream buf = new FileOutputStream(filename);
buf.write(jpg_data);
//... etc.
EDIT: Maybe I am misunderstanding something here... is there Exif data already contained within the jpg_data that gets passed to onPictureTaken? Or is it optionally added?
The standard way of writing exif data in Android is by using the ExifInterface which sadly only works on files that have already been written to disk.
If you wish to do the exif write without using a library, then you would have to do it after your FileOutputStream is finished writing the file.
If you don't mind using a library, Sanselan (http://commons.apache.org/proper/commons-imaging/) might have the ability to do it to a byte[] array, but the documentation is pretty limited.
I am new to android and camera,so not able to understand clearly.
Why is JPEGcallback and Raw Picture callback required in camera takePicture?And how these callBacks are handled?
Please help.Thanks
This is asyc callback to notify that image data is ready after picture is taken.
So i.e. if you want to store imagedata to jpeg, then use bitmapfactory etc to store in desire format.
similarly if you want to take another picture you should not until this app notify by callback.
In my code I have my own camera and it works and saves the pictures properly.
First I call:
myCamera.takePicture(null, null, jpegCallBack);
Then I want to save the picture and display it in the same activity I'm now.
So in the jpegCallBack I have (fullPic is my ImageView):
try {
FileOutputStream fos = new FileOutputStream(pictureFile);
fos.write(data);
fos.close();
Media.getBitmap(getContentResolver(), Uri.fromFile(pictureFile) );
String image_string = Uri.fromFile(pictureFile).toString();
bm = BitmapFactory.decodeFile(image_string);
fullPic.setImageBitmap(bm);
} catch ...
But it doesn't work, the first time I press the Shot Button, it saves the file and continues displaying the camera. Sometimes when I take the second one app stops.
I've also tried with this:
fullPic.setImageURI(Uri.fromFile(pictureFile));
and it says: "Bitmap too large to be uploaded into a texture (4608x2592, max=4096x4096)"
I can reduce the size of the displayed image if it is necessary (but not the saved one), but I don't know how to do it.
In the AndroidManifest I have permission for both WRITE and READ.
My XML is like this:
<LL>
<RL>
<FrameLayout></FL> for the Camera Preview
<Button></> To shot the picture
<ImageView></> Here I want to show the picture taken
</RL>
</LL>
Why doesn't it work? Any ideas to display it? I have been looking another stackoverflow links but I don't find the solution. If you need more data/code, please, let me know, I tried to summarize to make it the clearest possible.
Thank you!
Edit:
This is the call of jpegCallBack:
private PictureCallback jpegCallBack = new PictureCallback() {
#Override
public void onPictureTaken(byte[] data, Camera camera)
Maybe I can so the image directly from that data in some way. Can I?
I tried this way but it breaks:
bm = BitmapFactory.decodeByteArray(data, 0, data.length);
fullPic.setImageBitmap(bm);
LogCat: On the blue line I press the shot button
NullPointerException, is it possible that it occurs because I'm using a Samsung? In this thread they say that Samsung has problems with this: Capture Image from Camera and Display in Activity
You're not getting NullPointerException, but out of memory error.
There is a series of very helpful articles on how to load bitmap efficiently. The only thing you need to know is the target display size so you will know to which size to scale down.
If the app is crashing on a 5MB allocation it means you are already at the peek of your memory usage. Please check this Android presentation on memory management.
It seems to be the simplest thing in the world: taking a picture within your Android app using the default camera activity. However, there are many pitfalls which are covered in several posts across StackOverflow and the web as, for instance, Null Intents being passed back, the orientation of the picture not being correct or OutOfMemoryErrors.
I'm looking for a solution that allows me to
start the camera activity via the camera intent,
retrieve the Uri of the photo, and
retrieve the correct orientation of the photo.
Moreover, I would like to avoid a device configuration (manufacturer, model, os version) specific implementation as far as possible. So I'm wondering: what is the best way to achieve this?
UPDATE: January 2nd, 2014:
I tried really hard to avoid implementing different strategies based on the device manufacturer. Unfortunately, I did not get around it. Going through hundreds of posts and talking to several developers, nobody found a solution that works on all devices without implementing device manufacturer specific code.
After I posted my solution here on StackOverflow, some developers asked me to publish my code on github. So here it is now: AndroidCameraUtil on github
The code was successfully tested on a wide variety of devices with Android API-Level >= 8. For a complete list, please see the Readme file on github.
The CameraIntentHelperActivity provides the main functionality, which is also described in more detail in the following.
Calling the default camera activity:
for Samsung and Sony devices: I call the camera activity with the method call to startActivityForResult. I only set the constant CAPTURE_IMAGE_ACTIVITY_REQUEST_CODE. I do NOT set any other intent extras.
for all other devices: I call the camera activity with the method call to startActivityForResult as previously. This time, however, I additionally set the intent extra MediaStore.EXTRA_OUTPUT and provide an URI, where I want the image to be stored.
In both cases I remember the time the camera activity was started.
On camera activity result:
Mediastore: First, I try to read the photo being captured from the MediaStore. Using a mangedQuery on the MediaStore content, I retrieve the latest image being taken, as well as its orientation property and its timestamp. If I find an image and it was not taken before the camera intent was called, it is the image I was looking for. Otherwise, I dismiss the result and try one of the following approaches.
Intent extra: Second, I try to get an image Uri from intent.getData() of the returning intent. If this is not successful either, I continue with step 3.
Default photo Uri: If all of the above mentioned steps did not work, I use the image Uri I passed to the camera activity.
At this point, I retrieved the photo Uri and its orientation which I pass to my UploadPhotoActivity.
Image processing
Please take a close look at my BitmapHelper class. It is based on the code described in detail in that tutorial.
Moreover, the shrinkBitmap method also rotates the image if required based on the orientation information extracted earlier.
I hope this is helpful to some of you.
I have tested this code with a Sony Xperia Go, Samsung Galaxy SII, Samsung Galaxy SIII mini and a Samsung Galaxy Y it worked on all devices!
But on the LG E400 (2.3.6) it didn’t work and you get double pictures in the gallery. So i have added the manufacturer.contains("lge") in the void startCameraIntent() and it fixed the problem.
if(!(manufacturer.contains("samsung")) && !(manufacturer.contains("sony")) && !(manufacturer.contains("lge"))) {
String filename = System.currentTimeMillis() + ".jpg";
ContentValues values = new ContentValues();
values.put(MediaStore.Images.Media.TITLE, filename);
cameraPicUri = getContentResolver().insert(MediaStore.Images.Media.EXTERNAL_CONTENT_URI, values);
intent.putExtra(MediaStore.EXTRA_OUTPUT, cameraPicUri);
}
On a Galaxy S3 with CM 10.1 I get a nullpointer exception in BitmapHelper:
bm = BitmapFactory.decodeFileDescriptor(fileDescriptor.getFileDescriptor(), null, options);
subsequently my UploadPhotoActivity fails at:
try {
photo = BitmapHelper.readBitmap(this, cameraPicUri);
if (photo != null) {
photo = BitmapHelper.shrinkBitmap(photo, 600, rotateXDegrees);
thumbnail = BitmapHelper.shrinkBitmap(photo, 100);
ImageView imageView = (ImageView) findViewById(R.id.sustainable_action_photo);
imageView.setImageBitmap(photo);
} else {
Log.e(TAG,"IMAGE ERROR 1");
}
} catch (Exception e) {
Log.e(TAG,"IMAGE ERROR 2");
e.printStackTrace();
}
at the second log (IMAGE ERROR 2).
After a couple of tries my camera broke and I got a "Could not connect to camera"-error.
Tested it on a nexus 7 and it works perfectly.
Edit: Narrowed it down to this:
fileDescriptor = context.getContentResolver().openAssetFileDescriptor(selectedImage, "r");
Although selectedImage contains this:
file:///storage/emulated/0/DCIM/Camera/IMG_20131023_183343.jpg
The fileDescriptor returns a FileNotFoundException. I checked the file system and the image is not saved at this location. The cameraPicUri in TakePhotoActivity points to a non existant image. I am currently checking where it all goes wrong.
Edit2: I figured out the error: Since the device is a Samsung, and tells the App that it is a Samsung device, your Samsung specific fixes are applied. Cyanogenmod does not need those fixes though, and in the end the code breaks. Once you remove
(manufacturer.contains("samsung")) &&
It works. Since this is a custom ROM you could not plan for that of course. I am trying to figure out a way to detect if the device is running cyanogenmod and then include this in your code.
Thanks for a nice camera fix!
Edit3: I fixed it to run on Cyanogenmod on the Galaxy S3 by changing your code to this:
Well, now it sometimes works, sometimes it does not. Strange.
if (getPackageManager().hasSystemFeature("com.cyanogenmod.android") || (!(manufacturer.contains("samsung")) && !(manufacturer.contains("sony")) && !(manufacturer.contains("lge"))))
I experience some problems when using this with Sony Xperia Z5.
I added this and it got a lot better.
if (buildType.contains("sony")&& buildDevice.contains("e5823")) {
setPreDefinedCameraUri = true;}
But 4 times out of 22 it restarted the camera and once it restarted two times. I restarted the App due to every test.
Is there some way to get around this or do I accept this result?
The thing is that if the camera restarts I can press the back button twice and boom, the image is there in my Imageview and saved