writing a camera using app on android [closed] - android

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking for code must demonstrate a minimal understanding of the problem being solved. Include attempted solutions, why they didn't work, and the expected results. See also: Stack Overflow question checklist
Closed 9 years ago.
Improve this question
I'm new to android programming but have tried several ways to use the camera app.
I want a photo to be taken every 5 minutes and sent to my server, but each method I've tried ends up supplying me with an app that gives me the built in camera app that expects me to press the shutter. I need this automated
The 'wrapper' in Cordova does this. The example on the android developer pages does this and I have the suspicion as I work through the Android programming book that the camera pp example will do the same.

//somewhere in your code call this (Maybe you need to set up a timer):
mCamera.takePicture(null, null, mCall);
//this should be done before using camera
Camera mCamera;
private boolean safeCameraOpen(int id) {
boolean qOpened = false;
try {
releaseCamera();
mCamera = Camera.open(id);
qOpened = (mCamera != null);
} catch (Exception e) {
Log.e(getString(R.string.app_name), "failed to open Camera");
e.printStackTrace();
}
return qOpened;
}
private void releaseCamera() {
if (mCamera != null) {
mCamera.release();
mCamera = null;
}
}
Camera.PictureCallback mCall = new Camera.PictureCallback() {
// you need to change this method due to your needs
public void onPictureTaken(byte[] data, Camera camera) {
//decode the data obtained by the camera into a Bitmap
FileOutputStream outStream = null;
try {
outStream = new FileOutputStream("/sdcard/Image.jpg");
outStream.write(data);
outStream.close();
} catch (FileNotFoundException e){
Log.d("CAMERA", e.getMessage());
} catch (IOException e){
Log.d("CAMERA", e.getMessage());
}
}
};

Related

Capture image in service without preview android

Problem statement:
When someone tries to open the device with the wrong pattern / PIN, my application should trigger an alarm, send an alert SMS to the registered mobile number. AND it should capture the image of the person trying to unlock the device, and send this image to registered Email ID.
What I have achieved:
I am getting notification of wrong Pattern / PIN in my DeviceAdmin class.
I start the service for background tasks. This service plays alarm successfully.
I send an alert SMS to the registered mobile number.
I sent an alert Email to the registered email ID successfully. (BUT without image.)
I am confused about how can I capture image in background IntentService when the device is locked, and that, too without preview.
I cannot use the Camera intent obviously. Because startActivityForResult can't be called from Service. PLUS, I don't want user to capture the image after opening the Camera app.
My research led me to these posts already.
Can I use Android Camera in service without preview?
How to Capture Image When Device is Locked
Issue is:
Camera API is deprecate`. Camera2 API requires Minimum sdk version 21,
but my client's requirement is minSdkVersion 15, which I can't change. I am unable to figure out what should I do now. Any reference or help please?
I solved my issue with help of this blog
So I capture the image within background service using this code:
#Override
public void onStart(Intent intent, int startId) {
mCamera = getAvailableFrontCamera(); // globally declared instance of camera
if (mCamera == null){
mCamera = Camera.open(); //Take rear facing camera only if no front camera available
}
SurfaceView sv = new SurfaceView(getApplicationContext());
SurfaceTexture surfaceTexture = new SurfaceTexture(10);
try {
mCamera.setPreviewTexture(surfaceTexture);
//mCamera.setPreviewDisplay(sv.getHolder());
parameters = mCamera.getParameters();
//set camera parameters
mCamera.setParameters(parameters);
//This boolean is used as app crashes while writing images to file if simultaneous calls are made to takePicture
if(safeToCapture) {
mCamera.startPreview();
mCamera.takePicture(null, null, mCall);
}
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
//Get a surface
sHolder = sv.getHolder();
//tells Android that this surface will have its data constantly replaced
sHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
}
Camera.PictureCallback mCall = new Camera.PictureCallback()
{
public void onPictureTaken(byte[] data, Camera camera)
{
safeToCapture = false;
//decode the data obtained by the camera into a Bitmap
FileOutputStream outStream = null;
try{
// create a File object for the parent directory
File myDirectory = new File(Environment.getExternalStorageDirectory()+"/Test");
// have the object build the directory structure, if needed.
myDirectory.mkdirs();
//SDF for getting current time for unique image name
SimpleDateFormat curTimeFormat = new SimpleDateFormat("ddMMyyyyhhmmss");
String curTime = curTimeFormat.format(new java.util.Date());
// create a File object for the output file
outStream = new FileOutputStream(myDirectory+"/user"+curTime+".jpg");
outStream.write(data);
outStream.close();
mCamera.release();
mCamera = null;
String strImagePath = Environment.getExternalStorageDirectory()+"/"+myDirectory.getName()+"/user"+curTime+".jpg";
sendEmailWithImage(strImagePath);
Log.d("CAMERA", "picture clicked - "+strImagePath);
} catch (FileNotFoundException e){
Log.d("CAMERA", e.getMessage());
} catch (IOException e){
Log.d("CAMERA", e.getMessage());
}
safeToCapture = true; //Set a boolean to true again after saving file.
}
};
private Camera getAvailableFrontCamera (){
int cameraCount = 0;
Camera cam = null;
Camera.CameraInfo cameraInfo = new Camera.CameraInfo();
cameraCount = Camera.getNumberOfCameras();
for (int camIdx = 0; camIdx < cameraCount; camIdx++) {
Camera.getCameraInfo(camIdx, cameraInfo);
if (cameraInfo.facing == Camera.CameraInfo.CAMERA_FACING_FRONT) {
try {
cam = Camera.open(camIdx);
} catch (RuntimeException e) {
Log.e("CAMERA", "Camera failed to open: " + e.getLocalizedMessage());
}
}
}
return cam;
}
//Send Email using javamail API as user will not be allowed to choose available
// application using a Chooser dialog for intent.
public void sendEmailWithImage(String imageFile){
.
.
.
}
Following uses-features and permissions will be needed for this in manifest file:
<uses-feature
android:name="android.hardware.camera"
android:required="false" />
<uses-feature
android:name="android.hardware.camera.front"
android:required="false" />
<uses-permission android:name="android.permission.SEND_MAIL" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.CAMERA" />
I have set property required to false, so that user will be able to install my app even without any of the Camera available. May be this can help someone in future.

my camera do not work on my new device (nullpointerexcepetion) [duplicate]

This question already has answers here:
What is a NullPointerException, and how do I fix it?
(12 answers)
Closed 5 years ago.
after i changed my device my camera suddently stopped working and throwing out a
java.lang.NullPointerException: Attempt to invoke virtual method 'android.hardware.Camera$Parameters android.hardware.Camera.getParameters()' on a null object reference
i had tryed to change out my catch (RuntimeException ex) to finaly
but that instead gaved me a
Fail to connect to camera service” exception
public void surfaceCreated(SurfaceHolder surfaceHolder) {
String PreviewFPS;
String Previewsize;
String Displayor;
PreviewFPS = setingPreferences.getString("previewfps", "");
Previewsize = setingPreferences.getString("screensize", "");
Displayor = setingPreferences.getString("orientation", "");
String[] size = Previewsize.split(",");
try {
camera = Camera.open();
} catch (RuntimeException ex) {
}
Camera.Parameters parameters;
parameters = camera.getParameters();
//modificer parameterene
parameters.setPreviewFrameRate(Integer.parseInt(PreviewFPS));
parameters.setPreviewSize(Integer.parseInt(size[0]),Integer.parseInt(size[1]));
camera.setParameters(parameters);
parameters.setFocusMode(Camera.Parameters.FOCUS_MODE_CONTINUOUS_PICTURE);
camera.setDisplayOrientation(Integer.parseInt(Displayor));
try {
camera.setPreviewDisplay(surfaceHolder);
camera.startPreview();
} catch (Exception ex) {
}
}
manifest
<uses-permission android:name="android.permission.CAMERA" />
<uses-feature android:name="android.hardware.camera" />
<uses-feature android:name="android.hardware.camera.autofocus" />
the manufactor of the device had investigated the code and told me that the reason was due to it couldn't run the scanner and camera at once
Hi Kewin
 
Would you please share the source code to me? as we haven't tested if it can run both the camera and scanner
 
We checked, the scanner and camera can not be opened in same time.
 
Best Regards,
so my fix for this was by having my camera and scanner activity running in 2 separate activites and layouts hopefully this would help someone else
You need all the camera code in the try and you should read the stacktraces if you actually printed them
try {
camera = Camera.open();
Camera.Parameters parameters = camera.getParameters();
...
} catch (RuntimeException ex) {
ex.printStackTrace();
}
// camera remains null if an exception is caught...

How to add Media Recorder to Android Google Play Service Vision Barcode scanner

This question was asked but never answered here -- but it is somewhat different than my need, anyway.
I want to record video, while running the Google Vision library in the background, so whenever my user holds up a barcode (or approaches one closely enough) the camera will automatically detect and scan the barcode -- and all the while it is recording the video. I know the Google Vision demo is pretty CPU intensive, but when I try a simpler version of it (i.e. without grabbing every frame all the time and handing it to the detector) I'm not getting reliable barcode reads.
(I am running a Samsung Galaxy S4 Mini on KitKat 4.4.3 Unfortunately, for reasons known only to Samsung, they no longer report the OnCameraFocused event, so it is impossible to know if the camera grabbed the focus and call the barcode read then. That makes grabbing and checking every frame seem like the only viable solution.)
So to at least prove the concept, I wanted to simply modify the Google Vision Demo. (Found Here)
It seems the easiest thing to do is simply jump in the code and add a media recorder. I did this in the CameraSourcePreview method during surface create.
Like this:
private class SurfaceCallback implements SurfaceHolder.Callback
{
#Override
public void surfaceCreated(SurfaceHolder surface)
{
mSurfaceAvailable = true;
try
{
startIfReady();
if (mSurfaceAvailable)
{
Camera camera = mCameraSource.getCameraSourceCamera();
/** ADD MediaRecorder to Google Example **/
if (camera != null && recordThis)
{
if (mMediaRecorder == null)
{
mMediaRecorder = new MediaRecorder();
camera.unlock();
SurfaceHolder sh = mSurfaceView.getHolder();
mMediaRecorder.setPreviewDisplay(sh.getSurface());
mMediaRecorder.setCamera(camera);
mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER);
mMediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
mMediaRecorder.setProfile(CamcorderProfile.get(CamcorderProfile.QUALITY_HIGH));
String OutputFile = Environment.getExternalStorageDirectory() + "/" +
DateFormat.format("yyyy-MM-dd_kk-mm-ss", new Date().getTime()) + ".mp4";
File newOutPut = getVideoFile();
String newOutPutFileName = newOutPut.getPath();
mMediaRecorder.setOutputFile(newOutPutFileName);
Log.d("START MR", OutputFile);
try { mMediaRecorder.prepare(); } catch (Exception e) {}
mCameraSource.mediaRecorder = mMediaRecorder;
mMediaRecorder.start();
}
}
}
}
catch (SecurityException se)
{
Log.e(TAG, "Do not have permission to start the camera", se);
}
catch (IOException e)
{
Log.e(TAG, "Could not start camera source.", e);
}
}
That DOES record things, while still handing each frame off to the Vision code. But, strangely, when I do that, the camera does not seem to call autofocus correctly, and the barcodes are not scanned -- since they are never really in focus, and therefore not recognized.
My next thought was to simply capture the frames as the barcode detector is handling the frames, and save them to the disk one by one (I can mux them together later.)
I did this in CameraSource.java.
This does not seem to be capturing all of the frames, even though I am writing them out in a separate AsyncTask running in the background, which I thought would get them eventually -- even if it took awhile to catch up. The saving was not optimized, but it looks as though it is dropping frames throughout, not just at the end.
To add this code, I tried putting it in the
private class FrameProcessingRunnable
in the run() method.
Right after the FrameBuilder Code, I added this:
if (saveImagesIsEnabled)
{
if (data == null)
{
Log.d(TAG, "data == NULL");
}
else
{
SaveImageAsync saveImage = new SaveImageAsync(mCamera.getParameters().getPreviewSize() );
saveImage.execute(data.array());
}
}
Which calls this class:
Camera.Size lastKnownPreviewSize = null;
public class SaveImageAsync extends AsyncTask<byte[], Void, Void>
{
Camera.Size previewSize;
public SaveImageAsync(Camera.Size _previewSize)
{
previewSize = _previewSize;
lastKnownPreviewSize = _previewSize;
}
#Override
protected Void doInBackground(byte[]... dataArray)
{
try
{
if (previewSize == null)
{
if (lastKnownPreviewSize != null)
previewSize = lastKnownPreviewSize;
else
return null;
}
byte[] bitmapData = dataArray[0];
if (bitmapData == null)
{
Log.d("doInBackground","NULL: ");
return null;
}
// where to put the output file (note: /sdcard requires WRITE_EXTERNAL_STORAGE permission)
File storageDir = Environment.getExternalStorageDirectory();
String imageFileName = baseFileName + "_" + Long.toString(sequentialCount++) + ".jpg";
String filePath = storageDir + "/" + "tmp" + "/" + imageFileName;
FileOutputStream out = null;
YuvImage yuvimage = new YuvImage(bitmapData, ImageFormat.NV21, previewSize.width,
previewSize.height, null);
try
{
out = new FileOutputStream(filePath);
yuvimage.compressToJpeg(new Rect(0, 0, previewSize.width,
previewSize.height), 100, out);
}
catch (Exception e)
{
e.printStackTrace();
}
finally
{
try
{
if (out != null)
{
out.close();
}
}
catch (IOException e)
{
e.printStackTrace();
}
}
}
catch (Exception ex)
{
ex.printStackTrace();
Log.d("doInBackground", ex.getMessage());
}
return null;
}
}
I'm OK with the mediarecorder idea, or the brute force frame capture idea, but neither seem to be working correctly.

Stop recording audio when recording video in Google Glass

Has anyone disabled sound in either Intent or MediaRecorder when record video in Google
Glass?
I have removed permissions from AndroidManifest and I am not setting the audio source in MediaRecorder but I still record audio.
I am using XE22.
private boolean prepareVideoRecorder(){
if (mCamera != null){
mCamera.release(); // release the camera for other applications
}
mCamera = getCameraInstance();
if (mCamera == null) return false;
mrec = new MediaRecorder();
mCamera.unlock();
mrec.setCamera(mCamera);
//mrec.setProfile(CamcorderProfile.get(CamcorderProfile.QUALITY_HIGH));
CamcorderProfile profile = getValidCamcorderProfile();
mrec.setVideoSource(MediaRecorder.VideoSource.CAMERA);
mrec.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
mrec.setVideoSize(profile.videoFrameWidth, profile.videoFrameHeight);
mrec.setVideoEncodingBitRate(profile.videoBitRate);
mrec.setVideoEncoder(profile.videoCodec);
mrec.setPreviewDisplay(mPreviewHolder.getSurface());
mOutputFile = getOutputMediaFile(MEDIA_TYPE_VIDEO);
mrec.setOutputFile(mOutputFile.toString());
try {
mrec.prepare();
}
catch (Exception e) {
Log.e(TAG, e.getMessage());
return false;
}
return true;
}
private CamcorderProfile getValidCamcorderProfile(){
CamcorderProfile profile;
if (CamcorderProfile.hasProfile(CamcorderProfile.QUALITY_TIME_LAPSE_720P)){
profile = CamcorderProfile.get(CamcorderProfile.QUALITY_TIME_LAPSE_720P);
return profile;
}
if (CamcorderProfile.hasProfile(CamcorderProfile.QUALITY_720P))
profile = CamcorderProfile.get(CamcorderProfile.QUALITY_720P);
else
profile = CamcorderProfile.get(CamcorderProfile.QUALITY_HIGH);
return profile;
}
The code is taken from the book Beginning Google Glass Development.
Any Ideas??
I had a solution that worked before for me few years ago but my code is now crashing due to several API updates. Withing hours of try and fail experiment, I got it to start working again.
Steps:
1) Remove anything about audio such as setVideoSource and setAudioEncoder. From the code you posted above. It looks like you already did that but verify that from the rest of your code.
2) Add try catch to setProfile and you are fine to go.
For example:
Instead of doing mrec.setProfile(CamcorderProfile.get(CamcorderProfile.QUALITY_HIGH))
do:
try {
mrec.setProfile(CamcorderProfile.get(CamcorderProfile.QUALITY_HIGH));
}catch (Exception e){
}
Furthermore, don't use CamcorderProfile profile = getValidCamcorderProfile();. Use mrec.setProfile(CamcorderProfile.get(CamcorderProfile.QUALITY_HIGH)) and that should work.
Also remove the following from your code
mrec.setVideoSize(profile.videoFrameWidth, profile.videoFrameHeight);, mrec.setVideoEncodingBitRate(profile.videoBitRate);, mrec.setVideoEncoder(profile.videoCodec);.
Inside my prepareVideoRecorder() function, this is what my code looks like:
private boolean prepareVideoRecorder(){
if (mCamera != null){
mCamera.release(); // release the camera for other applications
}
mCamera = getCameraInstance();
if (mCamera == null) return false;
mrec = new MediaRecorder();
mCamera.unlock();
mrec.setCamera(mCamera);
//mrec.setAudioSource(MediaRecorder.AudioSource.CAMCORDER); Remove this
mrec.setVideoSource(MediaRecorder.VideoSource.CAMERA);
//Add try catch to setProfile
try {
mrec.setProfile(CamcorderProfile.get(CamcorderProfile.QUALITY_HIGH));
}catch (Exception e){
}
mrec.setPreviewDisplay(mPreviewHolder.getSurface());
mOutputFile = getOutputMediaFile(MEDIA_TYPE_VIDEO);
mrec.setOutputFile(mOutputFile.toString());
try {
mrec.prepare();
}
catch (Exception e) {
Log.e(TAG, e.getMessage());
return false;
}
return true;
}
To see the recorded video on Glass from your computer, you have to restart Glass.
EDIT:
Get my whole project from here. Its from the book and was modified to record without audio.
https://github.com/InnoBlast/RecordingVideoWithoutSound

How to Detect and extract QR code from a preview frame using OpenCV in Android?

I am using Android Zxing library to decode a QR code, that has to be extracted in real time from a camera preview frame. The problem is that I have to use OpenCV to do the QR code detection, without asking the user to capture the image. Can anybody please tell me:
How to use frames from a camera?
How to use OpenCV to do QR detection on these frames, without capturing the image?
What algorithm to use for QR code detection?
Also, I would appreciate if somebody can tell me what functions, libraries to use, as well as some sample codes that may help me.
Update: This is what I am doing now:
Using a preview frame, decoding it to Byte array, and then passing it to RGBLuminance
public void surfaceCreated(SurfaceHolder holder) {
// The Surface has been created, acquire the camera and tell it where
// to draw.
camera = Camera.open();
try {
camera.setPreviewDisplay(holder);
camera.setPreviewCallback(new PreviewCallback() {
public void onPreviewFrame(byte[] data, Camera arg1) {
boolean shouldCall = (System.currentTimeMillis() - lastTime) > 1000;
if (shouldCall) {
lastTime = System.currentTimeMillis();
//slow work
Camera.Parameters parameters = camera.getParameters();
Size size = parameters.getPreviewSize();
Bitmap bMap1 = BitmapFactory.decodeByteArray(data, 0, data.length);
TextView textv = (TextView) findViewById(R.id.mytext);
LuminanceSource source = new RGBLuminanceSource(bMap1);
BinaryBitmap bitmap = new BinaryBitmap(new HybridBinarizer(source));
Reader reader = new MultiFormatReader();
try {
Result result = reader.decode(bitmap);
text = result.getText();
byte[] rawBytes = result.getRawBytes();
if (rawBytes!= null)
camera.stopPreview();
BarcodeFormat format = result.getBarcodeFormat();
ResultPoint[] points = result.getResultPoints();
ParsedResult result2 = parseResult(result);
textv.setText(text);
} catch (NotFoundException e) {
camera.startPreview();
e.printStackTrace();
} catch (ChecksumException e) {
text = "Checksum Error";
camera.stopPreview();
// TODO Auto-generated catch block
e.printStackTrace();
} catch (FormatException e) {
text = "Format Error";
camera.stopPreview();
// TODO Auto-generated catch block
e.printStackTrace();
}
lastTime = System.currentTimeMillis();
}
}
});
} catch (IOException e) {
camera.startPreview();
}
}
But this isn't working. Can anybody tell me what I am doing wrong? Also, I am using the same decoding instance in my other code snippet, where I simply take a picture and decode it. But every time the picture doesn't contain a QR, the app crashes with a force close. What do I do about that? Somebody please help
Not sure, if you are using zxing already then you already have code that decodes QR codes from camera frames in Android, full stop. What else do you need -- why bother with OpenCV?

Categories

Resources