Problem background
I am developing a VR Project on Unreal Engine 4, and the project requires the usage of Android's native camera. Since there are no built-in functions in UE4 to iteract with Android's native methods, I customized this plugin under my need.
The original plugin uses the JNI interface to iteract with C++ code. It calls camera.open() and camera.startPreview() on UE4's EventBeginPlay, and calls camera.stopPreview() and camera.Release() on UE4's EventEndPlay. Since it is a known issue that EventEndPlay never fires up on Android platform, I decided to manipulate the camera in onResume() and onPause() methods. Here is the code:
<gameActivityClassAdditions>
<insert>
/* Unrelevant code goes here */
...
...
/* End of unrelevant code */
public void AndroidThunkJava_startCamera()
{
surfaceTexture = new SurfaceTexture(10);
surfaceTexture.setDefaultBufferSize(preferredWidth, preferredHeight);
if (camera == null){
try {
camera = Camera.open();
} catch (RuntimeException exc) {
return;
}
}
try {
camera.setPreviewTexture(surfaceTexture);
} catch (IOException t) {
return;
}
Parameters cameraParam = camera.getParameters();
cameraParam.setPreviewFormat(ImageFormat.NV21);
cameraParam.setPreviewSize(preferredWidth, preferredHeight);
cameraParam.setPreviewFpsRange(preferredFPS, preferredFPS);
cameraParam.setFocusMode(Camera.Parameters.FOCUS_MODE_MACRO);
if (cameraParam.isVideoStabilizationSupported()) {
cameraParam.setVideoStabilization(false);
}
if (cameraParam.isAutoWhiteBalanceLockSupported()) {
cameraParam.setAutoWhiteBalanceLock(false);
}
camera.setParameters(cameraParam);
camera.setPreviewCallback(new PreviewCallback() {
#Override
public void onPreviewFrame(byte[] data, Camera camera) {
int Height = camera.getParameters().getPreviewSize().height;
int Width = camera.getParameters().getPreviewSize().width;
// calling C++ function via JNI interface
processFrameData(Width, Height, data);
}
});
camera.startPreview();
}
public void AndroidThunkJava_stopCamera()
{
if (camera != null)
{
camera.stopPreview();
camera.release();
camera = null;
}
}
</insert>
</gameActivityClassAdditions>
<gameActivityOnPauseAdditions>
<insert>
AndroidThunkJava_stopCamera();
</insert>
</gameActivityOnPauseAdditions>
<gameActivityOnResumeAdditions>
<insert>
AndroidThunkJava_startCamera();
</insert>
</gameActivityOnResumeAdditions>
The problem
The camera works fine every second time. That means:
I open the app, camera is working. I pushed the home button (which triggers onPause() method), then switch back to the app (triggers onResume() method). Pushed the home button again, and then switched back - camera works. And so on, the camera works every second time.
Anybody have any idea about this issue? Is that connected to the fact that android.hardware.Camera is deprecated? I'm using API version 19, so it is not possible to use newer android.hardware.camera2.
Here is my onStop and onResume methods. I'm not using onPause. And it works perfectly:
#Override
protected void onResume() {
super.onResume();
if (mCamera == null) {
restartPreview();
}
}
#Override
public void onStop() {
// stop the preview
if (mCamera != null) {
stopCameraPreview();
mCamera.release();
mCamera = null;
}
super.onStop();
}
private void restartPreview() {
if (mCamera != null) {
stopCameraPreview();
mCamera.release();
mCamera = null;
}
getCamera(mCameraID);
startCameraPreview();
}
private void startCameraPreview() {
try {
mCamera.setPreviewDisplay(mSurfaceHolder);
mCamera.startPreview();
setSafeToTakePhoto(true);
setCameraFocusReady(true);
} catch (IOException e) {
Log.d("st", "Can't start camera preview due to IOException " + e);
e.printStackTrace();
}
}
private void stopCameraPreview() {
setSafeToTakePhoto(false);
setCameraFocusReady(false);
// Nulls out callbacks, stops face detection
mCamera.stopPreview();
mPreviewView.setCamera(null);
}
maybe some implementations not equals yours but i think it is help you.
Related
At first please excuse my bad English.
I have problem with programmatically taking photo. I wrote an app, that makes collection of photos based on countdown timer and after that, photos are being processed using c++ code.
I'm using dummy SurfaceView, because I don't need preview in UI. The code below is working on my phone Xperia mini - API 15 (so permissions and code would be correct), but I borrowed school Nexus 5 - API 21 and there is problem with preview.
takePicture: camera 0: Cannot take picture without preview enabled
I found a solution, which uses setPreviewTexture (commented below) instead of setPreviewDisplay. It working for the first photo, which is normally saved, but I get the same error after the second call of takePicture().
Thanks for every advice, LS
Camera camera;
#Override
protected void onResume() {
super.onResume();
// is camera on device?
if(!checkCameraHardware()) return;
releaseCamera();
try {
camera.stopPreview();
} catch (Exception e){
Log.d(TAG, "No preview before.");
}
SurfaceView dummy = new SurfaceView(this);
camera = Camera.open();
Camera.Parameters params = camera.getParameters();
params.setFocusMode(Camera.Parameters.FOCUS_MODE_AUTO);
camera.setParameters(params);
try {
//camera.setPreviewTexture(new SurfaceTexture(10));
camera.setPreviewDisplay(dummy.getHolder());
} catch (IOException e) {
e.printStackTrace();
}
camera.startPreview();
}
SOLUTION:
I needed to refresh preview. The code below is working on Xperie and Nexus too.
Question remains why I have to use setPreviewTexture, because setPreviewDisplay always returns error on Nexus.
camera.takePicture(null, null, new PictureCallback() {
#Override
public void onPictureTaken(final byte[] data, Camera camera) {
// save picture
refreshPreview();
}
});
public void refreshPreview() {
try {
camera.stopPreview();
} catch (Exception e) {}
try {
camera.startPreview();
} catch (Exception e) {}
}
and in function onResume()
try {
camera.setPreviewTexture(new SurfaceTexture(10));
} catch (IOException e) {}
Just add a callback for starting preview on your camera instance. The thing is that after starting preview on camera instance, it needs some time to be able take a picture. Try this:
camera.startPreview();
camera.setOneShotPreviewCallback(new Camera.PreviewCallback() {
#Override
public void onPreviewFrame(byte[] data, Camera camera) {
camera.takePicture(null, null, new Camera.PictureCallback() {
#Override
public void onPictureTaken(byte[] data, Camera camera) {
// do something you want with your picture and stop preview
camera.stopPreview();
}
});
Once the picture is taken, refresh you're surfaceview & stop the preview and releasee camera and restart the process again.
try {
camera.takePicture(null, null, new PictureCallback() {
public void onPictureTaken(final byte[] data, Camera camera) {
//once ur logic done
refreshCamera();
}
});
} catch (Exception e2) {
// Toast.makeText(getApplicationContext(), "Picture not taken", Toast.LENGTH_SHORT).show();
e2.printStackTrace();
}
public void refreshCamera() {
if (dummy.getHolder().getSurface() == null) {
return;
}
try {
camera.stopPreview();
} catch (Exception e) {
// ignore: tried to stop a non-existent preview
}
try {
camera.setPreviewDisplay(dummy.getHolder());
camera.startPreview();
} catch (Exception e) {
}
}
Hope this solution may help you.
i stuck here with a Problem. Trying to build a torch app. Works fine, but when i switch fragment or go to homescreen and come back the flash light wont work. Error is failed to connect to camera service.
I think the Problem is, that I create a new Camera instance then, and the new cant connect to the camera anymore. But how should i solve it?
public class FlashCameraManager {
private boolean isFlashOn;
private Camera camera;
public Camera.Parameters params;
// getting camera parameters
public void getCamera() {
if (camera == null) {
try {
camera = Camera.open();
params = camera.getParameters();
} catch (RuntimeException e) {
camera = null;
Log.e("Camera Error. Failed to Open. Error: ", e.getMessage());
}
} else {
camera.release();
camera = null;
}
}
public void FlashOnOff()
{
//Flash Aktivieren oder deaktivieren
if (isFlashOn)
{
//Turn Flash off
if (camera == null || params == null) {
return;
}
params = camera.getParameters();
params.setFlashMode(Camera.Parameters.FLASH_MODE_OFF);
camera.setParameters(params);
camera.stopPreview();
isFlashOn = false;
Log.d("FlashCameraManager", "Turning Flash off");
}
else
{
// Turn Flash on
if (camera == null || params == null) {
return;
}
params = camera.getParameters();
params.setFlashMode(Camera.Parameters.FLASH_MODE_TORCH);
camera.setParameters(params);
camera.startPreview();
isFlashOn = true;
Log.d("FlashCameraManager", "Turning Flash on");
}
}
public boolean isFlashActive()
{
//Prüfen ob Flash an oder aus ist
return isFlashOn;
}}
This is from the MainActivity
final ImageButton flash = (ImageButton) rootView.findViewById(R.id.none_flash);
if(camera == null) {
camera = new FlashCameraManager();
}
camera.getCamera();
flash.setOnClickListener(new View.OnClickListener() {
public void onClick(View v) {
//Content
if (camera.isFlashActive())
{
//Turn Flash off
camera.FlashOnOff();
Log.d("NoneFragment", "Turning Flash off");
flash.setActivated(false);
}
else
{
//Turn Flash on
camera.FlashOnOff();
Log.d("NoneFragment", "Turning Flash on");
flash.setActivated(true);
}
}} );
After you are done with the camera (i.e. before exiting the application or launching another activity) make sure that you release the camera resources by calling the method release(), which, per the API Guide, "Disconnects and releases the Camera object resources". The API guide also provides some valueable insight into properly utilizing the class and performing simple operations, such as tasking a picture. The API Guide may be found here:
http://developer.android.com/reference/android/hardware/Camera.html
You might also want to consider taking a glance at the new camera API (android.hardware.camera2), as the current API that you are using is deprecated as of API level 21. The guide for the new API is found here:
http://developer.android.com/reference/android/hardware/camera2/package-summary.html
I am practicing building Android apps and figured starting with a flashlight would be a great beginner step. After having my code blow up several times, I have the app stable where it no longer force closes.
However, the LED camera flash doesn't turn on like I was hoping it should.
Any insight as to what I'm doing wrong would be most helpful.
public class PMATorch extends Activity {
private Camera camera;
private Button button;
private Camera.Parameters param;
private boolean torchStat = false;
public Camera getCameraInstance() {
Camera c = null;
try {
c = camera.open();
} catch (Exception e) {
}
return c;
}
private void torchOn(){
if (camera != null){
Parameters param = camera.getParameters();
param.setFlashMode(Parameters.FLASH_MODE_TORCH);
camera.setParameters(param);
camera.startPreview();
torchStat = true;
}
}
private void torchOff(){
if (camera != null){
Parameters param = camera.getParameters();
param.setFlashMode(Parameters.FLASH_MODE_OFF);
camera.setParameters(param);
camera.stopPreview();
torchStat = false;
}
}
#Override
protected void onDestroy() {
if (camera != null) {
camera.release();
camera = null;
}
super.onDestroy();
}
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_pmatorch);
camera = getCameraInstance();
button = (Button) findViewById(R.id.torchOnOff);
button.setOnClickListener(new OnClickListener() {
#Override
public void onClick(View v) {
if (torchStat = false) {
torchOn();
} else {
torchOff();
}
}
});
}
}
Edit: I have the permissions and features set in AndroidManifest.xml.
Edit 2: Updated the code to what I just tried running.
private Camera camera;
is never assigned anything. So if (camera != null){ in torchOn won't do anything. You probably wanted to do something like:
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
camera = getCameraInstance(); // <<
public Camera getCameraInstance() {
Camera c = null;
try {
c = camera.open();
} catch (Exception e) {
}
return c;
}
has furthermore 2 minor problems:
* catch (Exception e) {} hides anything that might go wrong here. I would add at least a logging statement like Log.e("PMATorch", "getCameraInstance", e).
* Cosmetical issue: camera.open() screams NullPointerException at first glance. Change to Camera.open() - the method is static and belongs to the class.
E.g. (IMO nicer to read if you get rid of the local variable so I removed that as well)
public Camera getCameraInstance() {
try {
return Camera.open();
} catch (Exception e) {
Log.e("PMATorch", "getCameraInstance", e);
return null;
}
}
To further help with debugging such a problem. Add Log to the place that actually causes the action that doesn't seem to work.
private void torchOn(){
if (camera != null){
Log.d("PMATorch", "now actually turning on");
...
You will find that In many cases code you think is not working is actually not executed. When that happen, trace back the path that leads there, either with more log or by using the debugger and stepping though the code.
Instead of using Parameters.FLASH_MODE_ON, try using Parameters.FLASH_MODE_TORCH on your torchOn() method.
According to the documentation on Camera Parameters
Parameters.FLASH_MODE_ON: Flash will always be fired during snapshot.
Parameters.FLASH_MODE_TORCH: Constant emission of light during preview, auto-focus and snapshot.
In my understanding using Parameters.FLASH_MODE_ON will only turn on the light once and instantly or only if a picture is being taken. Parameters.FLASH_MODE_TORCH will constantly emit light so this option fits your requirement of having a light turned on when a button is pressed.
A nice tutorial on creating a flashlight application can be found here.
I have a simple app that has an activity being called periodically by an alarm manager to display a preview frame and take a picture when the preview frame is built.After taking the picture, it is saved using a AsyncTask and the activity destroyed using finish().
The code works perfectly fine when I have the screen turned on.However it fails to take a picture with the screen off.I want to monitor a house and take pictures periodically using the app and in that case keeping the screen always on or turning it on manually is not a viable option.
Also the code for the camera activity has been copied from Commonsware library and works perfectly great.I am only having a problem with taking a picture with screen off.I can also see from the logs that the camera is opened by the activity.However the Runnable that is supposed to take picture when the preview frame is built, doesn't run and the camera goes to the pause state instead and stays there.
I have the necessary permissions set up perfectly as I am able to get the images with screen turned on.Maybe I am having trouble understanding the activity lifecylce when the screen is off and someone can shed light there.
I tried using the wakelocks to turn the screen on but that didnt do any good.
Below is the code for the Activity.
Also I am sorry but removing the comment for the license to make it short here.
package com.thopedia.snapper; /***
Copyright (c) 2008-2012 CommonsWare, LLC
*/
import all;
public class CameraActivity1 extends Activity {
private PreviewFrameLayout frame=null;
private SurfaceView preview=null;
private SurfaceHolder previewHolder=null;
private Camera camera=null;
private boolean inPreview=false;
private boolean cameraConfigured=false;
private PowerManager.WakeLock wakeLock;
private PowerManager powerManager;
#SuppressWarnings("deprecation")
#Override
public void onCreate(Bundle savedInstanceState) {
/* powerManager = (PowerManager) getSystemService(POWER_SERVICE);
wakeLock = powerManager.newWakeLock(PowerManager.SCREEN_BRIGHT_WAKE_LOCK, getClass()
.getName());*/
Log.v(GlobalVariables.TAG,"CameraActivity On create called");
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
frame=(PreviewFrameLayout)findViewById(R.id.frame);
preview=(SurfaceView)findViewById(R.id.preview);
previewHolder=preview.getHolder();
previewHolder.addCallback(surfaceCallback);
previewHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
}
#TargetApi(Build.VERSION_CODES.GINGERBREAD)
#Override
public void onResume() {
// wakeLock.acquire();
Log.v(GlobalVariables.TAG,"camera activity onResume called");
super.onResume();
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.GINGERBREAD) {
Camera.CameraInfo info=new Camera.CameraInfo();
for (int i=0; i < Camera.getNumberOfCameras(); i++) {
Camera.getCameraInfo(i, info);
if (info.facing == Camera.CameraInfo.CAMERA_FACING_BACK) {
try{
camera=Camera.open(i);
}catch (Exception e){
Log.v(GlobalVariables.TAG,"Camera Opening Exception");
if(!isFinishing()) {
finish();
}}}}}
if (camera == null) {
try{
camera=Camera.open();
}catch (Exception e){
if(!isFinishing()) {
finish();
}
Log.v(GlobalVariables.TAG,"Camera opening exception");
}
}
startPreview();
preview.post(new Runnable() {
#Override
public void run() {
if (inPreview) {
camera.takePicture(null, null, photoCallback);
inPreview=false;
}
}
});
}
#Override
public void onPause() {
super.onPause();
Log.v(GlobalVariables.TAG,"Camera activity onPause called");
if (inPreview) {
if(camera!=null) {
camera.stopPreview();
}
}
if(camera!=null) {
camera.release();
camera = null;
}
inPreview=false;
}
#Override
protected void onDestroy() {
Log.v(GlobalVariables.TAG,"Camera activity onDestroy called!");
super.onDestroy();
if(camera!=null){
camera.release();
}
}
#Override
public boolean onCreateOptionsMenu(Menu menu) {
new MenuInflater(this).inflate(R.menu.options, menu);
return(super.onCreateOptionsMenu(menu));
}
#Override
public boolean onOptionsItemSelected(MenuItem item) {
if (item.getItemId() == R.id.camera) {
if (inPreview) {
camera.takePicture(null, null, photoCallback);
inPreview=false;
}
}
return(super.onOptionsItemSelected(item));
}
private Camera.Size getBestPreviewSize(int width, int height,
Camera.Parameters parameters) {
Camera.Size result=null;
for (Camera.Size size : parameters.getSupportedPreviewSizes()) {
if (size.width <= width && size.height <= height) {
if (result == null) {
result=size;
}
else {
int resultArea=result.width * result.height;
int newArea=size.width * size.height;
if (newArea > resultArea) {
result=size;
}
}
}
}
return(result);
}
private Camera.Size getSmallestPictureSize(Camera.Parameters parameters) {
Camera.Size result=null;
for (Camera.Size size : parameters.getSupportedPictureSizes()) {
if (result == null) {
result=size;
}
else {
int resultArea=result.width * result.height;
int newArea=size.width * size.height;
if (newArea < resultArea) {
result=size;
}
}
}
return(result);
}
private void initPreview(int width, int height) {
if (camera != null && previewHolder.getSurface() != null) {
try {
camera.setPreviewDisplay(previewHolder);
}
catch (Throwable t) {
Log.e("PreviewDemo-surfaceCallback",
"Exception in setPreviewDisplay()", t);
Toast.makeText(CameraActivity1.this, t.getMessage(),
Toast.LENGTH_LONG).show();
}
if (!cameraConfigured) {
Camera.Parameters parameters=camera.getParameters();
Camera.Size size=getBestPreviewSize(width, height, parameters);
Camera.Size pictureSize=getSmallestPictureSize(parameters);
if (size != null && pictureSize != null) {
parameters.setPreviewSize(size.width, size.height);
parameters.setPictureSize(pictureSize.width,
pictureSize.height);
parameters.setPictureFormat(ImageFormat.JPEG);
frame.setAspectRatio((double)size.width / size.height);
camera.setParameters(parameters);
cameraConfigured=true;
}
}
}
}
private void startPreview() {
if (cameraConfigured && camera != null) {
camera.startPreview();
inPreview=true;
}
}
SurfaceHolder.Callback surfaceCallback=new SurfaceHolder.Callback() {
public void surfaceCreated(SurfaceHolder holder) {
// no-op -- wait until surfaceChanged()
}
public void surfaceChanged(SurfaceHolder holder, int format,
int width, int height) {
initPreview(width, height);
startPreview();
}
public void surfaceDestroyed(SurfaceHolder holder) {
// no-op
}
};
Camera.PictureCallback photoCallback=new Camera.PictureCallback() {
public void onPictureTaken(byte[] data, Camera camera) {
new SavePhotoTask().execute(data);
camera.startPreview();
inPreview=true;
if(!isFinishing()) {
finish();
}
}
};
I use the following piece of code to click a picture after the preview surface is properly created in onResume().
preview.post(new Runnable() {
#Override
public void run() {
if (inPreview) {
camera.takePicture(null, null, photoCallback);
inPreview=false;
}
}
});
Any help is appreciated.Thanks
I think you can use WakeLock to make sure that Screen-Off does not occur. Below is the sample code/algorithm by which you can turn on screen whenever it goes off. Hope This Helps!
Register a broadcast receiver on Intent.ACTION_SCREEN_OFF.
Whenever you get screen off broadcast intent, wake-up by using below code.
PowerManager pm = (PowerManager) context
.getSystemService(Context.POWER_SERVICE);
WakeLock wakeLock = pm.newWakeLock(PowerManager.FULL_WAKE_LOCK
| PowerManager.ACQUIRE_CAUSES_WAKEUP
| PowerManager.ON_AFTER_RELEASE, "MyWakeLock");
wakeLock.acquire();
android:keepScreenOn="true"
you can use above line to your parent layout in XML which you are calling through activity
it will keep your screen on always so you will not get any issue hope it will match your requirement
I figured out what the problem was after making extensive use of the LogCat :).
It seems that when the screen is kept on, the onPause() is not called instantly which is the case with the SCREEN_OFF.When the screen is ON, the Runnable is executed before the onPause() method executed and as such the pictures are taken perfectly fine.However in case when the screen is OFF, the Runnable is being executed after the Activity has completed the onPause() method.By this time we have already released the camera in onPause() and so we don't get a picture.
It started working after I figured out the flow and moved the camera release to the onDestroy() which might not be ideal for all situations but works just fine for mine because the only purpose of my Activity is to take a picture and then destroy itself.
Also the WAKELOCKS didn;t change the behaviour of the code.I would expect the Activity to not execute without the WAKE_LOCK but its working perfectly fine.
Hope this helps someone stuck in a similar situation.
I want to detect the numbers of faces in the front camera frame. I can detect the face once I get the image using this :http://www.developer.com/ws/android/programming/face-detection-with-android-apis.html.
But I don't know how to capture an image using the front camera every 30 seconds without an user interaction.Can someone please help me?
Following code will capture photo from your camera after every 5 secs.
if (TIMER_STARTED) {
multishotTimer.cancel();
multishotTimer.purge();
TIMER_STARTED = false;
} else {
multishotTimer = new Timer();
multishotTimer.schedule(new TimerTask() {
#Override
public void run() {
TIMER_STARTED = true;
Camera camera = surfaceView.getCamera();
camera.takePicture(null, null,
new HandlePictureStorage());
}
}, 1000, 5000L);
}
Here, TIMER_STARTED is boolean which indicate whether timer is running or not.
Following is code for HandlePictureStorage
private class HandlePictureStorage implements PictureCallback {
#Override
public void onPictureTaken(byte[] picture, final Camera camera) {
//do something when picture is captured...
}
}
You can create manually a SurfaceView and preview camera on it as follows:
//First get a reference to the SurfaceView displaying the camera preview
cameraSurface = (SurfaceView) findViewById(R.id.cameraSurface);
cameraSurfaceHolder = cameraSurface.getHolder();
cameraSurfaceHolder.addCallback(cameraSurfaceHolderCallbacks);
.
.
.
private SurfaceHolder.Callback cameraSurfaceHolderCallbacks = new SurfaceHolder.Callback() {
#Override
public void surfaceDestroyed(SurfaceHolder holder) {
if(mCamera == null)return;
mCamera.stopPreview();
mCamera.release();
mCamera = null;
}
#Override
public void surfaceCreated(SurfaceHolder holder) {
try {
mCamera = Camera.open();
mCamera.setPreviewDisplay(holder);
} catch (Exception exception) {
if(mCamera == null)return;
mCamera.release();
mCamera = null;
}
}
#Override
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
Parameters cameraParameters = mCamera.getParameters();
cameraParameters.setPreviewSize(320, 240);
cameraParameters.setPictureSize(320, 240);
int avrgExposure = (cameraParameters.getMinExposureCompensation() + cameraParameters.getMaxExposureCompensation())/2;
cameraParameters.setExposureCompensation(avrgExposure);
mCamera.setParameters(cameraParameters);
mCamera.startPreview();
mCamera.takePicture(null, null, mPictureCallback);
}
};
Do not forget to add the proper camera permission in your manifest:
<uses-permission android:name="android.permission.CAMERA"/>
And finally if you are using an Android 4.0 device or above you can use the method:
mCamera.startFaceDetection();
.
.
.
private FaceDetectionListener faceDetectionListener = new FaceDetectionListener() {
#Override
public void onFaceDetection(Face[] faces, Camera camera) {
//Faces have been detected...
}
};
.
.
.
mCamera.setFaceDetectionListener(faceDetectionListener);
You can go to this post which explains everything related to that specific functionality and even provides a functional Android Source Code that you can download to do it yourself.
Regards!
You should be scheduling an RTC_WAKEUP Alarm using the AlarmManager, at every 30 seconds, set a PendingIntent to the Alarm to launch a Service and inside the Service you should access the Camera to capture the image.
You should probably look at this post : Open/Run camera from a background Service.