I have written the Android service shown below for recording the front cam in the background. This works very well. But now I would like to also take a picture every 5 seconds while recording. Is this somehow possible? When I try to open a second camera (in another service) I'm getting an error.
public class RecorderService extends Service implements SurfaceHolder.Callback {
private WindowManager windowManager;
private SurfaceView surfaceView;
private Camera camera = null;
private MediaRecorder mediaRecorder = null;
#Override
public void onCreate() {
// Create new SurfaceView, set its size to 1x1, move it to the top left corner and set this service as a callback
windowManager = (WindowManager) this.getSystemService(Context.WINDOW_SERVICE);
surfaceView = new SurfaceView(this);
WindowManager.LayoutParams layoutParams = new WindowManager.LayoutParams(
1, 1,
WindowManager.LayoutParams.TYPE_SYSTEM_OVERLAY,
WindowManager.LayoutParams.FLAG_WATCH_OUTSIDE_TOUCH,
PixelFormat.TRANSLUCENT
);
layoutParams.gravity = Gravity.LEFT | Gravity.TOP;
windowManager.addView(surfaceView, layoutParams);
surfaceView.getHolder().addCallback(this);
}
#Override
public int onStartCommand(Intent intent, int flags, int startId) {
Intent notificationIntent = new Intent(this, MainActivity.class);
PendingIntent pendingIntent = PendingIntent.getActivity(this, 0,
notificationIntent, 0);
Notification notification = new NotificationCompat.Builder(this)
//.setSmallIcon(R.mipmap.app_icon)
.setContentTitle("Background Video Recorder")
.setContentText("")
.setContentIntent(pendingIntent).build();
startForeground(MainActivity.NOTIFICATION_ID_RECORDER_SERVICE, notification);
return Service.START_NOT_STICKY;
}
// Method called right after Surface created (initializing and starting MediaRecorder)
#Override
public void surfaceCreated(SurfaceHolder surfaceHolder) {
camera = Camera.open(1);
mediaRecorder = new MediaRecorder();
camera.unlock();
mediaRecorder.setPreviewDisplay(surfaceHolder.getSurface());
mediaRecorder.setCamera(camera);
mediaRecorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER);
mediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
mediaRecorder.setProfile(CamcorderProfile.get(CamcorderProfile.QUALITY_720P));
FileUtil.createDir("/storage/emulated/0/Study/Camera");
mediaRecorder.setOutputFile("/storage/emulated/0/Study/Camera/" + Long.toString(System.currentTimeMillis()) + ".mp4");
try { mediaRecorder.prepare(); } catch (Exception e) {}
mediaRecorder.start();
try {
camera.setPreviewDisplay(surfaceHolder);
} catch (IOException e) {
e.printStackTrace();
}
Runnable runnable = new PictureThread(camera);
Thread thread = new Thread(runnable);
thread.start();
}
// Stop recording and remove SurfaceView
#Override
public void onDestroy() {
mediaRecorder.stop();
mediaRecorder.reset();
mediaRecorder.release();
camera.lock();
camera.release();
windowManager.removeView(surfaceView);
}
#Override
public void surfaceChanged(SurfaceHolder surfaceHolder, int format, int width, int height) {}
#Override
public void surfaceDestroyed(SurfaceHolder surfaceHolder) {}
#Override
public IBinder onBind(Intent intent) { return null; }
}
Edit: I have now written a thread PictureThread. This thread is started from RecorderService and tries to take a picture while video recording.
public class PictureThread implements Runnable {
private final static String TAG = PictureThread.class.getSimpleName();
private Camera camera;
PictureThread(Camera camera) {
this.camera = camera;
}
#Override
public void run() {
camera.startPreview();
camera.takePicture(shutterCallback, rawCallback, jpegCallback);
}
Camera.ShutterCallback shutterCallback = new Camera.ShutterCallback() {
public void onShutter() {
}
};
Camera.PictureCallback rawCallback = new Camera.PictureCallback() {
public void onPictureTaken(byte[] data, Camera camera) {
}
};
Camera.PictureCallback jpegCallback = new Camera.PictureCallback() {
public void onPictureTaken(byte[] data, Camera camera) {
Log.i(TAG, "onPictureTaken - jpeg");
}
};
}
Unfortunately jpegCallback gets never called (i.e. the Log message is never printed). When I open the camera app of my tablet then I can take pictures while video recording, so this should be possible.
I have also tried the Camera2 API example as suggested by Alex Cohn (https://github.com/mobapptuts/android_camera2_api_video_app). Recording a video works and also taking a picture works but when I try to take a picture while recording, no picture is created (but also no error). Nevertheless, I have found this example app not working very reliable (perhaps there is another example app).
Edit 2: The shutterCallback and rawCallback of takePicture gets called but the data of the rawCallback is null. The jpegCallback gets never called.. Any idea why and how this can be solved? I have also tried to wait in the thread for a period of time to give the callback time for being called and I have tried to make the callbacks static in my main activity (so that it gets not garbage collected). Nothing worked.
Edit:
With the clarification:
The old camera API supports calling takePicture() while video is being recorded, if Camera.Parameters.isVideoSnapshotSupported reports true on the device is question.
Just hold on to the same camera instance you're passing into the MediaRecorder, and call Camera.takePicture() on it.
Camera2 also supports this with more flexibility, by creating a session with preview, recording, and JPEG outputs at the same time.
Original answer:
If you mean taking pictures with the back camera, while recording with the front camera - that's device-dependent. Some devices have enough hardware resources to run multiple cameras at once, but most won't (they share processing hardware between the two cameras).
The only way to tell if multiple cameras can be used at once is to try opening a second camera when one is already open. If it works, you should be good to go; if not, that device doesn't support multiple cameras at once.
No, you cannot open separate camera instances for video recording and stills capture. The deprecated Camera API is not reliable for such tasks (see e.g. Android camera parameter IsVideoSnapshotSupported incorrectly set to false about Samsung S4).
You can use camera2 API (on devices that support such mode) to capture different formats and resolutions from the same camera instance. Here is a video tutorial: https://www.nigeapptuts.com/android-video-app-still-capture-recording/
My app contains three fragments. I need to keep the user experience fluid so I'm using a the setOffscreenPageLimit() method to keep them all alive in memory.
Problem: when I start a new activity (even empty), It loads slowly. Same when I finish it.
I know this is coming from my Camera Preview fragment because when I comment out the init of the camera, everything runs very smoothly.
Here is how I initialize my camera preview on the OnResume method:
mCamera = GetCameraInstance(currentCameraId);
//-- Set the SurfaceView
preview = (SurfaceView) view.findViewById(R.id.camera_preview);
mSurfaceHolder = preview.getHolder();
mSurfaceHolder.addCallback(this);
if (mCamera != null) {
mCamera.setPreviewDisplay(mSurfaceHolder);
mCamera.startPreview();
}
When the new activity is started, the surfaceDestroyed method is called which destroys the Camera preview. When the new activity is terminated, the app recreate a camera view again.
#Override
public void surfaceDestroyed(SurfaceHolder surfaceHolder) {
if (mCamera != null) {
mCamera.stopPreview();
mCamera.release();
mCamera = null;
}
}
The time spent on stopPreview() and release() method as seen on traceview. It takes around 700ms to destroy the camera preview.
Screenshot of Traceview
Based on #CommonsWare suggestion, here is how I did it.
First, I placed my mCamera variable in the Application class
public Camera mCamera;
Then, I placed the AsyncTask in my fragment
private class ControlCameraTask extends AsyncTask<Integer, Void, Void> {
protected Void doInBackground(Integer... urls) {
//--
if (app.mCamera != null) {
app.mCamera.stopPreview();
app.mCamera.release();
app.mCamera = null;
} else {
app.mCamera = GetCameraInstance(currentCameraId);
app.mCamera.setPreviewDisplay(app.mSurfaceHolder);
app.mCamera.startPreview();
}
return null;
}
}
Finally, I just call my AsyncTask on onResumeand onPausemethods
#Override
public void onResume() {
super.onResume();
new ControlCameraTask().execute(1);
}
#Override
public void onPause() {
super.onPause();
new ControlCameraTask().execute(1);
}
PS: I removed the trycatches for the code to be easily readable.
I'm writing an app that consumes media (audio/video) and that allows users to reply and/or post new media.
My question relates to the SurfaceView used to display the videos. This SurfaceView object is shared between the MediaRecorder (recording a video) and the MediaPlayer (consuming/playing the video).
The MediaPlayer is located on its own Service, which runs on its own thread, as per the NPR example: PlaybackService.java
Since the NPR example doesn't involve video, I was not sure about how to make the MediaPlayer on the Service aware of the SurfaceView in the UI. I ended up using a static variable to solve this issue:
// MyFragmentClass.java
// CameraPreview is the sample class shown int he media section
// of the Android Developer site:
// http://developer.android.com/guide/topics/media/camera.html#camera-preview
private static CameraPreview sCameraPreview;
#Override
public void onStart() {
super.onStart();
CameraPreview cameraPreview = new CameraPreview(getActivity());
FrameLayout preview = (FrameLayout) getView().findViewById(R.id.video_preview);
preview.addView(cameraPreview);
setCameraPreview(cameraPreview); // a static setter.
// Other classes are initialized afterwards, not relevant to this question...
}
public static CameraPreview getCameraPreview() {
return sCameraPreview;
}
public static void setCameraPreview(CameraPreview cameraPreview) {
sCameraPreview = cameraPreview;
}
Here's the method on the PlaybackService that takes care of preparing a video for playback:
// Params url and isVideo were extras on an Intent that triggers
// this method
synchronized private void prepareMediaPlayer(String url, boolean isVideo) {
if (mMediaPlayer == null) {
mMediaPlayer = new MediaPlayer();
mMediaPlayer.setOnCompletionListener(this);
mMediaPlayer.setOnErrorListener(this);
mMediaPlayer.setOnInfoListener(this);
mMediaPlayer.setOnPreparedListener(this);
} else {
mMediaPlayer.reset();
}
mMediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
mMediaPlayer.setDataSource(mediaUrl);
if (isVideo) {
// this is the important line
mMediaPlayer.setDisplay(MyFragmentClass.getCameraPreview().getHolder());
mMediaPlayer.setScreenOnWhilePlaying(true);
}
mMediaPlayer.prepareAsync();
sendIntent(STATUS_PLAYBACK_PREPARED);
}
It gets the job done, but I'm wondering if there's any better way to do it, specially because I'm debugging a random bug where the shared surface is not released, and it got me thinking:
Is there a better way to make my service class aware of the Service?
Is it a good approach to have a single SurfaceView for both Recording and Playback?
Thanks!
I'm trying to program Optical flow on android device.
My problem is to get two consecutive frames from camera.
That's the code to get ONE frame.
mCamera.setPreviewCallback(new PreviewCallback() {
public void onPreviewFrame(byte[] data, Camera camera) {
synchronized (SampleViewBase.this) {
mFrame2 = data;
SampleViewBase.this.notify();
}
}
});
Can't you then do something like:
private byte[] currFrame;
private byte[] prevFrame;
private void copyFrame(byte[] a){
if(a != null) prevFrame = a;
}
mCamera.setPreviewCallback(new PreviewCallback() {
public void onPreviewFrame(byte[] data, Camera camera) {
synchronized (SampleViewBase.this) {
copyFrame(currFrame);
currFrame = data;
SampleViewBase.this.notify();
}
}
});
I'm not sure that's proper Java syntax but just copy the currentFrame, before assigning data to it. Anyway, I think you can also use the VideoCapture class to obtain the frames already in a Mat format. I'm not sure if this class is still available in the latest release, but from my experience with Opencv 2.3 it was much faster to use it to grab camera frames than to use the Android camera.
I am writing an Android 1.5 application which starts just after boot-up. This is a Service and should take a picture without preview. This app will log the light density in some areas whatever. I was able to take a picture but the picture was black.
After researching for a long time, I came across a bug thread about it. If you don't generate a preview, the image will be black since Android camera needs preview to setup exposure and focus. I've created a SurfaceView and the listener, but the onSurfaceCreated() event never gets fired.
I guess the reason is, the surface is not being created visually. I've also seen some examples of calling the camera statically with MediaStore.CAPTURE_OR_SOMETHING which takes a picture and saves in the desired folder with two lines of code, but it doesn't take a picture too.
Do I need to use IPC and bindService() to call this function? Or is there an alternative method to achieve this?
it is really weird that camera on android platform can't stream video until it given valid preview surface. it seems that the architects of the platform was not thinking about 3rd party video streaming applications at all. even for augmented reality case the picture can be presented as some kind of visual substitution, not real time camera stream.
anyway, you can simply resize preview surface to 1x1 pixels and put it somewhere in the corner of the widget (visual element). please pay attention - resize preview surface, not camera frame size.
of course such trick does not eliminate unwanted data streaming (for preview) which consumes some system resources and battery.
I found the answer to this in the Android Camera Docs.
Note: It is possible to use MediaRecorder without creating a camera
preview first and skip the first few steps of this process. However,
since users typically prefer to see a preview before starting a
recording, that process is not discussed here.
You can find the step by step instructions at the link above. After the instructions, it will state the quote that I have provided above.
Actually it is possible, but you have to fake the preview with a dummy SurfaceView
SurfaceView view = new SurfaceView(this);
c.setPreviewDisplay(view.getHolder());
c.startPreview();
c.takePicture(shutterCallback, rawPictureCallback, jpegPictureCallback);
Update 9/21/11: Apparently this does not work for every Android device.
Taking the Photo
Get this working first before trying to hide the preview.
Correctly set up the preview
Use a SurfaceView (pre-Android-4.0 compatibility) or SurfaceTexture (Android 4+, can be made transparent)
Set and initialise it before taking the photo
Wait for the SurfaceView's SurfaceHolder (via getHolder()) to report surfaceCreated() or the TextureView to report onSurfaceTextureAvailable to its SurfaceTextureListener before setting and initialising the preview.
Ensure the preview is visible:
Add it to the WindowManager
Ensure its layout size is at least 1x1 pixels (you might want to start by making it MATCH_PARENT x MATCH_PARENT for testing)
Ensure its visibility is View.VISIBLE (which seems to be the default if you don't specify it)
Ensure you use the FLAG_HARDWARE_ACCELERATED in the LayoutParams if it's a TextureView.
Use takePicture's JPEG callback since the documentation says the other callbacks aren't supported on all devices
Troubleshooting
If surfaceCreated/onSurfaceTextureAvailable doesn't get called, the SurfaceView/TextureView probably isn't being displayed.
If takePicture fails, first ensure the preview is working correctly. You can remove your takePicture call and let the preview run to see if it displays on the screen.
If the picture is darker than it should be, you might need to delay for about a second before calling takePicture so that the camera has time to adjust its exposure once the preview has started.
Hiding the Preview
Make the preview View 1x1 size to minimise its visibility (or try 8x16 for possibly more reliability)
new WindowManager.LayoutParams(1, 1, /*...*/)
Move the preview out of the centre to reduce its noticeability:
new WindowManager.LayoutParams(width, height,
Integer.MIN_VALUE, Integer.MIN_VALUE, /*...*/)
Make the preview transparent (only works for TextureView)
WindowManager.LayoutParams params = new WindowManager.LayoutParams(
width, height, /*...*/
PixelFormat.TRANSPARENT);
params.alpha = 0;
Working Example (tested on Sony Xperia M, Android 4.3)
/** Takes a single photo on service start. */
public class PhotoTakingService extends Service {
#Override
public void onCreate() {
super.onCreate();
takePhoto(this);
}
#SuppressWarnings("deprecation")
private static void takePhoto(final Context context) {
final SurfaceView preview = new SurfaceView(context);
SurfaceHolder holder = preview.getHolder();
// deprecated setting, but required on Android versions prior to 3.0
holder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
holder.addCallback(new Callback() {
#Override
//The preview must happen at or after this point or takePicture fails
public void surfaceCreated(SurfaceHolder holder) {
showMessage("Surface created");
Camera camera = null;
try {
camera = Camera.open();
showMessage("Opened camera");
try {
camera.setPreviewDisplay(holder);
} catch (IOException e) {
throw new RuntimeException(e);
}
camera.startPreview();
showMessage("Started preview");
camera.takePicture(null, null, new PictureCallback() {
#Override
public void onPictureTaken(byte[] data, Camera camera) {
showMessage("Took picture");
camera.release();
}
});
} catch (Exception e) {
if (camera != null)
camera.release();
throw new RuntimeException(e);
}
}
#Override public void surfaceDestroyed(SurfaceHolder holder) {}
#Override public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {}
});
WindowManager wm = (WindowManager)context
.getSystemService(Context.WINDOW_SERVICE);
WindowManager.LayoutParams params = new WindowManager.LayoutParams(
1, 1, //Must be at least 1x1
WindowManager.LayoutParams.TYPE_SYSTEM_OVERLAY,
0,
//Don't know if this is a safe default
PixelFormat.UNKNOWN);
//Don't set the preview visibility to GONE or INVISIBLE
wm.addView(preview, params);
}
private static void showMessage(String message) {
Log.i("Camera", message);
}
#Override public IBinder onBind(Intent intent) { return null; }
}
On Android 4.0 and above (API level >= 14), you can use TextureView to preview the camera stream and make it invisible so as to not show it to the user. Here's how:
First create a class to implement a SurfaceTextureListener that will get the create/update callbacks for the preview surface. This class also takes a camera object as input, so that it can call the camera's startPreview function as soon as the surface is created:
public class CamPreview extends TextureView implements SurfaceTextureListener {
private Camera mCamera;
public CamPreview(Context context, Camera camera) {
super(context);
mCamera = camera;
}
#Override
public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int height) {
Camera.Size previewSize = mCamera.getParameters().getPreviewSize();
setLayoutParams(new FrameLayout.LayoutParams(
previewSize.width, previewSize.height, Gravity.CENTER));
try{
mCamera.setPreviewTexture(surface);
} catch (IOException t) {}
mCamera.startPreview();
this.setVisibility(INVISIBLE); // Make the surface invisible as soon as it is created
}
#Override
public void onSurfaceTextureSizeChanged(SurfaceTexture surface, int width, int height) {
// Put code here to handle texture size change if you want to
}
#Override
public boolean onSurfaceTextureDestroyed(SurfaceTexture surface) {
return true;
}
#Override
public void onSurfaceTextureUpdated(SurfaceTexture surface) {
// Update your view here!
}
}
You'll also need to implement a callback class to process the preview data:
public class CamCallback implements Camera.PreviewCallback{
public void onPreviewFrame(byte[] data, Camera camera){
// Process the camera data here
}
}
Use the above CamPreview and CamCallback classes to setup the camera in your activity's onCreate() or similar startup function:
// Setup the camera and the preview object
Camera mCamera = Camera.open(0);
CamPreview camPreview = new CamPreview(Context,mCamera);
camPreview.setSurfaceTextureListener(camPreview);
// Connect the preview object to a FrameLayout in your UI
// You'll have to create a FrameLayout object in your UI to place this preview in
FrameLayout preview = (FrameLayout) findViewById(R.id.cameraView);
preview.addView(camPreview);
// Attach a callback for preview
CamCallback camCallback = new CamCallback();
mCamera.setPreviewCallback(camCallback);
There is a way of doing this but it's somewhat tricky.
what should be done, is attach a surfaceholder to the window manager from the service
WindowManager wm = (WindowManager) mCtx.getSystemService(Context.WINDOW_SERVICE);
params = new WindowManager.LayoutParams(WindowManager.LayoutParams.WRAP_CONTENT,
WindowManager.LayoutParams.WRAP_CONTENT,
WindowManager.LayoutParams.TYPE_SYSTEM_OVERLAY,
WindowManager.LayoutParams.FLAG_WATCH_OUTSIDE_TOUCH,
PixelFormat.TRANSLUCENT);
wm.addView(surfaceview, params);
and then set
surfaceview.setZOrderOnTop(true);
mHolder.setFormat(PixelFormat.TRANSPARENT);
where mHolder is the holder you get from the surface view.
this way, you can play with the surfaceview's alpha, make it completly transparent, but the camera will still get frames.
that's how i do it. hope it helps :)
We solved this problem by using a dummy SurfaceView (not added to actual GUI) in versions below 3.0 (or let's say 4.0 as a camera service on a tablet does not really make sense).
In versions >= 4.0 this worked in the emulator only ;(
The use of SurfaceTexture (and setSurfaceTexture()) instead of SurfaceView (and setSurfaceView()) worked here. At least this works on Nexus S.
I think this really is a shortcoming of the Android framework.
In the "Working Example by Sam" (Thank you Sam... )
if at istruction "wm.addView(preview, params);"
obtain exception "Unable to add window android.view.ViewRoot -- permission denied for this window type"
resolve by using this permission in AndroidManifest:
<uses-permission android:name="android.permission.SYSTEM_ALERT_WINDOW"/>
You can try this working code, This service click front picture, if you want to capture back camera picture then uncomment back camera in code and comment front camera.
Note :- Allow Camera and Storage permission to App And startService from Activity or anywhere.
public class MyService extends Service {
#Nullable
#Override
public IBinder onBind(Intent intent) {
return null;
}
#Override
public void onCreate() {
super.onCreate();
CapturePhoto();
}
private void CapturePhoto() {
Log.d("kkkk","Preparing to take photo");
Camera camera = null;
Camera.CameraInfo cameraInfo = new Camera.CameraInfo();
int frontCamera = 1;
//int backCamera=0;
Camera.getCameraInfo(frontCamera, cameraInfo);
try {
camera = Camera.open(frontCamera);
} catch (RuntimeException e) {
Log.d("kkkk","Camera not available: " + 1);
camera = null;
//e.printStackTrace();
}
try {
if (null == camera) {
Log.d("kkkk","Could not get camera instance");
} else {
Log.d("kkkk","Got the camera, creating the dummy surface texture");
try {
camera.setPreviewTexture(new SurfaceTexture(0));
camera.startPreview();
} catch (Exception e) {
Log.d("kkkk","Could not set the surface preview texture");
e.printStackTrace();
}
camera.takePicture(null, null, new Camera.PictureCallback() {
#Override
public void onPictureTaken(byte[] data, Camera camera) {
File pictureFileDir=new File("/sdcard/CaptureByService");
if (!pictureFileDir.exists() && !pictureFileDir.mkdirs()) {
pictureFileDir.mkdirs();
}
SimpleDateFormat dateFormat = new SimpleDateFormat("yyyymmddhhmmss");
String date = dateFormat.format(new Date());
String photoFile = "ServiceClickedPic_" + "_" + date + ".jpg";
String filename = pictureFileDir.getPath() + File.separator + photoFile;
File mainPicture = new File(filename);
try {
FileOutputStream fos = new FileOutputStream(mainPicture);
fos.write(data);
fos.close();
Log.d("kkkk","image saved");
} catch (Exception error) {
Log.d("kkkk","Image could not be saved");
}
camera.release();
}
});
}
} catch (Exception e) {
camera.release();
}
}
}