As you may notice, camera in android phones stops working when we minimize it (for example when we start a new application). My question is: Is there any way to create an app with android camera, which records even if we minimize it so it could be recording videos while we are doing something different on our phone? Or may be it is only possible if we create such camera without using MediaStore? If you share some links or code which might help me, I'll be grateful. Thanks in advance.
I believe the answer to this is that one must use
public final void setPreviewTexture (SurfaceTexture surfaceTexture)
Added in API level 11
Sets the SurfaceTexture to be used for live preview.
Either a surface or surface texture is necessary for preview,
and preview is necessary to take pictures.
from https://developer.android.com/reference/android/hardware/Camera.html . And
from https://developer.android.com/reference/android/graphics/SurfaceTexture.html :
The image stream may come from either camera preview or video decode.
A SurfaceTexture may be used in place of a SurfaceHolder when specifying the
output destination of a Camera or MediaPlayer object. Doing so will cause all the
frames from the image stream to be sent to the SurfaceTexture object rather than
to the device's display.
and I would really like to try this and send you some code, but I have no phone more recent than gingerbread, and this was introduced with honeycomb.
Using a surface associated with an Activity, surfaceDestroyed is called sometime between onPause and onStop when the Activity is being minimised, although, oddly, not when the phone is being put to sleep: How SurfaceHolder callbacks are related to Activity lifecycle? But I hope that a surfaceTexture is not destroyed in this way.
Related
I have a Service running in the background which is started by an activity. The service itself is a background Camera recorder. The Service tries to Starts Camera2 and writes to a Surface provided by the Media Recorder.
When I get the Activity runnning, now I want to have the live stream along with the background recording. So far, I have been creating a SurfaceView in an Activity and passing it as a target to Camera2 when the surface is created from the Activity. But, I have to reinitialize the Camera2 API each time the Surface gets destroyed(ie Activity goes to the background). Is this the right approach to solve this problem? Is it possible for the Service to own the SurfaceView and pass a reference to Surface back to the Activity so that it can display the live feed without re initializing the Camera device?
Not directly; the SurfaceView has to live in the app process because it's closely tied to the app's UI state (and as you've noticed, it gets torn down every time the app goes in the background). So it can't live in the service.
However, you could have the Service receive camera frames, and resend them to the app when it's in the foreground. For example, you could use an ImageReader to read YUV frames from the camera, and then write those (with an ImageWriter) to a Surface the app provides from its own ImageReader (or SurfaceView if you can do the YV12 bit described below). Then the camera doesn't need to be reconfigured when the app appears or disappears; the service can just start dumping the images to nowhere.
That does require the app to draw its own YUV frames, which is annoying, since there's unfortunately no requirement that a TextureView or SurfaceView will accept YUV_420_888 buffers. If your camera device supports YV12 output, then that can be written directly to a SurfaceView or TextureView; otherwise, you'll probably need to implement some custom translation. Most efficiently that can be done in JNI code, where you can use the NDK ANativeWindow APIs to lock the Surface from a SurfaceView, set its format to YV12, and write the YUV_420_888 image data into it (translating pixel stride/row stride as needed).
I am new to Android, i want to take pictures in background without surface view/preview. I have searched online but the methods don't seem working for me. I want to use the latest Camera2 API.
Regards!
Muhammad Awais
Just create an ImageReader, and a camera capture session with that ImageReader's Surface. No need to have a SurfaceView or TextureView as well.
You'll need to stream some number of captures before starting to save any, though, to ensure that the auto-exposure/focus/etc routines of the camera have time to converge.
I'm using libvlc on and android app to play a network stream; the native code draws the image on a surfaceview.
Let's assume the video stops and I have no access to the native code, how can I detect if the SurfaceView is still moving (the video is not frozen)?
I tried getViewTreeObserver().addOnDrawListener(); and getViewTreeObserver().addOnPreDrawListener(); but they do not have the effect I'm looking for.
Thanks
You can't get that information from the SurfaceView, because the SurfaceView itself does not know.
The SurfaceView's job is to set up the Surface, and create a hole in the View layout that you can see through. Once it has done these things, it is no longer actively involved in the process of displaying video. The content flows from the decoder to the Surface, which is managed by SurfaceFlinger (the system graphics compositor).
This is, for example, how DRM video works. The app "playing" the video has no access to DRM-protected video frames. (Neither does SurfaceFlinger, actually, but that's a longer story.)
The best way to know if content is still arriving is to ask the video source if it is still sending you content. Another approach would be to change your SurfaceView to a TextureView, and provide an onSurfaceTextureUpdated() callback method.
I am not sure what exactly what you are trying to achieve here but you can see if surface view is rendering or not through implementing an interface called SurfaceHolder.Callback which gives you access to the following methods,
On Surface Created - This is called immediately after the surface is first created.
On Surface Changed - This is called immediately after any structural changes (format or size) have been made to the surface.
On Surface Destroyed - This is called immediately before a surface is being destroyed.
Take a look at the documentation for surface view. For SurfaceHolder take a look at this link. Basically in order to know
when I make a call to mCamera.takePicture(null, null, null, null); (for simplicity I have omitted the callbacks) the preview freezes and shows a preview of the scene that was just captured. Why is that the case. Can I somehow control this behaviour? And, what does actually happen? Is there a new view that gets attached or does my camera preview simple stop?
What does actually happen? Is there a new view that gets attached or does my camera preview simply stop?
No new view gets attached by default. The preview just stops. The documentation for Camera states this clearly:
Preview will be stopped after the image is taken; callers must call startPreview() again if they want to re-start preview or take more pictures.
It also goes on to say:
After calling this method, you must not call startPreview() or take another picture until the JPEG callback has returned.
So, the best place to call startPreview() again would be the JPEG callback. Any time before that, the camera hardware is still processing the previous image, and wouldn't be able to give you a preview. That's the main reason that it "freezes"; the camera hardware is just busy.
It's also a visual cue to the user that:
a picture was taken
the picture looks like "this"
That's icing on the cake, but even if you didn't care about that, it would still do it.
Can I somehow control this behaviour?
Through the publicly expose API? Definitely not. You can restart the preview once the camera is done processing(as above), but you can't prevent it from freeze-framing when you call takePicture().
Whether it's possible by going further into the camera firmware, I can't really say. However, since there are roughly a bazillion different cameras used in Android devices, this would likely be an exercise in futility if you weren't working on one specific device.
Even with one specific device, I can't see how you'd overcome it altogether. At a bare minimum, the camera will be busy processing the image for some amount of time. Even high-end DSLR cameras that I've seen freeze the preview at least for the duration of the exposure.
After calling takePicture() you can hide the preview surface under another view (e.g. ImageView). If you use OpenGL to render the preview texture instead of SurfaceView, you have even more tricks in your sleeve.
I was wondering whether it is possible to have 2 instances of the camera preview in android. What I mean is running 2 instances of the camera at the same time. If it is, how would one go about this, will there be need to implement an instance on a different thread? I have not used the camera API before, so I would appreciate it if I can have a heads up on the issue, so I don't waste time on it.
Thank you.
It is not possible to have two open connections to the camera - you have to lock the camera in order to get a preview and it can only be locked once. Indeed, if you have the camera locked, and your app crashes before you've unlocked it, then nobody can use the camera!
See http://developer.android.com/reference/android/hardware/Camera.html#open%28int%29
You must call release() when you are
done using the camera, otherwise it
will remain locked and be unavailable
to other applications.
...
RuntimeException: if connection to the
camera service fails (for example, if
the camera is in use by another
process).
That said, you can certainly register a preview callback and take the preview data from your single camera instance to use in multiple views. But be aware of the issues with the YUV format of the raw byte[] data provided by the preview callback: Getting frames from Video Image in Android (note that the preview data is raw from the camera driver and may vary from device to device)
Ignoring the big Why question, your best bet would be to make a service that interacts with the camera, and go from there.