I was wondering whether it is possible to have 2 instances of the camera preview in android. What I mean is running 2 instances of the camera at the same time. If it is, how would one go about this, will there be need to implement an instance on a different thread? I have not used the camera API before, so I would appreciate it if I can have a heads up on the issue, so I don't waste time on it.
Thank you.
It is not possible to have two open connections to the camera - you have to lock the camera in order to get a preview and it can only be locked once. Indeed, if you have the camera locked, and your app crashes before you've unlocked it, then nobody can use the camera!
See http://developer.android.com/reference/android/hardware/Camera.html#open%28int%29
You must call release() when you are
done using the camera, otherwise it
will remain locked and be unavailable
to other applications.
...
RuntimeException: if connection to the
camera service fails (for example, if
the camera is in use by another
process).
That said, you can certainly register a preview callback and take the preview data from your single camera instance to use in multiple views. But be aware of the issues with the YUV format of the raw byte[] data provided by the preview callback: Getting frames from Video Image in Android (note that the preview data is raw from the camera driver and may vary from device to device)
Ignoring the big Why question, your best bet would be to make a service that interacts with the camera, and go from there.
Related
I want to configure front and back both cameras into Android camera2 API, to take pictures and videos from both cameras simultaneously, I have created 2 texture views , when ever I am opening one camera (front or back) my code is working fine but whenever I am trying to open both cameras simultaneously , code is breaking upon creating session, I am getting cameraAccessException :configure stream : method not implimented.
I want to save both front and back camera captured images as one image and both video as one video.
Guys it will be very much helpful if you can put some sample code or some sample link.
i am using one plus 6, i recently downloaded an app "Dual camera fron back Camera", by using this i am able to capture image from front and back both camera on the same time, so if somebody want to suggest for no hardware support, i think it may be valid for other phones but for my case i think i am missing something in coding, till now from google search it looks like there is some problem with session creation for second camera, i debugged my code, during creation of second camera session it fails so if you have any idea about that, please share.
Thanks
Rakesh
The camera API is fine with it, but most Android devices do not have sufficient hardware resources to run both cameras at once, so you will generally get an error trying to open the second camera.
Both image sensors are typically connected to the same image signal processor (ISP), and that ISP can only operate one camera at a time. Some high-end devices have ISPs with multiple processing pipelines which can in theory run more than one camera at a time, but they often require using multiple pipelines to handle advanced functionality or very high resolutions for the main (back) camera.
So on those devices, multiple cameras may be usable at once, but not at maximum resolution or other similar restrictions.
Some manufacturers include multi-camera features in their own camera app, since they know exactly what the limitations are and can write application code to work within them. They may not make multi-camera available to normal apps, due to concerns about performance, thermal limits, or just lack of time to verify more than the exact use case they implement in their own app.
The Android camera API does not currently have a way to query if multiple cameras can be used at once, or if they can be, what the restrictions are. So the only thing you can do is try, and handle the errors in case that isn't feasible.
I'd like to be able to use the video stream from the camera (or even a few frames per second) while another app is also using the camera. Is this possible?
No. Access to a camera is exclusive.
See the Camera documentation, which says:
If your application does not properly release the camera, all
subsequent attempts to access the camera, including those by your own
application, will fail and may cause your or other applications to be
shut down.
According to Android dev guide, the answer is no. Usually when you start another camera app, you have to release camera in onPause() of the previous camera app. Otherwise an exception will be thrown. But why don't you try to use different camera? If one app is using back-facing camera, another one can use front-facing camera. On Samsung's galaxy 4, there's a camera mode called dual-shot. running different camera simultaneously seems to be no problem.
I wrote a small game. It has a front facing cam preview implemented. I want to be able to record the entire screen to a mp4. How would I do that? Anyone know a nice tutorial for recording the entire screen to mp4 (in code, so not just screenshots.. I want to enable the user to make a recording, while playing the game).
Indeed, as #Michael stated, directly accessing the screenshot and framebuffer mechanisms is not possible. That doesn't mean you're completely out of luck, though, if you want to record gaming sessions. It's just more work.
If you generate a perfect record of the various actions in your game, save that record. You can then build a service (web service, local, whatever) that accepts the record as an input. In effect, it would replay the game. You'd then reconstruct the game in such a manner that you could generate a video.
You could also replay in the Android emulator and create a video from that. Not as easily deployable, but it depends on what you want to do with it.
Apps aren't allowed to grab the screen contents (see this discussion).
[L]etting applications grab the screen's content liberally is a serious security risk, which is why the platform prevents it (taking a snapshot requires either direct physical manipulation from the user, or using a debugging tool).
And as #CommonsWare mentioned in his comment you can't use the getDrawingCache method either to grab a Bitmap of the View if you're using a SurfaceView or one of its children (e.g. VideoView). More details about that here.
As you may notice, camera in android phones stops working when we minimize it (for example when we start a new application). My question is: Is there any way to create an app with android camera, which records even if we minimize it so it could be recording videos while we are doing something different on our phone? Or may be it is only possible if we create such camera without using MediaStore? If you share some links or code which might help me, I'll be grateful. Thanks in advance.
I believe the answer to this is that one must use
public final void setPreviewTexture (SurfaceTexture surfaceTexture)
Added in API level 11
Sets the SurfaceTexture to be used for live preview.
Either a surface or surface texture is necessary for preview,
and preview is necessary to take pictures.
from https://developer.android.com/reference/android/hardware/Camera.html . And
from https://developer.android.com/reference/android/graphics/SurfaceTexture.html :
The image stream may come from either camera preview or video decode.
A SurfaceTexture may be used in place of a SurfaceHolder when specifying the
output destination of a Camera or MediaPlayer object. Doing so will cause all the
frames from the image stream to be sent to the SurfaceTexture object rather than
to the device's display.
and I would really like to try this and send you some code, but I have no phone more recent than gingerbread, and this was introduced with honeycomb.
Using a surface associated with an Activity, surfaceDestroyed is called sometime between onPause and onStop when the Activity is being minimised, although, oddly, not when the phone is being put to sleep: How SurfaceHolder callbacks are related to Activity lifecycle? But I hope that a surfaceTexture is not destroyed in this way.
I'm trying to figure out if it is possible to use a camera video stream from within a background service rather than from a normal intent.
What I had in mind is this:
start the service from my app
this service accesses the video stream and extracts features continuously; depending on the features, it sends network packets (to localhost)
user switches to another app - the service must still be running and extracting features!
Before trying to implement all that, I'd like to know if it was possible.
Thanks in advance
Nicola
yes its possible, check my answer and sample code on one similar thread
Open/Run Camera from background service in android
You have to show the fake preview over the current screen. For that you have to create the surface view with 1*1 px dimension and display the preview in that. You have to draw the surface view over other apps for that.
Check out this library that provides facility to capture image from the background even from the service.
I don't think this is possible as the camera needs a preview screen. See previous question here