Android, Open Front and Back Cameras Simultaneously [duplicate] - android

This question already has answers here:
Using both front and back cameras simultaneously android
(3 answers)
Closed 5 years ago.
I am attempting to use 2 Surface Holder objects tied to 2 separate SurfaceViews.
I am for the one set doing a Camera.Open(0) for the back camera and Camera.Open(1) for the front.
I am able to get a perfect preview for whichever I call to open first, but am unable to open both Cameras at the same time, even though I am using separate SurfaceViews and SurfaceHolders for each Camera.
Is it just impossible to do this under Android ? I have seen a couple of post suggesting that it is either not possible, or that it is phone hardware dependent, but no concrete explanations as to why.
Could somebody explain why Android does not appear to support this ?
If it is supported, could someone suggest the correct way of opening both Cameras at the same time ?
I have also seen some suggestions that it should be possible to do using OpenCV. If so, could someone please provide a link to an example or similar ?
Thanks and Regards,
Steed.

It is possible because I've done it on my Nexus 6, even recording video from both cameras simultaneously when using Camera1 APIs. However, it is very much limited to a few devices.
Any unsupported device should give an error during the 2nd Camera.open() call. It seems each hardware manufacturer provides a different implementation of the Camera APIs. You could pretty easily try/catch the exception if a camera does not allow it.

This is possible on certain phones and pretty much all newer phones. I have found that devices which use the Snapdragon 801 and higher chipsets support this (OnePlus 1, HTC M8 etc). This was somewhere in 2014.
It all depends on the hardware/manufacturor and you should test this on real devices.
Also be aware that the first Camera API outputs in YUV so you will have to deal with this conversion to another format if you want to use the images/video; you can display it fine on a SurfaceView in realtime but saving to picture/video I suggest you save the YUV and convert later/on a seperate thread although you could save and convert single images in realtime on a seperate thread.
Sorry for the late answer but I hope you or someone else can use this info!

Related

Using two cameras at a same time in Android Things

I am in a situation where I need to use two cameras at a same time.
I have been looking up in internet for Camera2 api examples. Although not successful in developing my own camera app the way I want for android phone, I found some examples which open the camera.
Now I have a situation. I would like to know if I can access two cameras simultaneously in Android Things Odroid N2+ board. This is because I am working on the app that needs to open two cameras and display at the same time. For processing the image, I am planning to use OpenCV library.
Is this possible in Android/Odroid ?
I also recommend to use UVC Library to access two camera. Android Things is not fit to this problem. I think this example help you. https://github.com/saki4510t/UVCCamera

Android - Utilise both front and back cameras in app

I have searched quite a bit about whether or not it is possible to utilise both front and back cameras simultaneously in app. I found threads from several years ago saying it is possible on certain devices and on all Samsung phones after something like the S4. However that feature is locked to Samsung developed applications only. I then looked into whether or not it is possible to switch rapidly between the two cameras to achieve the same goal but apparently that would be extremely taxing on the hardware. I was wondering if anyone has some information about this in 2017 and if developing an application that is able to use both front and back cameras simultaneously is viable?
I know this is way late but here are two post I made on this to help anyone who runs into this:
https://stackoverflow.com/a/28811277/1138878
https://stackoverflow.com/a/43445052/1138878
Short answer: it's possible but depends on hardware/chipset (Snapdragon 801 and higher level hardware).
What it boils down to is that you need a Camera object for each camera which feeds a SurfaceView for each camera. Also make sure to check, in code, the capabilities (resolution and image format) and use one of the supported formats/sizes.

Access both back and front cameras simultaneously

What I'm trying to achieve: access both front and back cameras at the same time.
What I've researched: I know android camera API doesn't give support for using multiple instances of the Camera and you have to release a camera before using the other one. I've read tens of questions about this, I know on some devices it's possible (like Samsung S4, or other new devices from them).
I've also found out that it's possible to have access to both of them in Android KitKat on SOME devices.
I also know that on api >= 21, using the camera2 API, it's possible to access both of them at the same time because it's thread safe.
What I've got so far: implementation for accessing the cameras one at the time in order to provide a picture-in-picture.
I know it's not possible to implement dual simultaneously camera on every device, I just want a way to make it available to some devices.
How can I test to see if the device is capable of accessing both of them?
I've also searched for a library that can allow me such thing, but I didn't find anything. Is there such a library?
I would like to make this feature available for as many devices as possible, and for the others, I'll leave the current state (one by one) of the feature.
Can anyone please help me, at least with some pieces of advice?
Thanks
!
The Android camera APIs generally allow multiple cameras to be used at the same time, but most devices do not have enough hardware resources to support that in practice - for example, there's often only one camera image processor shared by both cameras.
There's no query that's included in the Android APIs that'll tell you up front if you can use multiple cameras at the same time.
The only way to tell is to try to open a second camera when you already have one open. If you can open the second camera, then you can do picture-in-picture, etc. If you get an exception trying to open the second camera, then that particular device doesn't support having both cameras open.
It is possible using the Android Camera2 API, but as indicated above most devices don't have hardware support. If you have a Nexus 5X, Nexus 6, or Nexus 6P it will work and you can test with this BothCameras app. I've implemented blitting to allow video recording as well (in addition to still pictures) using the hardware h264 encoder.
You can not access both the cameras in all android mobile phones due to hardware limitations. The best alternative can be using both the camera one by one. For that you can use single camera object and can change camera face to take another photo.
I have done this in one of my application.
https://play.google.com/store/apps/details?id=com.ushaapps.bothie
I've decided to mention that in some cases just opening two cameras with Camera2 API is not enough to know about support.
There are some devices which are not throwing error during opening. The second camera is opened correctly but the first one will call onCaptureFailed callback.
So the most accurate way is starting both cameras and wait frames from each of them and check there are no capture failure errors.

How to take photo with front and rear camera simultaneously? [duplicate]

This question already has answers here:
Using both front and back cameras simultaneously android
(3 answers)
Closed 9 years ago.
I'm trying to make an app that takes a picture using the front and the rear camera in one picture.
This is kinda how it would look.
I've read http://developer.android.com/reference/android/hardware/Camera.html and searched and really couldn't find anything on the subject but I know I saw on the app store of other apps like this.
Mostly it depends on the hardware of the device and most of the devices now do not allow use of both cameras simultaneously. Even if some device will be able to use both cameras, if you want to make application that runs flawlessly on as much devices as possible (and this is what you want) you should not implement this solution. Google says:
In order for your application to be compatible with more devices, you
should not make assumptions about the device camera specifications
What you can do, and it depends on the application you are writing is to periodically take pictures once with the front camera and the second time with the back camera.

android support better method than glreadpixels?

I'm making android game.(using andengine)
I need to record game play screen .
This is not for making promotion video, It is for game players to review their game play.
My app should record video by itself.
So I can't solve this problem using available recording app in market.
I already checked below code.
http://code.google.com/p/andengineexamples/source/browse/src/org/anddev/andengine/examples/ScreenCaptureExample.java?spec=svn66d3057f672175a8a21f9d06e7a045942331f65c&r=66d3057f672175a8a21f9d06e7a045942331f65c
It works very well..
But I want to record game play video, not a one screenshot.
At least I need 24fps for smooth replay, But If I use glreadpixels , I can get 5 fps at my xoom device.
I searched various websites to solve this optimization problem.
most people saying glreadplxels is too slow to record video.
http://www.gamedev.net/topic/473794-glreadpixel-takes-tooooo-much-time/
they recommend glcopyteximage2d instead of glreadpixels.
because glcopyteximage2d is much more faster than glreadpixels.
but I can't find how to use glcopyteximage2d in andengine.
even someone say that android opengl ES do not support glcopyteximage2d.
Maybe Another method exists to record smooth video.
It is read framebuffer of android device.
most of recording app in market using this method. but these app needs root permission to grab framebuffer.
I've read some news that android will be support capture screen from suface_flinger after gingerbread.
But I can't find out how to use framebuffer without root permission. T_T
These are my guessing solution.
use another opengl API which has better speed than glreadpixels.
find some android API can get framebuffer without root permission.
(Maybe I can access to android SURFACE_FLINGER ??)
draw another offscreen texture to record video.
But I don't know how to implement these methods.
Which approach is correct?
Do you have a example code to record video for android?
please help me to solve this problem.
If you know any other method, That will be helpful.
any help will be appreciated
Does the GPU vendor of your device support es3.0, if it does you can try to use PBO.
Here is a topic I you can refer to :Low readback performance with PBO , help !!!!!

Categories

Resources