Using two cameras at a same time in Android Things - android

I am in a situation where I need to use two cameras at a same time.
I have been looking up in internet for Camera2 api examples. Although not successful in developing my own camera app the way I want for android phone, I found some examples which open the camera.
Now I have a situation. I would like to know if I can access two cameras simultaneously in Android Things Odroid N2+ board. This is because I am working on the app that needs to open two cameras and display at the same time. For processing the image, I am planning to use OpenCV library.
Is this possible in Android/Odroid ?

I also recommend to use UVC Library to access two camera. Android Things is not fit to this problem. I think this example help you. https://github.com/saki4510t/UVCCamera

Related

Android - Utilise both front and back cameras in app

I have searched quite a bit about whether or not it is possible to utilise both front and back cameras simultaneously in app. I found threads from several years ago saying it is possible on certain devices and on all Samsung phones after something like the S4. However that feature is locked to Samsung developed applications only. I then looked into whether or not it is possible to switch rapidly between the two cameras to achieve the same goal but apparently that would be extremely taxing on the hardware. I was wondering if anyone has some information about this in 2017 and if developing an application that is able to use both front and back cameras simultaneously is viable?
I know this is way late but here are two post I made on this to help anyone who runs into this:
https://stackoverflow.com/a/28811277/1138878
https://stackoverflow.com/a/43445052/1138878
Short answer: it's possible but depends on hardware/chipset (Snapdragon 801 and higher level hardware).
What it boils down to is that you need a Camera object for each camera which feeds a SurfaceView for each camera. Also make sure to check, in code, the capabilities (resolution and image format) and use one of the supported formats/sizes.

Access both back and front cameras simultaneously

What I'm trying to achieve: access both front and back cameras at the same time.
What I've researched: I know android camera API doesn't give support for using multiple instances of the Camera and you have to release a camera before using the other one. I've read tens of questions about this, I know on some devices it's possible (like Samsung S4, or other new devices from them).
I've also found out that it's possible to have access to both of them in Android KitKat on SOME devices.
I also know that on api >= 21, using the camera2 API, it's possible to access both of them at the same time because it's thread safe.
What I've got so far: implementation for accessing the cameras one at the time in order to provide a picture-in-picture.
I know it's not possible to implement dual simultaneously camera on every device, I just want a way to make it available to some devices.
How can I test to see if the device is capable of accessing both of them?
I've also searched for a library that can allow me such thing, but I didn't find anything. Is there such a library?
I would like to make this feature available for as many devices as possible, and for the others, I'll leave the current state (one by one) of the feature.
Can anyone please help me, at least with some pieces of advice?
Thanks
!
The Android camera APIs generally allow multiple cameras to be used at the same time, but most devices do not have enough hardware resources to support that in practice - for example, there's often only one camera image processor shared by both cameras.
There's no query that's included in the Android APIs that'll tell you up front if you can use multiple cameras at the same time.
The only way to tell is to try to open a second camera when you already have one open. If you can open the second camera, then you can do picture-in-picture, etc. If you get an exception trying to open the second camera, then that particular device doesn't support having both cameras open.
It is possible using the Android Camera2 API, but as indicated above most devices don't have hardware support. If you have a Nexus 5X, Nexus 6, or Nexus 6P it will work and you can test with this BothCameras app. I've implemented blitting to allow video recording as well (in addition to still pictures) using the hardware h264 encoder.
You can not access both the cameras in all android mobile phones due to hardware limitations. The best alternative can be using both the camera one by one. For that you can use single camera object and can change camera face to take another photo.
I have done this in one of my application.
https://play.google.com/store/apps/details?id=com.ushaapps.bothie
I've decided to mention that in some cases just opening two cameras with Camera2 API is not enough to know about support.
There are some devices which are not throwing error during opening. The second camera is opened correctly but the first one will call onCaptureFailed callback.
So the most accurate way is starting both cameras and wait frames from each of them and check there are no capture failure errors.

Android, Open Front and Back Cameras Simultaneously [duplicate]

This question already has answers here:
Using both front and back cameras simultaneously android
(3 answers)
Closed 5 years ago.
I am attempting to use 2 Surface Holder objects tied to 2 separate SurfaceViews.
I am for the one set doing a Camera.Open(0) for the back camera and Camera.Open(1) for the front.
I am able to get a perfect preview for whichever I call to open first, but am unable to open both Cameras at the same time, even though I am using separate SurfaceViews and SurfaceHolders for each Camera.
Is it just impossible to do this under Android ? I have seen a couple of post suggesting that it is either not possible, or that it is phone hardware dependent, but no concrete explanations as to why.
Could somebody explain why Android does not appear to support this ?
If it is supported, could someone suggest the correct way of opening both Cameras at the same time ?
I have also seen some suggestions that it should be possible to do using OpenCV. If so, could someone please provide a link to an example or similar ?
Thanks and Regards,
Steed.
It is possible because I've done it on my Nexus 6, even recording video from both cameras simultaneously when using Camera1 APIs. However, it is very much limited to a few devices.
Any unsupported device should give an error during the 2nd Camera.open() call. It seems each hardware manufacturer provides a different implementation of the Camera APIs. You could pretty easily try/catch the exception if a camera does not allow it.
This is possible on certain phones and pretty much all newer phones. I have found that devices which use the Snapdragon 801 and higher chipsets support this (OnePlus 1, HTC M8 etc). This was somewhere in 2014.
It all depends on the hardware/manufacturor and you should test this on real devices.
Also be aware that the first Camera API outputs in YUV so you will have to deal with this conversion to another format if you want to use the images/video; you can display it fine on a SurfaceView in realtime but saving to picture/video I suggest you save the YUV and convert later/on a seperate thread although you could save and convert single images in realtime on a seperate thread.
Sorry for the late answer but I hope you or someone else can use this info!

How to develop a Smart Magnifier using the mobile camera?

I would like to use the mobile camera and develop a smart magnifier that can zoom and freeze-frame what we are viewing, so we don't have to keep holding the device steady while we read. Also should be able to change colors as given in the image in the link below.
https://lh3.ggpht.com/XhSCrMXS7RCJH7AYlpn3xL5Z-6R7bqFL4hG5R3Q5xCLNAO0flY3Fka_xRKb68a2etmhL=h900-rw
Since i'm new to android i have no idea on how to start, do you have any idea?
Thanks in advance for your help :)
I've done something similar and published it here. I have to warn you though, this is not a task to start Android development with. Not because of development skills, the showstopper here is a need for massive amount of devices to test it on.
Basically, two reasons:
Camera API is quite complicated and the different HW devices behave differently. Forget about using emulator, you would need a bunch of real HW devices.
There is a new API, Camera2 for platform 21 and higher, and the old Camera API is deprecated (kind of 'in limbo' state).
I have posted some custom Camera code on GitHub here, to show some of the hurdles involved.
So the easiest way out in your situation would be to use camera intent approach, and when you get your picture back (it is a jpeg file) just decompress it and zoom-in to the center of the resulting bitmap.
Good Luck

how can i set the camera function that anti-shake(image Stabilizer) at android

I've made a Camera App.
I want to add the functionality of anti-shake.
But I could not find the setting for anti-shake(image Stabilizer).
Plz Help me!!
Usually Image Stabilizer is a built-in camera feature, while OIS (Optical-Image-Stabilization) is a built-in hardware feature; by now really few devices support them.
If device hasn't a built-in feature, i think you cannot do anything.
Android doesn't provide a direct API to manage image stabilization, but you may try:
if android.hardware.Camera.getParameters().getSupportedSceneModes(); contains steadyphoto keyword (see here), your device supports a kind of stabilization (usually it shots when accelerometer data indicates a "stable" situation)
check android.hardware.Camera.getParameters().flatten(); for a "OIS" or "image-stabilizer" keyword/values or similar to use in Parameters.set(key, value);. For the Samsung Galaxy Camera you should use parameters.set("image-stabilizer", "ois");//can be "ois" or "off"
if you are really boring you may try reading the accelerometer data and decide to shot when the device looks steady.
Good luck.
If you want to develop software image stabilizer, OpenCV is helpful library for you. Following is the one of the way to stabilize the image using Feature.
At first, you should extract feature from image using feature extractor like SIFT, SURF algorithm. In my case, FAST+ORB algorithm is best. If you want more information, See this paper
After you get the features in images, you should find matching features with images.there are several matcher but Bruteforce matcher is not bad. If Bruteforce is slow in your system, you should use a algorithm like KD-Tree.
Last, you should get geometric transformation matrix which is minimize error of transformed points. You can use RANSAC algorithm in this process.
You can develop all this process using OpenCV and I already developed it in mobile devices. See this repository

Categories

Resources