How do you create an android emulator with multiple back cameras - android

I am trying to work on stereo video capture using two back cameras of an android device. Since I don't have a device with android pie and two back cameras on hand, I've been trying to create an emulator with the above specifications. But I'm unable to find any resources on how I can create a hardware profile with two back cameras. Is it possible to do this in android studio 3.3

I think you are trying to use the new multi-camera API introduced for Android P.
The API search for the physical camera sensors/components, so i do not know if you can emulate all of its features without a physical device. But the best you can do is to emulate a Google Pixel 3 or a Huawei Mate 20 Pro (or Mate 20 series) phone, with a camera enabled.
Summary: I suggest you to emulate a Pixel 3 from Android Studio, and use the API, I think you can implement some easy functions such as zooming or differentiate between the physical and logical cameras. But if try to implement serious features, then you need to get a physical phone with multi-camera support from Android side.

Related

Is it possible to read data from TOF (time-of-flight) sensor on Android?

Recent models of Android phones (Honor View 20, Huawei P30 Pro, Samsung Galaxy 10 5g) have TOF (time-of-flight) sensor. Is it possible to read it through some API or manufacturer SDK?
Possible for Huawei using AR Engine SDK
https://developer.huawei.com/consumer/en/ar
Excerpt of SDK Document AREnginesdk-sample-2.0.0.6\HUAWEI AR Engine Function Manual.doc
4.13 Scene mesh
Huawei AR Engine provides real-time output scene mesh
capability. The output includes the pose of the mobile phone in
space. The 3D mesh of the current camera view only supports the
specified Huawei models(Honor V20、P30Pro) that can obtain depth
information, and the supported scanning scene is static.
Excerpt of SDK Document AREnginesdk-sample-2.0.0.6\java\HUAWEI AR Engine SDK Interface Manual.docx
2.2.1.18. ARSceneMesh
• Description: The class used to return the tracking result when the environment Mesh is tracked. The result includes the Mesh vertex coordinates, the triangle subscript, and so on.
• Methods:
public ShortBuffer getSceneDepth()
// Get the depth image of current frame(optimized).
public int getSceneDepthHeight()
// Get the height of the depth image.
public int getSceneDepthWidth()
// Get the width of the depth image.
Possible on Huawei usinga camera2 API
https://github.com/google-ar/arcore-android-sdk/issues/120#issuecomment-535413944
This issue in ARCore contains a lot of information!
Also seems to be possible on S10 5G, but currently not on Note 10+
Excerpt of Night Vision / ToF Viewer app description:
This app is currently working only on Huawei P30 Pro, Honor View 20
and Samsung S10 5G. More devices will start working by future device
software updates.
New features
compatibility for Samsung S10 5G added (and maybe for other devices)
resolution dialog removed
support for front facing removed
Note: There was done a big research for Samsung Note10+ support,
however this device does not seem to support ToF for 3rd party apps.
Another interesting app:
3D scanner app using Huawei ToF sensors
https://play.google.com/store/apps/details?id=com.lvonasek.arcore3dscannerpro
Android does include OS level APIs to interact with non-traditional visual cameras. For Samsung S10 5G for example, you can access the camera with the Camera2 API and get a DEPTH16 frame directly. Here is an example: https://medium.com/swlh/working-with-the-3d-camera-on-the-samsung-s10-5g-4782336783c This works with all the Samsung devices, though the example only uses the front-facing ToF camera on the S10 5G. Both S10 and Note 10+ 5G have back facing ToF cameras as well.
I don't know if Huawei or OnePlus conform to the same API (they theoretically should and other answers indicate that they do to some extent).
Not possible as of now.
As Android has not included any such APIs in official latest SDKs.
Also those manufactures like Huawei, Samsung and OnePlus are using their customized OS sources and I don't think they have opened it out.
I was searching for the same and i came across this thread on Samsung community,
https://developer.samsung.com/forum/board/thread/view.do?boardName=SDK&messageId=371359&listLines=40&startId=zzzzz~&searchType=ALL&searchText=tof

android API27 not detecting external usb camera ( LENS_FACING_EXTERNAL)

I am trying list number of cameras (Front, back, external) using API27 (8.1.0) (which they claim as supporting external USB camera.) But couldn't.
Looks like android api (camera2) doesn't support this, any idea?
Android camera API supports external camera but it still needs device manufacturers to implement this feature at kernerl/HAL level. My understanding is not too many devices today actually implements this feature.

Custom Camera Compatibility

I'm developing an app that uses a the android camera but not the default camera, I am using a custom one. Everything seem to be fine on my testing device (a local manufactured smart phone). But I'm having problems when it comes to high end devices like Samsung and Sony Xperia, some models from these devices are working fine but with other models no luck (specially large size devices). I guess that the errors are caused by different camera hardware since each android devices has its own hardware camera different with other devices (correct me if im wrong).
But I'm wondering how did INSTAGRAM do this trick that their camera is compatible with almost every android phones. Did they used any library or something? Please help me if you know how they did the trick. And is there any standards like on what device should I start developing so if I test my app on other high end device there would be no problems. Also please give me links of standard custom camera that works on every kind of android devices.

Communicating Between Digital Camera and Android Tablet

I was wondering if there was a known camera that was compatible with android OS's. (such as the nexus 7).
I am trying to essentially control a high resolution digial camera from the android tablet so that it can control when to take a picture, and then retrieve the picture.
This would require a camera with a public API.
I have experience in android programming but not too much in communicating between two different devices. So i was wondering what I should look into in order to achieve this.
Here is a camera that runs Android: http://news.yahoo.com/samsung-takes-aim-japanese-rivals-android-camera-034717081--finance.html
And since it is Android I guess that the API is public.
And the same camera with more info: http://www.samsung.com/uk/consumer/mobile-devices/galaxy-camera/galaxy-camera/EK-GC100ZWABTU and it does run what look like standard android apps.
If you want to control that from another Android device, I think that would make a very interesting project.
The other possibility is the Nikon external control SDK, but I have no idea what language that is in. That was used to build the excellent Sofortbild app for Macs, which controls most Nikon DSLRs. https://sdk.nikonimaging.com/apply/
There are Android applications which can control a set of cameras with added features. The one I'm using gives me the ability to take very specific timelapse shots which would be too complicated or even impossible to get through the camera's own controls. You can find many other control apps on the play store.
Unfortunately this one is only for Canon EOS cameras : DSLR Controller

How to emulate a gyroscope in an Android Emulator

I am trying to work on stuff related to a gyroscope. My phone does not have a built in
gyroscope. Is there a way to include the gyroscope functionality in the emulator, at least make the emulator set in such a way that it behaves as if it had a real gyroscope?
p.s. I do not need to read any values from the gyroscope, I just want the emulator to think that it has one.
I have searched thoroughly and all I've found was this: http://code.google.com/p/openintents/wiki/SensorSimulator
But this does not make the emulator feel that it has a in built gyroscope, instead it runs an app in the emulator and fetches readings from sensors that are simulated in "SensorSimulator".
Any info would be helpful
The Android Emulator, launched with Android Studio 3.0 can simulate a range of rotation sensors that just might address your use case. We specially added a Gyroscope in the Android Emulator v26.1.0.
Gyroscope is newly supported in the Emulator of Android Studio 3.0, released to the Canary Channel on 5/17/17. Note that (as of today) Android Studio 2.3.3 is the latest official (i.e. "stable") version. Here is how you setup the Preview Version of Android Studio, which can exist concurrently with the official version.
Note that running the emulator out of the box won't work, as it's not a recent enough version:
You need to follow the 'change your update channel' steps in the latter link: select File > Other Settings > Default Settings and update from the Canary Channel:
Note that while running a Virtual Device with Android 7+ (aka API 26) did show Gyroscope output in the emulator's Virtual Sensors (within Extended Controls), it does not (at least yet) send that output to the virtual device; to actually see the environment in the sample app move as I moved the phone, I had to use Android 7.1.1 (API 25).
(Thanks to #jamal-eason for the protip!)
PREVIOUS (6/12/17):
As of the date of writing the release version of the Android Emulator (in Android Studio 2.3.3) does not offer Gyroscope support.
While the documentation referenced by #Nesski suggests this, I offer the following as proof:
The Android SDK's Virtual Reality getting started demo is the game called Treasure Hunt. Here is what it looks like when played on a phone. Notice that the camera moves as the player looks around.
Of the handful of devices compatible with the Google Daydream - because they contain an internal Gyroscope - Android Studio's AVD Manager offers only two of them: the Pixel and Pixel XL. I downloaded two virtual devices for each of those phones so that I could run the latest two Android versions (7.1.1 and 8.0) on each device:
I ran each device in the Emulator, and got similar results: press CTRL + SHIFT + C (on Windows) to bring up the Extended Controls, and you'll be able to test the phone's Virtual Sensors:
Using its Rotate controls, you'll notice that while there is Accelerometer output, Magnetometer output, and Rotation output, there is no Gryroscope output. You can rotate the phone as if you were looking around, but the game's camera view does not change as the phone is moved.
While this sad reality is unfortunate, I do, however, hope and expect Android to add Gyroscope support to the emulator in the future as more developers jump on the Google Daydream Virtual Reality bandwagon.
I don't think there is any Gyroscope support in the Emulator.
source.android.com's Sensors docs states
The gyroscope cannot be emulated based on magnetometers and
accelerometers, as this would cause it to have reduced local
consistency and responsiveness. It must be based on a usual gyroscope
chip.
I am working on something similar so I'm kind of reading up on what data to collect and what not to.

Categories

Resources