The use of front light option with zxing1.6 barcode scanner does not work on my Nexus One. I need to be able to use the flashlight in my app, but you can't have two instances of the camera running. Is there a way to use the flashlight without accessing the camera? Or can I somehow access a camera that is already in use?
I am using the Google IntentIntegrator.java patch to be able to scan barcodes.
The short answer is "no"; the front LED is controlled as a flash mode, which is a property of the camera. It is mode "torch". And no two apps can't open the camera at the same time.
(A longer answer is that there used to be a hidden API for this, which is what Barcode Scanner tries to access, but it doesn't work on almost any device anymore. You can dig into the source code to see FlashlightManager.)
Since Android 2.x there is this proper API for turning on the light, and the beta of the next version of Barcode Scanner does use it. You can try it here.
Related
I am adding zebra scanning sdk to my app. I see that when camera is open, the hardware scanner does not work. I have implemented Scanner.StatusListener but I see that this is not invoked when camera is open. I am seeing a way to know when user clicks the hardware button when camera is open to show them a toast. How can I get that callback
Unfortunately it is not possible to use both the camera and the scanner within the same app because of a low level hardware dependency (even if you are using the 2D imager for scanning, rather than the camera, this hardware dependency exists). There is no easy way to programmatically determine that the user has pressed the trigger in this scenario, to display the toast as you say, the only way I can think of would be for your app to remap the trigger to some other action using the KeyMapping Manager and then revert the trigger back to its original behaviour when the camera is dismissed. Rather than try to manage the EMDK enabling & disabling when the camera is used I would recommend using DataWedge for scanning in your app, you still can't do scanning when the camera is displayed but it should make your application logic simpler
I have quite a bit of experience with the camera API, but I could not find any documentation to answer this question...
Most phones already have a front and a back camera. Would it be possible to simulate a 3rd camera via software (probably using a service), and register that with the api?
The idea would be that we define a custom camera, register it with the Api, and then any camera app would be able to get it by looping through the available cameras.
I imagine several cases where we might want this...
There are some external cameras (such as the FLIR thermal camera) that could provide this.
We might want to concatenate the front and back camera images into a single image, and preview that. I know not all phones support opening both cameras concurrently, but some do, and i could imagine this functionality would be cool for 3rd party video chat apps like Skype... Specifically, since Skype doesnt natively support this, by registering directly with the Android Camera Api, we could get around the limitations of the Skype API, since our custom camera would just look like one of the default Android cameras.
So would this be possible? Or what is the technical limitations that prevents us from doing it. Perhaps the Android Api simply doesnt let us define a custom source (I know the Sensor API doesnt, so I would not be surprised if this was the case for the Camera API as well).
What I'm trying to achieve: access both front and back cameras at the same time.
What I've researched: I know android camera API doesn't give support for using multiple instances of the Camera and you have to release a camera before using the other one. I've read tens of questions about this, I know on some devices it's possible (like Samsung S4, or other new devices from them).
I've also found out that it's possible to have access to both of them in Android KitKat on SOME devices.
I also know that on api >= 21, using the camera2 API, it's possible to access both of them at the same time because it's thread safe.
What I've got so far: implementation for accessing the cameras one at the time in order to provide a picture-in-picture.
I know it's not possible to implement dual simultaneously camera on every device, I just want a way to make it available to some devices.
How can I test to see if the device is capable of accessing both of them?
I've also searched for a library that can allow me such thing, but I didn't find anything. Is there such a library?
I would like to make this feature available for as many devices as possible, and for the others, I'll leave the current state (one by one) of the feature.
Can anyone please help me, at least with some pieces of advice?
Thanks
!
The Android camera APIs generally allow multiple cameras to be used at the same time, but most devices do not have enough hardware resources to support that in practice - for example, there's often only one camera image processor shared by both cameras.
There's no query that's included in the Android APIs that'll tell you up front if you can use multiple cameras at the same time.
The only way to tell is to try to open a second camera when you already have one open. If you can open the second camera, then you can do picture-in-picture, etc. If you get an exception trying to open the second camera, then that particular device doesn't support having both cameras open.
It is possible using the Android Camera2 API, but as indicated above most devices don't have hardware support. If you have a Nexus 5X, Nexus 6, or Nexus 6P it will work and you can test with this BothCameras app. I've implemented blitting to allow video recording as well (in addition to still pictures) using the hardware h264 encoder.
You can not access both the cameras in all android mobile phones due to hardware limitations. The best alternative can be using both the camera one by one. For that you can use single camera object and can change camera face to take another photo.
I have done this in one of my application.
https://play.google.com/store/apps/details?id=com.ushaapps.bothie
I've decided to mention that in some cases just opening two cameras with Camera2 API is not enough to know about support.
There are some devices which are not throwing error during opening. The second camera is opened correctly but the first one will call onCaptureFailed callback.
So the most accurate way is starting both cameras and wait frames from each of them and check there are no capture failure errors.
I have been searching but have found no information on this. I am working on an app that captures video from the device camera and I would like to limit the length to one minute or less. I have seen the camera API which allows monitoring the status, but even that does not seem to have options for reading the current state (recording or not) Either way the API says to use cameraUI API for mobile, and that has no options other than to lock into video rather than stills.
I also found an ANE for iOS which allows record start/stop functions, so I could use that to make my own record button and implement a timer before triggering stop. but I cannot find an android equivalent.
I am using Flash Professional CS6, not Flex or Flash Builder.
Has anyone done this or have any ideas how to make this happen?
Is it possible to enable only LED of the device without using Camera.open() in Android?
Since LED light is also a separate hardware in mobile, there should be way to access it alone without using Camera.open() and setting Torch parameter to it.
The reason why I am asking is I have a Video App which is built in AIR which requires Flash to enabled with Camera also. The Camera will be made open by AIR and Flash will be enabled using Android Native extension. But its not working as we cannot have multiple camera instance opened at the same time.
No, you need to open the Camera with Camera.open() and then setFlashMode() to FLASH_MODE_TORCH to enable the LED light continuously. The LED is supposed to go off when the Camera is closed. And you need to close the Camera when your process goes into the background. So you really can't do this in a second app.
How about using ANE to call setFlashMode()? That would really be the right way to do it. I have never tried it directly so do not know if there's a catch that stops it from working.