Can anyone tell me how to check if the android phone has a front camera too? I'd tried to use some help form https://docs.google.com/View?id=dhtsnvs6_57d2hpqtgr but Camera camera = FrontFacingCamera.getFrontFacingCamera(); sometimes works sometimes not.
Any help please?
Can anyone tell me how to check if the android phone has a front camera too?
There is no API for this, at least through Android 2.2. Sorry!
I'd tried to use some help form https://docs.google.com/View?id=dhtsnvs6_57d2hpqtgr but Camera camera = FrontFacingCamera.getFrontFacingCamera(); sometimes works sometimes not.
That is for two specific models of phones, not for Android devices in general. With luck, the upcoming Gingerbread release will add built-in support for front-facing cameras.
In the meantime, you need to get the instructions (like the one you linked to) from each and every device manufacturer and attempt to follow them.
private boolean hasFlash() {
if (Build.VERSION.SDK_INT > Build.VERSION_CODES.M) {
try {
if(camManager==null)
camManager=(CameraManager)getSystemService( CAMERA_SERVICE );
String cameraId = camManager.getCameraIdList()[1];
CameraCharacteristics cameraCharacteristics = camManager.getCameraCharacteristics( cameraId );
return cameraCharacteristics.get( CameraCharacteristics.FLASH_INFO_AVAILABLE );
} catch (Exception e) {
e.printStackTrace();
}
}
return false;
}
Related
I need to disable app from running on emulators so I somehow combined answers of this question. It's working on most emulators but it can't detect blustacks. I can not find a robust way to detect blustacks as it has most of a real device properties.
I found safeToRun library that using Build.BOOTLOADER == OsCheckConstants.UNKNOWN to detect blustacks but I'm not sure that only this condition is enough to make sure the running device is a blustacks emulator and I'm afraid that some real devices return "unknown" as bootloader parameter.
it is worth mentioning that this block of code can't detect bluestacks too (glGetString returns null)
try {
String opengl = android.opengl.GLES20.glGetString(android.opengl.GLES20.GL_RENDERER);
if (opengl != null) {
if (opengl.contains("Bluestacks") ||
opengl.contains("Translator")
)
newRating += 10;
}
} catch (Exception e) {
e.printStackTrace();
}
any ideas or solutions?
Description
I'm developing the application which one of the features is turning LED on/off depending on the recorded sound rhythm (so it happens with a quite high frequency). I use Flutter but for this functionality I created my own Java code switching LED on:
CameraManager cameraManager = (CameraManager) context.getSystemService(Context.CAMERA_SERVICE);
try {
cameraManager.setTorchMode(cameraId, true);
} catch (#SuppressLint("NewApi") CameraAccessException e) {
//handle exception
}
And here is the similar code turing LED off:
CameraManager cameraManager = (CameraManager) context.getSystemService(Context.CAMERA_SERVICE);
try {
cameraManager.setTorchMode(cameraId, false);
} catch (#SuppressLint("NewApi") CameraAccessException e) {
//handle exception
}
When I start using LED, I get cameraID by:
private static String getCameraIdWithFlash(CameraManager cameraManager) throws CameraAccessException {
String cameraId = null;
for (String id : cameraManager.getCameraIdList()) {
Boolean hasFlashAvailable = cameraManager.getCameraCharacteristics(id).get(CameraCharacteristics.FLASH_INFO_AVAILABLE);
if (hasFlashAvailable != null && hasFlashAvailable) {
cameraId = id;
break;
}
}
return cameraId;
}
Problem
My code runs perfectly on most devices but I'm aware there is ANR for at least one device (Redmi Note 10 Pro with Android 11). After some time (like 10-40 minutes) of switching LED on/off continuously the screen freezes, and the user is seeing the system's ANR window. So app stops working actually. What's interesting, system's top bar buttons allowing to enable some options like wi-fi, LED etc. aren't working until the phone restart.
Unfortunately I have no access to the logs, so can't provide more details for the time being. But I wonder if that's the issue with my code or it's just related to Redmi phones or something. I believe it's rather not related to the Android version. I tested the app with Samsung Galaxy A71 Android 11 and couldn't reproduce the issue while it happens very often for Redmi Note 10 Pro.
I'd be grateful for any suggestions. Thank you!
I tried Google barcode-reader from https://github.com/googlesamples/android-vision
This example doesn't work. When I tab to screen it always detect
"no barcode detected"
Debug reason :
private boolean onTap(float rawX, float rawY) {
//TODO: use the tap position to select the barcode.
BarcodeGraphic graphic = mGraphicOverlay.getFirstGraphic();
Barcode barcode = null;
if (graphic != null) {
barcode = graphic.getBarcode();
if (barcode != null) {
Intent data = new Intent();
data.putExtra(BarcodeObject, barcode);
setResult(CommonStatusCodes.SUCCESS, data);
finish();
}
else {
Log.d(TAG, "barcode data is null");
}
}
else {
Log.d(TAG,"no barcode detected");
}
return barcode != null;
}
graphic variable is always Null
See the image:
Anyone faced this problem? Can you let me know how to resolve it?
Thank you so much!
So I guest you are new to Android Mobile Vision, in the new version of Google Play services (v9) they temporarily disabled the feature due to a serious bug in that feature, you can check the release note here:
https://developers.google.com/android/guides/releases#may_2016_-_v90
As #Vietnt134 have already answered, Android Mobile Vision is temporarily disabled.
You can follow this topic to know if something knew came up:
https://github.com/googlesamples/android-vision/issues/98
People are pretty mad with Google about this. I hope they solve this quickly.
getFirstGraphic returns null whenever no graphics have been added to the overlay; in the barcode example, this is when no barcodes have been detected in the frame.
Check if barcodeDetector.isOperational() is returning false in BarcodeCaptureActivity.java. If it's returning false, has for several minutes, and you aren't in a low storage condition, there's a very good chance this is because of a current service outage.
More details can be found here: https://github.com/googlesamples/android-vision/issues/98 We'll update that issue as soon as we have a resolution.
Is there any way to check if the camera is open or not? I don't want to open the camera, I just want to check its status.
If your device API version is higher than 21, CameraManager.AvailabilityCallback might be a good choice.
You need to first obtain the camera manager of the system with the following code:
CameraManager manager = (CameraManager) getSystemService(Context.CAMERA_SERVICE);
Then, you need to register the AvailabilityCallback:
CameraManager manager = (CameraManager) getSystemService(Context.CAMERA_SERVICE);
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.LOLLIPOP) {
manager.registerAvailabilityCallback(new CameraManager.AvailabilityCallback() {
#Override
public void onCameraAvailable(String cameraId) {
super.onCameraAvailable(cameraId);
//Do your work
}
#Override
public void onCameraUnavailable(String cameraId) {
super.onCameraUnavailable(cameraId);
//Do your work
}
}, yourHandler);
}
This works better if API version is higher than 21. You can refer to CameraManager, CameraManager.AvailabilityCallback, and the whole package
Trying to open the camera to check if exception is thrown works good if API level is lower than 23. In API level 23, camera service is different than before, from the official docs:
Access to camera subsystem resources, including opening and configuring a camera device, is awarded based on the “priority” of the client application process. Application processes with user-visible or foreground activities are generally given a higher-priority, making camera resource acquisition and use more dependable.
Active camera clients for lower priority apps may be “evicted” when a higher priority application attempts to use the camera. In the deprecated Camera API, this results in onError() being called for the evicted client. In the Camera2 API, it results in onDisconnected() being called for the evicted client.
We can see that in API 23 or higher, trying to open the camera used by other app/process will seize the camera from app/process which was using it, instead of getting RuntimeException.
You can check it using method Camera.open(cameraId).
Creates a new Camera object to access a particular hardware camera. If the same camera is opened by other applications, this will throw a RuntimeException.
Throws
RuntimeException
If opening the camera fails (For Example, if the camera is in use by another process or device policy manager has disabled the camera).
Update:
Example:
public boolean isCameraUsebyApp() {
Camera camera = null;
try {
camera = Camera.open();
} catch (RuntimeException e) {
return true;
} finally {
if (camera != null) camera.release();
}
return false;
}
You can use this method to use as Paul suggested but keep this thing in mind that this method first acquire the camera.
If its acquire successfully then its mean that no other application is using this camera and don't forgot to release it again otherwise you will not able to acquire it again.
Its its throws RuntimeException it means that camera is in use by another process or device policy manager has disabled the camera.
Looking into the source code of Camera, its JNI counterpart, and finally the native code for connecting a camera with the service, it appears that the only way of determining if the camera is in use is directly through the result of Camera::connect(jint).
The trouble is that this native code is only accessible through the JNI function android_hardware_Camera_native_setup(JNIEnv*, jobject, jobject, jint), which sets up the camera for use when creating the Camera instance from Java in new Camera(int).
In short, it doesn't seem possible. You'll have to attempt to open the camera, and if it fails, assume it is in use by another applicaiton. E.g.:
public boolean isCameraInUse() {
Camera c = null;
try {
c = Camera.open();
} catch (RuntimeException e) {
return true;
} finally {
if (c != null) c.release();
}
return false;
}
To better understand the underlying flow of camera's native code, see this thread.
Recently I went thru the code for accessing the camera using flash ActionScript3 and I have tested the code in iMac machine, iPhone and Android.Now based on this, I am developing an application for Android which includes the accessibility of the front camera. Now my Problem is I dont know how to access the front camera? We should use some other code or should we specify which camera should be accessed? First of all, can we access the front camera thru flash?
I made a simple android app. Here is the code for selecting camera window
public class SelectCameraAlertAndroid extends StartAlertAndroid_design{
public function SelectCameraAlertAndroid() {
frontCameraButton.addEventListener(MouseEvent.CLICK, onFrontCamera);
backCameraButton.addEventListener(MouseEvent.CLICK, onBackCamera);
}
private function onFrontCamera(event:MouseEvent):void {
Model.model.camera = Camera.getCamera("1");
Model.model.cameraSelectedSignal.dispatch();
dispatchEvent(new Event("closeMe"));
}
private function onBackCamera(event:MouseEvent):void {
Model.model.camera = Camera.getCamera("0");
Model.model.cameraSelectedSignal.dispatch();
dispatchEvent(new Event("closeMe"));
}
}
Not true. You can access the front camera on Android.
The only problem is that you don't get to use the CameraUI(pretty sure).
var camera = Camera.getCamera("1");
camera.setMode(stage.stageWidth, stage.stageHeight, 30, true);
var video:Video = new Video(stage.stageWidth, stage.stageHeight);
video.attachCamera(camera);
addChild(video);
Note: This answer is outdated. Please refer to the other answers for updated information.
Currently, AIR only supports access to the primary camera on an Android device.
http://forums.adobe.com/thread/849983
Official documentation: http://help.adobe.com/en_US/FlashPlatform/reference/actionscript/3/flash/media/Camera.html#getCamera()
"On Android devices, you can only access the rear-facing camera."