How to solve CameraX flickering screen bug - android

before the version 1.0.0-beta03 CameraX I used version 1.0.0-alpha06 and the image analysis along with barcode scanning worked fine for me. Now on the newer version it doesnt always work because the camera preview is sometimes flickering and the images end up being unclear enough to be able to detect barcodes.
Look at the images I attached. The screen is black on purpose to be able to see the difference.
I think the problem is with the camera using up too much memory. I noticed that if I clear the apps running in the background and restart the app, the camera quality gets better with no such screen flickering. When I did so once it said "freed 200mb memory" just from my app alone. I cant reproduce the flickering bug all the time, but what I do is kill the app, restart and eventualy it starts flickering. In my BarcodeScanFragment here is my onDestroy
override fun onDestroy() {
super.onDestroy()
// I call cameraUtil.unbindCamera() which executes code below:
cameraProvider?.unbindAll()
cameraExecutor?.shutdown()
}
Here is my CameraUtil (that I use to initialize camera for different fragments) setupCameraX function :
fun setupCameraX(previewView: PreviewView,fragment:Fragment, analyzer:ImageAnalysis.Analyzer?) {
previewView.preferredImplementationMode = PreviewView.ImplementationMode.TEXTURE_VIEW
val cameraProviderFuture = ProcessCameraProvider.getInstance(fragment.context!!)
val rotation = previewView.display.rotation
cameraExecutor = Executors.newSingleThreadExecutor()
cameraProviderFuture.addListener(Runnable {
cameraProvider = cameraProviderFuture.get()
val cameraSelector = CameraSelector.Builder().requireLensFacing(CameraSelector.LENS_FACING_BACK).build()
val preview = Preview.Builder().setTargetRotation(rotation).build()
preview.setSurfaceProvider(previewView.createSurfaceProvider(null))
var useCase:UseCase?=null
if(analyzer==null){
imageCapture = ImageCapture.Builder()
.setCaptureMode(ImageCapture.CAPTURE_MODE_MAXIMIZE_QUALITY)
.setTargetRotation(previewView.display.rotation)
.build()
useCase = imageCapture!!
}else{
val imageAnalysis = ImageAnalysis.Builder()
.setTargetRotation(rotation)
.build()
.also{
it.setAnalyzer(cameraExecutor!!, analyzer)
}
useCase = imageAnalysis
}
cameraProvider?.unbindAll()
try{
camera = cameraProvider?.bindToLifecycle(fragment, cameraSelector, preview,useCase)
preview.setSurfaceProvider(previewView.createSurfaceProvider(camera?.cameraInfo))
}catch(ex: Exception){
Log.e("error",ex.message,ex)
}
}, ContextCompat.getMainExecutor(fragment.context))
}
I make sure to call
cameraProvider?.unbindAll()
So that the previous camerax use cases are unbound
EDIT: device Huawei P smart 2019

in my case I was experiencing a flickering only when I pop a fragment from the back stack.
I'm using camera x version 1.0.0-beta05
Finally, I could fix this flickering simply by removing this line
previewView.preferredImplementationMode = PreviewView.ImplementationMode.TEXTURE_VIEW

Related

disable autofocus in android camerax (camera 2)

I'm in project of scanning barcode, so i want to disable auto-focus for improving performance. I tried so many ways but it doesn't work at all. Could anyone give me some help? Thank you.
If you really want to turn off AF, you can do this on CameraX with the Camera2CameraControl class. To do this you have to first bind the use cases you need to the lifecycle which results in a Camera object, you can then use that camera object to get the CameraControl object and then use it to instantiate a Camera2CameraControl which would let you set the focus mode to CameraMetadata.CONTROL_AF_MODE_OFF.
val camera : Camera = cameraProvider.bindToLifecycle(
this,
cameraSelector,
imagePreview,
imageCapture,
imageAnalysis
)
val cameraControl : CameraControl = camera.cameraControl
val camera2CameraControl : Camera2CameraControl = Camera2CameraControl.from(cameraControl)
//Then you can set the focus mode you need like this
val captureRequestOptions = CaptureRequestOptions.Builder()
.setCaptureRequestOption(CaptureRequest.CONTROL_AF_MODE, CameraMetadata.CONTROL_AF_MODE_OFF)
.build()
camera2CameraControl.captureRequestOptions = captureRequestOptions
This was tested on the latest CameraX's "1.0.0-rc03" build.
I use
disableAutoCancel()
with cameraX 1.0.0. The camera focuses once then stays locked, autofocus is not restarted at every X seconds, so something like
val autoFocusAction = FocusMeteringAction.Builder(
autoFocusPoint,
FocusMeteringAction.FLAG_AF or
FocusMeteringAction.FLAG_AE or
FocusMeteringAction.FLAG_AWB
).apply {
disableAutoCancel()
}
}.build()
myCameraControl!!.startFocusAndMetering(autoFocusAction)

How do you take a picture with camerax?

I'm still practicing with Kotlin and Android Developing. As far as I understood, Camera class has been deprecated, and Android invites to use Camerax instead, because this high-level class is device-indipendent, and they've made simpler the process of implementing cameras on apps.
I've tried to read the documentation (https://developer.android.com/training/camerax) but it's written so bad I barely understood what they are trying to explain.
So I went to read the entire sample code given in the documentation itself (https://github.com/android/camera-samples/tree/main/CameraXBasic).
The CameraFragment code is about 500 lines long (ignoring imports and various comments).
Do I really need to write 500 lines of code to simply take a picture?
How is this supposed to be considered "simpler than before"?
I mean, Android programming is at the point where I just need to write only 4 lines of code to ask the user to select an Image from his storage and retreive it and show it in an ImageView.
Is there a TRUE simple way to take a picture, or do I really need to stop and lose a whole day of work to write all those lines of code?
EDIT:
Take this page of the documentation:
https://developer.android.com/training/camerax/architecture#kotlin
It starts with this piece of code.
val preview = Preview.Builder().build()
val viewFinder: PreviewView = findViewById(R.id.previewView)
// The use case is bound to an Android Lifecycle with the following code
val camera = cameraProvider.bindToLifecycle(lifecycleOwner, cameraSelector, preview)
cameraProvider comes out of nowhere. What is this supposed to be? I've found out it's a ProcessCameraProvider, but how am I supposed to initialize it?
Should it be a lateinit var or has it already been initialized somewhere else?
Because if I try to write val cameraProvider = ProcessCameraProvider() I get an error, so what am I supposed to do?
What is cameraSelector parameter? It's not defined before. I've found out it's the selector for the front or back camera, but how am I supposed to know it reading that page of the documentation?
How could have this documentation been released with these kind of lackings?
How is someone supposed to learn with ease?
Before you can interact with the device's cameras using CameraX, you need to initialize the library. The initialization process is asynchronous, and involves things like loading information about the device's cameras.
You interact with the device's cameras using a ProcessCameraProvider. It's a Singleton, so the first time you get an instance of if, CameraX performs its initialization.
val cameraProviderFuture: ListenableFuture<ProcessCameraProvider> = ProcessCameraProvider.getInstance(context)
Getting the ProcessCameraProvider singleton returns a Future because it might need to initialize the library asynchronously. The first time you get it, it might take some time (usually well under a second), subsequent calls though will return immediately, as the initialization will have already been performed.
With a ProcessCameraProvider in hand, you can start interacting with the device's cameras. You choose which camera to interact with using a CameraSelector, which wraps a set of filters for the camera you want to use. Typically, if you're just trying to use the main back or front camera, you'd use CameraSelector.DEFAULT_BACK_CAMERA or CameraSelector.DEFAULT_FRONT_CAMERA.
Now that you've defined which camera you'll use, you build the use cases you'll need. For example, you want to take a picture, so you'll use the ImageCapture use case. It allows taking a single capture frame (typically a high quality one) using the camera, and providing it either as a raw buffer, or storing it in a file. To use it, you can configure it if you'd wish, or you can just let CameraX use a default configuration.
val imageCapture = ImageCapture.Builder().build()
In CameraX, a camera's lifecycle is controlled by a LifecycleOwner, meaning that when the LifecycleOwner's lifecycle starts, the camera opens, and when it stops, the camera closes. So you'll need to choose a lifecycle that will control the camera. If you're using an Activity, you'd typically want the camera to start as the Activity starts, and stop when it stops, so you'd use the Activity instance itself as the LifecycleOwner, if you were using a Fragment, you might want to use its view lifecycle (Fragment.getViewLifecycleOwner()).
Lastly, you need to put the pieces of the puzzle together.
processCameraProvider.bindToLifecycle(
lifecycleOwner,
cameraSelector,
imageCapture
)
An app typically includes a viewfinder that displays the camera's preview, so you can use a Preview use case, and bind it with the ImageCapture use case. The Preview use case allows streaming camera frames to a Surface. Since setting up the Surface and correctly drawing the preview on it can be complex, CameraX provides PreviewView, a View that can be used with the Preview use case to display the camera preview. You can check out how to use them here.
// Just like ImageCapture, you can configure the Preview use case if you'd wish.
val preview = Preview.Builder().build()
// Provide PreviewView's Surface to CameraX. The preview will be drawn on it.
val previewView: PreviewView = findViewById(...)
preview.setSurfaceProvider(previewView.surfaceProvider)
// Bind both the Preview and ImageCapture use cases
processCameraProvider.bindToLifecycle(
lifecycleOwner,
cameraSelector,
imageCapture,
preview
)
Now to actually take a picture, you use on of ImageCapture's takePicture methods. One provides a JPEG raw buffer of the captured image, the other saves it in a file that you provide (make sure you have the necessary storage permissions if you need any).
imageCapture.takePicture(
ContextCompat.getMainExecutor(context), // Defines where the callbacks are run
object : ImageCapture.OnImageCapturedCallback() {
override fun onCaptureSuccess(imageProxy: ImageProxy) {
val image: Image = imageProxy.image // Do what you want with the image
imageProxy.close() // Make sure to close the image
}
override fun onError(exception: ImageCaptureException) {
// Handle exception
}
}
)
val imageFile = File("somePath/someName.jpg") // You can store the image in the cache for example using `cacheDir.absolutePath` as a path.
val outputFileOptions = ImageCapture.OutputFileOptions
.Builder(imageFile)
.build()
takePicture(
outputFileOptions,
CameraXExecutors.mainThreadExecutor(),
object : ImageCapture.OnImageSavedCallback {
override fun onImageSaved(outputFileResults: ImageCapture.OutputFileResults) {
}
override fun onError(exception: ImageCaptureException) {
}
}
)
Do I really need to write 500 lines of code to simply take a picture?
How is this supposed to be considered "simpler than before"?
CameraXBasic is not as "basic" as its name might suggest x) It's more of a complete example of CameraX's 3 use cases. Even though the CameraFragment is long, it explains things nicely so that it's more accessible to everyone.
CameraX is "simpler than before", before referring mainly to Camera2, which was a bit more challenging to get started with at least. CameraX provides a more developer-friendly API with its approach to using use cases. It also handles compatibility, which was a big issue before. Ensuring your camera app works reliably on most of the Android devices out there is very challenging.

Take a Video with CameraX - setLensFacing() is unresolved

I tried to take a video with CameraX. For that I have read the SO posts here and here .
But when I copy paste the code and adjust it a little bit, there is an unresolved reference with the setLensFacing() method:
videoCapture = VideoCaptureConfig.Builder().apply {
setTargetRotation(binding.viewFinder.display.rotation)
setLensFacing(lensFacing)
}.build()
I adjust the code little bit since you do not need to pass a config object to a VideoCapture anymore. You can build it directly.
At this point, Android Studio is telling me that setLensFacing(lensFacing) is unresolved.
I am a little bit confused because on this page , there is a nice documentation and VideoCaptureConfig.Builder() contains setLensFacing()
I hope someone can help.
Camera selection is no longer done through the use cases. The code you wrote was possible until -I think- version 1.0.0-alpha08.
The way to select the lens now is by using a CameraSelector when binding a use case (or multiple use cases) to a lifecycle. That way all the use cases use the same lensFacing.
So you can write:
val cameraSelector = CameraSelector.Builder().requireLensFacing(lensFacing).build()
// Or alternatively if you want a specific lens, like the back facing lens
val cameraSelector = CameraSelector.DEFAULT_BACK_CAMERA
val videoCapture = VideoCaptureConfig.Builder().build()
processCameraProvider.bindToLifecycle(lifecycleOwner, cameraSelector, videoCapture)
Note that currently, the VideoCapture use case is hidden in the camerax API, and is still in an early state of development.
In CameraX 1.0.0-beta11, the video capture configuration has moved from VideoCaptureConfig to VideoCapture and the lens is set in the CameraSelector Builder:
val videoCapture = VideoCapture.Builder().apply {
setVideoFrameRate(30)
setAudioBitRate(128999)
setTargetRotation(viewFinder.display.rotation)
setTargetAspectRatio(AspectRatio.RATIO_16_9)
}.build()
val cameraSelector = CameraSelector.Builder()
.requireLensFacing(CameraSelector.LENS_FACING_BACK)
.build()

Bad quality of Android CameraX preview (viewfinder)

I have a problem with a preview using CameraX sample app. Quality of the preinstalled camera app's preview is better then CameraX sample's preview, but photos quality is ok. In the sample project touch to focus is implemented too, so it isn't problem, I think.
I'm using this code to setup a preview use case. What is maybe wrong?
private fun buildPreviewUseCase(): Preview {
val display = viewFinder.display
val metrics = DisplayMetrics().also { display.getMetrics(it) }
val preview = Preview.Builder()
.setTargetRotation(display.rotation)
.setTargetResolution(Size(metrics.widthPixels, metrics.heightPixels))
.build()
.apply {
previewSurfaceProvider = viewFinder.previewSurfaceProvider
}
preview.previewSurfaceProvider = viewFinder.previewSurfaceProvider
return preview
}
CameraX sample's preview
preinstalled camera app's preview
i have same issue, but after compile and run camerax example app
https://github.com/android/camera-samples/tree/main/CameraXBasic
seems there quality is ok.
I checked difference, and...
somehow, if imageCapture UseCase is not set, quality is bad..
so, try to add
val imageCapture = ImageCapture.Builder()
.setCaptureMode(ImageCapture.CAPTURE_MODE_MINIMIZE_LATENCY)
.setTargetAspectRatio(screenAspectRatio)
.setTargetRotation(rotation)
.build()
and set it into camera
camera = cameraProvider.bindToLifecycle(
this, cameraSelector, preview, imageCapture, imageAnalyzer
)

How to check if the device has a flash for front and back facing camera separately?

I am currently working on a camera activity. I managed to access my device's back camera flash light and to hide the flash toggle button when I switch to the front camera automatically. However, I was wondering if there was a way to check for secondary flash lights since many smartphone models come with front camera flash light and that would also help when using this application from a tablet without back camera flash light. My idea is checking for the front and back facing camera flash lights separately with two independent booleans and if the flash light is not available, setting the toggle button invisible. I really dislike the idea of showing or hiding the flash button without making sure the device has flash light or not in any of its cameras. This is what I have so far. Any ideas?
private boolean hasFlash(Context Context) {
if (Context.getPackageManager().hasSystemFeature(PackageManager.FEATURE_CAMERA_FLASH)) {
return true;
} else {
return false;
}
}
_
if (!hasFlash(Context)) {
ImageButton FlashButton = (ImageButton) findViewById(R.id.frnxcameraflashbutton);
FlashButton.setVisibility(View.INVISIBLE);
FlashButton.setImageResource(R.mipmap.cameraflashoffbutton);
}
If you are able to use the new Camera2 APIs (mostly 21+ I think), the CameraCharacteristics key-value map available for each camera should indicate whether or not each camera has a corresponding flash. For example, you could probably just check for the FLASH_STATE_UNAVAILABLE flag per-camera to achieve your goal.
If we are using CameraX, we can do it as shown below.
var imageCapture: ImageCapture = ImageCapture.Builder()
.setCaptureMode(ImageCapture.CAPTURE_MODE_MINIMIZE_LATENCY)
.setFlashMode(flashMode)
.setTargetRotation(Surface.ROTATION_0)
.build()
Below code can be used to check if flash unit is available on the device.
val cameraProviderFuture: ListenableFuture<ProcessCameraProvider> = ProcessCameraProvider.getInstance(requireContext())
//Get camera provider
val cameraProvider: ProcessCameraProvider = cameraProviderFuture.get()
//Get camera object after binding to lifecycle
var camera: Camera = cameraProvider.bindToLifecycle(this as LifecycleOwner, cameraSelector, preview, imageCapture)
Once we have camera object, we can use it to get CameraInfo and check if flash unit is available.
camera?.cameraInfo?.hasFlashUnit()

Categories

Resources