Take a Video with CameraX - setLensFacing() is unresolved - android

I tried to take a video with CameraX. For that I have read the SO posts here and here .
But when I copy paste the code and adjust it a little bit, there is an unresolved reference with the setLensFacing() method:
videoCapture = VideoCaptureConfig.Builder().apply {
setTargetRotation(binding.viewFinder.display.rotation)
setLensFacing(lensFacing)
}.build()
I adjust the code little bit since you do not need to pass a config object to a VideoCapture anymore. You can build it directly.
At this point, Android Studio is telling me that setLensFacing(lensFacing) is unresolved.
I am a little bit confused because on this page , there is a nice documentation and VideoCaptureConfig.Builder() contains setLensFacing()
I hope someone can help.

Camera selection is no longer done through the use cases. The code you wrote was possible until -I think- version 1.0.0-alpha08.
The way to select the lens now is by using a CameraSelector when binding a use case (or multiple use cases) to a lifecycle. That way all the use cases use the same lensFacing.
So you can write:
val cameraSelector = CameraSelector.Builder().requireLensFacing(lensFacing).build()
// Or alternatively if you want a specific lens, like the back facing lens
val cameraSelector = CameraSelector.DEFAULT_BACK_CAMERA
val videoCapture = VideoCaptureConfig.Builder().build()
processCameraProvider.bindToLifecycle(lifecycleOwner, cameraSelector, videoCapture)
Note that currently, the VideoCapture use case is hidden in the camerax API, and is still in an early state of development.

In CameraX 1.0.0-beta11, the video capture configuration has moved from VideoCaptureConfig to VideoCapture and the lens is set in the CameraSelector Builder:
val videoCapture = VideoCapture.Builder().apply {
setVideoFrameRate(30)
setAudioBitRate(128999)
setTargetRotation(viewFinder.display.rotation)
setTargetAspectRatio(AspectRatio.RATIO_16_9)
}.build()
val cameraSelector = CameraSelector.Builder()
.requireLensFacing(CameraSelector.LENS_FACING_BACK)
.build()

Related

disable autofocus in android camerax (camera 2)

I'm in project of scanning barcode, so i want to disable auto-focus for improving performance. I tried so many ways but it doesn't work at all. Could anyone give me some help? Thank you.
If you really want to turn off AF, you can do this on CameraX with the Camera2CameraControl class. To do this you have to first bind the use cases you need to the lifecycle which results in a Camera object, you can then use that camera object to get the CameraControl object and then use it to instantiate a Camera2CameraControl which would let you set the focus mode to CameraMetadata.CONTROL_AF_MODE_OFF.
val camera : Camera = cameraProvider.bindToLifecycle(
this,
cameraSelector,
imagePreview,
imageCapture,
imageAnalysis
)
val cameraControl : CameraControl = camera.cameraControl
val camera2CameraControl : Camera2CameraControl = Camera2CameraControl.from(cameraControl)
//Then you can set the focus mode you need like this
val captureRequestOptions = CaptureRequestOptions.Builder()
.setCaptureRequestOption(CaptureRequest.CONTROL_AF_MODE, CameraMetadata.CONTROL_AF_MODE_OFF)
.build()
camera2CameraControl.captureRequestOptions = captureRequestOptions
This was tested on the latest CameraX's "1.0.0-rc03" build.
I use
disableAutoCancel()
with cameraX 1.0.0. The camera focuses once then stays locked, autofocus is not restarted at every X seconds, so something like
val autoFocusAction = FocusMeteringAction.Builder(
autoFocusPoint,
FocusMeteringAction.FLAG_AF or
FocusMeteringAction.FLAG_AE or
FocusMeteringAction.FLAG_AWB
).apply {
disableAutoCancel()
}
}.build()
myCameraControl!!.startFocusAndMetering(autoFocusAction)

Is bindToLifecycle() is necessary to switch ON/OFF/AUTO flash every time after creating preview in CameraX library

As I understand from many implementations such as :
https://github.com/android/camera-samples/tree/master/CameraXBasic
https://proandroiddev.com/android-camerax-preview-analyze-capture-1b3f403a9395
After every use case in CameraX implementation cameraProvide.bindToLifecycle() method needs to be called.
For example, if I need to switch ON the FLASH_MODE of the camera from the default OFF mode, then again bindToLifecycle() method needs to be called.
The disadvantage with this approach is that for a second or two the preview is removed and re-attached which doesn't feel like a smooth transition for an app.
Is there any better practice available or it is the limitation?
I Have attached a sample code below:
private void bindCameraUseCase() {
int screenAspectRatio = getAspectRatio(previewView.getWidth(), previewView.getHeight());
int rotation = previewView.getDisplay().getRotation();
preview = new Preview.Builder()
.setTargetAspectRatio(screenAspectRatio)
.setTargetRotation(rotation)
.build();
cameraSelector = new CameraSelector.Builder()
.requireLensFacing(lensFacing)
.build();
imageCapture = new ImageCapture.Builder()
.setCaptureMode(ImageCapture.CAPTURE_MODE_MINIMIZE_LATENCY)
.setTargetAspectRatio(screenAspectRatio)
.setTargetRotation(rotation)
.setFlashMode(flashMode)
.build();
// Must unbind the use-cases before rebinding them
cameraProvider.unbindAll();
preview.setSurfaceProvider(previewView.createSurfaceProvider());
camera = cameraProvider.bindToLifecycle(this, cameraSelector, preview, imageCapture);
}
And to toggle flashlight:
private void toggleFlash(){
Log.d(TAG, "toggleFlash: "+flashMode);
switch (flashMode){
case ImageCapture.FLASH_MODE_OFF:
flashMode = ImageCapture.FLASH_MODE_ON;
flashButton.setBackgroundResource(R.drawable.ic_flash_on_24dp);
break;
case ImageCapture.FLASH_MODE_ON:
flashMode = ImageCapture.FLASH_MODE_AUTO;
break;
case ImageCapture.FLASH_MODE_AUTO:
flashMode = ImageCapture.FLASH_MODE_OFF;
break;
}
bindCameraUseCase();
}
I'm using CameraX version - 1.0.0-beta04
To enable or disable the flash during an image capture after you've created an ImageCapture instance and bound it to a lifecycle, you can use ImageCapture.setFlashMode(boolean).
Regarding your question about the difference between setting the flash mode before vs after binding the ImageCapture use case, AFAIK there isn't much of a difference really. When you take a picture by calling ImageCapture.takePicture(), a capture request is built using different configuration parameters, one of them is the flash mode. So as long as the flash mode is set before this call (ImageCapture.takePicture()), the output of the capture request should be the same.
CameraX currently uses Camera2 under the hood, to better understand how the flash mode is set when taking a picture, you can take a look at CaptureRequest.FLASH_MODE.
I see ImageCapture.flashMode only has effect during we build with initial configuration, ImageCapture.Builder() etc.
But if you want to enable/disable flash dynamically, you will have to use the following.
camera?.cameraControl?.enableTorch(enableFlash)
If you are wondering what camera is? Captured it from documentation.
// A variable number of use-cases can be passed here -
// camera provides access to CameraControl & CameraInfo
camera = cameraProvider.bindToLifecycle(
this, cameraSelector, preview, imageCapture
)

How to solve CameraX flickering screen bug

before the version 1.0.0-beta03 CameraX I used version 1.0.0-alpha06 and the image analysis along with barcode scanning worked fine for me. Now on the newer version it doesnt always work because the camera preview is sometimes flickering and the images end up being unclear enough to be able to detect barcodes.
Look at the images I attached. The screen is black on purpose to be able to see the difference.
I think the problem is with the camera using up too much memory. I noticed that if I clear the apps running in the background and restart the app, the camera quality gets better with no such screen flickering. When I did so once it said "freed 200mb memory" just from my app alone. I cant reproduce the flickering bug all the time, but what I do is kill the app, restart and eventualy it starts flickering. In my BarcodeScanFragment here is my onDestroy
override fun onDestroy() {
super.onDestroy()
// I call cameraUtil.unbindCamera() which executes code below:
cameraProvider?.unbindAll()
cameraExecutor?.shutdown()
}
Here is my CameraUtil (that I use to initialize camera for different fragments) setupCameraX function :
fun setupCameraX(previewView: PreviewView,fragment:Fragment, analyzer:ImageAnalysis.Analyzer?) {
previewView.preferredImplementationMode = PreviewView.ImplementationMode.TEXTURE_VIEW
val cameraProviderFuture = ProcessCameraProvider.getInstance(fragment.context!!)
val rotation = previewView.display.rotation
cameraExecutor = Executors.newSingleThreadExecutor()
cameraProviderFuture.addListener(Runnable {
cameraProvider = cameraProviderFuture.get()
val cameraSelector = CameraSelector.Builder().requireLensFacing(CameraSelector.LENS_FACING_BACK).build()
val preview = Preview.Builder().setTargetRotation(rotation).build()
preview.setSurfaceProvider(previewView.createSurfaceProvider(null))
var useCase:UseCase?=null
if(analyzer==null){
imageCapture = ImageCapture.Builder()
.setCaptureMode(ImageCapture.CAPTURE_MODE_MAXIMIZE_QUALITY)
.setTargetRotation(previewView.display.rotation)
.build()
useCase = imageCapture!!
}else{
val imageAnalysis = ImageAnalysis.Builder()
.setTargetRotation(rotation)
.build()
.also{
it.setAnalyzer(cameraExecutor!!, analyzer)
}
useCase = imageAnalysis
}
cameraProvider?.unbindAll()
try{
camera = cameraProvider?.bindToLifecycle(fragment, cameraSelector, preview,useCase)
preview.setSurfaceProvider(previewView.createSurfaceProvider(camera?.cameraInfo))
}catch(ex: Exception){
Log.e("error",ex.message,ex)
}
}, ContextCompat.getMainExecutor(fragment.context))
}
I make sure to call
cameraProvider?.unbindAll()
So that the previous camerax use cases are unbound
EDIT: device Huawei P smart 2019
in my case I was experiencing a flickering only when I pop a fragment from the back stack.
I'm using camera x version 1.0.0-beta05
Finally, I could fix this flickering simply by removing this line
previewView.preferredImplementationMode = PreviewView.ImplementationMode.TEXTURE_VIEW

Bad quality of Android CameraX preview (viewfinder)

I have a problem with a preview using CameraX sample app. Quality of the preinstalled camera app's preview is better then CameraX sample's preview, but photos quality is ok. In the sample project touch to focus is implemented too, so it isn't problem, I think.
I'm using this code to setup a preview use case. What is maybe wrong?
private fun buildPreviewUseCase(): Preview {
val display = viewFinder.display
val metrics = DisplayMetrics().also { display.getMetrics(it) }
val preview = Preview.Builder()
.setTargetRotation(display.rotation)
.setTargetResolution(Size(metrics.widthPixels, metrics.heightPixels))
.build()
.apply {
previewSurfaceProvider = viewFinder.previewSurfaceProvider
}
preview.previewSurfaceProvider = viewFinder.previewSurfaceProvider
return preview
}
CameraX sample's preview
preinstalled camera app's preview
i have same issue, but after compile and run camerax example app
https://github.com/android/camera-samples/tree/main/CameraXBasic
seems there quality is ok.
I checked difference, and...
somehow, if imageCapture UseCase is not set, quality is bad..
so, try to add
val imageCapture = ImageCapture.Builder()
.setCaptureMode(ImageCapture.CAPTURE_MODE_MINIMIZE_LATENCY)
.setTargetAspectRatio(screenAspectRatio)
.setTargetRotation(rotation)
.build()
and set it into camera
camera = cameraProvider.bindToLifecycle(
this, cameraSelector, preview, imageCapture, imageAnalyzer
)

Android CameraX - face detection while recording video

I'm using the new library CameraX with Firebase ML Kit in Android and detecting faces every frame the device can.
So I set CameraX like that:
CameraX.bindToLifecycle(this, preview, imageCapture, faceDetectAnalyzer)
All working flowless, now, while I'm doing that, I want to record a video.
So basically I want to to detect faces while recording a video.
I tried:
CameraX.bindToLifecycle(this, preview, imageCapture, faceDetectAnalyzer, videoCapture)
But I'm getting an error saying that there are too many parameters so I guess that's not the right way.
I know that this library still in alpha but I guess there is a way to do that.
Even if there is not jet, what's another way to implement face detection while recording a video with Firebase ML?
I didn't use CameraX a lot, but I'm usually working with Camera 2 API and Firebase ML Kit.
For use both API together, you should get the Image callbacks from your Preview Size ImageReader. On that callback you can use that Images to create a FirebaseVisionFace through the API and do whatever you want with it.
Using Kotlin and Coroutines it should look like these:
private val options: FirebaseVisionFaceDetectorOptions = FirebaseVisionFaceDetectorOptions.Builder()
.setContourMode(FirebaseVisionFaceDetectorOptions.ALL_CONTOURS)
.build()
val detector = FirebaseVision.getInstance().getVisionFaceDetector(options)
suspend fun processImage(image: Image): FirebaseVisionFace {
val metadata = FirebaseVisionImageMetadata.Builder()
.setWidth(image.width) // 480x360 is typically sufficient for image recognition
.setHeight(image.height)
.setFormat(FirebaseVisionImageMetadata.IMAGE_FORMAT_NV21)
.build()
val visionImage = FirebaseVisionImage.fromMediaImage(image)
val firebaseVisionFace = detector.detectInImage(visionImage).await()
return firebaseVisionFace
}
If you want to use the await method for Coroutine support you can give a loot to https://github.com/FrangSierra/Firebase-Coroutines-Android

Categories

Resources