Image Stabilization Issue in LGV30 - android

Video suddenly zooming into the frame when recording starts in preview mode is known behavior, but the issue I am facing currently is the extreme jump in preview when using LGV30.
In pixel 1 XL and pixel 2 XL the jump is negligible
In Samsung S8+ the jump is visible but can be considered for the final smoother videos
But in LGV30 the jump is terrible, the video moves to the extreme top right corner of the preview and the final output quality of the video is worse than VGA.
I know that Image Stabilization is dependent on the combination of Sensor Orientation, Gyroscope and Accelerometer and the jump in video is decided by the quality of these sensors, but is there any way I can control the Jump in LGV30 or is it the hardware that has flaw.
I am writing down my configuration code for reference
mPreviewBuilder.set(CaptureRequest.COLOR_CORRECTION_MODE, CaptureRequest.COLOR_CORRECTION_MODE_FAST);
mPreviewBuilder.set(CaptureRequest.CONTROL_CAPTURE_INTENT, CaptureRequest.CONTROL_CAPTURE_INTENT_VIDEO_RECORD);
mPreviewBuilder.set(CaptureRequest.HOT_PIXEL_MODE, CaptureRequest.HOT_PIXEL_MODE_FAST);
mPreviewBuilder.set(CaptureRequest.EDGE_MODE, CaptureRequest.EDGE_MODE_FAST);
mPreviewBuilder.set(CaptureRequest.NOISE_REDUCTION_MODE, CaptureRequest.NOISE_REDUCTION_MODE_FAST);
mPreviewBuilder.set(CaptureRequest.TONEMAP_MODE, CaptureRequest.TONEMAP_MODE_FAST);
mPreviewBuilder.set(CaptureRequest.SHADING_MODE, CaptureRequest.SHADING_MODE_FAST);
mPreviewBuilder.set(CaptureRequest.CONTROL_VIDEO_STABILIZATION_MODE, CaptureRequest.CONTROL_VIDEO_STABILIZATION_MODE_ON);
I assume that this jump can not be controlled, but is there any way to reduce the jump?

The solution turned out to be the wrong camera.
Sounds trivial but in this code
CameraManager manager = (CameraManager) mContext.getSystemService(Context.CAMERA_SERVICE);
manager.getCameraIdList()
The manager returns all the accessible camera for a given device.
In LGV30 since there are 2 rear cameras, one regular and another wide angle, I was accessing the wide angle camera.
Wide angle camera was not able to handle the Image Stabilization and was causing this issue. So the solution was to pick the first camera from the list for given camera facing. This is still a workaround because I couldn't figure out the API to check whether given camera is regular or workaround.
Details for above question is in How to check whether given Camera is regular camera or wide angle camera?

Related

Rotating the preview with CameraX

I have a proprietary device with a rotated camera, that doesn't report its rotation to the OS.
We use CameraX with PreviewView, and I get the image 90 deg rotated, and mirrored.
Is there an efficient way to rotate the preview?
We tried:
Setting the setTargetOrientation of camera, of analyzer, of Preview. None works.
Rotating the PreviewView of course doesn't work, it just modifies the ViewPort because it won't rotate the surface.
ViewFinder library - outputs bitmaps but doesn't connect to Preview (or does anybody have a solution for that?). Without setSurfaceProvider, nothing will empty the frames pipeline, and it will display nothing.
Ugly "Solution":
We currently use bitmap (via Analyzer), then we display the bitmap. That's ugly, and CPU expensive.
Is Camera2 the only way to rotate previews nicely? Camera2 does bitmaps anyway, but it may do the YUV conversion on hardware.
NOTE
Solutions like here don't work. Setting target orientation of the Preview only modifies the scale to compensate for aspect ratio. It doesn't actually rotate. Also that doesn't work because it's obsolete and not valid anymore.
The Preview#setTargetRotation API should work: https://developer.android.com/reference/androidx/camera/core/Preview#setTargetRotation(int)
For this API to work, your PreviewView has to be in the COMPATIBLE mode:
PreviewView#setImplementationMode(ImplementationMode.COMPATIBLE);
You can find a complete sample on GitHub: https://github.com/androidx/androidx/blob/androidx-main/camera/integration-tests/viewtestapp/src/main/java/androidx/camera/integration/view/PreviewViewFragment.java
I uploaded a screen record of the preview rotating. Please take look and let me know if this is not your desired result: https://github.com/xizhang/public-files/blob/main/stackoverflow74798791/preview_rotation.mp4
You can also download the APK and test it yourself: https://github.com/xizhang/public-files/blob/main/stackoverflow74798791/camera-testapp-view-debug.apk

How to set android camera2 parameters to have the best result at text recognizing

I'm creating an android app that allow user to scan the code on a small card (like yu gi oh card).
The problem is that the number I want to read is really small and it's hard to get a good focus to make it clear. So I wanted to set the params to get the best result at closest distance.
At first, I follow this tutorial to create a simple camera preview: https://inducesmile.com/android/android-camera2-api-example-tutorial/
Next I tried to change the camera preview settings to disable auto-focus, it works well, but then I tried to manually set the focus distance, and nothing change.
This is an extract of the code in the camera preview creation methods :
CameraManager manager = (CameraManager) getSystemService(Context.CAMERA_SERVICE);
CameraCharacteristics characteristics = manager.getCameraCharacteristics(cameraId);
//Disable auto-focus
captureRequestBuilder.set(CaptureRequest.CONTROL_AF_MODE, CameraMetadata.CONTROL_AF_MODE_OFF);
//Try to make it at the shortest distance (do not work)
captureRequestBuilder.set(CaptureRequest.LENS_FOCUS_DISTANCE, characteristics.get(CameraCharacteristics.LENS_INFO_MINIMUM_FOCUS_DISTANCE)
I've tried different parameters on the focus distance, but nothing change.
Maybe I'm just making a mistake, and it's not the right way to improve this.
Manual focus control is not a guaranteed feature. Many lower-end devices do not support it, and only support autofocus. You can check if the device has the capability MANUAL_SENSOR. Some cameras are entirely fixed-focus (mostly these are selfie cameras), so those you can't even autofocus.
For your use case, autofocus should work well enough anyway, as long as the small card fills up most of the visible scene.
Note that many devices have a minimum focus distance of 8-10 cm, so you can't hold the card really close and expect to get sharp images of it.

Who rotates the frame from a camera when rendering in a SurfaceTexture?

UPDATE : question is about Camera2
I'm trying to figure out who applies the rotation transform when a camera preview is drawn on a SurfaceTexture.
When requesting the preview sizes from the camera you always get pairs where the width is larger than the height (because landscape is the usual orientation when taking a picture).
When using the device on portrait mode and setting the preview size (for ex. 1600 x1200) the frames from the camera are correctly rotated but I can't find out where it's done, is it something CameraDevice does automatically based on Surface type ? or is it the SurfaceTexture who rotates the image?
The answer depends a bit on which camera API you're using; since you mention CameraDevice, I assume you mean camera2.
In camera2, one of the goals was to simplify the rotation handling where possible; so if you draw preview into a SurfaceView, the camera service and the hardware compositor / GPU cooperate and handle all the rotation for you. You always request landscape resolutions from the sensor (which is aligned with the long edge of the sensor matching the long edge of the device), and you just need to make sure your SurfaceView's aspect ratio matches that of the requested resolution (so in portrait, you need 9:16 View to display 1080p correctly; in landscape you need a 16:9 View).
For TextureView, and completely custom GL drawing with SurfaceTexture, the API can't do it all for you. The SurfaceTexture's transform is set by the camera API to rotate the camera output to the device's native orientation (generally portrait for phones, landscape for some tablets), but it can't handle the additional rotation if your app's UI is not in the native orientation.
In that case, you'll need to get the transform matrix from the SurfaceTexure, apply the extra rotation to match your app UI, and set that as the TextureView matrix.
It's been about a year since I worked on Camera stuff, so I apologize for a lack of specifics. But I do remember that the key when displaying the Camera preview was calculating the correct transform matrix to apply to the SurfaceView (or TextureView) to apply any scaling or rotation that is needed. So regarding "who is doing the rotation" I guess you can say it is the view, as instructed by the Transform that you supply. So the matrix will be based on the relationship of the preview resolution you request (from the list of what's capable for your device) compared to the actual laid out dimensions of your view, plus the physical orientation of the device sensor as compared to the current orientation of the device screen.
In the old camera API, Camera.CameraInfo has an orientation field that supplies the physical orientation with which the sensor is mounted. It's typically 90 for a landscape mounted sensor on most phones, but you'll also find some 270s out there (effectively "upside down"). 0 and 180 may exist but I've never seen them.

How to disable mirror effect in front camera and record a video without inverting its images? [duplicate]

I recording video using MediaRecorder.When using back-camera,it working fine,but when using front camera,the video captured is being flipped/inverse.Means that the item in right,will appear on the left.The camera preview is working fine,just final captured video flipped.
Here is the camera preview looks like
But the final video appear like this(all the item in left hand side,appear on right hand side)
What I tried so far:
I tried to apply the matrix when prepare recorder,but it seems does change anything.
private boolean prepareRecorder(int cameraId){
//# Create a new instance of MediaRecorder
mRecorder = new MediaRecorder();
setCameraDisplayOrientation(this,cameraId,mCamera);
int angle = getVideoOrientationAngle(this,cameraId);
mRecorder.setOrientationHint(angle);
if(cameraId == Camera.CameraInfo.CAMERA_FACING_FRONT){
Matrix matrix = new Matrix();
matrix.preScale(1.0f,-1.0f);
}
//all other code to prepare recorder here
}
I already read for all this question below,but all this seems didnt solve my problem.For information,I using SurfaceView for the camera preview,so this question here doesn't help.
1) Android flip front camera mirror flipped video
2) How to keep android from inverting the image from the front facing camera?
3) Prevent flipping of the front facing camera
So my question is :
1) How to capture a video by front camera which the video not being inverse(exactly the same with camera preview)?
2) How to achieve this when the Camera preview is using SurfaceView but not TextureView ? (cause all the question I mention above,tell about using TextureView)
All possible solution is mostly welcome..Tq
EDIT
I made 2 short video clip to clarify the problem,please download and take a look
1) The video during camera preview of recording
2) The video of the final product of recording
So, if the system camera app produces video similar to your app, you didn't do something wrong. Now it's time to understand what happens to front-facing camera video recording.
The front facing camera is not different from the rear facing camera in the way it captures still pictures or video. There is a difference how the phone displays camera preview on the screen. To make it look more natural to the user, Android (and all other systems) mirrors the preview, so that you can see yourself as if in a mirror.
It is important to understand that this only applies to the way the preview is presented to you. If you pick up any video conferencing app, connect two devices that you hold in two hands, and look at yourself, you will see to your surprise that the two instances of yourself are flipped.
This is not a bug, this is the natural way to present the video to the other party.
See the sketch:
This is how you see the scene:
This is how your peer sees the same scene
Normally, recording of a video is done from the point if view of your peer, as in the second picture. This is the natural setup for, e.g., video conferencing.
But Snapchat and some other social apps choose to store the front-facing video clip as if you record it from the mirror (as if the recorder is in your hand on the first picture). Some people like this feature, others hate it (see https://forums.androidcentral.com/general-help-how/664539-front-camera-pics-mirrored-reversed-only-snapchat.html and https://www.reddit.com/r/nexus6/comments/3846ay/has_anyone_found_a_fix_for_snapchat_flipping)
You cannot use MediaRecorder for that. You can use the lower-level API of MediaCodec to record processed frames. You need to flip each frame 'manually', and this may be a significant performance hit, because normally the MediaRecorder 'connects' the camera to hardware encoder in a very efficient way, without need even to copy the pixels to user memory. This answer shows how you can manipulate the way camera is rendered to texture.
You can achieve this by recording video manually from surface view.
In such case preview and recording will match exactly.
I've been using this library for this purpose:
https://github.com/spaceLenny/recordablesurfaceview
Here is the guide how to use it (not with camera but with OpenGL drawing): https://withintent.uncorkedstudios.com/recording-screen-video-on-android-with-recordablesurfaceview-451c9daa213e

Android: Zoom not effecting preview frame data passed to onPreviewFrame()

I'm building a camera app which access the preview frame by implementing
android.hardware.Camera.PreviewCallback#onPreviewFrame(byte[] data, Camera camera).
When I change my camera zoom by calling the
android.hardware.Camera.Parameters#setZoom(int zoom)
it doesn't seem to have any effect on the data i get in the onPreviewFrame. The preview display itself is effected as expected.
This is happening on LG Nexus 4.
How can I get the actual zoom which is applied on the preview data. do I have any way of knowing if my device actually applies zoom on the data or not?
Thanks,
not exactly a solution but... you can set the scene to HDR (Camera.Parameters.setSceneMode("hdr")) and that makes the data to behave as expected (but, of course, then HDR is on).
what I did was to work around it and actually crop the data only on the Nexus 4. I used the getZoom() method to detect zoom and then cropped accordingly.

Categories

Resources