I have a proprietary device with a rotated camera, that doesn't report its rotation to the OS.
We use CameraX with PreviewView, and I get the image 90 deg rotated, and mirrored.
Is there an efficient way to rotate the preview?
We tried:
Setting the setTargetOrientation of camera, of analyzer, of Preview. None works.
Rotating the PreviewView of course doesn't work, it just modifies the ViewPort because it won't rotate the surface.
ViewFinder library - outputs bitmaps but doesn't connect to Preview (or does anybody have a solution for that?). Without setSurfaceProvider, nothing will empty the frames pipeline, and it will display nothing.
Ugly "Solution":
We currently use bitmap (via Analyzer), then we display the bitmap. That's ugly, and CPU expensive.
Is Camera2 the only way to rotate previews nicely? Camera2 does bitmaps anyway, but it may do the YUV conversion on hardware.
NOTE
Solutions like here don't work. Setting target orientation of the Preview only modifies the scale to compensate for aspect ratio. It doesn't actually rotate. Also that doesn't work because it's obsolete and not valid anymore.
The Preview#setTargetRotation API should work: https://developer.android.com/reference/androidx/camera/core/Preview#setTargetRotation(int)
For this API to work, your PreviewView has to be in the COMPATIBLE mode:
PreviewView#setImplementationMode(ImplementationMode.COMPATIBLE);
You can find a complete sample on GitHub: https://github.com/androidx/androidx/blob/androidx-main/camera/integration-tests/viewtestapp/src/main/java/androidx/camera/integration/view/PreviewViewFragment.java
I uploaded a screen record of the preview rotating. Please take look and let me know if this is not your desired result: https://github.com/xizhang/public-files/blob/main/stackoverflow74798791/preview_rotation.mp4
You can also download the APK and test it yourself: https://github.com/xizhang/public-files/blob/main/stackoverflow74798791/camera-testapp-view-debug.apk
Related
Video suddenly zooming into the frame when recording starts in preview mode is known behavior, but the issue I am facing currently is the extreme jump in preview when using LGV30.
In pixel 1 XL and pixel 2 XL the jump is negligible
In Samsung S8+ the jump is visible but can be considered for the final smoother videos
But in LGV30 the jump is terrible, the video moves to the extreme top right corner of the preview and the final output quality of the video is worse than VGA.
I know that Image Stabilization is dependent on the combination of Sensor Orientation, Gyroscope and Accelerometer and the jump in video is decided by the quality of these sensors, but is there any way I can control the Jump in LGV30 or is it the hardware that has flaw.
I am writing down my configuration code for reference
mPreviewBuilder.set(CaptureRequest.COLOR_CORRECTION_MODE, CaptureRequest.COLOR_CORRECTION_MODE_FAST);
mPreviewBuilder.set(CaptureRequest.CONTROL_CAPTURE_INTENT, CaptureRequest.CONTROL_CAPTURE_INTENT_VIDEO_RECORD);
mPreviewBuilder.set(CaptureRequest.HOT_PIXEL_MODE, CaptureRequest.HOT_PIXEL_MODE_FAST);
mPreviewBuilder.set(CaptureRequest.EDGE_MODE, CaptureRequest.EDGE_MODE_FAST);
mPreviewBuilder.set(CaptureRequest.NOISE_REDUCTION_MODE, CaptureRequest.NOISE_REDUCTION_MODE_FAST);
mPreviewBuilder.set(CaptureRequest.TONEMAP_MODE, CaptureRequest.TONEMAP_MODE_FAST);
mPreviewBuilder.set(CaptureRequest.SHADING_MODE, CaptureRequest.SHADING_MODE_FAST);
mPreviewBuilder.set(CaptureRequest.CONTROL_VIDEO_STABILIZATION_MODE, CaptureRequest.CONTROL_VIDEO_STABILIZATION_MODE_ON);
I assume that this jump can not be controlled, but is there any way to reduce the jump?
The solution turned out to be the wrong camera.
Sounds trivial but in this code
CameraManager manager = (CameraManager) mContext.getSystemService(Context.CAMERA_SERVICE);
manager.getCameraIdList()
The manager returns all the accessible camera for a given device.
In LGV30 since there are 2 rear cameras, one regular and another wide angle, I was accessing the wide angle camera.
Wide angle camera was not able to handle the Image Stabilization and was causing this issue. So the solution was to pick the first camera from the list for given camera facing. This is still a workaround because I couldn't figure out the API to check whether given camera is regular or workaround.
Details for above question is in How to check whether given Camera is regular camera or wide angle camera?
UPDATE : question is about Camera2
I'm trying to figure out who applies the rotation transform when a camera preview is drawn on a SurfaceTexture.
When requesting the preview sizes from the camera you always get pairs where the width is larger than the height (because landscape is the usual orientation when taking a picture).
When using the device on portrait mode and setting the preview size (for ex. 1600 x1200) the frames from the camera are correctly rotated but I can't find out where it's done, is it something CameraDevice does automatically based on Surface type ? or is it the SurfaceTexture who rotates the image?
The answer depends a bit on which camera API you're using; since you mention CameraDevice, I assume you mean camera2.
In camera2, one of the goals was to simplify the rotation handling where possible; so if you draw preview into a SurfaceView, the camera service and the hardware compositor / GPU cooperate and handle all the rotation for you. You always request landscape resolutions from the sensor (which is aligned with the long edge of the sensor matching the long edge of the device), and you just need to make sure your SurfaceView's aspect ratio matches that of the requested resolution (so in portrait, you need 9:16 View to display 1080p correctly; in landscape you need a 16:9 View).
For TextureView, and completely custom GL drawing with SurfaceTexture, the API can't do it all for you. The SurfaceTexture's transform is set by the camera API to rotate the camera output to the device's native orientation (generally portrait for phones, landscape for some tablets), but it can't handle the additional rotation if your app's UI is not in the native orientation.
In that case, you'll need to get the transform matrix from the SurfaceTexure, apply the extra rotation to match your app UI, and set that as the TextureView matrix.
It's been about a year since I worked on Camera stuff, so I apologize for a lack of specifics. But I do remember that the key when displaying the Camera preview was calculating the correct transform matrix to apply to the SurfaceView (or TextureView) to apply any scaling or rotation that is needed. So regarding "who is doing the rotation" I guess you can say it is the view, as instructed by the Transform that you supply. So the matrix will be based on the relationship of the preview resolution you request (from the list of what's capable for your device) compared to the actual laid out dimensions of your view, plus the physical orientation of the device sensor as compared to the current orientation of the device screen.
In the old camera API, Camera.CameraInfo has an orientation field that supplies the physical orientation with which the sensor is mounted. It's typically 90 for a landscape mounted sensor on most phones, but you'll also find some 270s out there (effectively "upside down"). 0 and 180 may exist but I've never seen them.
I managed to write a video recording demo, my implementation is the same as ContinuousCaptureActivity of Grafika.
In ContinuousCaptureActivity.java, The author create egl object in SurfaceCreated which run in UI thread and call drawFrame also in UI thread. He did 2 things in drawFrame, draw frame to screen and push data to encoder.
See the code here: ContinuousCaptureActivity
Because I set the encoding video size to 1280*720 which is large, the camera preview is not smooth and fps of target video is low.
I plan to create a new thread to do the encoding work but I do not know how to handle with multithread of opengl es. Who can give some advice?
Add: I found that drawFrame of Texture2dProgram use GLES20.glDrawArrays, Will GLES20.glDrawElements get a better performance?
First, 1280x720 shouldn't be an issue for mainstream devices. Some super-cheap low-end devices might struggle, but there isn't really anything you can do if the hardware simply can't handle 1280x720x30fps.
The most common reasons I've seen for low FPS at 720p are failure to configure the Camera with a reasonable fps value (using setPreviewFpsRange() with a value from getSupportedPreviewFpsRange()), and failing to call setRecordingHint(true) (or the Camera2 equivalent). The latter can take you from 15fps to 30fps, but may affect the aspect ratio of the preview.
The video encoding is performed in a separate process, called mediaserver, which manages all interaction with the video encoder hardware. There are already multiple threads in play, so adding another won't help.
The GLES code is drawing two textured triangles. Using a different API won't change the difference.
If you think there is a performance bottleneck, you need to use tools like systrace to narrow it down.
I finally found a way to make drawing frame to screen faster which eventually save the time of processing each frame.
The detail is as follows:
mPreviewWidth = mCamera.getParameters().getPreviewSize().width;
mPreviewHeight = mCamera.getParameters().getPreviewSize().height;
holder.setFixedSize(mPreviewWidth, mPreviewHeight);
add these code to https://github.com/google/grafika/blob/master/src/com/android/grafika/ContinuousCaptureActivity.java#L352
then use GLES20.glViewport(0, 0, mPreviewWidth, mPreviewHeight); to replace https://github.com/google/grafika/blob/master/src/com/android/grafika/ContinuousCaptureActivity.java#L436
This modification will reduce data size of frame to draw a lot.
But it will make preview image not so smooth if we use TextureView, and we can use setScaleX(1.00001f); setScaleY(1.00001f); to resolve it.
I'm building a camera app which access the preview frame by implementing
android.hardware.Camera.PreviewCallback#onPreviewFrame(byte[] data, Camera camera).
When I change my camera zoom by calling the
android.hardware.Camera.Parameters#setZoom(int zoom)
it doesn't seem to have any effect on the data i get in the onPreviewFrame. The preview display itself is effected as expected.
This is happening on LG Nexus 4.
How can I get the actual zoom which is applied on the preview data. do I have any way of knowing if my device actually applies zoom on the data or not?
Thanks,
not exactly a solution but... you can set the scene to HDR (Camera.Parameters.setSceneMode("hdr")) and that makes the data to behave as expected (but, of course, then HDR is on).
what I did was to work around it and actually crop the data only on the Nexus 4. I used the getZoom() method to detect zoom and then cropped accordingly.
My video Camera app does record in the landscape mode, but the front facing camera previews the regular image, but the actual recording is mirrored (flipped or inverted) across the axis.
Everything works great on normal rear camera.
Can anybody suggest me a way to avoid it ? Any suggestions or source code would help a lot. Thank you.
The bad news: this mirroring is hardcoded into the camera service, and can not be disabled.
The good news: if you are on a recent API (API level >= 14), you can easily use a TextureView to mirror the preview image back to the original. Take the TextureView Example over at the Android Documentation, then use setTransform to set a mirroring transform. This will revert the preview image back to the non-mirrored original.
Note that a mirror transform is the same as a scaling transform with a -1 scale on the X axis.
If you are on an older API version, you might be able to do the same with a SurfaceView (using setScaleX, API level >= 11).
Try applying a transformation matrix to a TextureView. As per Prevent flipping of the front facing camera
This works for API level >= 14
You can change camera picture mirror to exact preview if you use this code
ImageCapture.Metadata metadata = new ImageCapture.Metadata();
metadata.setReversedHorizontal(true); // This method will fix mirror issue
options = new ImageCapture.OutputFileOptions.Builder(imageOutFile).setMetadata(metadata).build();
in OpenCV JavaCameraView;
public Mat onCameraFrame(CameraBridgeViewBase.CvCameraViewFrame inputFrame) {
imageMat = inputFrame.rgba().t();
if (//horizontal reverse)
Core.flip(imageMat, imageMat, 1);
else //for vertical reverse
Core.flip(imageMat, imageMat, -1);