Android Camera2 preview occasionally rotated by 90 degrees - android

I'm working on some app using Android's Camera2 API. So far I've been able to get a preview displayed within a TextureView. The app is by default in landscape mode. When using the emulator the preview will appear upside-down. On my physical Nexus 5 the preview is usually displayed correctly (landscape, not upside-down), but occasionally it is rotated by 90 degrees, yet stretched to the dimensions of the screen.
I thought that should be easy and thought the following code would return the necessary information on the current orientation:
// display rotation
getActivity().getWindowManager().getDefaultDisplay().getRotation();
// sensor orientation
mManager.getCameraCharacteristics(mCameraId).get(CameraCharacteristics.SENSOR_ORIENTATION);
... I was pretty surprised when I saw that above code always returned 1 for the display rotation and 90 for the sensor orientation, regardless of the preview being rotated by 90 degree or not. (Within the emulator sensor orientation is always 270 which kinda makes sense if I assume 90 to be the correct orientation).
I also checked the width and height within onMeasure within AutoMeasureTextureView (adopted from Android's Camera2 example) that I'm using to create my TextureView. But no luck either - width and height reported from within onMeasure are always the same regardless of the preview rotation.
So I'm clueless on how to tackle this issue. Does anyone have an idea what could be the reason for the occasional hickups in my preview orientation?
[Edit]
A detail I just found out: Whenever the preview appears rotated onSurfaceTextureSizeChanged in the TextureView.SurfaceTextureListener seems not to get called. In the documentation for onSurfaceTextureSizeChanged it is said that this method is called whenever the SurfaceTexture's buffers size is changed. I have a method createCameraPreviewSession (copied from Android's Camera2 example) in which I set the default buffer size of my texture like
texture.setDefaultBufferSize(mPreviewSize.getWidth(), mPreviewSize.getHeight());
From my logging output I can tell that onSurfaceTextureSizeChanged is called exactly after that - however, not always... (or setting the default buffer size sometimes silently fails?).

I think I can answer my own question: I was creating my Camera2 Fragment after the Android's Camera2 example. However, I didn't really consider the method configureTransform to be important as, opposite to the example code, my application is forced to landscape mode anyway. It turned out that this assumption was wrong. Since having configureTransform reintegrated in my code I haven't experienced any more hiccups.
Update: The original example within the Android documentation pages doesn't seem to exist anymore. I've updated the link which is now pointing to the code on Github.

I followed the whole textureView.setTransform(matrix) method listed above, and it worked. However, I was also able to manually set the rotation using the much simpler textureView.setRotation(270) without the need to create a Matrix.

I had also faced a similar issue on the Nexus device. The below code is working for me.
Call this function before opening the camera and also on Resume().
private void transformImage(int width, int height)
{
if (textureView == null) {
return;
} else try {
{
Matrix matrix = new Matrix();
int rotation = getWindowManager().getDefaultDisplay().getRotation();
RectF textureRectF = new RectF(0, 0, width, height);
RectF previewRectF = new RectF(0, 0, textureView.getHeight(), textureView.getWidth());
float centerX = textureRectF.centerX();
float centerY = textureRectF.centerY();
if (rotation == Surface.ROTATION_90 || rotation == Surface.ROTATION_270) {
previewRectF.offset(centerX - previewRectF.centerX(), centerY - previewRectF.centerY());
matrix.setRectToRect(textureRectF, previewRectF, Matrix.ScaleToFit.FILL);
float scale = Math.max((float) width / width, (float) height / width);
matrix.postScale(scale, scale, centerX, centerY);
matrix.postRotate(90 * (rotation - 2), centerX, centerY);
}
textureView.setTransform(matrix);
}
} catch (Exception e) {
e.printStackTrace();
}
}

Related

AndroidCamera2 for face detection and distance measurement

I'm building a camera that needs to detect the user's face/eyes and measure distance through the eyes.
I found that on this project https://github.com/IvanLudvig/Screen-to-face-distance, it works great but it doesn't happen to use a preview of the frontal camera (Really, I tested it on at least 10 people, all measurements were REALLY close or perfect).
My app already had a selfie camera part made by me, but using the old camera API, and I didn't find a solution to have both camera preview and the face distance to work together on that, always would receive an error that the camera was already in use.
I decided to move to the camera2 to use more than one camera stream, and I'm still learning this process of having two streams at the same time for different things. Btw documentation on this seems to be so scarce, I'm really lost on it.
Now, am I on the right path to this?
Also, on his project,Ivan uses this:
Camera camera = frontCam();
Camera.Parameters campar = camera.getParameters();
F = campar.getFocalLength();
angleX = campar.getHorizontalViewAngle();
angleY = campar.getVerticalViewAngle();
sensorX = (float) (Math.tan(Math.toRadians(angleX / 2)) * 2 * F);
sensorY = (float) (Math.tan(Math.toRadians(angleY / 2)) * 2 * F);
This is the old camera API, how can I call this on the new one?
Judging from this answer: Android camera2 API get focus distance in AF mode
Do I need to get min and max focal lenghts?
For the horizontal and vertical angles I found this one: What is the Android Camera2 API equivalent of Camera.Parameters.getHorizontalViewAngle() and Camera.Parameters.getVerticalViewAngle()?
The rest I believe is done by Google's Cloud Vision API
EDIT:
I got it to work on camera2, using GMS's own example, CameraSourcePreview and GraphicOverlay to display whatever I want to display together the preview and detect faces.
Now to get the camera characteristics:
CameraManager manager = (CameraManager) this.getSystemService(Context.CAMERA_SERVICE);
try {
character = manager.getCameraCharacteristics(String.valueOf(1));
} catch (CameraAccessException e) {
Log.e(TAG, "CamAcc1Error.", e);
}
angleX = character.get(CameraCharacteristics.SENSOR_INFO_PHYSICAL_SIZE).getWidth();
angleY = character.get(CameraCharacteristics.SENSOR_INFO_PHYSICAL_SIZE).getHeight();
sensorX = (float)(Math.tan(Math.toRadians(angleX / 2)) * 2 * F);
sensorY = (float)(Math.tan(Math.toRadians(angleY / 2)) * 2 * F);
This pretty much gives me mm accuracy to face distance, which is exactly what I needed.
Now what is left is getting a picture from this preview with GMS's CameraSourcePreview, so that I can use later.
Final Edit here:
I solved the picture issue, but I forgot to edit here. The thing is, all the examples using camera2 to take a picture are really complicated (rightly so, it's a better API than camera, has a lot of options), but it can be really simplyfied to what I did here:
mCameraSource.takePicture(null, bytes -> {
Bitmap bitmap;
bitmap = BitmapFactory.decodeByteArray(bytes, 0, bytes.length);
if (bitmap != null) {
Matrix matrix = new Matrix();
matrix.postRotate(180);
matrix.postScale(1, -1);
rotateBmp = Bitmap.createBitmap(bitmap, 0, 0, bitmap.getWidth(),
bitmap.getHeight(), matrix, false);
saveBmp2SD(STORAGE_PATH, rotateBmp);
rotateBmp.recycle();
bitmap.recycle();
}
});
That's all I needed to take a picture and save to a location I specified, don't mind the recycling here, it's not right, I'm working on it
It looks like that bit of math is calculating the physical dimensions of the image sensor, via the angle-of-view equation:
The camera2 API has the sensor dimensions as part of the camera characteristics directly: SENSOR_INFO_PHYSICAL_SIZE.
In fact, if you want to get the field of view in camera2, you have to use the same equation in the other direction, since FOVs are not part of camera characteristics.
Beyond that, it looks like the example you linked just uses the old camera API to fetch that FOV information, and then closes the camera and uses the Vision API to actually drive the camera. So you'd have to look at the vision API docs to see how you can give it camera input instead of having it drive everything. Or you could use the camera API's built-in face detector, which on many devices gives you eye locations as well.

Android Pie AOSP default Camera orientation issue: Camera sensor roated to 90 degrees right and Display is in reverse landscape mode

I am using an (Android Pie) AOSP Camera2 application. (packages/apps/Camera2)
We are using Camera module OV5640. Able to preview and capture the image and video in reverse landscape mode (But mirror enabled).
But for the mechanical enclosure fitting, camera sensor module rotated to 90 degrees right physically.
And in camera application for preview, image is coming as 90 degree left and mirror also enabled. Even after capture the image or video also same behaviour.
Please can you suggest me what changes I need to do in camera application (packages/apps/Camera2) related to:
Reverse landscpe and camera sensor module rotated to 90 Degrees right
I tried as below, this time camera image capture preview came same as in camera direction, but preview size is less. And after capturing the image image also saved in proper direction. Still mirror issue is there.
But video preview is still came as 90 degress rotated left but after capturing the video, saved video is coming properly. Still mirror issue is there.
packages/apps/Camera2/src/com/android/camera/app/OrientationManager.java
public static DeviceOrientation from(int degrees) {
switch (degrees) {
case 0:
return CLOCKWISE_90;
case 90:
return CLOCKWISE_0;
case 180:
return CLOCKWISE_270;
case 270:
return CLOCKWISE_180;
default:
return CLOCKWISE_90;
}
}
packages/apps/Camera2/src/com/android/camera/processing/imagebackend/TaskCompressImageToJpeg.java
final DeviceOrientation exifDerivedRotation;
if (exifOrientation == null) {
// No existing rotation value is assumed to be 0
// rotation.
exifDerivedRotation = DeviceOrientation.CLOCKWISE_90;
} else {
//exifDerivedRotation = DeviceOrientation
// .from(exifOrientation);
exifDerivedRotation = DeviceOrientation.CLOCKWISE_90;
}
// Resulting image will be rotated so that viewers won't
// have to rotate. That's why the resulting image will have 0
// rotation.
resultImage = new TaskImage(
DeviceOrientation.CLOCKWISE_90, resultSize.getWidth(),
resultSize.getHeight(),
ImageFormat.JPEG, null);
// Image rotation is already encoded into the bytes.
src/com/android/camera/TextureViewHelper.java
// This rotation code assumes that the aspect ratio of the content
// (not of necessarily the surface) equals the aspect ratio of view that is receiving
// the preview. So, a 4:3 surface that contains 16:9 data will look correct as
// long as the view is also 16:9.
switch (deviceOrientation) {
case CLOCKWISE_90:
transform.setRectToRect(rotatedRect, desiredBounds, Matrix.ScaleToFit.FILL);
transform.preRotate(270, mWidth / 2, mHeight / 2);
break;
case CLOCKWISE_180:
transform.setRectToRect(normalRect, desiredBounds, Matrix.ScaleToFit.FILL);
transform.preRotate(180, mWidth / 2, mHeight / 2);
break;
case CLOCKWISE_270:
transform.setRectToRect(rotatedRect, desiredBounds, Matrix.ScaleToFit.FILL);
transform.preRotate(90, mWidth / 2, mHeight / 2);
break;
case CLOCKWISE_0:
default:
transform.setRectToRect(normalRect, desiredBounds, Matrix.ScaleToFit.FILL);
break;
Thank and Regards,
Devendra
Are you returning the correct sensor orientation values from the HAL's camera metadata? Also, there is a general requirement in the Android CDD that the long side of the sensor must line up with the long side of the screen, which most applications expect to be true.
The Android OS also flips images from front-facing camera for preview, but should not do so for still captures or recorded video. If this is a back-facing camera, and you are producing mirrored images from the sensor module, you need to unmirror at the HAL or hardware level.
In general, applications cannot rotate/flip images drawn into a SurfaceView, so there's not much you can do to fix this in the app. And if you fix it in app, all 3P apps will still be broken. This may not matter if this device is not intended to run other apps, or be certified as Android-compatible, of course.

Set Display resolution limit in unity

So I released a game a few months ago.
I run a lot of test on devices I add at home (galaxy note 2, galaxy tab pro, wiko), and the game runs smoothly on these devices.
But last day, I run my game on an LG G3 device, and there are a lot of FPS drops.
I think it's because the game runs with the native display resolution of the screen (2560 x 1440).
Is it possible to create a script, that when it detects a display resolution upper than FullHD (like for the LG G3), it displays the game in a lower resolution?
I think it would stop the FPS drops.
Adjust same Camera Resolution on every Device.
If your Game is in portrait mode then use 720*1280 resolution and if using landscape mode the use 960*640 , your game will run perfect on every device.
Attach Script to your camera
Change Values targetaspect
using UnityEngine;
using System.Collections;
public class CameraResolution : MonoBehaviour {
void Start () {
// set the desired aspect ratio (the values in this example are
// hard-coded for 16:9, but you could make them into public
// variables instead so you can set them at design time)
float targetaspect = 720.0f / 1280.0f;
// determine the game window's current aspect ratio
float windowaspect = (float)Screen.width / (float)Screen.height;
// current viewport height should be scaled by this amount
float scaleheight = windowaspect / targetaspect;
// obtain camera component so we can modify its viewport
Camera camera = GetComponent<Camera> ();
// if scaled height is less than current height, add letterbox
if (scaleheight < 1.0f) {
Rect rect = camera.rect;
rect.width = 1.0f;
rect.height = scaleheight;
rect.x = 0;
rect.y = (1.0f - scaleheight) / 2.0f;
camera.rect = rect;
} else { // add pillarbox
float scalewidth = 1.0f / scaleheight;
Rect rect = camera.rect;
rect.width = scalewidth;
rect.height = 1.0f;
rect.x = (1.0f - scalewidth) / 2.0f;
rect.y = 0;
camera.rect = rect;
}
}
}
is not that easy (with a good quality result).
Basically, you can use asset bundle system for it and have double of your graphics in SD and HD formats. Unity supports it, it calls variants. Please find more information about Asset Bundles here:
https://unity3d.com/learn/tutorials/topics/scripting/assetbundles-and-assetbundle-manager
Detection of screen resolution is easy. You can use Screen.width and Screen.height for it.
I know Screen class has a method SetResolution and this might do a thing for you without using an Asset Bundle system. I have never use it on my own.
Here is more about Screen class:
https://docs.unity3d.com/ScriptReference/Screen.html
and concrete SetResolution method:
https://docs.unity3d.com/ScriptReference/Screen.SetResolution.html
You can use Camera.aspect to get an aspect ratio of the screen as well:
https://docs.unity3d.com/ScriptReference/Camera-aspect.html

Rotate resulting Camera Images when the Activity is fixed to portrait

I am making an application with a built in camera. The Activity is fixed to portrait orientation but I want to have the images saved properly right-side up, like so:
Camera camera = getCameraInstance(); //method found on http://developer.android.com/guide/topics/media/camera.html
Camera.Parameters params = camera.getParameters();
params.setRotation(someInteger); //I want to get the proper value for this method
camera.setParameters(params);
Has anyone been able to achieve this?
If you're just trying to rotate the JPEG images you receive from calling takePicture, then setRotation is the right method to use.
Is the question about what value to pass into setRotation? Assuming you want real-world 'up' to be 'up' in the saved JPEG image, setRotate needs to be set based on the current orientation of the camera sensor relative to the world.
You can find out what the relative orientation of the whole device to the world is, and you can find out what the orientation of the camera sensor is relative to the device's 'natural' orientation, and combine the two rotations into the final answer. The math is easy to get wrong, which is why we have it explicitly spelled out in the API documentation for setRotation, reproduced here:
public void onOrientationChanged(int orientation) {
if (orientation == ORIENTATION_UNKNOWNsetRotation) return;
android.hardware.Camera.CameraInfo info =
new android.hardware.Camera.CameraInfo();
android.hardware.Camera.getCameraInfo(cameraId, info);
orientation = (orientation + 45) / 90 * 90;
int rotation = 0;
if (info.facing == CameraInfo.CAMERA_FACING_FRONT) {
rotation = (info.orientation - orientation + 360) % 360;
} else { // back-facing camera
rotation = (info.orientation + orientation) % 360;
}
mParameters.setRotation(rotation);
}
You'll need to inherit from OrientationEventListener and implement the above for the callback method. Of course, you should check that your camera is open, and that mParameters, etc, is valid before updating the parameters.
Please note that this only rotates the JPEGs that are sent out by the camera. If you see that your preview is not correctly oriented in your UI, you need to call setDisplayOrientation for that. The camera sensor is normally lined up with the landscape orientation of the device, so landscape camera apps can often get away without calling this function, even though they should in case they're on an unusual Android device. However, if you're writing a portrait app, it's likely mandatory you adjust the display orientation to align with your UI. As with setRotation, you need to take a few factors into account, and sample code for doing the math right is included in the documentation.

Photo rotated in custom camera application

My app is using camera to take photos. The problem is, the photo is rotated by 90 degrees. The app is designed to run in portrait orientation and I have set
android:configChanges="orientation|screenSize"
to avoid orientation changes. I thought I managed to fix it with
parameters.setRotation(90);
but it turns out that it varies on different devices (tested on lenovo ThinkPad tablet and a copule of smartphones). I tried reading the EXIF of the photo but orientation is not included there. I know there are many similar posts but most of them regards default camera app. Could someone explain me what this problem is caused by and how can i fix it? Thanks in advance.
Try this for getting image as u wanted
public static Bitmap createRotatedBitmap(Bitmap bm, float degree) {
Bitmap bitmap = null;
if (degree != 0) {
Matrix matrix = new Matrix();
matrix.preRotate(degree);
bitmap = Bitmap.createBitmap(bm, 0, 0, bm.getWidth(),
bm.getHeight(), matrix, true);
}
return bitmap;
}
bitmap = createRotatedBitmap(bitmap, 90);
Yes, orientation will not be exactly same for all the devices. It is completely hardware dependent, can vary device to device. You can't fix it, only one option you have just allow user to set the rotation once your application launched at first time get the base rotation angle and save it in to the settings and then give your functionality afterwards.

Categories

Resources