Inconsistent video rotation when using MediaCodec - android

I have two devices, a Nexus 7 (Android 5) and a Galaxy S3 (4.3). On both devices I recorded a video in portrait mode and saved it with rotation hint 90 degrees. This is the correct orientation hint cause when played using the default media player the orientation is fine on both devices. I can even copy the video from the Nexus to the Galaxy and the orientation is still fine when I play the video file.
However, when I decode the video using the MediaCodec api I get some problems with the video rotation. When I display the video data I get from the MediaCodec the video on the Nexus is correct but on the Galaxy S3 the video is rotated by 90 degrees. My problem is that I have no clue what parameters on both devices differ so that I can decide why I have to rotate the video on the Galaxy but not on the Nexus!
On both devices the display rotation is 0 and the orientation is portrait:
int rotation = windowManager.getDefaultDisplay().getRotation();
int orientation = context.getResources().getConfiguration().orientation;
Is there another parameter I have to take into account?
Was there some api changes between 4.3 and 5 that affects the rotation of the decoded video? For example, is that video rotated automatically now?
I use opengl to display the video in case it matters...
Update: Solution
The problem was in the OpenGL transformation matrix that I get back from the MediaCodec. MediaCodec renders onto a SurfaceTexture and in case of the Nexus the transformation matrix of the SurfaceTexture is updated correctly, the Galaxy does not do that. When rendering the final output texture I used this matrix. To solve this I ignore the transformation matrix and rotate the video manually according to the recording hint.

Related

Android Camera2 capture image skewed

Update: This looks like it's related to this: Image data from Android camera2 API flipped & squished on Galaxy S5 - I consider this as a bug since Nexus 5/6 works correctly and it makes no sense to need to obtain full sensor size and then cropping manually to reach the desired aspect ratio, might as well not using "supported" output sizes as well!
Problem:
Get characteristics of a camera using Camera2 API, and extract output sizes suitable for a MediaCodec.class
Create a MediaCodec input surface with one of the suitable camera output sizes. Feed the output to some MediaMuxer or whatever, to see the output.
Start camera capture requests using the codec's created surface as the target.
Codec output has the correct size. But the result differs by device:
Nexus 5/6: everything ok on Android 5/6.
Samsung tablet with Android 5.1: for some resolutions, the image is obviously stretched, indicating that the camera output resolution does not match the surface size. Becomes very obvious when starting to rotate the camera - image becomes more and more skewed since it's not aligned with the X/Y axes. For some other resolutions the output is OK. There is no pattern here related to either the size or the aspect ratio.
No problem, one would say. Maybe the surface is not created exactly at the specified width and height, or whatever (even if the output sizes were extracted specifically for a MediaCodec.class target).
So, I created an OpenGL context, generated a texture, a SurfaceTexture for it, set its default buffer size to the camera output size, and created a Surface using the texture. I won't go into the gory details of drawing that to a TextureView or back to the MediaCodec's EGL surface. The result is the same - the Camera2 capture requests outputs a distorted image only for some resolutions.
Digging deeper: calling getTransformMatrix on the SurfaceTexture immediately after updateTexImage - the matrix is always the identity matrix, as expected.
So, the real problem here is that the camera is NOT capturing at the size of the provided target surface. The solution would thereby be to get the actual size the camera is capturing, and the rest is pure GL matrix transforms to draw correctly. But - HOW DO I GET THAT?
Note: using the old Camera API, with exactly the same "preview size" and the same surface as the target (either MediaCodec's or the custom one) - ALL IS FINE! But I can't use the old camera API, since it's both deprecated and also seems to have a max capture size of 1080p, while the Camera2 API goes beyond that, and I need to support 4k recording.
I encounter similar issue, model SM-A7009 with api level 21, legacy camera2 device.
The preview is stretched, surfaceTexture.setDefaultBufferSize not working, the framework will override these value when preview started.
The preview sizes reported from StreamConfigurationMap.getOutputSizes(SurfaceTexture.class) are not all supported.
Only three of them are supported.
$ adb shell dumpsys media.camera |grep preview-size
preferred-preview-size-for-video: 1920x1080
preview-size: 1440x1080
preview-size-values: 1920x1080,1440x1080,1280x720,1056x864,960x720,880x720,800x480,720x480,640x480,528x432,352x288,320x240,176x144
The system dump info list many of the preview sizes, after check all of them, I found only 1440x1080, 640x480, 320x240 are supported.
The supported preview sizes all have 1.33333 ratio. They have the same ratio reported from CameraCharacteristics.SENSOR_INFO_ACTIVE_ARRAY_SIZE.
So I thought it's a bug in some samsung devices with legacy camera2 api in API 21.
The solution is making these devices using deprecated camera API.
Hope it would be helpful for anyone reach here.
So yes, this is a bug on those Samsung devices.
Generally this happens when you ask for multiple different aspect ratios on output, and the device-specific camera code trips over itself on cropping and scaling all of them correctly. You may be able to avoid it by ensuring all requested sizes have the same aspect ratio.
The resolution is probably actually what you asked for - but it's been incorrectly scaled (you could test this with an ImageReader at the problematic size, where you get an explicit buffer you can poke at.)
We are adding additional testing to the Android compliance tests to try to ensure these kinds of stretched outputs don't continue to happen.

why the video is rotated by 90 degrees at PC?

Shooting video with MediaRecorder class is good, shooting results in the phone, it was also ok, but uploaded to the computer to see the video,but it is rotated by 90 degrees. How to solve?
Usually, some vendor like samsung or sony... make their devices to record video in landscape mode. Maybe you record video in vertical orientation and review video after that on the laptop should be rotated by 90 degrees. I faced this issue in my project and the way to solve that is: detect the orientation by use the phone's sensor, after that use setOrientationHint(int degrees) method in MediaRecorder class to have a suitable orientation .

Use FFmpeg to rotate the video based on its <rotate> metadata? Why does Android put the wrong metadata?

I have a website where the user uploads a video.
For my website i have also an Android application.
The website creates a thumbnail of each uploaded video (from browser or Android).
The problem is that for normal videos it's all OK, but for android my videos are rotated by 90°.
I think that Android has a bug, because when I see with FFmpeg the video's metadata, for a normal recorded video I got a rotate=90 value, and for a 90° rotated video, I don't have anything in the metadata (and the thumbnail is correct).
Why?
This is an image of a normal recorded video (with the phone in portrait mode).
Anyway, I have the metadata in the video, can I create its thumbnail based on its metadata's rotate value without extracting the metadata and then use ffmpeg to rotate the video on this value?
Thank you in advance.
On android, holding your phone in portrait is considered to be 90 degrees. This the convention that android is using:
landscape: 0
portrait: 90
reverse landscape: 180
reverse portrait: 270
I know this doesn't make sense for phones but it does for tablets and there is realy no difference between tablets and phones on android.
So the metadata in the file are correct. The actual problem is that your browser ignores them. See me question here

Android Galaxy S4 Preview with Pink and Green lines

all.
I am making a video recorder which works fine on other devices but Samsung GS4.
As you know, GS4 has full HD resolution which is 1920 * 1280.
When I take a video with back camera, I set recorder.setVideoSize(1920, 1280) and it records correctly.
However, if I take a video with front cam, it shows below view.
I mean, when I record it, I can see the recordign screen.
However, after recording is done and I play the video file, it is corrupted like this.
I think it has something to do with video the setVideoSize().
I tried every possible pair of with and height for resolution but to no avail.
GS4 spec says
Front camera 2 megapixels 1080p HD Video Recording # 30fps Back-illuminated sensor
but I don't know what video size I have to set it to.
Can anyone give me a clue?
Thanks in advance!
The front facing camera records at a default of 1920x1080 (1080p). Though because of the orientation, it may be 1080x1920.

setOrientationHint rotates video counterclockwise on some phones' front facing cameras (HTC)

The Problem: Certain Android devices (listed at the bottom of the question) exhibit unexpected behavior when utilizing the setOrientationHint(int degrees) function for videos taken with the front facing camera. The expected behavior is for the video to be rotated clockwise, but these devices rotate the video counterclockwise.
My Goal: To identify a variable within either the camera settings or hardware orientations that allows me to predictably know when this will occur. Specifically, I would like to avoid special casing these phones in my code!
Further Explanation: I am recording video using the standard MediaRecorder object, and in preparing for recording, I set the orientation of the video using setOrientationHint(). In the documentation for setOrientationHint(), the following is specified for the degrees parameter:
degrees –– the angle to be rotated clockwise in degrees. The supported angles are
0, 90, 180, and 270.
The function is intended to add a composition matrix containing the rotation angle so that a video player can display the video as intended. So, what I do is that I get the camera hardware's orientation using the CameraInfo class and use that as the degrees parameter in the setOrientationHint fn. (I have tried variation on this code using the AOSP as a guide, but I had the exact same result.)
A Real Example: The Samsung Galaxy S3 front-facing camera (and most others, in fact) will have a hardware orientation of 270, so I use this when recording, and the resulting video is displayed correctly. An HTC Vivid will similarly return a hardware orientation of 270 for the same camera, but will only be displayed correctly if I use 90 as the degrees parameter in the setOrientationHint fn. If I use the 270 of the hardware orientation like any other phone, the video will be upside down.
NB: The setOrientationHint() function includes a warning:
Note that some video players may choose to ignore the compostion
matrix in a video during playback.
However, this is not what is happening, because I can easily fix this issue on these phones if I fake it and put in 90 instead of 270.
Phones that I have seen specifically exhibit this issue: HTC Vivid (PH39100) running Android 4.0.3, HTC EVO 4G (PG86100) running Android 4.0.3, HTC Thunderbolt (ADR6400L) running Android 2.3.4. Notice that they are all HTC phones. Perhaps someone at HTC mistook clockwise for counterclockwise.
Yes, the HTC phones rotate in the wrong direction for the front facing cameras. Instead of trying to guess, I ended up adding a settings screen that would take two pictures with the second one rotating 90 degrees. Then the user could keep hitting next as I cycled through the different rotation direction and angle combinations until both pictures appeared oriented the same way.

Categories

Resources