MediaCodec.VIDEO_SCALING_MODE_SCALE_TO_FIT_WITH_CROPPING constantly jumping - android

I have this problem which honestly I don't know how to solve, I'm using a SurfaceView to render content. The video is a letterbox type video where margins needs to be chopped of from the video in order to make it fill the screen and be in correct aspect ratio.
I have tried 3 different things till now:
Put the SurfaceView in the center of an over-sized FrameView (this worked but on some devices it caused video artefacts)
Use a TextureView and a matrix to scale, this works but on some devices I'm getting a green screen or very poor performance, beside Google suggest that using SurfaceView is a better approach.
My preferred (and probably most correct) approach is to use a SurfaceView and set media codec to: VIDEO_SCALING_MODE_SCALE_TO_FIT_WITH_CROPPING something like this:
#Override
public void onOutputFormatChanged(#NonNull MediaCodec mediaCodec, #NonNull MediaFormat mediaFormat) {
android.util.Log.d(TAG, "Output format changed!");
mediaCodec.setVideoScalingMode(MediaCodec.VIDEO_SCALING_MODE_SCALE_TO_FIT_WITH_CROPPING);
}
This does work, as expected however on some devices (which I can't reproduce on any of my tests), this happens:
https://www.youtube.com/watch?v=FFOafPVgADQ
https://www.youtube.com/watch?v=38wW_AeNnTg
The MediaCodec is created like:
MediaCodec.createDecoderByType("video/avc");
I have tried using, which seems to fix the problem, but has very poor performance (as it's done by the CPU rather then GPU)
MediaCodec.createByCodecName("OMX.google.h264.decoder");
Any idea why is that jumping happening? Am I right to think that it's a broken video driver and probably I won't be able to fix it?

Related

Unity VideoPlayer not rendering Video correctly to Texture

I'm trying to use a VideoPlayer component, with a URL source and a RenderTexture as the target, to show a video in my Unity mobile game. The video is loaded and starts playing, however the resulting texture is only 1 color. The color does change every frame to something matching what the video would look like that frame, but it's just the 1 color. Audio is working fine. On the VideoPlayer component, the Aspect Ratio is set to "Fit Inside", but I have tried all options here with the same result. As for the RenderTexture, it's set to the same resolution as the input video, and the Color Format is set to RGB565 (which both Android and iOS should support according to SystemInfo.SupportRenderTextureFormat()). I'm all out of ideas, any help would be appreciated.
EDIT: A workaround could be using "material override" instead of rendering to a texture. This doesn't work though if you want to use the texture specifically instead of only showing the video on a material, plus the fact that Material Override doesn't support objects with multiple renderers/materials. Not really a fix, but a workaround for those who find this question before a solution has been found.
I just had this myself and fixed it.
In the Raw Image, search UV Rect and set its W and H to 1. I had changed that, which made it only sample 1 pixel.

MediaCodec getWidth/HeightAlignment Invalid

I'm using a MediaCodecto encode some video frames to H264. Everything seems ok on most devices, but on some devices my output gets a lot of "diagonal" noise (appears like an offset/alignment issue).
The output resolution of the MediaCodec is calculated by using the function getWidthAlignment andgetHeightAlignment. The documentation states that the values returned by that functions are powers of two that the codec output width and height need to be a multiple of. The function returns 2 for the devices which do not work properly.
So, the device I am testing on (that does not work properly) has a native resolution of 1280x800. If I request that the output of the MediaCodec be 1152x720 (keeping the same aspect ratio), everything works. If I have the output of the MediaCodec be 1224x720 (keeping the same aspect ratio of the screen if the screen has the nav bar showing, so, not the same AR as the native resolution). Both 1224 and 720 are multiples of the width/height alignments that the MediaCodec's VideoCapabilities returns. Using 1224x720 causes the "diagonal lines."
So, my question is, what is the best way to get a supported output resolution for a MediaCodec? Using getWidth/HeightAlignment doesn't seem to behave properly on a lot of lower end devices (perhaps the devices are not implementing those functions correctly?).

Android RenderScript to blur video in VideoView

I know that I can use RenderScript on Android to blur images, but does anybody know if I can apply the same to video views so that my complete video is gaussian blurred?
VideoView, which extends SurfaceView, does not utilize the drawing cache due to being hardware accelerated. This means you won't be able to get stills. I was forced to scrap the design I had using the paused video still.
Check out: VideoView getDrawingCache is returning black
Edit: As I look into this more, there might be a way through https://github.com/google/grafika, but I haven't seen anyone verify it as a performant workaround.
You should use a thumbnail of the video in front of it. You can then, blur the image using this lib: https://github.com/jrvansuita/GaussianBlur

Record video with a different preview size than the resulting video file

I am attempting to allow users to record video that is a different size than the actual on-screen preview that they can see while recording. This seems to be possible from this documentation concerning the getSupportedVideoSizes function which states:
If the returned list is not null, the returned list will contain at
least one Size and one of the sizes in the returned list must be
passed to MediaRecorder.setVideoSize() for camcorder application if
camera is used as the video source. In this case, the size of the
preview can be different from the resolution of the recorded video
during video recording.
This suggests that some phones will return null from this fn (in my experience the Galaxy SIII does) but for those who do not, it is possible to provide a preview with a different resolution than the actual video. Is this understanding correct? Do some phones allow the behavior and others not?
Attempting a Solution:
In the official description of the setPreviewDisplay function, which is used in the lengthy process of setting up for video recording, it is mentioned that:
If this method is called with null surface or not called at all, media
recorder will not change the preview surface of the camera.
This seems to be what I want, but unfortunately if I do this, the whole video recording process is completely messed up. I am assuming that this function can not be passed null or not called at all in the process of recording video. Perhaps in other contexts this is okay. Unfortunately though, this does not seem to help me.
My only next steps are to look into TextureViews and to use a preview Texture as opposed to a typical SurfaceView implementation in order to use openGL to stretch the texture to my desired size that differs from the actual resolution (and crop any excess off the screen), and then to Construct a Surface for the setPreviewDisplay function with the Surface(SurfaceTexture surfaceTexture) constructor for a Surface. I would like to avoid using a TextureView due to incompatibility below ICS, and also because this adds significant complexity.
This seems like a delicate process, but I am hoping someone can offer some advice in this area.
Thank you.
a.Assume the user sets the size of x,y as video size
b.Now with getSupportedVideoSizes function get the entire list and see if x,y falls in one of them and set the MediaRecorder.setVideoSize().If x,y does not fall in the getSupportedVideoSizes list,then set the default profile for the video record.
This is about the video size
Now coming to the preview size,Not much workaround options.
Take a RelativeLayout which holds the SurfaceView.
<android.view.SurfaceView xmlns:android="http://schemas.android.com/apk/res/android"
android:id="#+id/preview"
android:layout_width="fill_parent"
android:layout_height="fill_parent"
/>
preview is the name of the SurfaceView.
Here i have given a sample of re-sizing it to half of the width and height.
resetCamera(); //reset the camera
ViewGroup.LayoutParams params = preview.getLayoutParams();
RelativeLayout myRelLayout = (RelativeLayout) findViewById(R.id.myRelLayout);
params.width = (int) (myRelLayout.getWidth()/2);
params.height = (int)(myRelLayout.getHeight()/2);
preview.setLayoutParams(params);
initCamera(); //initiate the camera(open camera, set parameter, setPreviewDisplay,startPreview)
please look at the resolution of the preview and then scale down the height or width accordingly based on the video size.
Hope it helps.
As you mention, this is only possible when getSupportedVideoSizes() returns a non-null list.
But if you do see a non-null list, then this simple approach should work:
Set the desired preview resolution with setPreviewSize; the size you select has to be one of the sizes given from getSupportedPreviewSizes.
Set the preview display to your SurfaceView or SurfaceTexture with setPreviewDisplay or setPreviewTexture, respectively.
Start preview.
Create the media recorder, and set its video size either directly with setVideoSize using one of the sizes from getSupportedVideoSizes, or use one of the predefined Camcorder profiles to configure all the media recorder settings for a given quality/size.
Pass the camera object to MediaRecorder's setCamera call, configure the rest of the media recorder, and start recording.
On devices with a non-null getSupportedVideoSizes list, this should result in preview staying at the resolution set by your setPreviewSize call, with recording operating at the set video size/camcorder profile resolution. On devices with no supported video sizes, the preview size will be reset by the MediaRecorder to match the recording size. You should be able to test this by setting a very low preview resolution and a high recording resolution (say, 160x120 for preview, 720p for recording). It should be obvious if the MediaRecorder switches the preview resolution to 720p when recording starts, as the preview quality will jump substantially.
Note that the preview size is not directly linked to the dimensions of the display SurfaceView; the output of the camera preview will be scaled to fit into the SurfaceView, so if your SurfaceView's dimensions are, say 100x100 pixels due to your layout and device, whatever the preview resolution you use will be scaled to 100x100 for display. So you still need to make sure to keep the SurfaceView's aspect ratio correct so that the preview is not distorted.
And for power efficiency, you should not use a preview resolution much higher than the actual number of pixels in your SurfaceView, since the additional resolution will be lost in fitting the preview in the surfaceview. This is of course only possible for recording when getSupportedVideoSizes() returns a non-null value.
First, I will try to answer your specific questions.
it is possible to provide a preview with a different resolution than the actual video. Is this understanding correct?
Yes, preview size is more often than not different from recording size. Preview size is more often than not linked to your display size. So if a phone has display of CIF (352 x 288), but is capable of recording D1 (720 x 480), then preview size and recording size will be different. I feel that other experts have answered sufficiently on this point.
Do some phones allow the behavior and others not?
Most of the latest phones support this feature except maybe a few low-end ones.
Along with setPreviewDisplay, we have to consider this point also:
The one exception is that if the preview surface is not set (or set to null) before startPreview() is called, then this method may be called once with a non-null parameter to set the preview surface. (This allows camera setup and surface creation to happen in parallel, saving time.) The preview surface may not otherwise change while preview is running.
Could you please share the issue faced by you when setPreviewDisplay is invoked with a NULL surface?

How is the camera preview connected with the final image output?

I've always been under the impression that the preview and the final output are not connected in any way; meaning that I can set the preview to be some arbitrary dimension and that the final JPG will be whatever specific resolution I set in to be in the params, but I just ran into a very odd situation where the image data coming back in the byte[] that's in the jpg callback is different, depending on what dimensions I set my preview to.
Can someone enlighten me on what actual relationship the preview has on the final JPG? (or point me to documentation on said relationship).
TIA
[Edit]
As per ravi's answer, this was my assumption as well, however, I see no alternative but to surmise that they are, in fact, directly connected based on the evidence. I'll post code if necessary (though there's a lot of it) but here's what I'm doing.
I have a preview screen where the user takes a photo of themselves. I then display the picture captured (from the jpg callback bitmap data) in a subsequent draw view and allow them to trace a shape over their photo. I then pass the points of their polygon into a class that cuts that shape out of the original image, and gives back the cut image.
All of this works, BUT depending on how I present the PREVIEW, the polygon cutting class crashes on an array out of bounds index as it tries to access pixels on the final image that simply don't exist. This effect is produced EXCLUSIVELY by altering the shape of the preview View's dimensions. I'm not altering ANYTHING else in the code, and yet, just by mis-shaping my preview view, I can reproduce this error 100% of the time.
I can't see an explanation other than that the preview and the final are directly connected somehow, since I'm never operating on the preview's data, I only display it in a SurfaceView and then move on to deal exclusively with the data from the JPG callback following the user having taken their photo.
There is no relation between the preview resolution and the final image that is captured.
They are completely independent (at least for the still image capture). The preview resolution and the aspect ratio are not interrelated with the final image resolution and the aspect ratio in anyway.
In the camera application that I have written, the preview is always VGA but the image I capture varies from 5M to VGA (depending on the device capability)
Perhaps if you can explain the situation it would be more helpful.
We are currently developing a camera application and face very similiar problems. In our case, we want to display a 16:9 preview, while capturing a 4:3 picture. On most devices this works without any problems, but on some (e.g. Galaxy Nexus, LG Optimus 3D), the output camera picture depends on the preview you've chosen. In our case the outcoming pictures on that devices are distorted when the preview ratio is different from the picture ratio.
We tried to fix this, by changing the preview resolution to a better one just before capturing the image. But this does not work on some devices and occure error while starting the preview again after capturing is finished.
We also tried to fix this, by enlarging the SurfaceView to fullscreen-width and "over fullscreen"-height to make a 16:9 preview out of a 4:3 preview. But this does not work, because SurfaceViews can not be higher then screenheight.
So there IS any connection on SOME devices, and we really want to know, how to fix/workaround this.

Categories

Resources