Camera2 video recording with custom resolutions? - android

Currently, I am following google's sample code in Kotlin for Camera2 API. Everything seems working fine in terms of video recording. However, I do have different requirements for my project as listed below.
I need to record a video in three possible ways 640 x 640(square), Y x 640(portrait), or 640 x Y(landscape) in portrait screen where Y is a number less than 640. 
640 x 640(square):
I have Samsung S9+ which supports only one resolution with 1:1 aspect ration which is 384x384, but when I post on Instagram they create a video with 720 x 720 resolution with good quality. So the question is how Instagram is enlarging a low-resolution video without losing quality?
W? x 640(portrait):
I need to find out an equal or high resolution with the closest matching aspect ratio and later on, I can run the FFmpeg command to match with the required size, right?
640 x H?(landscape):
I can follow the same thing as in the portrait use case. However, the real question is how to record a video in the landscape if your screen is in Portrait orientation?
I have already researched a lot on each use case and now open for any possible solutions like FFMpeg, OpenGL, MediaMuxer, MediaCodec or anything else?
Any hint, links or suggestion would be highly appreciated. Thanks in advance.

640 x 640(square): Instagram is likely capturing video at 720p (1280x720) and then cropping to 720x720 in their own code.
Generally, camera has only a few resolutions available, and they all tend to be landscape. If you need portrait resolutions (or landscape resolutions in portrait orientation), you will probably need to do your own cropping.

Related

What is the most common or average aspect ratio for mobile devices?

I know aspect ratio's can be defined by the market in question eg. the average aspect ration in India may differ from that of America. But what would you think would be the most common and or average aspect ratio of mobile devices universally speaking?
Does anyone have any stats on mobile devices in different continents?
I'm currently attempting to create the most dynamic media that will present perfectly on the most devices as possible.
EDIT: Media format must be in .jpeg
Thanks in advance,
I found that every Android device had one of the following aspect ratios (from most square to most rectangular):
4:3
3:2
8:5
5:3
16:9
And if you consider portrait devices separate from landscape devices you'll also find the inverse of those ratios (3:4, 2:3, 5:8, 3:5, and 9:16)
According to appbrain, the samsung galaxy s7 is the most popular device in the united states. Afterwards is the s5, then the s6.

How to Specify a Viewport on Android Camera API

1) The Camera previews at 1920 x 1080
2) I record at 960 x 540
3) I want to be able to specify what portion of the 1920 x 1080 preview should be saved into the video and change this on-the-fly.
In effect this would give me the ability to do digital zooming as well as digital panning of the Camera. What APIs, code-samples could help me out here?
I've looked at the Camera2 API and samples. Looks like you can only set one viewport for the device, not per output.
You'll have to implement this zooming yourself; the camera API produces the same field of view on all of its outputs, regardless of the resolution of each output (though it does crop different aspect ratios differently, to avoid stretching). The camera2 SCALER_CROP_REGION (used for digital zoom) will zoom/pan all outputs equally.
The simplest way to do this is probably to send the 1080p output to the GPU, and from the GPU, render to the screen with the full FOV, and render to a media recorder with just the region of the image you want to record.
It's not terribly straightforward, since you'll need to write quite a bit of OpenGL code to accomplish this.

How to compute the max supported frame rate for the camera for given resolution

According to this doc: https://developer.android.com/reference/android/media/MediaRecorder.html#setCaptureRate%28double%29:
"For resolutions that can be captured by the video camera, the fastest
fps can be computed using getPreviewFpsRange(int[])"
but I cannot find any example about it.
I've tried to:
List<Camera.Size> sizes = p.getSupportedPreviewSizes();
p.setPreviewSize(sizes.get(0).width, sizes.get(0).height);
cam.setParameters(p);
i.e. setting preview size with all kind of resolutions but calling getPreviewFpsRange(int[]) after that always returns the same result like 5000 - 30000 on Nexus 4, or 5000 - 60000 on acer tablet. Does that mean that these devices support the maximum fps for all resolutions or am I missing something?
Does someone know how to compute the max fps per resolution?
getSupportedPreviewSizes is now deprecated along with Camera. However, in AOSP it only ever returned a fixed list of frame rate ranges that were independent of the resolution. Some manufacturers might have overridden this behaviour, but I have never seen it, and it would be undefined in how it related to a particular preview resolution.
On at least one device I get 30 FPS for preview resolution 1280 x 720 and 15 FPS for everything else, including low resolutions such as 320 x 240. I suspect that the graphics drivers are optimised for particular preview resolutions, but the old Camera SDK has no way to share this information even if the driver makes it available.
The Camera2 interface appears to offer a solution in the form of getHighSpeedVideoFpsRangesFor, but I haven't tried it myself.

Converting iOS game dimensions for Unity2d android

I recently started working learning Unity2d from an iOS background. I am using the FWVGA Landscape (854x480) dimensions. I resized a 1136x640 background image (which I use for iOS) to 854x480 and made a sprite with the image but it only took a small portion of the screen. I had to resize the image in Unity. What are the general rules for converting dimensions from iOS to Android on Unity to get the dimensions to fit?
1136x640 is the ratio 16:9, as is 1920x1080 (1080p). Note a quick way to check the ratio: 16/9 = 1.77r. 1920/1080 = 1.77r. 1136/640=1.77r, 854/480 = 1.77r. All the same 16:9 ratio.
So, for Android phones that are also 16:9 you don't "need" to resize your assets unless say you want to up the resolution for quality sake on more powerful Android devices.
If you want to do it quick and easy then you want to draw your game in a camera at 1336x640 and then scale the camera view to match the resolution of the device you are running on (be it 1920x1080, 640x480, etc - all the same ratio).
You get problems with devices that are not 16:9 like tablets or phones that are say 16:10, 5:3, 3:2, or the iPads and Nexus 9 tablets at 4:3. This is a long subject though and there are lots of guides around to help. Try searching for "Unity 2d multiple screen ratios".
I think there are just too many variables to make this work "simply"
854x480 doesn't sound like a 'standard' HxW to me. and Unity is a 3D game engine which means its camera is on the Z axis when you are "looking down upon" a 2D game.
So the camera Z axis(Which apparently Unity tries to 'adjust' to match the phone's size, the phone's physical size, the size of the image you are using, etc. There are just a lot of variables. I would recommend https://stackoverflow.com/a/21375938 to see if that helps.

HTC Evo 3D stereoscopic preview - reduced horizontal resolution

I'm writing an Android application which makes some use of stereoscopic image data from camera on HTC Evo 3D. I try to access the data using standard Android API with some 3D-specific functions provided by HTC OpenSense API. So far, I can access the camera in stereoscopic mode and I can grab the image data using onPreviewFrame() callback method.
However, the "raw" image data (data[] byte array) available in onPreviewFrame() are not complete. The image I get is a correct side-by-side stereoscopic picture, but its horizontal size is reduced by a factor of two. For example, when set the camera preview size to 1280x720 px, I expect a 2560x720 px image (two images of the desired 1280x720 px resolution). But what I get is a picture of 1280x720 resolution, half of which comes from the right camera and the other half from the left one. I don't know why is the horizontal resolution reduced.
There is a similar thread on this forum, but the answer doesn't really solve the problem. Although the DisplaySetting.setStereoscopic3DFormat() returns true in my program, it doesn't seem to have any effect on display or image data.
Has anyone any experience with this issue?
The resolution halving is by design, the parallax barrier display causes 3D photos to be half the resolution.

Categories

Resources