How WebRTC modify my video's resolution? - android

I am dealing with WebRTC on android. My problem is I can not sent video which has more than 1280X1280 resolution. Even if I set video resolution as 1920x1080 WebRTC sends maximum 1280x1080 resolution. I see these results using StatsReport output.
It gives me these values when I set video as 1920X1080;
name : googFrameWidthInput, value : 1920
name : googFrameWidthSent value : 1280
name : googFrameHeightSent value : 1080
name : googFrameHeightInput value : 1080
I have 3 question in here.
1) Does WebRTC supports full hd video (1920X1080)?
2) How it modify my video resolution? Is it just decrease my video randomly?
As seen from here it doesn't keep my video ratio , isn't it wrong?
3) As far as I know WebRTC decrease video resolution when cpu usage increase or network quality decrease. When one of these case occurs what will be my new video resolution and ratio? Is it decrease with a rule ?

1) Does WebRTC supports full hd video (1920X1080)?
Yes, both the local camera and the remote peer needs to support it or else it will choose the lower rate.
2) How it modify my video resolution? Is it just decrease my video randomly? As seen from here it doesn't keep my video ratio , isn't it wrong?
Again it is decided by the combination of your local camera and the advertisement of the supported resolution by the peer.
3) As far as I know WebRTC decrease video resolution when cpu usage increase or network quality decrease. When one of these case occurs what will be my new video resolution and ratio? Is it decrease with a rule ?
The variance in bitrate is not changing the resolution. It is the codec being set to adjustable bit rate, and dealing with the amount of motion in the scene. When there is more motion, there will be a higher bitrate.

1) Few days ago explore native code WebRTC and find some parameters of resolution, where was max only HD. But, maybe WebRTC can transform data
for a suitable stream.
2) As you can see, from sample, all parameters is adding to PeerConnectionConstructure. But I think you tried this solution. If no, check the sample.
PeerConnectionParameters peerConnectionParameters =
new PeerConnectionParameters
( /* many different parameters, including resolution */);

Related

How to get the exact resoutions list for Video recording using camera2 api?

I'm making a custom camera application for android. I'm recording the videos using camera2 API.
But I want to get the list of resolutions supported by the device. So that I can record the videos according to the supported video resolutions and pass the exact size in:
mediaRecorder.setVideoSize(height,width)
I have used 2 options to get the video camera resolutions list
1 Using StreamConfigurationMap:
val characteristics = cameraManager.getCameraCharacteristics(cameraId)
val map: StreamConfigurationMap? = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP)
val sizes = map?.getOutputSizes(ImageFormat.YUV_420_888)
But it's giving me 4K resolution in some devices, which the device does not support for recording the videos.
I have also used map?.getOutputSizes(MediaRecorder::class.java), but it's also giving 4K resolution, which is not supported by the device for video recording.
2 Using the CamcorderProfile:
val has4KResolution = CamCorderProfile.hasProfile(deviceId, CamcorderProfile.QUALITY_2160P)
I am getting false in some devices which support 4K resolution.
Is there any other solution, by which I can get/fetch the supported video resolutions of the camera?
Any other ideas will also be appreciated.
CamcorderProfile lists the sizes the device manufacturer wants to support for video recording, so it's the simplest thing to look at.
The camera may support higher resolutions than that, for other use cases, so looking at the sizes from camera itself isn't terribly reliable.
It's likely possible other (maybe larger) sizes are available if you query the MediaCodecList's video encoders or similar. But it's a bit of work to iterate that list and figure out what encoder type and sizes supported are.
So sticking to what CamcorderProfile says is supported is the safest option.

How can I reduce the video recording size in flutter?

I'm using flutter camera plugin to record the video. But the recorded video size is too big. Something around 20mb for 1 min. How can I reduce the size (one of which is how to reduce resolution)? Also I have changed my VideoEncodingBitRate to 3000000. Like this mediaRecorder.setVideoEncodingBitRate(3000000);.
To reduce the size, you can employ any or both of these 2 methods:
Resolution
You can see them in the example
controller = CameraController(cameras[0], ResolutionPreset.medium);, change this to ResolutionPreset.low or some other customer value (does not have to be preset)
Encoding
You can use different encoding algorithms, such as FFmpeg using this plugin https://pub.dartlang.org/packages/flutter_ffmpeg. See also this question and its answers how to reduce size of video before upload to server programmatically in android

Video compression estimated size - android

I am developing an android video compression application.
I have video 1280*720, video duration is 4 seconds and size of the video is 7Mb, How can get estimated video size for different resolutions.
I want to find estimated video size for 854*504, 640*360 and more resolutions, please let me if any formal for calculating estimated video size.
Thanks :)
You can not estimate file size from resolution alone. A file with little motion will compress much smaller that a file with a lot of action. If you need to know the size before compression, you should decide what size you want, then divide that number by the video duration, and use that number for your bitrate when encoding.

What video encoder gives best performance on an Android device for given quality?

I'm trying to determine the best encoder or encoding parameters to play as high resolution (quality) video on an Android phone as possible. I do not care much about file size, it can be triple the size of a "properly compressed" video as long as it plays smoothly. All encoders are optimized for best quality in as a small file as possible by default at the expense of computing power needed to decode the video - I'd like to optimize for computing power at the expense of file size.
So essentially I'd like to know how to effectively unburden the decoder at the expense of increasing the file size so the video plays without any artifacts or freezes.
Can anyone recommend a technique to achieve this?
To clarify: I have a locally available file in very high quality (1440p) which I'd like to transcode to as much playable resolution/quality as possible while not caring about file size (1080p+).
Thank you.
For encoding video the general recommendation is to use H.264 with Baseline Profile for broad compatibility. There are a variety of parameters generally for optimizing for video content (animation vs static lecture vs action/sports), but generally resolves down to bitrate.
Any device which has Google Play must conform to the the Android Compatibility Definition Document which spells out what are the expect frame rate and bit rate for various sized videos:
http://source.android.com/compatibility/7.0/android-7.0-cdd.html#5_3_4_h_264
Android device implementations with H.264 decoders:
MUST support Main Profile Level 3.1 and Baseline Profile.
Support for ASO (Arbitrary Slice Ordering), FMO (Flexible Macroblock Ordering)
and RS (Redundant Slices) is OPTIONAL.
MUST be capable of decoding videos with the SD (Standard Definition)
profiles listed in the following table and encoded with the Baseline Profile and
Main Profile Level 3.1 (including 720p30).
SHOULD be capable of decoding videos with the HD (High Definition) profiles
as indicated in the following table.
In addition, Android Television devices—
MUST support High Profile Level 4.2 and the HD 1080p60 decoding profile.
MUST be capable of decoding videos with both HD profiles as indicated
in the following table and encoded with either the Baseline Profile, Main
Profile, or the High Profile Level 4.2
SD (Low quality) SD (High quality) HD 720p HD 1080p
Video resolution 320 x 240 px 720 x 480 px 1280 x 720 px 1920 x 1080 px
Video frame rate 30 fps 30 fps 30 fps 30 fps
Video bitrate 800 Kbps 2 Mbps 8 Mbps 20 Mbps
while Android has must requirements for SD video, HD is should but most likely implemented in high end devices.
With regards to power usage - with hardware decoding relatively common on high end devices - the screen is still the most power hungry part of playing a video so any thoughts about 'compression' should be in regards to what settings will provide the most visually acceptable content while being as small as possible. Given variations in content the 'right' settings usually require a bit of experimentation.
In addition if you are delivering to a device you should allow the client to pick the resolution/quality which makes sense - i.e. no reason to deliver a 1080p file to a 640x480 device.

Load and play ultra high resolution vĂ­deos on android

Does anyone know how to load a high resolution video in android programmatically, such as 3000 x 3000, to display only a portion of this video, such as 1000 x 1000?
I tried to use the MediaPlayer android sdk official in a TextureView, but this method has media size limitations, I think, because the video plays but and texture view is black ..
I appreciate the help.
Media size limitations are for a reason. 3k x 3k resolution video is huge for such small device like phone. Consider that you are not able to decrypt only small portion of video frame, that is not like video works. So you need do decrypt whole big frame (wchich is calculated from I-Frame and following P-frames ) than take screenshot of it after that take part which you are interested in and present on your textureView, and all of this in realtime. Think about device memory and CPU. In my opinion it's not possible with such small resources

Categories

Resources