android exoplayer video with high resolution not playing and buffering well why? - android

I have android tv application playing 2 different videos udp streaming using Exoplayer, the below images(image1 and image2) show the specifications of every video .. First one with low resolution play well without any problem in buffering, the second one have bad buffering and streaming.
image 1 : the first video with lower resolution (1280x720) which is playing well without any freezing or problem
image 2 : the second video with high resolution (1920x1080) play with freezing, or incontinious buffering
Below is my exoplayer initialisation
ExoPlayer.Builder playerbuilder = new ExoPlayer.Builder(WelcomeActivity.this);
LoadControl loadControl = new DefaultLoadControl.Builder()
.setBufferDurationsMs(15000, 50000, 2000, 5000)
.setTargetBufferBytes(DefaultLoadControl.DEFAULT_TARGET_BUFFER_BYTES)
.setAllocator(new DefaultAllocator(true, C.DEFAULT_BUFFER_SEGMENT_SIZE))
.setPrioritizeTimeOverSizeThresholds(DefaultLoadControl.DEFAULT_PRIORITIZE_TIME_OVER_SIZE_THRESHOLDS)
.build();
player = playerbuilder .setLoadControl(loadControl).setRenderersFactory(buildRenderersFactory(WelcomeActivity.this)).build();
player.setTrackSelectionParameters(
player.getTrackSelectionParameters()
.buildUpon()
.setMaxVideoSize(1280, 720)
.setPreferredTextLanguage("ar")
.build());
As you notice i am setting the maximum video size to low resolution .setMaxVideoSize(1280, 720) but this not changing any thing regards the bad buffering of the second video with high resolution

The size and bit rate of an encoded video is determined by how it is encoded - e.g. what codec is used and what parameters and configuration the encoding uses.
Nearly all encoders try to reduce the size of the encoded video, to make it easier to store and to transmit, and there is a balance between the size reduction and the visual quality of the video when it is subsequently decoded and played.
Generally, smaller size/bit rate means less quality, but it does depend on the content itself - for example some cartoons are very tolerant of lower bit rates. More advanced commercial encoders can do 'context aware' encoding and adjust the quality and bit rate depending on the scenes.
The most common codec at this time is the h.264 codec and ffmpeg is probably the most common open source tool for encoding.
ffmpeg provides a guide for encoding videos, or re-encoding / transcoding videos with h.264 and this includes notes and examples on quality vs bit rate trade off:
https://trac.ffmpeg.org/wiki/Encode/H.264
The two key sections in the above are for Constant Rate Factor encoding and two pass encoding.
CRF
allows the encoder to attempt to achieve a certain output quality for the whole file when output file size is of less importance. This provides maximum compression efficiency with a single pass
Two Pass
Use this rate control mode if you are targeting a specific output file size, and if output quality from frame to frame is of less importance
There is also a note on Constrained encoding (VBV / maximum bit rate) which it would be good to look at too.
I think you might be best to start with CRF encoding and experiment with the CRF value to see if you can find a quality/bit rate combination you are happy with.
Finally, it would be worth you checking out ABR streaming also, especially if you plan to host many videos and want your users to have consistent experience. For ABR you create multiple copies of the video, each encoded with different resolution/bit rate.
The client device or player downloads the video in chunks, e.g 10 second chunks, and selects the next chunk from the bit rate most appropriate to the current network conditions. See some more info in this answer also: https://stackoverflow.com/a/42365034/334402

Related

WebM with VP9 vs MP4 with H.264 AVC which one is best overall

I have used VideoView and load MP4 file of length 1 minute.
The problem is it start's after delay.
I want it to start immediately so which codec and byte rate to choose.
Share your experience if any one related to video codec.
I want to see comparison between these two types.
Loading speed, length and file size and quality ratio
MP4 usually has all index tables at the end of the file, so it may require to scan the whole file on the disk in order to start playback.
You may convert into MP4 file, optimized for streaming, so that tables are at the beginning.
MPEG TS (Transport stream) also is loaded quickly.
Probably Webm will load faster than "standard" MP4, but I am not so familiar with Webm format.
All PCs and smartphones have hardware based AVC (H.264) video decoder. VP9 is mostly decoded in software. So presumably, AVC will be easier to decode for your computer.
Quality or size of VP9 can be better than AVC only if you use HD. On smaller videos quality should be more or less equal.
There are many useful tools to encode AVC, and not so much for VP9. Using ffmpeg and proper settings like 2-pass encoding you can compress AVC harder than VP9.
So I recommend to use AVC, and optimized MP4.

How Can Compress Audio In Call Recorder Android

I Used This Code Capture Audio In Android Studio
But Size Audio Its Big(1min=1MB)
How Can Compress Audio Without Quality loss
AudioRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
AudioRecorder.setOutputFormat(MediaRecorder.OutputFormat.AAC_ADTS);
AudioRecorder.setAudioEncoder(MediaRecorder.OutputFormat.AMR_NB);
AudioRecorder.setAudioChannels(1);
AudioRecorder.setAudioEncodingBitRate(44100);
AudioRecorder.setAudioSamplingRate(128000);
You're setting the encoder to AMR_N8. About 1 MB/sec sounds about right for that encoder. If you want to compress it you can choose another encoding, such as Vorbis.
Please note that the different encoders have different purposes. None of them will compress without quality loss- we're converting analog sound to digital values, there is always quality loss with that. Different encoders are optimized for different uses. AMR is optimized for human voice. It filters out frequencies not in vocal range. That's a quality loss, but it may be one you want (like in a call). You can't get everything- you're going to have to sacrifice size or quality. I suggest you study up on the different encodings found at https://developer.android.com/reference/android/media/MediaRecorder.AudioEncoder.html and figure out what's best for you.

Android - Recorded audio file differs size when changing device

I am trying to record an audio file in Android, I'm setting the output file bit rate and sampling rate, everything is working right but whenever i record anything in a different device the file size differs a lot.
I have made some tests with a Z2 and a Moto G changing bit rate and sampling rate, obtaining very different file sizes on same recording time. I have noticed that the file depends the most on bit rate rather than sampling rate.
The problem is that i would expect and actually need files to be as little as possible but, even when with Moto G i get files from 38 - 254KB, files on same configuration on Z2 are 437 - 653KB size.
I don't know what to do to get files on Z2 (and any other device) with almost same size as Moto G, any help would be greatly appreciated.
Sorry for my english, It is not my natural language.
P.D:
Using MediaInfo, I get that only difference in files is "Overall bit rate". When setting bit rate to 16000, Moto G file on MediaInfo shows 19.4Kbps and Z2 file shows 226Kbps but both shows "Bit rate" = 16.8Kbps.
The reason for different sizes on different devices could be explained by two factors, audio resolution (microphone quality) and bitrate type, if you have a really good microphone to capture high audio frequencies you will have a more complex audio to compress, but if you have a low quality microphone able to record audio using only mid range frequencies your audio will lack detail and be simpler to compress. The other fact is the bitrate, VBR vs CBR, if you use Variable Bit Rate and the audio to encode doesn't have a lot of detail, it will lower the bitrate, but if suddenly you have the sound of cymbals for example, the details in the audio is higher and it requires a higher bitrate to encode, in the other hand, if you encode the audio using Constant Bit Rate your output will be always the same... let's say that you record 1min of audio using 128Kbps (almost certain that you don't need more than that) you will have 60s * 128Kbps = 7680Kbs... 7680Kbs / 8B = 960KB per minute.

H.264, 720p Video Bitrate

Disclaimer: I know very little about Video codecs & encoding.
I'm developing an iOS and Android app that allows users to record videos. I want to be able to upload the videos to YouTube & have them play at 720p quality.
The videos I'm recording will always be less than 180 seconds, always be ~30fps and will always have audio.
As far as I can tell, this means I need to record at a resolution of 1280x720, then I should be good. Is this correct?
I'm trying to determine how large, on average, an H.264 video file will be per second of video. From my understanding, I need to know the bitrate of the videos. What will the bitrate of recorded H.264 video be on Android 2.2+, and iOS 5+? This Android developer page mentions a bitrate of "2Mbps" for "HD" video - is that 2 Megabytes per second or 2 Megabits per second? Will that rate be the same for any recorded H.264 video?
Part of the reason I'm so confused about this is because I did a test with 4 different Android-encoded videos of different lengths, and produced the following output;
Wtf!?
Bonus points if you can link me to some iOS developer docs detailing this information - I've searched and can't find anything.
EDITS:
Possibly Related: H.264 file size for 1 hr of HD video
This wikipedia article mentions that the max bitrate for level 3.1 H.264 video (1280x720 # 30fps) is from 14000 - 17500kbps.
Yeah, 720p stands for 1280x720. I think it is correct.
To define how large your video file will be, you should record at a constant bitrate(CBR), but I doubt the camera will be using CBR while VBR(variable bitrate) is more efficient.
Mbps stands for Megabits per second.
I doubt that the rate will be the same as I stated earlier VBR could be used.
Edit:
Judging from the graph, it is definitely a VBR.

Power efficient video streaming from an Android device

I'm doing some experiments with video streaming from and the front camera of the android device to a local server. Currently I plan to use WiFi. I may move to Bluetooth 4.0 in the future.
I'm looking for insights, experience and DOs and DON'Ts and other ideas that I should consider in relation to protocol options (TCP, UDP, ...? ) and video codec. The image quality should be good enough to run computer vision algorithms such as face and object detection, recognition and tracking on the server side. The biggest concern is power. I want to make sure that the streaming is as power efficient as possible. I understand more power efficiency means a lower frame rate.
Also, I need to way to just send the video frames without displaying them directly on the screen.
Thanks.
You didn't mention whether you will be doing encoding or decoding on the device.
Some tips:
UDP will be less power hungry in general especially under deteriorating network conditions:
See http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.134.5517&rep=rep1&type=pdf
Check for more papers on this on google
In terms of codecs in general you can say the order is H264 > MPEG4 > H.263 in terms of power needed for both encoding and decoding.
Higher the bitrate more the power needed for decoding but the codec difference is a bigger difference than the bitrate one. I say this because to get same quality as a H.264 stream with H.263 you need higher bitrate. But h.263 at that bitrate should consume lesser power than H.264 at the lower bitrate. So do not apply it cross codec. Just at the codec chosen use the lowest bitrate/framerate you can.
In encoding though very low bitrates can make the encoder work harder so increase power consumption. So encoding bitrates should be low, but not so low that the encoder is streched. This means choosing a reasonable bitrate which does not produce a continous blocky stream but gives a decent stream output.
Within each codec if you can control the encoding then you can also control the decoding power. The following applies to both:
i.e. Deblocking, B Pictures will add to power requirements. Keeping to lower profiles [Baseline for H.264, Simple Profile for MPEG4 and Baseline for H.263] will result in lesser power requirements in encoding and decoding. In MPEG4 switch off 4MV support if you can. Makes streams even simpler to decode. Remember each of these also have a quality impact so you have to find what is acceptable quality.
Also unless you can really measure the power consumption I am not sure you need very fine tweaking of the toolsets. Just sticking to the lower profiles should suffice.
Worse the video quality during capture more the power needed during encoding. So bright lighted videos need lesser effort to encode, low light videos need more power.
There is no need to send videos to a screen. you receive video over a socket and do whatever you want to do with that data. That is upto you. You do not have to decode and display it.
EDIT: Adding a few more things I could think off
In general the choice of codec and its profile will be the biggest thing affecting a video encoding/decoding system's power consumption.
The biggest difference may come from the device configuration. If you have hardware accelerators for a particular codec in the device it may be cheaper to use those than software codec for another one. So though H.264 may require more power than MPEG4 when both are in software, if the device has H.264 in hardware then it may be cheaper than MPEG4 in software. So check you device hardware capability.
Also video resolution matters. Smaller videos are cheaper to encode. You can clock your device at lower speeds when running smaller resolutions.

Categories

Resources