Disclaimer: I know very little about Video codecs & encoding.
I'm developing an iOS and Android app that allows users to record videos. I want to be able to upload the videos to YouTube & have them play at 720p quality.
The videos I'm recording will always be less than 180 seconds, always be ~30fps and will always have audio.
As far as I can tell, this means I need to record at a resolution of 1280x720, then I should be good. Is this correct?
I'm trying to determine how large, on average, an H.264 video file will be per second of video. From my understanding, I need to know the bitrate of the videos. What will the bitrate of recorded H.264 video be on Android 2.2+, and iOS 5+? This Android developer page mentions a bitrate of "2Mbps" for "HD" video - is that 2 Megabytes per second or 2 Megabits per second? Will that rate be the same for any recorded H.264 video?
Part of the reason I'm so confused about this is because I did a test with 4 different Android-encoded videos of different lengths, and produced the following output;
Wtf!?
Bonus points if you can link me to some iOS developer docs detailing this information - I've searched and can't find anything.
EDITS:
Possibly Related: H.264 file size for 1 hr of HD video
This wikipedia article mentions that the max bitrate for level 3.1 H.264 video (1280x720 # 30fps) is from 14000 - 17500kbps.
Yeah, 720p stands for 1280x720. I think it is correct.
To define how large your video file will be, you should record at a constant bitrate(CBR), but I doubt the camera will be using CBR while VBR(variable bitrate) is more efficient.
Mbps stands for Megabits per second.
I doubt that the rate will be the same as I stated earlier VBR could be used.
Edit:
Judging from the graph, it is definitely a VBR.
Related
I have android tv application playing 2 different videos udp streaming using Exoplayer, the below images(image1 and image2) show the specifications of every video .. First one with low resolution play well without any problem in buffering, the second one have bad buffering and streaming.
image 1 : the first video with lower resolution (1280x720) which is playing well without any freezing or problem
image 2 : the second video with high resolution (1920x1080) play with freezing, or incontinious buffering
Below is my exoplayer initialisation
ExoPlayer.Builder playerbuilder = new ExoPlayer.Builder(WelcomeActivity.this);
LoadControl loadControl = new DefaultLoadControl.Builder()
.setBufferDurationsMs(15000, 50000, 2000, 5000)
.setTargetBufferBytes(DefaultLoadControl.DEFAULT_TARGET_BUFFER_BYTES)
.setAllocator(new DefaultAllocator(true, C.DEFAULT_BUFFER_SEGMENT_SIZE))
.setPrioritizeTimeOverSizeThresholds(DefaultLoadControl.DEFAULT_PRIORITIZE_TIME_OVER_SIZE_THRESHOLDS)
.build();
player = playerbuilder .setLoadControl(loadControl).setRenderersFactory(buildRenderersFactory(WelcomeActivity.this)).build();
player.setTrackSelectionParameters(
player.getTrackSelectionParameters()
.buildUpon()
.setMaxVideoSize(1280, 720)
.setPreferredTextLanguage("ar")
.build());
As you notice i am setting the maximum video size to low resolution .setMaxVideoSize(1280, 720) but this not changing any thing regards the bad buffering of the second video with high resolution
The size and bit rate of an encoded video is determined by how it is encoded - e.g. what codec is used and what parameters and configuration the encoding uses.
Nearly all encoders try to reduce the size of the encoded video, to make it easier to store and to transmit, and there is a balance between the size reduction and the visual quality of the video when it is subsequently decoded and played.
Generally, smaller size/bit rate means less quality, but it does depend on the content itself - for example some cartoons are very tolerant of lower bit rates. More advanced commercial encoders can do 'context aware' encoding and adjust the quality and bit rate depending on the scenes.
The most common codec at this time is the h.264 codec and ffmpeg is probably the most common open source tool for encoding.
ffmpeg provides a guide for encoding videos, or re-encoding / transcoding videos with h.264 and this includes notes and examples on quality vs bit rate trade off:
https://trac.ffmpeg.org/wiki/Encode/H.264
The two key sections in the above are for Constant Rate Factor encoding and two pass encoding.
CRF
allows the encoder to attempt to achieve a certain output quality for the whole file when output file size is of less importance. This provides maximum compression efficiency with a single pass
Two Pass
Use this rate control mode if you are targeting a specific output file size, and if output quality from frame to frame is of less importance
There is also a note on Constrained encoding (VBV / maximum bit rate) which it would be good to look at too.
I think you might be best to start with CRF encoding and experiment with the CRF value to see if you can find a quality/bit rate combination you are happy with.
Finally, it would be worth you checking out ABR streaming also, especially if you plan to host many videos and want your users to have consistent experience. For ABR you create multiple copies of the video, each encoded with different resolution/bit rate.
The client device or player downloads the video in chunks, e.g 10 second chunks, and selects the next chunk from the bit rate most appropriate to the current network conditions. See some more info in this answer also: https://stackoverflow.com/a/42365034/334402
I am trying to record an audio file in Android, I'm setting the output file bit rate and sampling rate, everything is working right but whenever i record anything in a different device the file size differs a lot.
I have made some tests with a Z2 and a Moto G changing bit rate and sampling rate, obtaining very different file sizes on same recording time. I have noticed that the file depends the most on bit rate rather than sampling rate.
The problem is that i would expect and actually need files to be as little as possible but, even when with Moto G i get files from 38 - 254KB, files on same configuration on Z2 are 437 - 653KB size.
I don't know what to do to get files on Z2 (and any other device) with almost same size as Moto G, any help would be greatly appreciated.
Sorry for my english, It is not my natural language.
P.D:
Using MediaInfo, I get that only difference in files is "Overall bit rate". When setting bit rate to 16000, Moto G file on MediaInfo shows 19.4Kbps and Z2 file shows 226Kbps but both shows "Bit rate" = 16.8Kbps.
The reason for different sizes on different devices could be explained by two factors, audio resolution (microphone quality) and bitrate type, if you have a really good microphone to capture high audio frequencies you will have a more complex audio to compress, but if you have a low quality microphone able to record audio using only mid range frequencies your audio will lack detail and be simpler to compress. The other fact is the bitrate, VBR vs CBR, if you use Variable Bit Rate and the audio to encode doesn't have a lot of detail, it will lower the bitrate, but if suddenly you have the sound of cymbals for example, the details in the audio is higher and it requires a higher bitrate to encode, in the other hand, if you encode the audio using Constant Bit Rate your output will be always the same... let's say that you record 1min of audio using 128Kbps (almost certain that you don't need more than that) you will have 60s * 128Kbps = 7680Kbs... 7680Kbs / 8B = 960KB per minute.
I am needing to convert video's to put on my website but I need to be able to get the right format so that they can be viewed on all mobile devices, the problem I am having is that I cannot get them to work on android. Does android required a different encoder? is there a format in which I can use that will work on on mobiles
Blackberry
Android
iOS (iPhone/iPad)
According to Android Media Fortmats docs you need MPEG-4 container and H.264 Baseline Profile codec with 480 x 360 px resolution, 30 fps, video bitrate 500 Kbps, audio codec AAC-LC with bitrate 128 Kbps
mp4 cointainer,15 fps, 640x480 resolution is pretty generic
Though not very good quality with the low frame rate, but it will probably be supported by most devices.
.m4v is listed under Squeeze's encode list for iphone and android, yet it also worked on the desktop/chrome/safari/firefox as well.
According to developer.android.com, the Android supports the playing of video using the H.263, H.264 AVC, MPEG4 SP and VP8 codecs. However, I want to play a video encoded in the .mxf format (Material eXchange Format, http://en.wikipedia.org/wiki/MXF) in my app. How do I go about it?
MXF generally contains MPEG-2 / AVC video with profile/levels not supported by the decoders in the android devices. Specifically IMX is 422 profile at main level and HD MPEG-2 MXF is generally 422 profile at high level.
The rule of the thumb says that you must use the most common and less resource intensive codec. Bad video playback is a deal breaker. Maybe you know the huge difference that format makes but the user won't.
Does anybody have any luck streaming a high quality video (over 1000kbps) to Android through RTSP?
We currently have low quality video streams (around 200kbps) that work wonderfully over 3G. Now we are trying to serve a high-quality stream for when the user has a faster connection. The high quality videos play smoothly in VLC, but the Android playback seems to drop frames and get blocky, even on a 4 megabit connection.
It seems like the YouTube app uses a plain HTTP download for their high quality videos. This works well and plays smoothly, but will not work for streaming live videos. Has anybody had luck streaming high quality videos to Android through RTSP?
The videos are encoded using H.264, 1500kbps, 24fps, and a 720x480 resolution. In the app, we are using a VideoView to play the videos. We are using Darwin Streaming Server, but we are open to other options if necessary.
Update 6/23/2011
Looking through Darwin some more today. So far, I am just logging the request and session information in a Darwin module.
The original Droid tries to use these settings: 3GPP-Adaptation:...size=131072;target-time=4000. Although that means it wants 4 seconds of buffer, 131Kb only holds about a second of playback at 1200kbps. I understand that 1200kbps is large, but it is necessary for a high quality video (minimal compression on 720x480).
I am trying to force the client to buffer more, but I haven't figured out how to do that yet. I'm just looking through the Darwin Streaming Server source and trying to figure out how they do things. Any Darwin experts out there?
Update 6/24/2011
As it turns out, using plain old HTTP for viewing videos on demand works well with no loss of quality. When we get to live streaming, we will have to look more into RTSP.
Well even if the network is able to transmit at that rate, you still need to decode it. What are you using for decoding? You will probably need to use a NEON accelerated video decoder so you can have a proper framerate, and a decent size buffer... the graphics processor is only as good as the bus that it is in... Also what are your encoding settings and resolution?
Edit: You are encoding those at much to high bitrate, half of that will do fine. Also you need to make sure where the issue lies. Is the mediaPlayer getting the data and failing to stream at a decent framerate, in that case you have to replace the MediaPlayer code with your own player. Is it's network issue then only solution is to lower the bitrate, 600Kbps would be just fine (or 500Kbps video, 128Kbps audio), it's 3x your 200k stream and on a screen this small, the difference is not noticeable.