Compressing video file based with Multiple Resolution format options in Android - android

I am using MediaCodec Native API to compress video. My requirement is , I also need to show the user, list of available resolution formats for video compression(assuming that if user selects any of the resolution format, the output file should be less than 5MB). So, I need to be able to calculate the compressed video size based on resolution option chosen by the user before the actual compression. Is this possible in Android? I have searched extensively, but unable to find any answer. Any leads would be very helpful. Thanks!

You can calculate the output size by multiplying the output bitrate (bit per second) with the length (seconds) of the video.

Related

How can I reduce the video recording size in flutter?

I'm using flutter camera plugin to record the video. But the recorded video size is too big. Something around 20mb for 1 min. How can I reduce the size (one of which is how to reduce resolution)? Also I have changed my VideoEncodingBitRate to 3000000. Like this mediaRecorder.setVideoEncodingBitRate(3000000);.
To reduce the size, you can employ any or both of these 2 methods:
Resolution
You can see them in the example
controller = CameraController(cameras[0], ResolutionPreset.medium);, change this to ResolutionPreset.low or some other customer value (does not have to be preset)
Encoding
You can use different encoding algorithms, such as FFmpeg using this plugin https://pub.dartlang.org/packages/flutter_ffmpeg. See also this question and its answers how to reduce size of video before upload to server programmatically in android

Video compression estimated size - android

I am developing an android video compression application.
I have video 1280*720, video duration is 4 seconds and size of the video is 7Mb, How can get estimated video size for different resolutions.
I want to find estimated video size for 854*504, 640*360 and more resolutions, please let me if any formal for calculating estimated video size.
Thanks :)
You can not estimate file size from resolution alone. A file with little motion will compress much smaller that a file with a lot of action. If you need to know the size before compression, you should decide what size you want, then divide that number by the video duration, and use that number for your bitrate when encoding.

how to become the preview stream in RAW Format (Camera2 API , ImageFormate.RAW_SENSOR)?

Hello I have a really difficult question. I got the nexus 6 and I want to have the preview stream in RAW fromat (ImageFormate.RAW_SENSOR) at the camera2 API. Is this even possible?
I use the android-Camera2Raw (https://github.com/googlesamples/android-Camera2Raw)
Currently, this is not possible. Please see this table for a list of formats possible as a stream: https://developer.android.com/reference/android/hardware/camera2/CameraCharacteristics.html#SCALER_STREAM_CONFIGURATION_MAP
Your most "raw" choice would be turning off noise reduction (if supported by the hardware) in your CaptureRequest.Builder, like so:
builder.set(CaptureRequest.NOISE_REDUCTION_MODE, CaptureRequest.NOISE_REDUCTION_MODE_OFF);
If frame rate isn't an issue, you could send repeated CaptureRequests for RAW images, and on your ImageReader.OnImageAvailableListener() process the RAW, convert it to a Bitmap, then place that into an ImageView, but that approach is exactly as impractical as it sounds :)

Android:where to store large arrays of frames before encoding?

I am facing a programming problem
I am trying to encode video from camera frames that I have merged with other frames which were retrieved from other layer(like bitmap/GLsurface)
When I use 320X240 .I can make the merge in real time with fine FPS(~10),but when I try to increase the pixels size I am getting less than 6 FPS.
It is sensible as my merging function depend on the pixels size.
So what I ask is, how to store that Arrays of frames for after processing (encode)?
I don't know how to store this large arrays.
Just a quick calculation:
If i need to store 10 frame per second
and each frame is 960X720 pixel
so i need to store for 40 second video : 40X10X960X720X(3/2-android factor)=~ 276 MB
it is to much for heap
any idea?
You can simply record the camera input as video - 40 sec will not be a large file even at 720p resolution - and then offline you can decode, merge, and encode again. The big trick is that MediaRecorder will use hardware, so encoding will be really fast. Being compressed, the video can be written to sdcard or local file system in real time, and reading it for decoding is not an issue, too.

Format of sound/image files to be used in Android Apps

While developing an Android app what format of sound/image should i should be using so that i can control the overall size of the app after completion.
here is a link to all the media types supported by Android.
For sound I would probably use a low-bitrate .mp3 or a .midi and for images either a compressed .jpg or .gif
For supported media formats see this.
For images you'll probably end up with JPG or PNG (if you need transparency). You should also scrape the images to remove all unnecessary meta data etc. For linux, a nice tool for this is Trimage.
Take a look at Supported Media Formats.
My choice would be:
Images: go with JPG for compression or PNG for quality and transparency support.
Audio: go with MP3-VBR (variable bit rate) for compression and quality.
The size of your file will be greatly affected by compression level. At some point, if you compress too much you will see/hear artifacts. The acceptable level of compression is subjective and really depends on the input data (image or audio). You should be testing different levels of compression to see what works.

Categories

Resources