Front Camera Video Capturing Distortion - Android - android

I am working on Video capturing App. It is working fine for back camera. But when i switch to front CAM the video made is very blur (just some line across the video).
mediaRecorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER);
mediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
mediaRecorder.setProfile(CamcorderProfile.get(CamcorderProfile.QUALITY_720P));
mediaRecorder.setOutputFile("/sdcard/myvideo.mp4");
mediaRecorder.setMaxDuration(600000); // Set max duration 60 sec.
mediaRecorder.setMaxFileSize(50000000); // Set max file size 50M

I have searched a lot and eventually found the solution as below.
The BitRate,setEncodingBitRate,setVideoFrameRate,setVideoSize functions can have parameters according to your or user-end devices. I have used constant values working fine for me. Set them generic accordingly. Also, Remember that camera resolution is also set LOW for Front Cameras.
mediaRecorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER);
mediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
mediaRecorder.setProfile(CamcorderProfile.get(CamcorderProfile.QUALITY_LOW);
mediaRecorder.setVideoEncodingBitRate(512* 1000);
mediaRecorder.setVideoFrameRate(15);
mediaRecorder.setVideoSize(640,480);
mediaRecorder.setVideoSize( 200, 200 );
mediaRecorder.setOutputFile("/sdcard/myvideo.mp4");
mediaRecorder.setMaxDuration(600000); // Set max duration 60 sec.
mediaRecorder.setMaxFileSize(50000000); // Set max file size 50M
`Attaching Links which helped me to come to this solution.
Blurr/Distorted video Error Insight

Related

Android MediaRecorder CaptureReate and VideoFrameRate have no effect

I'm using the MediaRecorder to record the device's screen. I'd like to be able to control the frame rate of the video captured.
I've tried setting both the Capture Rate and the VideoFrameRate properties of the MediaRecorder, but they seem to have no effect:
this.mRecorder.setCaptureRate(this.fps);
this.mRecorder.setVideoFrameRate(this.fps);
As per the documentation (setCaptureRate, setVideoFrameRate), I am calling setCaptureRate and setVideoFrameRate after setting the format and video source, and before calling prepare:
this.mRecorder = new MediaRecorder();
this.mRecorder.setAudioSource(MediaRecorder.AudioSource.DEFAULT);
this.mRecorder.setVideoSource(2);
this.mRecorder.setOutputFormat(2);
this.mRecorder.setOutputFile(this.mFilePath);
this.mRecorder.setVideoSize(screenWidth, screenHeight);
this.mRecorder.setVideoEncoder(2);
this.mRecorder.setAudioEncoder(2);
this.mRecorder.setVideoEncodingBitRate(this.mBitRate);
this.mRecorder.setCaptureRate(this.fps);
this.mRecorder.setVideoFrameRate(this.fps);
this.mRecorder.prepare();
I've checked, and this.fps is set to 5, so it is not like some other incorrect value... and the documentation says:
The fps can go as low as desired
Does anyone know how to set the FPS?

Front cam recording using MediaRecorder not working smoothly

I have a Huawei P9 Plus smartphone running Android 7.0. I'm using MediaRecorder to record the front cam. It is a 8 MP camera. I'm using the following settings (I think this is the most important part, I'm not posting the whole class because it is too much lines of code):
mMediaRecorder = new MediaRecorder();
mMediaRecorder.setVideoSource(MediaRecorder.VideoSource.SURFACE);
mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4)
mMediaRecorder.setOutputFile(videoFile.getAbsolutePath());
mMediaRecorder.setVideoEncodingBitRate(8000000);
mMediaRecorder.setVideoFrameRate(30)
mMediaRecorder.setVideoSize(1024 , 1920)
mMediaRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.H264);
mMediaSurface = MediaCodec.createPersistentInputSurface();
mMediaRecorder.setInputSurface(mMediaSurface);
mMediaRecorder.prepare();
With this settings it works but sometimes the video is a bit jerky. Strange is also that with video size 1024 x 1920 it works but when I set 1080 x 1920 it does not work anymore (there is no error but the video is completely corrupted). Why is that? In the supported resolutions I got from the front cam characteristics 1080 x 1920 is listed but not 1024 x 1920.
Are my other settings ok? Is setVideoEncodingBitRate ok for a 8 MP camera?
I have also tried to use a given profile as follows:
mMediaRecorder = new MediaRecorder();
mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
mMediaRecorder.setVideoSource(MediaRecorder.VideoSource.SURFACE);
mMediaRecorder.setOutputFile(videoFile.getAbsolutePath());
mMediaRecorder.setProfile(CamcorderProfile.get(CamcorderProfile.QUALITY_1080P));
mMediaRecorder.setVideoFrameRate(30)
mMediaSurface = MediaCodec.createPersistentInputSurface();
mMediaRecorder.setInputSurface(mMediaSurface);
mMediaRecorder.prepare();
When I run it this way I'm getting an error when I try to stop MediaRecorder (stop failed: -1007), probably because starting video recording did not succeed. Why? Did I make a mistake?
When I run it this way I'm getting an error when I try to stop MediaRecorder (stop failed: -1007), probably because starting video recording did not succeed. Why? Did I make a mistake
Your probably right. Try checking or adding property(e.g Boolean) to determine if the recorder has already started. In that case you won't be able to call stop if it is not yet started.

MediaRecorder.setVideoFrameRate() not having any effect

I'm trying to get my Android app to record video with a lower frame rate (to reduce file size).
Here's my MediaRecorder configuration code:
m_mediaRecorder = new MediaRecorder();
m_mediaRecorder.setCamera(mCamera);
m_mediaRecorder.setVideoSource(MediaRecorder.VideoSource.DEFAULT);
CamcorderProfile profile = CamcorderProfile.get(CamcorderProfile.QUALITY_HIGH);
m_mediaRecorder.setOutputFormat(profile.fileFormat);
m_mediaRecorder.setVideoEncodingBitRate(profile.videoBitRate);
m_mediaRecorder.setVideoEncoder(profile.videoCodec);
m_mediaRecorder.setVideoSize(640, 480);
m_mediaRecorder.setVideoFrameRate(10);
m_mediaRecorder.setOrientationHint((int) orientationListener.getPreviewRotation(null));
m_mediaRecorder.setOutputFile(videoDirectory + "/" + uuid + ".mp4");
try {
m_mediaRecorder.prepare();
m_mediaRecorder.start();
} catch (Exception e) {
e.printStackTrace();
}
The video does record successfully, but no matter what I try, the frame rate seems to be fixed at 30 fps. m_mediaRecorder.setVideoFrameRate(10); has no effect.
(If I set the videoBitRate to a lower value, this reduces the file size but also reduces the quality of each individual frame - something we do not want to do.)
(For the record - Android 6.0.1; SDK Version 21.)
What am I missing?
Thanks,
Reuven
mediaRecorder.setCaptureRate(20);
mediaRecorder.setVideoFrameRate(20);
I am successfull with this, please try it. Thanks!
May be it makes sense to limit frame rate on Camera as well?
Anyway, do not expect significant gain from lowering frame rate, because the higher interval between frames, the more bits are required to encode the same picture.
Much more bitrate saving can be achieved by reducing picture resolution.

Android camera2 jpeg framerate

I am trying to save image sequences with fixed framerates (preferably up to 30) on an android device with FULL capability for camera2 (Galaxy S7), but I am unable to a) get a steady framerate, b) reach even 20fps (with jpeg encoding). I already included the suggestions from Android camera2 capture burst is too slow.
The minimum frame duration for JPEG is 33.33 milliseconds (for resolutions below 1920x1080) according to
characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP).getOutputMinFrameDuration(ImageFormat.JPEG, size);
and the stallduration is 0ms for every size (similar for YUV_420_888).
My capture builder looks as follows:
captureBuilder.set(CaptureRequest.CONTROL_AE_MODE, CONTROL_AE_MODE_OFF);
captureBuilder.set(CaptureRequest.SENSOR_EXPOSURE_TIME, _exp_time);
captureBuilder.set(CaptureRequest.CONTROL_AE_LOCK, true);
captureBuilder.set(CaptureRequest.SENSOR_SENSITIVITY, _iso_value);
captureBuilder.set(CaptureRequest.LENS_FOCUS_DISTANCE, _foc_dist);
captureBuilder.set(CaptureRequest.CONTROL_AF_MODE, CONTROL_AF_MODE_OFF);
captureBuilder.set(CaptureRequest.CONTROL_AWB_MODE, _wb_value);
// https://stackoverflow.com/questions/29265126/android-camera2-capture-burst-is-too-slow
captureBuilder.set(CaptureRequest.EDGE_MODE,CaptureRequest.EDGE_MODE_OFF);
captureBuilder.set(CaptureRequest.COLOR_CORRECTION_ABERRATION_MODE, CaptureRequest.COLOR_CORRECTION_ABERRATION_MODE_OFF);
captureBuilder.set(CaptureRequest.NOISE_REDUCTION_MODE, CaptureRequest.NOISE_REDUCTION_MODE_OFF);
captureBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER, CaptureRequest.CONTROL_AF_TRIGGER_CANCEL);
// Orientation
int rotation = getWindowManager().getDefaultDisplay().getRotation();
captureBuilder.set(CaptureRequest.JPEG_ORIENTATION,ORIENTATIONS.get(rotation));
Focus distance is set to 0.0 (inf), iso is set to 100, exposure-time 5ms. Whitebalance can be set to OFF/AUTO/ANY VALUE, it does not impact the times below.
I start the capture session with the following command:
session.setRepeatingRequest(_capReq.build(), captureListener, mBackgroundHandler);
Note: It does not make a difference if I request RepeatingRequest or RepeatingBurst..
In the preview (only texture surface attached), everything is at 30fps.
However, as soon as I attach an image reader (listener running on HandlerThread) which I instantiate like follows (without saving, only measuring time between frames):
reader = ImageReader.newInstance(_img_width, _img_height, ImageFormat.JPEG, 2);
reader.setOnImageAvailableListener(readerListener, mBackgroundHandler);
With time-measuring code:
ImageReader.OnImageAvailableListener readerListener = new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable(ImageReader myreader) {
Image image = null;
image = myreader.acquireNextImage();
if (image == null) {
return;
}
long curr = image.getTimestamp();
Log.d("curr- _last_ts", "" + ((curr - last_ts) / 1000000) + " ms");
last_ts = curr;
image.close();
}
}
I get periodically repeating time differences like this:
99 ms - 66 ms - 66 ms - 99 ms - 66 ms - 66 ms ...
I do not understand why these take double or triple the time that the stream configuration map advertised for jpeg? The exposure time is well below the frame duration of 33ms. Is there some other internal processing happening that I am not aware of?
I tried the same for the YUV_420_888 format, which resulted in constant time-differences of 33ms. The problem I have here is that the cellphone lacks the bandwidth to store the images fast enough (I tried the method described in How to save a YUV_420_888 image?). If you know of any method to compress or encode these images fast enough myself, please let me know.
Edit: From the documentation of getOutputStallDuration: "In other words, using a repeating YUV request would result in a steady frame rate (let's say it's 30 FPS). If a single JPEG request is submitted periodically, the frame rate will stay at 30 FPS (as long as we wait for the previous JPEG to return each time). If we try to submit a repeating YUV + JPEG request, then the frame rate will drop from 30 FPS." Does this imply that I need to periodically request a single capture()?
Edit2: From https://developer.android.com/reference/android/hardware/camera2/CaptureRequest.html: "The necessary information for the application, given the model above, is provided via the android.scaler.streamConfigurationMap field using getOutputMinFrameDuration(int, Size). These are used to determine the maximum frame rate / minimum frame duration that is possible for a given stream configuration.
Specifically, the application can use the following rules to determine the minimum frame duration it can request from the camera device:
Let the set of currently configured input/output streams be called S.
Find the minimum frame durations for each stream in S, by looking it up in android.scaler.streamConfigurationMap using getOutputMinFrameDuration(int, Size) (with its respective size/format). Let this set of frame durations be called F.
For any given request R, the minimum frame duration allowed for R is the maximum out of all values in F. Let the streams used in R be called S_r.
If none of the streams in S_r have a stall time (listed in getOutputStallDuration(int, Size) using its respective size/format), then the frame duration in F determines the steady state frame rate that the application will get if it uses R as a repeating request."
The JPEG output is by way not the fastest way to fetch frames. You can accomplish this a lot faster by drawing the frames directly onto a Quad using OpenGL.
For burst capture, a faster solution would be capturing the images to RAM without encoding them, then encoding and saving them asynchronously.
On this website you can find a lot of excellent code related to android multimedia in general.
This specific program uses OpenGL to fetch the pixel data from an MPEG video. It's not difficult to use the camera as input instead of a video. You can basically use the texture used in the CodecOutputSurface class from the mentioned program as output texture for your capture request.
A possible solution I found consists of using and dumping YUV without encoding it as JPEG in combination with a micro Sd-card that is able to save up to 95Mb per second. (I had the misconception that YUV images would be larger, so with a cellphone that has full support for the camera2-pipeline, the write speed should be the limiting factor.
With this setup, I was able to achieve the following stable rates:
1920x1080, 15fps (approx. 4Mb * 15 == 60Mb/sec)
960x720, 30fps. (approx. 1.5Mb * 30 == 45Mb/sec)
I then encode the images offline from YUV to PNG using a python script.

Mediarecorder setParameter failed (setAudioSamplingRate)

I try to build an app where there is a record function in. I want the user to be able to choose the record quality, one of the options is AAC recording. I am using the code below, but I keep getting errors on setting the audioSamplingRate.. any ideas?
mediaRecorder = new MediaRecorder();
mediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
mediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
mediaRecorder.setAudioSamplingRate(96000);
mediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AAC);
and the error
04-29 10:32:17.477: E/MediaRecorder(18750): setParameters(audio-param-sampling-rate=96000) failed: -22
04-29 10:32:17.477: E/com.test.com.AudioRecorder(18750): setParameter failed.
04-29 10:32:17.477: E/com.test.com.AudioRecorder(18750): prepare() method called on illegal state
Every audio sampling rate is not supported in devices. so,if it is not supported it will give error or clipped internally to make it suitable.for quality you can set audio encoding bit rate with sample rate.44100 sample rate is supported by all android devices according to its audiorecorder documentation.you can use bit rate according to your need.this formula may help you
Bit rate = (sampling rate) × (bit depth) × (number of channels)
Bit depth is generally 8 in android devices.

Categories

Resources