This is coded in NativeScript, so I'll try my best to adapt the scenario to Java. I have created an in-app video view with support to record the video.
This is done as follows:
First I create a SurfaceView that will hold the preview of the camera:
this.mSurfaceView = new android.view.SurfaceView(this._context);
this.mHolder = this.mSurfaceView.getHolder();
this.mHolder.setType(android.view.SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
Then I create an instance of the Camera, and sets the video surface:
var mCamera = android.hardware.Camera;
var camera = mCamera.open(1);
this.camera = camera;
this.camera.setDisplayOrientation(90);
var parameters = camera.getParameters();
parameters.setRecordingHint(true);
if( parameters.isVideoStabilizationSupported() ){
parameters.setVideoStabilization(true);
}
camera.setParameters(parameters);
this.camera.setPreviewDisplay(_this.mHolder);
this.camera.startPreview();
this.camera.startFaceDetection();
Now, all is good. I have the camera preview in the view that I want it to be. The color is good and I think the image aspect ratio is good too.
However, when I initiate the recording, as I do with the following code:
this.mediarecorder = new android.media.MediaRecorder();
// Step 1: Unlock and set camera to MediaRecorder
this.camera.unlock();
this.mediarecorder.setCamera(this.camera);
// Step 2: Set sources
this.mediarecorder.setAudioSource(android.media.MediaRecorder.AudioSource.CAMCORDER);
this.mediarecorder.setVideoSource(android.media.MediaRecorder.VideoSource.CAMERA);
//this.mediarecorder.setOutputFormat(android.media.MediaRecorder.OutputFormat.MPEG_4);
// Step 3: Set a CamcorderProfile (requires API Level 8 or higher)
this.mediarecorder.setProfile(android.media.CamcorderProfile.get(android.media.CamcorderProfile.QUALITY_HIGH));
// platform.screen.mainScreen.widthDIPs
// platform.screen.mainScreen.heightDIPs
// Step 4: Set output file
var fileName = "videoCapture_" + new Date() + ".mp4";
var path = android.os.Environment.getExternalStoragePublicDirectory(android.os.Environment.DIRECTORY_DCIM).getAbsolutePath() + "/Camera/" + fileName;
this.file = new java.io.File(path);
this.mediarecorder.setOutputFile(this.file.toString());
this.mediarecorder.setOrientationHint(270);
try {
this.mediarecorder.prepare();
this.mediarecorder.start();
} catch( ex ) {
console.log(ex);
}
Then, the image suddenly becomes darker, and my face (its what's in focus when I'm trying it out) gets wider. So the aspect ratio changes, and so does the lighting somehow.
I have tried setting setPictureSize on the camera parameters, and setVideoSize on the MediaRecorder with no luck. And for the lighting change, I have simply no clue as to whats going on. Now I've been googling myself half way to heaven, and still found nothing, so I hope someone here has got any tip on what to pursue next?
Video recording generally tries to run at a steady frame rate, such as 30fps. Camera preview often slows itself down to 10-15fps to maintain brightness, so if you're in a darker location, video recording will be darker (since it can't expose for longer than 1/30s instead of 1/10s that camera preview can).
Did you call setVideoSize before or after calling setProfile? The setProfile call changes many parameters, including preview size; most video recording sizes are 16:9, and the default camera preview resolution is likely a 4:3 size. So when you start the recording, the aspect ratio switches.
Most video recording apps use 16:9 preview sizes even before starting recording so that they're consistent. You can also record 4:3 video, but that's generally not what people want to see.
Related
I am working on a video recording app in which I want to record videos in portrait. Everything seems fine except for the video which is saved in landscape mode. I tried the implementation using this project: https://github.com/HofmaDresu/AndroidCamera2Sample as an example, but still, the video is being saved in landscape mode.
void PrepareMediaRecorder()
{
if (mediaRecorder == null)
{
mediaRecorder = new MediaRecorder();
}
else
{
mediaRecorder.Reset();
}
var map = (StreamConfigurationMap)characteristics.Get(CameraCharacteristics.ScalerStreamConfigurationMap);
if (map == null)
{
return;
}
videoFileName = GetVideoFilePath();
mediaRecorder.SetAudioSource(AudioSource.Mic);
mediaRecorder.SetVideoSource(VideoSource.Surface);
mediaRecorder.SetOutputFormat(OutputFormat.Mpeg4);
mediaRecorder.SetOutputFile(videoFileName);
mediaRecorder.SetVideoEncodingBitRate(10000000);
mediaRecorder.SetVideoFrameRate(30);
var videoSize = ChooseVideoSize(map.GetOutputSizes(Java.Lang.Class.FromType(typeof(MediaRecorder))));
mediaRecorder.SetVideoEncoder(VideoEncoder.H264);
mediaRecorder.SetAudioEncoder(AudioEncoder.Aac);
mediaRecorder.SetVideoSize(videoSize.Width, videoSize.Height);
int rotation = (int)Activity.WindowManager.DefaultDisplay.Rotation;
mediaRecorder.SetOrientationHint(GetOrientation(rotation));
mediaRecorder.Prepare();
}
Assuming a high-quality video player shows you the video in portrait (if not, your GetOrientation method probably has an error in it), but other players you still care about are stuck on landscape:
You'll have to rotate the frames yourself. Unfortunately, this is messy, since there's no automatic control for this on the media encoder APIs that I know of.
Options are receiving frames via an ImageReader from the camera, and then doing the rotation in Java or via JNI, in native code, and then sending the frame to the encoder either via an ImageWriter to a MediaRecorder or MediaCodec Surface, or writing frames via MediaCodec's ByteBuffer interface.
Or you could send the frames to the GPU via a SurfaceTexture, rotate in a fragment shader, and then write out to a Surface tied to a MediaRecorder/MediaCodec again.
Both of these require a lot of boilerplate code and understanding of lower-level details, unfortunately.
I am having a headache over the Camera API 1 for android. After reading all of the Internet content, I made some sample app that works OK. It creates a service, which then is used to operate with the camera in the background, so there is no preview or activity enabled. To achieve this I use a dummy SurfaceHolder, like this:
protected class MySurfaceHolder implements SurfaceHolder {
private final Surface surface;
private final SurfaceTexture surfaceTexture;
public MySurfaceHolder () {
int[] textures = new int[1];
GLES20.glGenTextures(1, textures, 0);
if (textures.length > 0) {
this.surfaceTexture = new SurfaceTexture(textures[0]);
this.surface = new Surface(this.surfaceTexture);
} else {
this.surface = null;
this.surfaceTexture = null;
}
}
[...]
}
and then I use it like this
// simplified version of my code
try {
initializeCamera(); // open camera and set Camera.Parameters
camera.setPreviewDisplay(new MySurfaceHolder());
camera.startPreview();
camera.unlock();
initializeMediaRecorder(); // create MediaRecorder, set video/audio parameters
mediaRecorder.prepare();
mediaRecorder.start();
// wait until recording finish and exit
} finally {
stopRecording();
}
the Camera and MediaRecorder initialization methods are just like the documentation states they should be (and they work).
Everything works and operates as it should. Almost everything - sometimes, under unknown circumstances the MediaRecorder creates empty files, like 32kB containing only headers and info about the video - no frames. The longer I record like this, the bigger is the file (few kB every few seconds). After 1 minute, the file weights about 80kB. Funny thing is I know that the camera is working and capturing frames (I debugged it a little showing preview frames), but the frames are not written into the output file.
Also when it happens I am not able to record in FHD (1920x1080) - I get the "start failed" message - at this time camera is not capturing frames. The same thing could happen when I use wrong (not supported) video size. I suppose in this case the message is thrown at the mediaRecorder.start(); line, and stopRecording(); is invoked but I am not sure.
After some time or after unknown action the problem is suddenly gone (I don't know when, I don't know how). It happens for sure on Android 5.1, but may happen on other versions as well.
Could this bug be related to my custom surface code?
What could cause the MediaRecorder to not write frames into a file?
Why I am not able to record in FHD, but in the same time I am able to record in HD (1280x720)?
Is there any alternative for MediaRecorder, so I can avoid these bugs?
May it happen when another app is trying to get Camera object, thus distrupting current recording? If so, how to regain access to the Camera object (I apparently am not able to do this now on some devices).
EDIT:
I think I might have a clue. I am calling
camera.setOneShotPreviewCallback(new Camera.PreviewCallback() {
// ... get current frame
}
camera.startPreview();
to get preview frame of current recording. It appears that the bug occurs when I am using this method to get preview frame (at random times). It seems flawed, because not all devices react to this thing properly (sometimes there is no preview frame...). Is there any other, better method of handling current preview frame without the real surface?
I was using a tutorial for camera2 api for android and one of the steps was to resize the textureview's surface to an acceptable format by doing the following:
SurfaceTexture surfaceTexture = mTextureView.getSurfaceTexture();
surfaceTecture.setDefaultBufferSize(mPreviewSize.getWidth(), mPreviewSize.getHeight();
Surface previewSurface = new Surface(surfaceTexture);
previewBuilder = CD.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
previewBuilder.addTarget(previewSurface);
So the mPreviewSize variable is of type Size and it was determined beforehand it cycles through the acceptable formats and selects the most optimal one according to your screen size. The problem is I'm using a SurfaceView and I'm trying to resize the surface object in the SurfaceView I tried this but it didn't work:
SurfaceHolder SH= gameSurface.getHolder();
SH.setFixedSize(mPreviewSize.getWidth(), mPreviewSize.getHeight());
Surface Sur = SH.getSurface();
previewBuilder = CD.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
previewBuilder.addTarget(Sur);
So in debug mode I see mPreviewSize is correct (as in it is set to an acceptable format) but I get an error saying that I'm trying to use an unacceptable format size, it shows the size and it's not the same as mPreviewSize which means the resizing isn't working. Any ideas?
You probably need to wait to receive the surfaceChanged callback from the SurfaceView, before trying to use the Surface to create a camera capture session.
setFixedSize doesn't necessarily take effect immediately.
I am using media recorder to record video in an android app.
mMediaRecorder.setCamera(mServiceCamera);
mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.DEFAULT);
mMediaRecorder.setVideoSource(MediaRecorder.VideoSource.DEFAULT);
mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.DEFAULT);
//mMediaRecorder.setVideoSize(mPreviewSize.width, mPreviewSize.height);
mMediaRecorder.setVideoFrameRate(30);
mMediaRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.DEFAULT);
mMediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.DEFAULT);
String file_name = Environment.getExternalStorageDirectory().getPath() +"/myVideo.mp4";
mMediaRecorder.setOutputFile(file_name);
mMediaRecorder.setPreviewDisplay(mSurfaceHolder.getSurface());
mMediaRecorder.prepare();
mMediaRecorder.start();
The problem is in the line
mMediaRecorder.setVideoSize(mPreviewSize.width, mPreviewSize.height);
In HTC and Xperia, setVideoSize works fine (Will work only if I don't comment this line).
But in Nexus and Note, setVideoSize won't work( Will work only if I comment this line).
What should I do in order for the app to run in all these devices correctly??
You need to understand that the preview and the actual captured video are two different things, likewise Preview sizes and Video sizes are two different parameters. What you see in the viewfinder is essentially the preview, but it is not what actually gets recorded.
When starting a camera, you set the preview size to the camera. But you must query for the supported preview sizes and should set one among them.
Camera camera = camera.open();
List psizes = camera.getParameters()
.getSupportedPreviewSizes();
Once you have set up the preview, you can start recording by using a MediaRecorder, and the video size can be set to the media recorder, and it is the actual size of the video that will be captured.
Again, you should set one of supported video size.
List sizes = camera.getParameters()
.getSupportedVideoSizes();
and then, you can set one of these to the media recorder
mediaRecorder.setVideoSize(videoWidth, videoHeight);
So, remember to check for the supported sizes always, else you are bound to get an app crash.
Video sizes in a device is equal to preview sizes. You have to first check whether video size you setting is available or not. Video sizes in different devices may be diffrent.so,first check available preview sizes using getSupportedPreviewSizes () and then set video size. this will return a list.you have to select only one of them.
I'm trying to record video from the Camera using the MediaRecorder. Here's a code snippet
snip..
mr.setAudioSource( MediaRecorder.AudioSource.MIC );
mr.setVideoSource( MediaRecorder.VideoSource.CAMERA);
mr.setOutputFormat( MediaRecorder.OutputFormat.THREE_GPP );
mr.setAudioEncoder( MediaRecorder.AudioEncoder.AMR_NB );
mr.setVideoEncoder( MediaRecorder.VideoEncoder.MPEG_4_SP );
mr.setVideoSize( 200, 200 );
mr.setVideoFrameRate( 15 );
..snap
Code executes on a MileStone/Droid, non-empty output file will be created. But when I try to view the video, it looks like this:
My first thoughts were about some sort of encoding error, so I tried every possible OutputFormat/VideoEncoder combination, with no effetcs on the result.
LogCat shows the following error
CameraInput: Unsupported parameter(x-pvmf/media-input-node/cap-config-interface;valtype=key_specific_value)
But I can't figure out, what I may have set wrong. I used camera.getParameters(), set the preview size with the returned params and then pushed them back using camera.setParameters()...
Worked thru every piece of sample code I could find, but still found no solution.
Does anyone have any ideas ?
you must set the correct setVideoSize( x, y) function.
you must call the function which give you the size options , and choose from that list
when camera open,
Camera.Parameters p = mCamera.getParameters();
p.setPreviewSize(mSur.getWidth(), mSur.getHeight());
mCamera.setParameters(p);
and when you prepareRecord
mCamera.unlock();
if (mRecorder == null) {
mRecorder = new MediaRecorder();
} else {
mRecorder.reset();
}
mNextRecordFileName = getOneFileName();
mRecorder.setCamera(mCamera);
mRecorder.setVideoSource(mVideoSource);
mRecorder.setAudioSource(mAudioSource);
mRecorder.setPreviewDisplay(mSur.getHolder().getSurface());
mRecorder.setOutputFormat(mVideoFormat);
//设置在setEncoder之前才有效,如果不设置,htc会崩溃掉
mRecorder.setVideoSize(this.mSur.getWidth(), mSur.getHeight());
Log.i(TAG, this.mSur.getHolder().getSurfaceFrame().height()+"gao");
Log.i(TAG, this.mSur.getHolder().getSurfaceFrame().width()+"kuan");
Log.i(TAG, "宽"+this.mSur.getWidth()+"高"+mSur.getHeight());
mRecorder.setVideoEncoder(mVideoEncoder);
mRecorder.setAudioEncoder(mAudioEncoder);
mRecorder.setOutputFile(mNextRecordFileName);
hope it can help you
To avoid distortion in recorded video ( I have seen this on Galaxy S3) make sure you set camera parameter preview size and mediarecorder videosize to same height and width.
To get supported camera preview size:
Camera.Parameters parameters = camera.getParameters();
List<Camera.Size> list = parameters.getSupportedPreviewSizes();
parameters.setPreviewSize(list.get(X).width, list.get(X).height);
// If you set Width and Height not supported by device you will get exception
// MediaRecorder object mMediaRecoder
mMediaRecoder.setVideoSize(list.get(X).width, list.get(X).height);
Hey,
I know you posted this a while ago and I doubt your still looking for an answer but I thought this might help someone else out.
I think your problem is mr.setVideoSize( 200, 200 );
I doubt the phones camera supports a 1X1 capture resolution. It is better to use something like mr.setVideoSize(Camcorder.get(Camcorder.QUALITY_LOW).videoFrameWidth,Camcorder.get(Camcorder.QUALITY_LOW).videoFrameHeight);
That will ensure that the resolution is supported by the camera. Also make sure your preview resolution matches your camera resolution, or that can cause the same problem. I know it happens to me if I have my preview set to QUALITY_HIGH and my camera to QUALITY_LOW