Samsung Galaxy SIII mediaRecorder() issues. (Corrupt Video) - android

I have a problem with the Samsung Galaxy SIII. In the app we are creating, we use a mediaRecorder to record a video of the user using the front camera. I have looked thoroughly in the documentation and all over forums and I have seen a few similar posts for the SII or crashes in general, but those fixes unfortunately did not work for us. The process that the camera records is as follows --> There is a function (code will be provided) that checks each devices compatible camera resolutions, then we check to see if they meet our specifications (right now it is 480p or less) If there is one that meets this criteria then it uses that quality to set the videoSize() (function provided by android to set the recording size of a video). This seems pretty trivial and looks like it would work with almost any device. This code does work for a couple different devices that we've tested it on (e.g. Galaxy S4, and Galaxy Stellar). But for some reason the SIII is being very difficult. When you record on the SIII in any resolution lower than 720p, the video becomes corrupt and plays back with multi colored screen (screen shots in link below). Why not just record the video in 720p+ then? Unfortunately we need lower video sizes so that it is not such a heavy data load to transfer over a cell phone provider network.
So my question is, why does recording corrupt the video in any lower resolution than 720p, when the resolution that it uses is being pulled out from a list of device supported resolutions?
This is the function to pull supported resolutions from the device.
public Camera.Size getSupportedRecordingSizes() {
Camera.Size result = null;
Camera.Parameters params = camera.getParameters();
List<Size> sizes = params.getSupportedPictureSizes();
for (Size s : sizes) {
if (s.height < 481 && s.width < 721) {
if (result == null) {
result = s;
} else {
int resultVideoSize = result.width * result.height;
int newVideoSize = s.width * s.height;
if (newVideoSize > resultVideoSize) {
result = s;
}
}
}
}
if (!sizes.isEmpty() && result == null) {
Context context = getApplicationContext();
CharSequence text = "Used default first value";
int duration = Toast.LENGTH_SHORT;
Toast toast = Toast.makeText(context, text, duration);
toast.show();
for (Size size : sizes) {
if (result == null) {
result = size;
}
int previousSize = result.width * result.height;
int newSize = size.width * size.height;
if (newSize < previousSize) {
result = size;
}
}
}
return (result);
}
This is the code for our mediaRecorder (shortened for simplicity)
mediaRecorder = new MediaRecorder();
setCameraDisplayOrientaion();
//sets devices supported video size less than or equal to 480p (720x480 resolution).
Camera.Size vidSize = getSupportedRecordingSizes();
camera.unlock();
mediaRecorder.setCamera(camera);
mediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
mediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
mediaRecorder.setOutputFormat(videoFormat);
mediaRecorder.setAudioEncoder(audioEncoder);
mediaRecorder.setVideoEncoder(videoEncoder);
mediaRecorder.setOutputFile(fullQuestionPath);
if (vidSize == null) {
mediaRecorder.setVideoSize(480, 360);
} else {
mediaRecorder.setVideoSize(vidSize.width, vidSize.height);
}
mediaRecorder.setVideoFrameRate(videoFrameRate);
mediaRecorder.setPreviewDisplay(surfaceHolder.getSurface());
//set the bitrate manually if possible.
if (android.os.Build.VERSION.SDK_INT > 7) {
mediaRecorder.setVideoEncodingBitRate(videoBitrate);
}
try {
mediaRecorder.prepare();
mediaRecorder.start();
}
catch (Exception e) {
Log.e(ResponseActivity.class
.toString(), e.getMessage());
releaseMediaRecorder();
}
The images that correlate to this problem are here http://imgur.com/a/8F7Tb (Sorry about not posting earlier.)
The order of these images are as follows --> 1) Before recording, preview is going. 2)Recording, preview showing current recording/recording. 3)Stop Recording, recording has stopped as well as preview. 4) Playback, this is where the issue is, it shows this multicolored corrupt image, which is the same if you pull it from the device directly.
EDIT: Note, I have also tried using the CamcorderProfile and setting the quality to low or high. Setting to QUALITY_HIGH forces 720p which recording works at, but QUALITY_LOW, despite everyone having quite the opposite problem, does not work for me.
EDIT: Anyone have an idea to point me in the right direction?

Related

Camera preview is dark using Camera2

I am trying to use Camera2 to allow an app to take a simple picture. I managed to get a working sample using android-Camera2Basic sample code, the problem is that the camera preview is very dark (same problem as this other question), following some answers i did get a proper FPS range [15, 15], setting this in the lockFocus() method allows the app to great a clear picture with correct brightness and fixes the preview from the camera:
private void lockFocus() {
try {
// This is how to tell the camera to lock focus.
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER, CameraMetadata.CONTROL_AF_TRIGGER_START);
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_TARGET_FPS_RANGE, Range.create(15, 15));
// Tell #mCaptureCallback to wait for the lock.
mState = STATE_WAITING_LOCK;
mCaptureSession.capture(mPreviewRequestBuilder.build(), mCaptureCallback, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
But the preview before taking the pic is still very dark. I tried to set up the same line of code in other parts of the sample but is not working. How can i fix it in order to ge the same results in the preview? I am working with a Samsung SM-P355M tablet.
Using an FPS range with equal lower and upper bounds, such as [15,15], [30,30], [etc...], will put a constrain to the AE algorithm about how much it can adjust to light changes, which therefore may produce dark results. Such type of ranges are meant for video recording to maintain a constant FPS. For photos you need to find a range with a wide spread between the lower and upper bound, such as [7,30], [15,25], [etc...]
The next method can help you to find the optimal FPS range. Take into account that it is meant for photos and not video recording as it discards FPS ranges with equal lower and upper bounds.
(Adjust MIN_FPS_RANGE and MAX_FPS_RANGE to your requirements)
#Nullable
public static Range<Integer> getOptimalFpsRange(#NonNull final CameraCharacteristics characteristics) {
final int MIN_FPS_RANGE = 0;
final int MAX_FPS_RANGE = 30;
final Range<Integer>[] rangeList = characteristics.get(CameraCharacteristics.CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES);
if ((rangeList == null) || (rangeList.length == 0)) {
Log.e(TAG, "Failed to get FPS ranges.");
return null;
}
Range<Integer> result = null;
for (final Range<Integer> entry : rangeList) {
int candidateLower = entry.getLower();
int candidateUpper = entry.getUpper();
if (candidateUpper > 1000) {
Log.w(TAG,"Device reports FPS ranges in a 1000 scale. Normalizing.");
candidateLower /= 1000;
candidateUpper /= 1000;
}
// Discard candidates with equal or out of range bounds
final boolean discard = (candidateLower == candidateUpper)
|| (candidateLower < MIN_FPS_RANGE)
|| (candidateUpper > MAX_FPS_RANGE);
if (discard == false) {
// Update if none resolved yet, or the candidate
// has a >= upper bound and spread than the current result
final boolean update = (result == null)
|| ((candidateUpper >= result.getUpper()) && ((candidateUpper - candidateLower) >= (result.getUpper() - result.getLower())));
if (update == true) {
result = Range.create(candidateLower, candidateUpper);
}
}
}
return result;
}
After lots of reseach it seams there is no easy fix for this (at least not with our same hardware), so we implemented a new version of the camera activities this time using the deprecated Camera Api and everything works as espected. Not really a clean solution but so far works for me.

Android Camera2 increase brightness

I am using android camera2 in my application to take continuous images, Here when I use camera2 getting image preview brightness very dark compare to original camera. I seen this but there is no similar requirement in that answer.
I tried to set brightness in camera2 as suggested here:
Note that this control will only be effective if android.control.aeMode != OFF. This control will take effect even when android.control.aeLock == true.
captureRequestBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
captureRequestBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON);
captureRequestBuilder.set(CaptureRequest.CONTROL_AE_LOCK, true);
captureRequestBuilder.set(CaptureRequest.CONTROL_AE_EXPOSURE_COMPENSATION, 6);
But it still showing preview as dark image only as shown below.
See the difference here:
Original Camera:
Using Camera2:
And what is the value I need to pass as second parameter in:
captureRequestBuilder.set(CaptureRequest.CONTROL_AE_EXPOSURE_COMPENSATION, 6);
I kept 6 because as suggested in doc's:
For example, if the exposure value (EV) step is 0.333, '6' will mean an exposure compensation of +2 EV; -3 will mean an exposure compensation of -1 EV.
But still no effect in brightness..
Here it is:
Add below code in onConfigured() and unlockFocus()
captureRequestBuilder.set(CaptureRequest.CONTROL_AE_TARGET_FPS_RANGE,getRange());
By using the above code you will get the better preview. But your captured picture will remain as it is. To get the better picture as well use the same below code in captureStillPicture()
captureBuilder.set(CaptureRequest.CONTROL_AE_TARGET_FPS_RANGE, getRange());
getRange is:
private Range<Integer> getRange() {
CameraCharacteristics chars = null;
try {
chars = mCameraManager.getCameraCharacteristics(mCameraId);
Range<Integer>[] ranges = chars.get(CameraCharacteristics.CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES);
Range<Integer> result = null;
for (Range<Integer> range : ranges) {
int upper = range.getUpper();
// 10 - min range upper for my needs
if (upper >= 10) {
if (result == null || upper < result.getUpper().intValue()) {
result = range;
}
}
}
if (result == null) {
result = ranges[0];
}
return result;
} catch (CameraAccessException e) {
e.printStackTrace();
return null;
}
}
CONTROL_AE_LOCK should be off. You have misinterpreted the doc, possibly document itself is a bit confusing.
Note that this control will only be effective if
android.control.aeMode != OFF. This control will take effect even when
android.control.aeLock == true.
What it means is that when AE lock is ON, the exposure compensation will be applied on the locked exposure and not on the instantaneous exposure at the time of taking picture.
Even in your repeat request, exposure is locked so it doesn't help.
Remove AE lock and it should work.
While setting CONTROL_AE_EXPOSURE_COMPENSATION the second parameter as defined by docs is relative to CameraCharacteristics.CONTROL_AE_COMPENSATION_STEP
The adjustment is measured as a count of steps, with the step size defined by android.control.aeCompensationStep and the allowed range by android.control.aeCompensationRange."
The value of 6 in the example for +2EV is correct only when step is 0.333 which is just an example.
Following code will give you the exposure compensation value to be used for +2EV
CameraManager manager = (CameraManager)this.getSystemService(Context.CAMERA_SERVICE);
CameraCharacteristics characteristics = manager.getCameraCharacteristics(cameraId);
double exposureCompensationSteps = characteristics.get(CameraCharacteristics.CONTROL_AE_COMPENSATION_STEP).doubleValue();
int exposureCompensation = (int)( 2.0 / exposureCompensationSteps );
I would also suggest you check if the value is within the range specified by CameraCharacteristics.CONTROL_AE_COMPENSATION_RANGE
You can try this
public void setBrightness(int value) {
int brightness = (int) (minCompensationRange + (maxCompensationRange - minCompensationRange) * (value / 100f));
previewRequestBuilder.set(CaptureRequest.CONTROL_AE_EXPOSURE_COMPENSATION, brightness);
applySettings();
}
private void applySettings() {
captureSession.setRepeatingRequest(previewRequestBuilder.build(), null, null);
}
I messed around with CaptureRequest.SENSOR_SENSITIVITY and it worked great on my Samsung s3, s7 and s8 phones.
You can get the CameraCharacteristics.SENSOR_INFO_SENSITIVITY_RANGE
sensitivity_range = chars.get(CameraCharacteristics.SENSOR_INFO_SENSITIVITY_RANGE);
On my s7, the range is from mid 50s to more than 3000. I then set it to 1500 as follows.
mCaptureRequest.set(CaptureRequest.SENSOR_SENSITIVITY, 1500);
It brightened the preview a few factors.
First, don't lock autoexposure - that's not needed when adjusting exposure compensation.
Second, did you call CameraCaptureSession.setRepeatingRequest with your new capture request?

Android Camera.Parameters setPictureSize not working

I am trying to set the best possible output picture size in my camera object. So that, i can get a perfect downscaled sample image and display it.
During debugging i observed i am setting output picture size exactly the size of my screen dimensions. But when i DecodeBounds of the returned image by camera. I get some larger number!
Also i am not setting my display dimensions as expected output picture size. Code used to calculate and set the output picture size is given below.
I am using this code for devices having API level < 21, so using camera shouldn't be a problem.
I don't have any idea of why i am getting this behavior. Thanks in advance for help!
Defining Camera parameter
Camera.Parameters parameters = mCamera.getParameters();
setOutputPictureSize(parameters.getSupportedPictureSizes(), parameters); //update paramters in this function.
//set the modified parameters back to mCamera
mCamera.setParameters(parameters);
Optimal picture size calculation
private void setOutputPictureSize(List<Camera.Size> availablePicSize, Camera.Parameters parameters)
{
if (availablePicSize != null) {
int bestScore = (1<<30); //set an impossible value.
Camera.Size bestPictureSize = null;
for (Camera.Size pictureSize : availablePicSize) {
int curScore = calcOutputScore(pictureSize); //calculate sore of the current picture size
if (curScore < bestScore) { //update best picture size
bestScore = curScore;
bestPictureSize = pictureSize;
}
}
if (bestPictureSize != null) {
parameters.setPictureSize(bestPictureSize.width, bestPictureSize.height);
}
}
}
//calculates score of a target picture size compared to screen dimensions.
//scores are non-negative where 0 is the best score.
private int calcOutputScore(Camera.Size pictureSize)
{
Point displaySize = AppData.getDiaplaySize();
int score = (1<<30);//set an impossible value.
if (pictureSize.height < displaySize.x || pictureSize.width < displaySize.y) {
return score; //return the worst possible score.
}
for (int i = 1; ; ++i) {
if (displaySize.x * i > pictureSize.height || displaySize.y * i > pictureSize.width) {
break;
}
score = Math.min(score, Math.max(pictureSize.height-displaySize.x*i, pictureSize.width-displaySize.y*i));
}
return score;
}
Finally i resolved the issue after many attempts! Below are my findings:
Step 1. If we are already previewing, call mCamera.stopPreview()
Step 2. Set modified parameters by calling mCamera.setParameters(...)
Step 3. Start previewing again, call mCamera.startPreview()
If i call mCamera.setParameters without stopping preview (Assuming camera is previewing). Camera seems to ignore the updated parameters.
I came up with this solution after several trail and errors. Anyone know better way to update parameters during preview please share.

Android Camera Parameter setPictureSize causes streaked picture

I am trying to take a picture using the Android camera. I have a requirement to capture a 1600 (w) x 1200 (h) image (3rd party vendor requirement). My code seems to work fine for many phone cameras but the setPictureSize causes a crash on some phones (Samsung Galaxy S4, Samsung Galaxy Note) and causes a streaked picture on others (Nexus 7 Tablet). On at least the Nexus the size I desire is showing up in the getSupportPictureSizes list.
I have tried specifying the orientation but it didn't help. Taking the picture with the default picture size works fine.
Here is an example of the streaking:
For my image capture I have a requirement of 1600x1200, jpg, 30% compression, so I am capturing a JPG file.
I think I have three choices:
1) Figure out how to capture the 1600x1200 size without a crash or streaking, or
2) Figure out how to change the size of the default picture size to a JPG that is 1600x1200.
3) Something else that is currently unknown to me.
I have found some other postings that have similar issues but not quite the same. I am in my 2nd day of trying things but am not finding a solution. Here is one posting that got close:
Camera picture to Bitmap results in messed up image (none of the suggestions helped me)
Here is the section of my code that worked fine for until I ran into the S4/Note/Nexus 7. I have added a bit of debugging code for now:
Camera.Parameters parameters = mCamera.getParameters();
Camera.Size size = getBestPreviewSize(width, height, parameters);
if (size != null) {
int pictureWidth = 1600;
int pictureHeight = 1200;
// testing
Camera.Size test = parameters.getPictureSize();
List<Camera.Size> testSizes = parameters.getSupportedPictureSizes();
for ( int i = 0; i < testSizes.size(); i++ ) {
test = testSizes.get(i);
}
test = testSizes.get(3);
// get(3) is 1600 x 1200
pictureWidth = test.width;
pictureHeight = test.height;
parameters.setPictureFormat(ImageFormat.JPEG);
parameters.setPictureSize(pictureWidth, pictureHeight);
parameters.setJpegQuality(30);
parameters.setPreviewSize(size.width, size.height);
// catch any exception
try {
// make sure the preview is stopped
mCamera.stopPreview();
mCamera.setParameters(parameters);
didConfig = true;
catch(Exception e) {
// some error presentation was removed for brevity
// since didConfig not set to TRUE it will fail gracefully
}
}
Here is the section of my code that saves the JPG file:
PictureCallback jpegCallback = new PictureCallback() {
public void onPictureTaken(byte[] data, Camera camera) {
if ( data.length > 0 ) {
String fileName = "image.jpg";
File file = new File(getFilesDir(), fileName);
String filePath = file.getAbsolutePath();
boolean goodWrite = false;
try {
OutputStream os = new FileOutputStream(file);
os.write(data);
os.close();
goodWrite = true;
} catch (IOException e) {
goodWrite = false;
}
if ( goodWrite ) {
// go on to the Preview
} else {
// TODO return an error to the calling activity
}
}
Log.d(TAG, "onPictureTaken - jpeg");
}
};
Any suggestions on how to correctly set up the camera parameters for taking photos or how to crop or resize the resulting photo would be great. Especially if it will work with older cameras (API level 8 or later)! Based on needing the full width of the picture I can only crop off the top.
Thanks!
EDIT: Here is what I ended up doing:
I started by processing the Camera.Parameters getSupportedPictureSizes to use the first one that had the height and width both greater than my desired size, AND the same width:height ratio. I set the Camera parameters to that picture size.
Then once the picture was taken:
BitmapFactory.Options options = new BitmapFactory.Options();;
options.inPurgeable = true;
// convert the byte array to a bitmap, taking care to allow for garbage collection
Bitmap original = BitmapFactory.decodeByteArray(input , 0, input.length, options);
// resize the bitmap to my desired scale
Bitmap resized = Bitmap.createScaledBitmap(original, 1600, 1200, true);
// create a new byte array and output the bitmap to a compressed JPG
ByteArrayOutputStream blob = new ByteArrayOutputStream();
resized.compress(Bitmap.CompressFormat.JPEG, 30, blob);
// recycle the memory since bitmaps seem to have slightly different garbage collection
original.recycle();
resized.recycle();
byte[] desired = blob.toByteArray();
Then I write out the desired jpg to a file for upload.
test = testSizes.get(3);
// get(3) is 1600 x 1200
There is no requirement that the array have 4+ elements, let alone that the fourth element be 1600x1200.
1) Figure out how to capture the 1600x1200 size without a crash or streaking
There is no guarantee that every device is capable of taking a picture with that exact resolution. You cannot specify arbitrary values for the resolution -- it must be one of the supported picture sizes. Some devices support arbitrary values, while other devices will give you corrupted output (as is the case here) or will flat-out crash.
2) Figure out how to change the size of the default picture size to a JPG that is 1600x1200
I am not aware that there is a "default picture size", and, beyond that, such a size will be immutable, since it is the default. Changing the picture size is your option #1 above.
3) Something else that is currently unknown to me.
For devices that support a resolution that is bigger on both axes, take a picture in that resolution, then crop to 1600x1200.
For all other devices, where one or both axes are smaller than desired, take a picture in whatever resolution suits you (largest, closest match to 4:3 aspect ratio, etc.), and then stretch/crop to get to 1600x1200.

Camera.getParameters() return null on Galaxy Tab

Here is my surface-changed event handling code:
public void surfaceChanged(SurfaceHolder holder,
int format, int width,
int height) {
Camera.Parameters parameters = camera.getParameters();
Camera.Size size = getBestPreviewSize(width, height,
parameters);
//...
}
private Camera.Size getBestPreviewSize(int width, int height,
Camera.Parameters parameters) {
Camera.Size result = null;
// it fails with NullPointerExceptiopn here,
// when accessing "getSupportedPreviewSizes" method:
// that means "parameters" is null
for (Camera.Size size : parameters.getSupportedPreviewSizes()) {
///...
}
}
I initialize camera like this:
#Override
public void onResume() {
super.onResume();
camera = Camera.open();
}
This problem doesn't occur on my Galaxy S Plus neither it happen on LG Optimus Black phone. Has anyone thoughts what's wrong here?
I've solved this.
parameters.getSupportedPreviewSizes()
Returns NULL on Galaxy Tab. So I just make a check if it is null and don't set new preview size in such case. To this conclusion I've came after looking into standard Camera application sources.
Looks like the camera variable was never initialized so you are calling getParameters() on null. Try calling camera = Camera.open(); first
camera initialization depends a lot on the specific device. For instance a specific Samsung device GT5500 is reporting null (width = 0, height = 0) as a valid resolution for preview, but crashes the whole phone ("hard" reboot) if you try to use it. We experienced it with mixare augmented reality engine (http://www.mixare.org) and it was PITA to debug (since we didn't have the phone and could not reproduce the bug on any other hardware).
However, about getting the "right" preview size you can take a look at our code (it's a free and open source app) on github. In the file: https://github.com/mixare/mixare/blob/master/src/org/mixare/MixView.java (row 871 and onwards)
List<Camera.Size> supportedSizes = null;
//On older devices (<1.6) the following will fail
//the camera will work nevertheless
supportedSizes = Compatibility.getSupportedPreviewSizes(parameters);
//preview form factor
float ff = (float)w/h;
Log.d("Mixare", "Screen res: w:"+ w + " h:" + h + " aspect ratio:" + ff);
//holder for the best form factor and size
float bff = 0;
int bestw = 0;
int besth = 0;
Iterator<Camera.Size> itr = supportedSizes.iterator();
//we look for the best preview size, it has to be the closest to the
//screen form factor, and be less wide than the screen itself
//the latter requirement is because the HTC Hero with update 2.1 will
//report camera preview sizes larger than the screen, and it will fail
//to initialize the camera
//other devices could work with previews larger than the screen though
while(itr.hasNext()) {
Camera.Size element = itr.next();
//current form factor
float cff = (float)element.width/element.height;
//check if the current element is a candidate to replace the best match so far
//current form factor should be closer to the bff
//preview width should be less than screen width
//preview width should be more than current bestw
//this combination will ensure that the highest resolution will win
Log.d("Mixare", "Candidate camera element: w:"+ element.width + " h:" + element.height + " aspect ratio:" + cff);
if ((ff-cff <= ff-bff) && (element.width <= w) && (element.width >= bestw)) {
bff=cff;
bestw = element.width;
besth = element.height;
}
}
Log.d("Mixare", "Chosen camera element: w:"+ bestw + " h:" + besth + " aspect ratio:" + bff);
//Some Samsung phones will end up with bestw and besth = 0 because their minimum preview size is bigger then the screen size.
//In this case, we use the default values: 480x320
if ((bestw == 0) || (besth == 0)){
Log.d("Mixare", "Using default camera parameters!");
bestw = 480;
besth = 320;
}
parameters.setPreviewSize(bestw, besth);
As you see we're not using directly the call to getSupportedPreviewSizes of the Camera class, but instead added a compatibility layer (the code is here: https://github.com/mixare/mixare/blob/master/src/org/mixare/Compatibility.java ) because we needed compatibility with older phones. If you don't want to support older android releases you can use the method of the Camera class directly.
HTH
Daniele

Categories

Resources