I try to Draw in the same surface that the camera will draw it self later, by using lock and unlock.
the strange thing is that the try to lock succeed, but after the call unlockCanvasAndPost() the mCamera.setPreviewDisplay() throw error "setPreviewDisplay failed".
Without calling lockCanvas() and unlockCanvasAndPost() the mCamera.setPreviewDisplay() work and i get my display.(running in android 4.1.1 compile with api 17)
Remarque:(running in android 4.0.4 compile with api 17) sometimes run and only
if i lock and unlock using java, and not work using lock and unlock
from jni c/c++(ANativeWindow_lock() and ANativeWindow_unlockAndPost()).
but let's discuss only the java solution and why calling lockCanvas() and unlockCanvasAndPost() prevent the camera from working:
mCamera=Camera.open();
try {
Canvas can;
can = imageView.getHolder().getSurface().lockCanvas(null);
Log.d("", "Canvas successfully acquired ");
imageView.getHolder().getSurface().unlockCanvasAndPost(can);
mCamera.setPreviewDisplay(this.imageView.getHolder());
} catch (IOException e) {
Log.d("",e.getMessage());
} catch (OutOfResourcesException e) {
e.printStackTrace();
}
mCamera.setPreviewCallback(previewCallback);
mCamera.startPreview();
Related
i'm having problem dealing with camera application on real device. when i run my application on genymotion emulator version 4.4.2(API 19) it works fine.. but when i run it on my real device Huawei P9 Lite version 6 (API 23) it crashes. Here is the errors that shows on the Logcat :
05-01 23:31:10.963 9308-9308/edu.sfsu.cs.orange.ocr W/CameraBase: An error
occurred while connecting to camera: 0
05-01 23:31:10.963 9308-9308/edu.sfsu.cs.orange.ocr E/Camera-JNI:
android_hardware_Camera_native_setup Error: -1
05-01 23:31:10.963 9308-9308/edu.sfsu.cs.orange.ocr E/Camera: Camera new
cameraInitNormal:-1
in my class where i handle the camera object i see that "android.hardware.camera is deprecated"
i think it maybe from this and i should use "android.hardware.camera2" instead. But as i use alot the camera object and it's methods it's going to be alot of change in my code. Can this be avoided ? or is this problem from an other thing ?
The permission for the camera is already included in my manifest and i'm using this configuration in my build file :
compileSdkVersion 24
buildToolsVersion '25.0.0'
minSdkVersion 9
targetSdkVersion 24
UPDATE
Here is why i think the problem is from deprecated methods :
try {
// Open and initialize the camera
cameraManager.openDriver(surfaceHolder);
// Creating the handler starts the preview, which can also throw a RuntimeException.
handler = new CaptureActivityHandler(this, cameraManager, isContinuousModeActive);
} catch (IOException ioe) {
showErrorMessage("Error", "Couldn't initialize camera. Please try restarting device.");
} catch (RuntimeException e) {
// Barcode Scanner has seen crashes in the wild of this variety:
// java.?lang.?RuntimeException: Fail to connect to camera service
showErrorMessage("Error", "Could not initialize camera. Please try restarting device.");
}
When the application starts on my phone it throws this RuntimeException.
openDrive method :
public synchronized void openDriver(SurfaceHolder holder) throws IOException {
Camera theCamera = camera;
if (theCamera == null) {
theCamera = Camera.open();
if (theCamera == null) {
throw new IOException();
}
camera = theCamera;
}
camera.setPreviewDisplay(holder);
if (!initialized) {
initialized = true;
configManager.initFromCameraParameters(theCamera);
if (requestedFramingRectWidth > 0 && requestedFramingRectHeight > 0) {
adjustFramingRect(requestedFramingRectWidth, requestedFramingRectHeight);
requestedFramingRectWidth = 0;
requestedFramingRectHeight = 0;
}
}
configManager.setDesiredCameraParameters(theCamera);
SharedPreferences prefs = PreferenceManager.getDefaultSharedPreferences(context);
reverseImage = prefs.getBoolean(PreferencesActivity.KEY_REVERSE_IMAGE, false);
}
any thoughts how i can solve this problem ? thanks in advance
A quick debug tip would be to compile the app for targetSdkVersion 19. Looks like between the API versions the Camera interface must have depriciated some stuff. Also if you have capability to see kernel logs via a serial terminal or via adb it would be helpful to see what the "Error: -1" means to the hardware i.e. is it really reaching the camera driver in kernel?
adb root
adb shell
#cat /proc/kmsg
Mostly I want to know if there is a fundamental conflict that I can't share the same resource with the library, if so, I will need to take a different approach.
My goal is to have low quality video with the detector's meta data saved at the same time, so that I can do some post processing and slicing without much of a delay.
Based on the CameraDetectorDemo - camera detector
I have been initializing a MediaRecorder, but it saves a black screen if I start it before the detector, and it crashes on start (with code -19) if I start it after the detector. The detector is attaching the preview, maybe it is to do with that.
I added some buttons to control these functions:
protected void cameraInit() {
String state = Environment.getExternalStorageState();
if (!Environment.MEDIA_MOUNTED.equals(state)) {
Log.d(LOG_TAG, "Drive not mounted - cannot write video");
return;
}
File file = new File(getExternalFilesDir(Environment.DIRECTORY_MOVIES), "demo.gp3");
Log.d(LOG_TAG, String.format("Camera Initializing. Setting output to: %s", file.getAbsolutePath()));
// Set sources
recorder.setAudioSource(MediaRecorder.AudioSource.MIC);
recorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
// Set profile
recorder.setProfile(CamcorderProfile.get(CamcorderProfile.QUALITY_LOW));
// Set output profile
recorder.setOutputFile(file.getAbsolutePath());
// Set preview output
recorder.setPreviewDisplay(cameraPreview.getHolder().getSurface());
try {
this.recorder.prepare();
} catch (IOException e) {
Log.e(LOG_TAG, "IO exception on camera Initialization");
e.printStackTrace();
} catch (IllegalStateException e) {
// This is thrown if the previous calls are not called with the
// proper order
Log.e(LOG_TAG, "Failed to initialize things properly :( ");
e.printStackTrace();
}
}
protected void cameraStart() {
Log.d(LOG_TAG, "Camera Start");
this.recorder.start();
}
protected void cameraStop() {
Log.d(LOG_TAG, "Camera Stop");
this.recorder.stop();
}
The Affdex SDK's CameraDetector needs access to the camera to get its preview frames and process them, so that's not going to work if the MediaRecorder has control of the camera.
Probably your best bet is to take preview frames from the camera, feed them to an Affdex FrameDetector for processing, and also save them to a video file via a MediaCodec and MediaMuxer, although I haven't tried that.
I'm using ffmpeg in my Android application and sometimes I'm getting out of memory error, I'm calling the ffmpeg inside a HandlerThread, is it ok to catch out of memory error and exit the thread while the main thread keeps on running?
I read a lot of this being not a good practice, the thing is that I really need that because I have to edit the DB when there is any kind of error
fc = new FfmpegController(context, fileTmp);
try {
fc.processVideo(clip_in, clip_out, false,
new ShellUtils.ShellCallback() {
#Override
public void shellOut(String shellLine) {
}
#Override
public void processComplete(int exitValue) {
//Update the DB
}
});
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
} catch (InterruptedException e) {
} catch (Exception e) {
}catch (OutOfMemoryError e) {
//update the DB
}
No something is going wrong if you are getting OutOfMemory errors. I would look into buffering your audio, as likely you are running the whole clip through ffmpeg at once, which is going to use up alot of memory.
Also, keep in mind that lots of us doing Audio in Android end up using the NDK primarily because of issues like you are experiencing. Audio has to be really high performance, and using the NDK allows you to write more low level memory efficient audio handling.
Android's AudioTrack has a write method that allows you to push an Audio buffer to it. A warning that this is not entry level and requires some knowledge of AudioBuffer's as well as requires you to read buffers in, send them to ffmpeg and then pass to AudioTrack. Not easy to do, and unfortunately more advanced audio on Android is not easy.
Hi I'm trying to get camera functionality to work on my app. The problem is, on one phone in particular - Samsung Galaxy Mini.
After I take a picture using the camera and previews, the phone freezes when I call camera.release(). I have to remove battery to reset it.
This is how I release the camera:
try
{
mCamera.stopPreview();
mCamera.setPreviewDisplay(null);
mCamera.release();
mCamera = null;
}
catch (Exception e)
{
// ignore: tried to stop a non-existent preview
}
I am also getting this weird native exception in logcat after the call:
03-10 09:45:56.080: E/mm-camera(95): camera_issue_ctrl_cmd: error (Bad address): type 43, length 0, status 40856
Any help would be greatly appreciated!
use the below open source camera code it will help you
Open Camera
and use it on surface destroyed
if(flag){
camera.release();
camera = null;
previewing = false;
}else{
camera.stopPreview();
}
I am working on my FIRST APP and using Android Studio for the FIRST TIME (bear that in mind before you vote my question down out of sheer arrogance).
The app utilizes the camera and I've borrowed bits of code from Google to achieve this. However, I am getting an error: "Cannot resolve symbol 'TAG'". Why is this? How do I fix it?
The code snippets include...
public void surfaceCreated(SurfaceHolder holder) {
// The Surface has been created, now tell the camera where to draw the preview.
try {
mCamera.setPreviewDisplay(holder);
mCamera.startPreview();
} catch (IOException e) {
Log.d(TAG, "Error setting camera preview" + e.getMessage());
}
}
...and...
} catch (Exception e){
Log.d(TAG, "Error starting camera preview: " + e.getMessage());
}
Its just a string used as a tag when printing log messages to the console (ddms).
Add the following line to the top of your class:
public static final String TAG = "YOUR-TAG-NAME";