This question may be slightly philosophical in nature, but would it be crazy to just capture a photo from the live preview instead of going through takePhoto?
I've found a few examples of how to do so: How to capture preview image frames from Camera Application in Android Programming? and Capture an image from the camera preview.
Right now I'm juggling through inconsistent different EXIF rotation behavior on various phones (it looks like you have a FullExifFixup for all Samsung devices, but I'm having different behavior between my S2 and S4) and I'm wondering if it wouldn't just be easier to grab the preview image.
Is this a stupid idea?
but would it be crazy to just capture a photo from the live preview instead of going through takePhoto?
It would be crazy with the library as it stands, simply because I don't expose the preview frames. :-) That's on the issue list.
it looks like you have a FullExifFixup for all Samsung devices, but I'm having different behavior between my S2 and S4
I don't have an S2 at the moment. If you can provide me with a reproducible test case (including details of the specific model of S2), post an issue, and I can see what I can do.
I'm wondering if it wouldn't just be easier to grab the preview image
It won't be at full resolution of the camera -- you'll be capped at the preview frame size. That being said, plenty of apps work with just the preview frames. Vine, for example, captures its video by capturing the preview frames, due to various problems they ran into when using MediaRecorder (there's a conference talk from a Vine employee that goes into more details).
Related
In my app I'm trying to use ArCore as sort of a "camera assistant" in a custom camera view.
To be clear - I want to display images for the user in his camera and have him capture images that don't contain the AR models.
From what I understand, in order to capture an image with ArCore I'll have to use the Camera2 API which is enabled by configuring the session to use the "shared Camera".
However, I can't seem to configure the camera to use any high-end resolutions (I'm using pixel 3 so I should be able to go as high as 12MP).
In the "shared camera example", they toggle between Camera2 and ArCore (a shame there's no API for CameraX) and it has several problems:
In the ArCore mode the image is blurry (I assume that's because the depth sensor is disabled as stated in their documentation)
In the Camera2 mode I can't enhance the resolution at all.
I can't use the Camera2 API to capture an image while displaying models from ArCore.
Is this requirement at all possible at the moment?
I have not worked yet with shared camera with ARCore, but I can say a few things regarding the main point of your question.
In ARCore you can configure both CPU image size and GPU image size. You can do that by checking all available camera configurations (available through Session.getSupportedCameraConfigs(CameraConfigFilter cameraConfigFilter)) and selecting your preferred one by passing it back to the ARCore Session. On each CameraConfig you can check which CPU image size and GPU texture size you will get.
Probably you are currently using (maybe by default?) a CameraConfig with the lowest CPU image, 640x480 pixels if I remember correctly, so yes it definitely looks blurry when rendered (but nothing to do with depth sensor in this regard).
Sounds like you could just select a higher CPU image and you're good to go... but unfortunately that's not the case because that configuration applies to every frame. Getting higher resolution CPU images will result in much lower performance. When I tested this I got about 3-4 frames per second on my test device, definitely not ideal.
So now what? I think you have 2 options:
Pause the ARCore session, switch to a higher CPU image for 1 frame, get the image and switch back to the "normal" configuration.
Probably you are already getting a nice GPU image, maybe not the best due to camera Preview, but hopefully good enough? Not sure how you are rendering it, but with some OpenGL skills you can copy that texture. Not directly, of course, because of the whole GL_TEXTURE_EXTERNAL_OES thing... but rendering it onto another framebuffer and then reading the texture attached to it could work. Of course you might need to deal with texture coordinates yourself (full image vs visible area) but that's another topic.
Regarding CameraX, note that it is wrapping Camera2 API in order to provide some camera use cases so that app developers don't have to worry about the camera lifecycle. As I understand it would not be suitable for ARCore to use CameraX as I imagine they need full control of the camera.
I hope that helps a bit!
I'm trying to connect a 4k camera on a Android tv device.
My problem is that i can't seem to get 4k output from the camera onto my 4k tv screen. The highest video resolution that I am able to get is 1920x1080.
I'm using opencv for android and have tried to set the resolution in initializeCamera() inside the JavaCameraView class.
params.setPictureSize(highestSupportedSize.width, highestSupportedSize.height);
params.setPreviewSize(highestSupportedSize.width, highestSupportedSize.height);
I search for the highest common resolution between the pictureSizes and previewSizes. The problem lays in the fact that the highest preview size that I can get is 1920x1080. Which is very strange since both my screen and camera support 4k.
If I manually set the picture size to 3840x2160, I get a very large green area on the screen and the camera preview part stays 1920,1080.
If I try to manually set the preview size to 3840x2160, the app crashes.
Would really appreciate someone that knows how to get 4k video output on a JavaCameraView.
Edit: I now have the feeling that the issue is the Android tv I'm using (Nvidia shield tv). Which might not support 4k camera input. Still trying to fix this.
Edit2: For anyone facing the same issue: According to Logitech customer service, the Android tv boxes don't support anything higher than 720p at the moment. There are also no plans from logitech or nvidia to create support for this.
You chan check my question:
android tv box 4K activity
You can select 4k for your actviti, if available.
I think a lot of tv boxes GPU only supports 4k video, but fullHD openGL surface /activities.
Maybe JavaCameraView is video over opengl.
Can you get a real 4K resolution with others non video apps, like launcher?
I downloaded a lot of apps to display screen resolution, and I always get a
logical fullHD resolution with a 4K#60Hz display.
I am using this project android-camera2-secret-picture-taker to capture image without open camera view, but the captured images is very bad like this
any help to make this better?
thanks
[Edit]
I tried other phones and it works fine, I take this bad images on Huawei Y6II only and I don't know why? the phone camera is 13 mpx and works fine with native camera app.
Did you issue only a single capture request to the camera device? (No free-running preview or such).
Generally, the auto-exposure, focus, and white-balance routines take a second or so of streaming before they stabilize to good values.
Even if you don't want a preview on screen, you need to request 10-30 frames of data from the camera to start before you save a final image. Or to be more robust, set a repeating request targeting some low-resolution SurfaceTexture, and wait until the CaptureResult CONTROL_AE_STATE / AWB_STATE fields reach CONVERGED, and the AF_STATE field is what you want as well (depends on what AF mode you're using). Then capture your image.
This is a wildly blind guess, but hey, worth a try.
If you used some code snippet from the web which suggests to get a list of supported image sizes and just pick the first one - well this has backfired for me on Huawei devices (more than one model) because Huawei seems to provide the list in the ascending order of resolution (i.e. smallest-first), whereas most other devices I've seen does that in descending order (i.e. largest-first).
So if this is a resolution issue, it might be worth a check.
I was trying to implement a burst mode camera in my app, which can take multiple pictures at the rate of 5-10(or more) snaps per second.
FYI I already saw the previous questions here, here and here - tried and failed with speed. Also the questions are old and there are no comprehensive answers addressing all the concerns like how to manage heap etc.
I would really appreciate if someone can help with useful pointers, best practice or maybe an SSCCE.
Update :
Tried successfully with pulling preview frames # 15+snaps/sec, but the
problem is preview size is limited. On nexus 5 I can get only
1920x1080 which is ~2mp, whereas the full resolution pic possible on
n5 is 8mp :-(
I think a big part of the problem is the question: How does burst mode work in current phones? A couple of blogs point out that Google has confirmed that they will be adding a burst mode API.
I suspect current implementations work by setting exposure time to minimum and calling takePicture in a loop or using Camera.PreviewCallback
I played around with the latter for some computer vision projects and happened to look into writing a burst mode camera using this API. You could store the buffers you receive from Camera.PreviewCallback in memory and process them on a background thread.
If I remember correctly, the resolution was lower than the actual camera resolution, so this may not ideal.
Short of device-specific APIs offered by their manufacturers, the only way you can get a "burst mode" that has a shot of working across devices will be to use the preview frames as the images. takePicture() has no guarantees of when you will be able to call takePicture() again.
I was looking for a method to use the camera on android devices without a surfaceview or a preview. I found out that, it is impossible to take picture without that preview. However, I have found a tutorial which is actually working taking pictures without a preview. Here is the link: http://www.vogella.com/articles/AndroidCamera/article.html
After switching the camera in the code from front to back-facing the app didn't crashed but it gave me an error 100. So it is only working with the front cam at the moment.
I am using a Samsung Galaxy S3(4.1.2) and i will test it on a Galaxy S2 and a Galaxy S3 Mini.
Anyone a good explanation for this?
You cannot take a picture without starting preview.
While some Android devices are more flexible, and allow takePicture to be called without preview running, this is technically against the API specifications.
It won't work on a large number of devices, so please don't rely on it. That tutorial is wrong, and presumably tested only on one of the devices that allows this behavior.
If you don't want a visible preview, see this question for ways to do that in Android versions >= 3.0.
Actually the time interval of question and answer is large, but may help others.
You can try this library to take picture even from service:
https://github.com/kevalpatel2106/android-hidden-camera
It uses a feature to draw over other apps and create a fake surface. Hope it helps.