Unable to get 4k video output on android + opencv - android

I'm trying to connect a 4k camera on a Android tv device.
My problem is that i can't seem to get 4k output from the camera onto my 4k tv screen. The highest video resolution that I am able to get is 1920x1080.
I'm using opencv for android and have tried to set the resolution in initializeCamera() inside the JavaCameraView class.
params.setPictureSize(highestSupportedSize.width, highestSupportedSize.height);
params.setPreviewSize(highestSupportedSize.width, highestSupportedSize.height);
I search for the highest common resolution between the pictureSizes and previewSizes. The problem lays in the fact that the highest preview size that I can get is 1920x1080. Which is very strange since both my screen and camera support 4k.
If I manually set the picture size to 3840x2160, I get a very large green area on the screen and the camera preview part stays 1920,1080.
If I try to manually set the preview size to 3840x2160, the app crashes.
Would really appreciate someone that knows how to get 4k video output on a JavaCameraView.
Edit: I now have the feeling that the issue is the Android tv I'm using (Nvidia shield tv). Which might not support 4k camera input. Still trying to fix this.
Edit2: For anyone facing the same issue: According to Logitech customer service, the Android tv boxes don't support anything higher than 720p at the moment. There are also no plans from logitech or nvidia to create support for this.

You chan check my question:
android tv box 4K activity
You can select 4k for your actviti, if available.
I think a lot of tv boxes GPU only supports 4k video, but fullHD openGL surface /activities.
Maybe JavaCameraView is video over opengl.
Can you get a real 4K resolution with others non video apps, like launcher?
I downloaded a lot of apps to display screen resolution, and I always get a
logical fullHD resolution with a 4K#60Hz display.

Related

Does Android support Full-View video recording with wide-angle lens at lower resolutions?

Assuming I have a new Android phone with a wide-angle back lens that can record with its native Camera App at Full-View mode (so we utilize the sensor as much as possible to not miss any visual information). Assume that the phone can even do that at 60FPS (I guess there are such phones but if not, please correct me).
The question is: Do you think that one could achieve the same footage (maybe using the NDK or CameraX) but at lower resolutions? I mean, keep the Full-View (because I do not want to crop the scene), but just lower proportionally the XY resolution. Is this something that is supported? Or are the settings offered by the native Apps the only possible options that a 3rd party developer can have?
If this can be achieved, how can one identify and use the Camera capabilities to set this up? (i.e. Full-View, Max FPS available for that Full-View and a resolution lower than 0.3MP shaped to that Full-View). Code example is more than welcome.

Is it possible to capture a High-Res image while using ArCore?

In my app I'm trying to use ArCore as sort of a "camera assistant" in a custom camera view.
To be clear - I want to display images for the user in his camera and have him capture images that don't contain the AR models.
From what I understand, in order to capture an image with ArCore I'll have to use the Camera2 API which is enabled by configuring the session to use the "shared Camera".
However, I can't seem to configure the camera to use any high-end resolutions (I'm using pixel 3 so I should be able to go as high as 12MP).
In the "shared camera example", they toggle between Camera2 and ArCore (a shame there's no API for CameraX) and it has several problems:
In the ArCore mode the image is blurry (I assume that's because the depth sensor is disabled as stated in their documentation)
In the Camera2 mode I can't enhance the resolution at all.
I can't use the Camera2 API to capture an image while displaying models from ArCore.
Is this requirement at all possible at the moment?
I have not worked yet with shared camera with ARCore, but I can say a few things regarding the main point of your question.
In ARCore you can configure both CPU image size and GPU image size. You can do that by checking all available camera configurations (available through Session.getSupportedCameraConfigs(CameraConfigFilter cameraConfigFilter)) and selecting your preferred one by passing it back to the ARCore Session. On each CameraConfig you can check which CPU image size and GPU texture size you will get.
Probably you are currently using (maybe by default?) a CameraConfig with the lowest CPU image, 640x480 pixels if I remember correctly, so yes it definitely looks blurry when rendered (but nothing to do with depth sensor in this regard).
Sounds like you could just select a higher CPU image and you're good to go... but unfortunately that's not the case because that configuration applies to every frame. Getting higher resolution CPU images will result in much lower performance. When I tested this I got about 3-4 frames per second on my test device, definitely not ideal.
So now what? I think you have 2 options:
Pause the ARCore session, switch to a higher CPU image for 1 frame, get the image and switch back to the "normal" configuration.
Probably you are already getting a nice GPU image, maybe not the best due to camera Preview, but hopefully good enough? Not sure how you are rendering it, but with some OpenGL skills you can copy that texture. Not directly, of course, because of the whole GL_TEXTURE_EXTERNAL_OES thing... but rendering it onto another framebuffer and then reading the texture attached to it could work. Of course you might need to deal with texture coordinates yourself (full image vs visible area) but that's another topic.
Regarding CameraX, note that it is wrapping Camera2 API in order to provide some camera use cases so that app developers don't have to worry about the camera lifecycle. As I understand it would not be suitable for ARCore to use CameraX as I imagine they need full control of the camera.
I hope that helps a bit!

Using onCameraFrame on OpenCV for Android to capture high resolution frame?

I am trying to capture a high resolution frame (1280x720) from the camera in a pair of Google Glass using OpenCV 2.4.10 for Android. I have implemented the CameraBridgeViewBase.CvCameraViewListener2 in my Activity and try to grab the frame in the onCameraFrame method. So far everything works well, and i get a 512x288 Mat object.
My problem is that the 512x288 resolution is not high enough for what I need. So I tried to setup my project the same way as they do in Sample 3 that follows with OpenCV: http://goo.gl/iDyqQj. The problem is that it only works for resolutions below 512x288, as soon as I increase the resolution above this level it defaults back to to being 512x288 (without any notice).
I found some suggestions, http://goo.gl/X2wtM4, that OpenCV is restricting the frame size to a maximum of the screen resolution. But the Google Glass screen should have a 640x360 resolution? I tried to do as described in the answer, but when I override calculateCameraFrameSize and return a Size-object larger than 512x288, I get a distorted frame (but with the larger dimensions, see below).
Does anyone have a suggestion on how capture a higher captured resolution on the Google Glass using OpenCV?
So I found a solution. It seem to be two separate problems. As I thought in my question you need to override calculateCameraFrameSize in JavaCameraView to be able to fetch higher resolutions than the device's screen in onCameraFrame. This is apparently a design choice by OpenCV and have been since version 2.4.5. So this is why I could not get a frame with higher resolution.
Even though I now can get a frame with higher resolution, it still is distorted for most preview sizes. This is a bug in the GDK that seem to have been known for quite some time (since XE10 if I understood correctly), but still is not fixed. Fortunately there is a workaround! The issue is avoided by manually setting the FPS of the preview using setPreviewFpsRange after you acquire the Camera.
Camera.Parameters params = camera.getParameters();
params.setPreviewFpsRange(30000, 30000);
camera.setParameters(params);

Capturing preview image with cwac-camera?

This question may be slightly philosophical in nature, but would it be crazy to just capture a photo from the live preview instead of going through takePhoto?
I've found a few examples of how to do so: How to capture preview image frames from Camera Application in Android Programming? and Capture an image from the camera preview.
Right now I'm juggling through inconsistent different EXIF rotation behavior on various phones (it looks like you have a FullExifFixup for all Samsung devices, but I'm having different behavior between my S2 and S4) and I'm wondering if it wouldn't just be easier to grab the preview image.
Is this a stupid idea?
but would it be crazy to just capture a photo from the live preview instead of going through takePhoto?
It would be crazy with the library as it stands, simply because I don't expose the preview frames. :-) That's on the issue list.
it looks like you have a FullExifFixup for all Samsung devices, but I'm having different behavior between my S2 and S4
I don't have an S2 at the moment. If you can provide me with a reproducible test case (including details of the specific model of S2), post an issue, and I can see what I can do.
I'm wondering if it wouldn't just be easier to grab the preview image
It won't be at full resolution of the camera -- you'll be capped at the preview frame size. That being said, plenty of apps work with just the preview frames. Vine, for example, captures its video by capturing the preview frames, due to various problems they ran into when using MediaRecorder (there's a conference talk from a Vine employee that goes into more details).

Capture screen shot on Android phone when playing video and camera preview

I have two HTC phones, one is HTC desire and another is HTC aria. I'm using ddms to capture the screenshot on HTC desire before, it works fine. However recently I buy one HTC aria, and I found when doing screenshot on camera preview and video playing (e.g. youtube clips), the result goes black.
I think that might be some overlay issue, but just can't figure out how do capture screenshots on camera preview.
Sorry, this is probably just not possible. Prior to Android 3.0, the DDMS screenshot facility was done by taking a copy of the framebuffer. For surfaces in an overlay (often the case with video playback), this doesn't exist in the framebuffer so can't be included in the screenshot.

Categories

Resources