Which are the android native camera supported platforms? - android

I am currently working on a camera-based OpenCV app with critical performance requirements
We already have java-based camera implementations - both, the deprecated HAL 1 and also camera2 API
We use camera1 implementation on platforms < 21 and the Camera 2 implementation on platforms >= 21
These two implementations are already extremely optimized for performance, however, we believe we could still improve by upgrading to the new native ndk camera API (the main improvement would be reducing the overhead of JNI image data transfer to native OpenCV processor)
In Android 7.0 (API 24) release, NDK native camera support was introduced. However, the only NDK documentation available is this flat list of C headers
Unfortunately, I am currently confused because there is no clear information about native camera platform support
When I looked at the native API I noticed it is very similar to the java camera2 API
This makes me (wishfully) think that the native API should be backward compatible with earlier platforms that support the camera2 java API
I have started an experimental project in an attempt to bust the myth, however, due to generally lacking NDK documentation, progress is slow
I am specially interested if anyone else already attempted to leverage the native camera API and there's a relevant conclusion on this matter that could be shared
On another track, I'm also curious to find out if the camera native API implementation is a reverse JNI binding to the camera2 java API or if it indeed is a lower-level integration. It's also interesting to know if the camera2 java API is a JNI binding to the native camera api?

There's more NDK documentation thank just the C headers; if you click on one of the functions, for example, you can get reference docs.
However, I think you're correct in that the compatibility story isn't well-documented.
The short version is that if you call ACameraManager_getCameraIdList and it returns camera IDs, then you can open those with the NDK API.
If it doesn't return any IDs, then there are no supported cameras on that device.
The longer story is that the NDK API only supports camera devices that have a hardware level of LIMITED or higher. LEGACY devices are not supported.
As an optimization note, how are you passing data through JNI? While JNI isn't ridiculously fast, it's not that slow, and as long as you're using passing mechanisms that don't copy data (such as direct access to ByteBuffer via getDirectBufferAddress.

Related

What replaces GraphicBuffer on Android 7.0?

For fast transfer of texels to/from an EGL surface, we have successfully used GraphicBuffer buffer as described in this thread:
How to use GraphicBuffer in android ndk
However on Android 7.0 that is not an option. As GraphicBuffer uses the private libary libui.so. So what replaces it? What is the Google-approved method of doing a fast transfer to/from an EGL surface?
In Android 8 (API level 26), the upcoming Oreo release, they have introduced a Hardware Buffer wrapper. I've compared the HardwareBuffer and GraphicBuffer classes, both provide an interface to create and access a shared buffer object, where the new HardwareBuffer is a generalised version of the GraphicBuffer. Therefore you will no longer need to link against the non-public libraries from API 26+.
The only alternative I have seen for Android 7 is to manually provide all required libraries with the apk for a project.
We will have to wait until Android 8 is released following it's beta testing phase. The roadmap for release can be found here, anticipated release is some time before the end of 2017. If you plan on updating your project with the new API features before the release date and want to test it out, you can use the Android O preview version on a Google device.

Android 7 GraphicBuffer alternative for direct access to OpenGL texture memory

The only way to get profit from the fact, that mobile devices have shared memory for CPU and GPU was using GrphicBuffer. But since Android 7 restrict access to private native libs (including gralloc) it is impossible to use it any more. The question - is there any alternative way to get direct memory access to texture's pixel data?
I know, that something similar can be done using PBO (pixel buffer object). But it still does additional memory copy, which is undesirable. Especially if we know, that there was way to do it with zero copies.
There are many apps, which used this feature, because it can heavily increase the performance. I think many developers are in stuck with this problem now.
Since Android 8 / API 26 (sorry not for Android 7...)
Hardware Buffer APIs are alts for GrphicBuffer().
The native hardware buffer API lets you
directly allocate buffers to create your own pipelines for
cross-process buffer management. You can allocate an AHardwareBuffer
and use it to obtain an EGLClientBuffer resource type via the
eglGetNativeClientBufferANDROID extension.
NDK revision history
Minimum revision of NDK is 15c (July 2017)
Android NDK, Revision 15c (July 2017)
Added native APIs for Android 8.0.
* Hardware Buffer API
android/hardware_buffer_jni.h is in the directory (NDK)/sysroot/usr/include/
Refs:
NDK - Native Hardware Buffer (android/hardware_buffer_jni.h)
Android/Java - HardwareBuffer
GrphicBuffer related article Using OpenGL ES to Accelerate Apps with Legacy 2D GUIs
NB: for Android 7 / API 24
Native API guide also says in Graphics/EGL section
API level 24 added support for the EGL_KHR_mutable_render_buffer,
ANDROID_create_native_client_buffer, and
ANDROID_front_buffer_auto_refresh extensions.
and EGL_ANDROID_create_native_client_buffer is an EGL extension which contains eglCreateNativeClientBufferANDROID(), which returns EGLClientBuffer. (EGL/eglext.h)
I think you can use SurfaceTexture, SurfaceTexture can create by MediaCore, SurfaceTexture can direct encode by MediaCore。This Plan can process 1080p video in 2ms-5ms per video frame。

ZSL feature on Android Lollipop with camera 2 API

I am trying to understand ZSL feature/capability support on Android 5.0, from camera application, camera framework and libcameraservice implementation as well camera HAL v3.2 specifications.
As far as I understand, ZSL implementation in android, is possible in two ways:
Framework implemented ZSL
In Kitkat, only framework implemented ZSL was supported, and it was pretty straightforward. (Using bidirectional streams for ZSL)
In Lollipop,they have documented framework implemented ZSL very clearly,
http://androidxref.com/5.0.0_r2/xref/hardware/libhardware/include/hardware/camera3.h#1076
Application implemented ZSL
In Lollipop, they have introduced the concept of application implemented ZSL. ZSL has been exposed as a capability to the application, as per the available documentation
http://androidxref.com/5.0.0_r2/xref/system/media/camera/docs/docs.html
Under android.request.availableCapabilities, it says that:
For ZSL, "RAW_OPAQUE is supported as an output/input format"
In Lollipop, framework implemented ZSL works the same way as Kitkat, with Camera1 API application.
However, I could not find anywhere in Camera2 API application code, how to enable application/framework implemented ZSL.
http://androidxref.com/5.0.0_r2/xref/packages/apps/Camera2/
Hence, the questions:
Is it possible to enable framework implemented ZSL in Android L, with Camera2 API application?
Is it possible to enable application implemented ZSL in Android L, without RAW_OPAQUE support, with Camera2 API application?
If either 1 or 2 is possible, what is required from Camera HAL to enable ZSL in Android L?
Any help appreciated.
No, the framework-layer ZSL only works with the old camera API.
No, unless it's sufficient to use the output buffer as-is, without sending it back to the camera device for final processing.
The longer answer is that the ZSL reprocessing APIs had to be cut out of the initial camera2 implementation, so currently there's no way for an application to send buffers back to the camera device, in any format (RAW_OPAQUE or otherwise).
Some of the documentation in camera3.h is misleading relative to the actual framework implementation, as well - only IMPLEMENTATION_DEFINED BIDIRECTIONAL ZSL is supported by the framework, and RAW_OPAQUE is not used anywhere.
Edit: As of Android 6.0 Marshmallow, reprocessing is available in the camera2 API, on devices that support it (such as Nexus 6P/5X).

Android API face detection vs. OpenCV/JavaCV face detection

I've used local Android face detection on an Android device, but it seems quite slow and I'm not so sure on the reliability. I've also used OpenCV's face detection but only on PC, as opposed to an Android device. For Android, I'm guessing I'll have to use JavaCV (or OpenCV4Android?).
Do you know what the speed differences are between Android API's facial detection and OpenCV's facial detection? I'm sure OpenCV/JavaCV is both more efficient/faster and more accurate, but cannot confirm.
Thanks!
Suggestion: If you are looking for face detection, I suggest you use platform specific APIs like FaceDetector rather than OpenCV Java wrapper. This is since those API's would be hardware accelerated(GPU) unlike OpenCV face detection which till version 3.0 relied on CPU only.
The speed difference you perceive between desktop and mobile device should be for the difference in device hardware ( like CPU ) and not because of different libraries wrappers like JavaCV/OpenCV4Android. OpenCV is written is in C/C++. All processing intensive code is still in C/C++ and the Java libraries are just wrappers over JNI.
OpenCV4Android - OpenCV.org maintained Android Java wrapper. Recommended.
OpenCV Java - OpenCV.org maintained auto generated desktop Java wrapper.
JavaCV - Popular Java wrapper maintained by independent developer(s). Not Android specific. This library might get out of sync with OpenCV newer versions.

Low delay audio on Android via NDK

It seems that this question has been asked before, I just would like to know whether there is an update in Android.
I plan to write an audio application involving low delay audio I/O (appr. < 10 ms). It seems not to be possible based on the methods proposed by the SDK, hence is there - in the meantime - a way to achieve this goal using the NDK?
there are currently no libraries in the NDK for accessing the android sound system, at least none that are considered safe to use (are stable).
Have you done any tests with the AudioTrack class? Its the lowest latency option available at the moment.
Currently 2 main apis are exposed in NDK for Audio:
OpenSL (from Android 2.3 Api level 9)
OpenMAX AL (from Android 4.0 Api level 14)
A good start point to learn about the OpenSL API in Android is in the samples code of the NDK:
look at "native-audio" sample.
Measurement about performances were made in this blog:
http://audioprograming.wordpress.com/
As summary the best latencies obtained were around 100-200ms, far from your target.
But, from android NDK documentation, the OpenSL interface is the one that in the future will benefit most from HW acceleration to go towards low latency.

Categories

Resources