OpenCV Documentation for Android - android

I am a newbie to the area of computer vision, with no prior experience. I am trying to develop a face recognition app for Android devices using OpenCV. I have installed OpenCV, but I have no idea how to use it. It seems like there is no Android specific documentation for OpenCV. How can I learn how to use OpenCV in my Android app?

The best resource I have found is just OpenCV's Android Tutorial. This goes through the opencv android sample code within the samples directory of the source that you extracted to install opencv. (ie. OPENCV_PATH/samples/android). There are other samples there including one for face detection.
There are also android docs on the opencv site.
There is also an opencv face recognition tutorial which may be useful to get you started even though it is not android specific.

Related

Is ARCore compatible with Ionic?

I want to create an AR app and read on the internet that ARCore is one of the things to work with. I previously tried basic app devolpment with Ionic and was pretty satisfied with it.
Therefore I was wondering if it is possible to integrate ARCore into an Ionic app (e.g. in an AR-tab)?
I found this answer on stackoverflow but it is just considering Xamarin and not Ionic...
I also found that it should be possible to integrate ARCore into Android (Java/Kotlin - which both could be used with Capacitor). But I am not really sure how (new to this topic) - can the ARCore elements just be copy and pasted to the desired tab inside my Ionic project?
you can use Wikitude SDK for AR App .. its support ionic follow this link below :
https://www.wikitude.com/download-wikitude-sdk-for-cordova/

ABBYY Cloud OCR SDK with android

I'm trying to develop an android application using android studio which will recognize Arabic text from an image. I tried Tesseract OCR but unfortunately the result were inaccurate at all, so I wish to try ABBYY cloud OCR SDK. But i'm not able to find any useful tutorials or examples of how to use it with android. can someone recommend some tutorials/examples or guide me how to start using it?
Detail of integration of Abbyy OCR sdk available on
GitHub

Want to create a plugin in Cocos2d for my SDK

I have created an SDK, currently its for both Android and iOS.
But now i also have to give the support in cocos2d platform.
Does any one how to achieve this?
I already have working this SDKs working in native.
But i am going through the some blogs but cant find any easy tutorial of a documentation or blog to do that.
One thing i know that it requires the knowledge of NDK and JNI, which i have very less knowledge of it.
So can please anyone guide me or give me some link or tutorial about how to create an extension in cocos2d for my SDK.
Thanks.
First of all, I very much agree that finding some thing w.r.t. cocos2d is pretty tough! :)
Having said that, here are some pointers/information on how you can take this forward.
Android Native Development Kit (NDK)
Android apps are typically written in Java, with its elegant object-oriented design. However, at times, you need to overcome the limitations of Java, such as memory management and performance, by programming directly into Android native interface. Android provides Native Development Kit (NDK) to support native development in C/C++, besides the Android Software Development Kit (Android SDK) which supports Java.
An amazing and yet simple article with code examples can be found on Android NDK
Best place to start off with NDK JNI(considering you know what they are basically)-
Sample: hello-jni is the best place to start off with code at Github
Advanced Android: Getting Started with the NDK
It's better to start off by kicking off with some basic learning of Cocos2D-X-
Cocos2D-X Tutorial for iOS and Android: Getting Started
The Completest Cocos2d-x Tutorial & Guide List - Stackoverflow link
Some perfect references for plugin development-
How to use plugin-x in android
Plugin-x Architecture
How to write your own plugin for android
PluginX IOS IAP Integration
Third Party SDK Integration
Earlier there was a way to Integrate 3rd party SDK into Plugin X, but, recently I see they have moved away from this approach and are using SDKBOX which is supposed to simplify the same.
Here is the best and probably only good reference that we can get for now from the cocos2d official programmersguide.
"SDKBOX is more like a upgraded version of plugin-x, so in short it's just a plugin it's not something runs on a cloud. the reason it starts is because we want to provide a better plugin integration solution for cocos2d-x, but the fact is plugin-x always gets the low priority compare to other shinning new 3D features, so we decide to change that." - Cocos2d-x developer said.
But, SDKBOX is where they develop plugins for you. I think its something like npm for nodejs.
Anyway, Some reference web links that will help you even more-
How to setup Cocos2d-x (Windows and Android)
cocos2d-x (iphone-android)/IDE installation and setup under mac os
How to set up the Android Cocos2d-x development environment on Windows 7
External Tutorials - Contains a bunch of helpful articles & tutorials.
**Helpful Examples ** to learn SDK development or support from-
Integration with Flurry Analytics SDK
We use Google Analytics with cocos2d-x extension
Countly SDK for Cocos2d-x apps
Cocos2d-x Extensions - Github repo
cocos2d-x-extensions - Another Github repo
List of Open Source Cocos2d Projects, Extensions and Code Snippets - Old but helpful
**Articles on SDKBOX ** which may help you if you looking for officially hosting your SDK-
Cocos2d-x Solves SDK Fatigue with New SDKBOX Initiative
The Best Way to Integrate SDKs into your Mobile Game
Hope it helps! :)
Happy Coding!

Unity3D for Android and iOS: QR-Code Decoding (Reader)

Is there a good QR Decoding Script or Pluging for Unity3D (Android and iOS)?
Or has someone already successfully integrated ZXing in Unity3D for Android and iOS? Here is a good solution for Webcam, but WebCamTexture does not always work on Android :(
I am grateful for any help.
There is a non-Free ($50) plugin available: Antares QR Code
If you're not interested in paying for a plugin then you'll have to create your own. Since ZXing is available for both iOS and Android you can create C# wrappers for it and then use a native plugin on iOS and the C#-to-Java extensions on Android to get what you need.
There is also an other plugin available for barcodes and QRCode for both Android and iOS : Easy Code Scanner
You just have to call a single method (common C# API for Android and iPhone) and it automatically launches a camera view/preview that decodes the barcode/QR code and gives you back the literal string of it in a callback. It is based on ZBar and you have nothing to integrate, everything is already self packaged.
The plugin can give you back the picture taken during the preview (as a Texture2D/Image) and also decode directly in the scripts a Texture2D/image without camera preview/shot.
The blog that the OP linked to, now has a free option for Android devices here You can also check out this related video
ARCamera prefab in Unity Tutorial
You may need to fix some minor compile errors to get everything working due to newer versions of Vuforia having different implementations.
You can also use free metaio SDK which has built in support for QR codes reading.

Computer Vision and AR libraries availabe for Android?

I am starting out as an Android Developer, and I would like to know if there are any Computer vision libraries or Augmented Reality libraries for the Android SDK, as I am planning to use these libraries for a mobile app.
I have read that if I download the NDK, I might be able to "import/use" the C openCV, and ARtoolkit libraries, but I am wondering if this is possible, or if there is a better and easier way of using these tools.
Android apps are programmed in Java, yet OpenCV & ARtoolkit use C/C++. Is there any way to use these libraries?
There are a number of wrappers for OpenCV available. For Java you might check JavaCV out.
To my knowledge, there is GSoC activity on AR with OpenCV on Android, but they seem to use C++.
Qualcomm is working on an Augmented Library for Android. As was mentioned opencv is also an option.
I would like to know if there are any
Computer vision libraries or Augmented
Reality libraries for the Android SDK
In the SDK? No. There are existing AR applications for Android (Layar, WIKITUDE) that you may wish to use as your foundation.
Is there any way to use these
libraries?
A quick search via Google turns up this and this.
Layar has made Layar Vision available to developers:
Layar Vision uses detection, tracking and computer vision techniques
to augment objects in the physical world. We can tell which objects in
the real world are augmented because the fingerprints of the object
are preloaded into the application based upon the user’s layer
selection. When a user aims their device at an object that matches the
fingerprint, we can quickly return the associated AR experience.
[...]
Layar Vision will be applied to the following Layar products:
6.0 version of Layar Reality browser on Android and iPhone iOS platforms.
iPhone Layar Player SDK v2.0.
The first release of an Android Layar Player SDK.
Layar Connect v2.0.
The simplest solution is to create a Vision layer, then use launcher creator for Android to create a layer launching app.
You can code in Java using OpenCV4Android, the official Android port of OpenCV. If you want to use native C++ OpenCV code, check out the Android NDK instead.
There's a new option for CV on Android, the Google Mobile Vision API. The API is exposed through com.google.android.gms.vision and lets you detect various types of objects (faces, barcodes, and facial features) given an arbitrary image bitmap.
In addition, Google provides the Cardboard VR library and a Unity plugin to make it easier for you to develop VR applications - such applications could include AR based on Mobile Vision if you integrated the phone's camera.
Now Google offers us 2 powerful SDKs: ARCore and ML Kit.
ARCore API has such important features as Augmented Images, Augmented Faces and Cloud Anchors. It supports Kotlin/Java languages, debugging apps on Emulator (AVD) and physically based rendering (PBR) with a help of Sceneform.
ML Kit API brings Google’s machine learning expertise to mobile developers in a powerful and easy-to-use package. Although ML Kit is still in beta stage it allows you to work with such important features as: Image Labelling, Text Recognition, Face Detection, Barcode Scanning, and Landmark Detection.

Categories

Resources