How to design viewfinder for QR scanner for scanning screen? - android

I have integrated QR Scanner for cordova android app.Currently when scanner opens it just shows video preview and scanning of QR code is fine.But I need to show viewfinder(Box where QR code must be placed to scan).Please help give your suggestions.

Draw a rectangle using canvas of your preview class #onDraw method. You can always use rectangle coordinates to find QR code.
example: canvas.drawRect
Hope my answer helps

Related

Limit detection area of MLKIT of barcode scanner

I am implementing a barcode scanner using MLKIT in my application (Kotlin). I need the barcode to be processed only when it is visible in a transparent rectangle in the center of the screen. as we see in the picture below.
PIC 1
Right now my App detects every barcode visible in the camera view as shown below. Is it possible to limit the detection area so the detection only reads from a rectangle in the middle of the screen?
Besides I would like to create a custom PreviewView for a barcode with cameraX, as shown in PIC 1. Thank you.
PIC 2
ML Kit doesn't support specifying interested area currently, so you have to either crop the preview image or filter out barcode results that not in the specified area. Similar question How do I make the camera focus only inside the rectangle and read the text inside the rectangle only in a flutter?

Using OCR mobile vision to anchor image to detected text

I am using the Text Recognition (mobile vision/ML) by Google to detect text on Camera feed. Once I detect text and ensure it is equal to "HERE WE GO", I draw a heart shape beside the detected text using the passed boundries.
The problem I am facing that the shape is jumping and lagging behind. I want it more like Anchored to the detected text. Is there something I can do to improve that?
I heard about ArCore library but it seems it is based on existing images to determine the anchor however in my case it can be any text that matches "HERE WE GO".
Any suggestions ?
I believe you are trying to overlay text on the camera preview in realtime. There will be small delay between the camera input and detection. Since the API is async by the time the output returns you would be showing another frame.
To alleviate that you can either make the processing part sync with using some lock/mutex or overlay another image that only refreshes after the processing is done.
We have some examples here: https://github.com/firebase/quickstart-android/tree/master/mlkit
and also I fixed a similar problem on iOS by using DispatchGroup https://github.com/googlecodelabs/mlkit-ios/blob/master/translate/TranslateDemo/CameraViewController.swift#L245
Option 1: Refer tensor flow android sample here
https://github.com/tensorflow/tensorflow/tree/master/tensorflow/examples/android
especially these classes:
1. Object tracker: https://github.com/tensorflow/tensorflow/blob/master/tensorflow/examples/android/src/org/tensorflow/demo/tracking/ObjectTracker.java
2.Overlay
https://github.com/tensorflow/tensorflow/blob/master/tensorflow/examples/android/src/org/tensorflow/demo/OverlayView.java
3.Camera Activity and Camera Fragment https://github.com/tensorflow/tensorflow/blob/master/tensorflow/examples/android/src/org/tensorflow/demo/CameraActivity.java
Option 2: A sample code can be found in below code lab. They are doing something similar for barcode.
https://codelabs.developers.google.com/codelabs/barcodes/index.html?index=..%2F..index#0

How to use JavaCameraView to capture a frame and send it to train the face recognizer?

First, I've just started with android development last week so please be thorough in your explanations as I'm still a noob.
I've managed to create an app that uses the JavaCameraView to show the user what the back camera is seeing. I created a new button in the activity bar to take a picture. When the user clicks this button I want to capture that frame and then send it to the picture library I am using for the facerecognizer. Thus far I haven't been able to succeed with this implementation.
So for the questions...
How can I capture a frame from the JavaCameraView when the take picture button is pressed?
From there do I just output the image to my image library using OutputStream?
Thanks everyone
In your class you have to add implements CvCameraViewListener2. Now your class has a method public Mat onCameraFrame(CvCameraViewFrame cameraviewframe).

How to get Custom Camera view with out using SurfaceView

I want to develop customize camera application like Snap chat!, with out using Surface view.
First i used surface view to develop the app,but i am unable get quality image and also i am unable to get all features what default camera app is providing, like Zoom, focus,face reorganization etc. Please provide me any solution to achieve this
sorry for my english
github/xplodwild/android_packages_apps_Focal
github/almalence/OpenCamera
github/troop/FreeDCam
github/rexstjohn/UltimateAndroidCameraGuide
maybe one those might help

Drawing a square frame in camera view

I have an application that scans qr code. I want to place a square in the middle of camera view so the users have a guide on where they will place the qr code. Can you tell me how to do it? Thank you.

Categories

Resources