I have a Barcode Scanner peripheral attached to my phone (see attached image) and cannot seem to find a way to access the readings of the scanner. The phone is reading the barcodes as there is an emulator app within the phone which displays the numeric values of barcodes once scanned.
My question is, how can the Barcode peripheral be accessed? Similar to how the Camera is accessed via the NPM package RNCamera (React Native).
Source - https://www.amazon.com/MUNBYN-Ergonomic-Warehouse-Inventory-Management/dp/B0885ZY3DV
For react-native you can use this library for barcode scanner.
Library link
https://github.com/react-native-camera/react-native-camera
Here is the explanation in react-native:-
https://medium.com/#dinukadilshanfernando/implementing-a-barcode-scanner-by-using-react-native-camera-b170de4b7f51
MUNBYN Barcode Scanner Device Integration for React Native
I had an experience with the same brand of barcode scanner and their attached scanner device acts like a keyboard actually. When it scans, it writes some texts and presses Enter automatically. After i realized it then i created a component for myself.
The idea is, you have to focus to an input. Because after scanning, some texts will be placed on somewhere, so the quick way is focus to an input and catch the scanned informations inside the input.
And also after scanning you need to handle "editing end" case, because scanner will press enter automatically. That's all you have to know actually.
import React, { useEffect, useState, useRef } from "react";
import { Center, Spinner, Input, VStack, Button, Stack } from "native-base";
const QuickBarcodeReader = ({ getSingleCode, getArrayOfCodes, endEditingCb, inputStyle }) => {
const [code, setCode] = useState("");
const [codeList, setCodesList] = useState([]);
const inputRef = useRef(null);
useEffect(() => {
if (codeList.length > 0) getArrayOfCodes?.(codeList);
}, [codeList]);
const endEditing = () => {
setCodesList(i => [...i, code]);
inputRef?.current?.focus();
endEditingCb?.();
setCode("");
};
const textChange = text => {
setCode(text);
getSingleCode?.(text);
};
return (
<>
<Stack w="full">
<Stack
style={{ position: "absolute", width: "100%", height: "100%", backgroundColor: "transparent", zIndex: 2 }}
/>
<Input
ref={inputRef}
w="full"
{...inputStyle}
placeholder={codeList[codeList.length - 1]}
showSoftInputOnFocus={false}
autoFocus
value={code}
onChangeText={textChange}
onEndEditing={endEditing}
/>
</Stack>
<Button opacity={0} h="0" w="0" m="0" p="0" />
</>
);
};
export default QuickBarcodeReader;
For design purpose i was using native-base, so please ignore the component just focus on how i handle scanner with some functions and hooks.
I successfully created an app for pickers and delivery guys on the market.
Happy scanning, cheers...
Related
Building a React native (expo built) app that will connect to POS and part of the app is scanning items and adding them to the cart, now I did it with expo camera which is working great but I got a task to upgrade it to work with devices like Android 10.0 CS20
Currently the only idea I have is having a Text input which I don't show keybooard for and on input trigger the action, the issue: I figured how to re focus to it, but a user can click on product to change it or delete it and it defocuses and I have no clue how to handle that one
Relevant code
const handleScan = (text: string) => {
handleBarCodeScanned({type:'scanner', data: text})
setTimeout(() => {
textInput.current?.focus()
},1000)
};
<ScannerCamera
handleBarCodeScanned={handleBarCodeScanned}
flashmode={flashmode}
scannerActive={scannerActive}
/>
<SlidingWindow
setProducts={setScannedProducts}
modal={modal}
setModal={setModal}
>
<TextInput
style={{width:100, height:50, backgroundColor:'red'}}
ref={textInput || null}
value={scan}
onChangeText={(text) => handleScan(text)}
showSoftInputOnFocus={false}
autoFocus={true}
/>
I want to only prevent screenshot and screen recording for certain screens like the login screen to prevent users from taking screenshot of passwords and login information and also the payment screens for the sensitive informations collected.
Most of the packages for preventing users to take screen capture usually sets it for all screens.
I have implemented the one mentioned here but it prevented screen capture for all the screens and not for only the screen in focus.
I have also tried
npm i react-native-screenshot-prevent
same result.
When using expo, to prevent screenshot or screen recording for single or more screens which are in focus, you can use the useIsFocused from react-navigation and the allowScreenCaptureAsync and preventScreenCaptureAsync of expo-screen-capture.
Below is an example as seen here https://dev.to/one/react-native-prevent-screen-capture-on-selected-screens-19f6
import * as ScreenCapture from 'expo-screen-capture';
import { useIsFocused } from '#react-navigation/native';
export default function LoginScreen() {
const isFocused = useIsFocused();
const activate = async () => {
await ScreenCapture.preventScreenCaptureAsync();
};
const deactivate = async () => {
await ScreenCapture.allowScreenCaptureAsync();
};
if(isFocused){
activate();
}else{
deactivate();
}
return (
<View style={styles.container}>
...
</View>
);
}
Shaking on a physical device can be hard sometimes, especially when the device and the cable have been connected hardly (for some problem by charge socket or cable).
It happened to me several times, when I wanted to shake my connection lost.
Is there any way for simulating the shake gesture on physical devices?
I was struggling with the same problem that i found these solutions :
------------------FOR ANDROID DEVICES------------------
after running your app on physical device or emulator , run this command to see the dev menu:
adb shell input keyevent KEYCODE_MENU
--------------------FOR IOS DEVICES-------------------------
you can easily add the shake gesture to the assistive touch and while your app running you can click on that on debug menu will be shown , if you wonder how to activate it check the link
--------FOR BOTH FROM INSIDE THE CODES---------
import React from 'react';
import {
View,
PanResponder,
NativeModules,
} from 'react-native';
const DevMenuTrigger = ({children}) => {
const {DevMenu} = NativeModules;
const panResponder = PanResponder.create({
onStartShouldSetPanResponder: (evt, gestureState) => {
if (gestureState.numberActiveTouches === 3) {
DevMenu.show();
}
},
});
return <View style={{flex: 1}} {...panResponder.panHandlers}>{children}</View>;
};
AppRegistry.registerComponent('myApp', (): any => <DevMenuTrigger><MyApp></DevMenuTrigger>);
With an android you can simply type "d" in your metro terminal and it will open the developer menu in your app.
It might also work for iOS but I haven't tried it.
I have created an app and added splach screen. It takes 1 second to load the app on Android Emulator. However after I publish the app in the store, it takes 4 seconds to load.
This is pretty annoying for such a simple app.
I was thinking it was because of _loadResourcesAsync func to load pictures. Therefore I commented out those lines but nothing has changed.
Any recommendation to speed up my app launch.
Here you can find my app.js
import React from 'react';
import { Platform, StatusBar, StyleSheet, View } from 'react-native';
import { AppLoading, Asset } from 'expo';
import AppNavigator from './navigation/AppNavigator';
export default class App extends React.Component {
constructor(props) {
super(props);
this.state = {
isLoadingComplete: false,
};
}
render() {
if (!this.state.isLoadingComplete && !this.props.skipLoadingScreen) {
return (
<AppLoading
startAsync={this._loadResourcesAsync}
onError={this._handleLoadingError}
onFinish={this._handleFinishLoading}
/>
);
} else {
return (
<View style={styles.container}>
{Platform.OS === 'ios' && <StatusBar barStyle="default" />}
<AppNavigator />
</View>
);
}
}
_loadResourcesAsync = async () => {
return Promise.all([
Asset.loadAsync([
// require('./assets/images/big_bottle.png'),
// require('./assets/images/bottle.png'),
// require('./assets/images/coffee.png'),
// require('./assets/images/juice.png'),
// require('./assets/images/menu.png'),
// require('./assets/images/tea.png'),
// require('./assets/images/water-glass.png'),
]),
]);
};
_handleLoadingError = error => {
// In this case, you might want to report the error to your error
// reporting service, for example Sentry
console.warn(error);
};
_handleFinishLoading = () => {
this.setState({ isLoadingComplete: true });
};
}
const styles = StyleSheet.create({
container: {
flex: 1,
backgroundColor: '#fff',
},
});
Based on my experience, hybrid apps, in general, tend to launch slower.
Based on my tests, even a completely blank Expo app, based on the device performance, can take anywhere from 1-4s to launch. This is the time it takes from the moment the user taps (opens) the app to the time the splash screen gets hidden (and the first app screen is visible).
What you can do to speed it up is a really broad topic. Here are two recommendations, that helped a lot for a project of mine:
Cashe your assets.
Bundling assets into your binary will provide for the best user experience as your assets will be available immediately. Instead of having to make a network request to the CDN to fetch your published assets, your app will fetch them from the local disk resulting in a faster, more efficient loading experience.
Always load a cached version of your app first.
You can configure the "updates" section in your app.json to always starts with the cached version of your app first (fallbackToCacheTimeout: 0) and continue trying to fetch the update in the background (at which point it will be saved into the cache for the next app load).
Unfortunately, as of today, it looks like we're kind of limited to what we can do to improve further the initial loading time on a blank app. There isn't really a reliable way to know how long the 'check for updates' process takes, the general React Native initialization time, the 'load all the JS' time.
There is a feature request for tools for improving the startup time, would be great if the Expo team introduces something similar anytime in the future.
I am wanting to develop an app where a user can select different gamepacks to install on their android or ios device. A gamepack will consist of:
one or more JSON files
images
sounds
Right now I'm not really concerned if these are transferred individually or in a zip file (though a zip file would certainly be preferred). Of course the device will need to be connected to the Internet to get the current list of gamepacks and to download the ones that the user chooses. Once the gamepacks are downloaded to the phone the user will not need an Internet connection to play the game as they will have all the files.
How do I go about implementing this in my project?
How do I download the files? Would I use the react-native-fetch-blob library and save it in a specific location? This library refers to saving it as "cache" rather than permanently, so I am not sure if this is the correct solution. The specific section I am looking at on the library page is "Use Specific File Path". But because it is cache, should I be looking for something else that is more of a longer term storage? It does say on the page that files will not be deleted so I am a bit confused as to what the difference between permanent storage and cache is in this case.
Once the files are downloaded would I then be able to open images and display them, open sound files and play them, and open the json files and process them?
Check out React Native FS, specifically the documentation on downloadFile:
https://github.com/johanneslumpe/react-native-fs#downloadfileoptions-downloadfileoptions--jobid-number-promise-promisedownloadresult-
Here's a working example:
import React, { Component } from 'react';
import {
AppRegistry,
Text,
View,
Image,
} from 'react-native';
import RNFS from 'react-native-fs';
export default class downloadFile extends Component {
constructor() {
super()
this.state = {
isDone: false,
};
this.onDownloadImagePress = this.onDownloadImagePress.bind(this);
}
onDownloadImagePress() {
RNFS.downloadFile({
fromUrl: 'https://facebook.github.io/react-native/img/header_logo.png',
toFile: `${RNFS.DocumentDirectoryPath}/react-native.png`,
}).promise.then((r) => {
this.setState({ isDone: true })
});
}
render() {
const preview = this.state.isDone ? (<View>
<Image style={{
width: 100,
height: 100,
backgroundColor: 'black',
}}
source={{
uri: `file://${RNFS.DocumentDirectoryPath}/react-native.png`,
scale: 1
}}
/>
<Text>{`file://${RNFS.DocumentDirectoryPath}/react-native.png`}</Text>
</View>
) : null;
return (
<View>
<Text onPress={this.onDownloadImagePress}>Download Image</Text>
{preview}
</View>
);
}
}
AppRegistry.registerComponent('downloadFile', () => downloadFile);
It's important to know that the height and width must be set on the Image