Building a React native (expo built) app that will connect to POS and part of the app is scanning items and adding them to the cart, now I did it with expo camera which is working great but I got a task to upgrade it to work with devices like Android 10.0 CS20
Currently the only idea I have is having a Text input which I don't show keybooard for and on input trigger the action, the issue: I figured how to re focus to it, but a user can click on product to change it or delete it and it defocuses and I have no clue how to handle that one
Relevant code
const handleScan = (text: string) => {
handleBarCodeScanned({type:'scanner', data: text})
setTimeout(() => {
textInput.current?.focus()
},1000)
};
<ScannerCamera
handleBarCodeScanned={handleBarCodeScanned}
flashmode={flashmode}
scannerActive={scannerActive}
/>
<SlidingWindow
setProducts={setScannedProducts}
modal={modal}
setModal={setModal}
>
<TextInput
style={{width:100, height:50, backgroundColor:'red'}}
ref={textInput || null}
value={scan}
onChangeText={(text) => handleScan(text)}
showSoftInputOnFocus={false}
autoFocus={true}
/>
Related
I have a Barcode Scanner peripheral attached to my phone (see attached image) and cannot seem to find a way to access the readings of the scanner. The phone is reading the barcodes as there is an emulator app within the phone which displays the numeric values of barcodes once scanned.
My question is, how can the Barcode peripheral be accessed? Similar to how the Camera is accessed via the NPM package RNCamera (React Native).
Source - https://www.amazon.com/MUNBYN-Ergonomic-Warehouse-Inventory-Management/dp/B0885ZY3DV
For react-native you can use this library for barcode scanner.
Library link
https://github.com/react-native-camera/react-native-camera
Here is the explanation in react-native:-
https://medium.com/#dinukadilshanfernando/implementing-a-barcode-scanner-by-using-react-native-camera-b170de4b7f51
MUNBYN Barcode Scanner Device Integration for React Native
I had an experience with the same brand of barcode scanner and their attached scanner device acts like a keyboard actually. When it scans, it writes some texts and presses Enter automatically. After i realized it then i created a component for myself.
The idea is, you have to focus to an input. Because after scanning, some texts will be placed on somewhere, so the quick way is focus to an input and catch the scanned informations inside the input.
And also after scanning you need to handle "editing end" case, because scanner will press enter automatically. That's all you have to know actually.
import React, { useEffect, useState, useRef } from "react";
import { Center, Spinner, Input, VStack, Button, Stack } from "native-base";
const QuickBarcodeReader = ({ getSingleCode, getArrayOfCodes, endEditingCb, inputStyle }) => {
const [code, setCode] = useState("");
const [codeList, setCodesList] = useState([]);
const inputRef = useRef(null);
useEffect(() => {
if (codeList.length > 0) getArrayOfCodes?.(codeList);
}, [codeList]);
const endEditing = () => {
setCodesList(i => [...i, code]);
inputRef?.current?.focus();
endEditingCb?.();
setCode("");
};
const textChange = text => {
setCode(text);
getSingleCode?.(text);
};
return (
<>
<Stack w="full">
<Stack
style={{ position: "absolute", width: "100%", height: "100%", backgroundColor: "transparent", zIndex: 2 }}
/>
<Input
ref={inputRef}
w="full"
{...inputStyle}
placeholder={codeList[codeList.length - 1]}
showSoftInputOnFocus={false}
autoFocus
value={code}
onChangeText={textChange}
onEndEditing={endEditing}
/>
</Stack>
<Button opacity={0} h="0" w="0" m="0" p="0" />
</>
);
};
export default QuickBarcodeReader;
For design purpose i was using native-base, so please ignore the component just focus on how i handle scanner with some functions and hooks.
I successfully created an app for pickers and delivery guys on the market.
Happy scanning, cheers...
Shaking on a physical device can be hard sometimes, especially when the device and the cable have been connected hardly (for some problem by charge socket or cable).
It happened to me several times, when I wanted to shake my connection lost.
Is there any way for simulating the shake gesture on physical devices?
I was struggling with the same problem that i found these solutions :
------------------FOR ANDROID DEVICES------------------
after running your app on physical device or emulator , run this command to see the dev menu:
adb shell input keyevent KEYCODE_MENU
--------------------FOR IOS DEVICES-------------------------
you can easily add the shake gesture to the assistive touch and while your app running you can click on that on debug menu will be shown , if you wonder how to activate it check the link
--------FOR BOTH FROM INSIDE THE CODES---------
import React from 'react';
import {
View,
PanResponder,
NativeModules,
} from 'react-native';
const DevMenuTrigger = ({children}) => {
const {DevMenu} = NativeModules;
const panResponder = PanResponder.create({
onStartShouldSetPanResponder: (evt, gestureState) => {
if (gestureState.numberActiveTouches === 3) {
DevMenu.show();
}
},
});
return <View style={{flex: 1}} {...panResponder.panHandlers}>{children}</View>;
};
AppRegistry.registerComponent('myApp', (): any => <DevMenuTrigger><MyApp></DevMenuTrigger>);
With an android you can simply type "d" in your metro terminal and it will open the developer menu in your app.
It might also work for iOS but I haven't tried it.
I'm hoping this is a known problem as I can't provide much to go on to get help solving it.
I'm using react-navigation for my app with the following setup:
AppContainer
Splash Screen
Setup Screen (A stack navigator)
Main screen (A stack navigator)
When my app starts it goes to the splash screen which decides if this is the first time running or not and depending on this decision calls this.props.navigation.navigate() with either main screen or setup screen.
So far this all works and is fairly standard.
When my app is launched for the first time the user is sent to the setup screen where they navigate through a series of screens entering data and selecting a next button to proceed to the next screen. This is where the problem occur. My first screen simply has some text and a next button (which is a regular button component) which when clicked calls this.props.navigation.push('nextviewname', {data: data}) to move to the next view.
The next view contains a textinput as well as back and next buttons which is where I'm having problems. When I reach this screen after freshly installing a release version of my app onto my Android X phone none of the inputs work. I can't:
Click next of back
Click the back arrow in the top left that is part of the header
Click the text input (the cursor does briefly show up in the text input but the keyboard never appears)
Click the hardware back key
On very rare occasions some of the above does work (e.g. the text input will sometimes work) and sometimes I'll even make it to the next step of my setup but it's rare that I make it all the way through
Weirdly this all works when I'm debugging my app on my phone.
Update: I've just tested on an Android 9 emulator and I'm getting the same issue.
Update 2: This is getting weird, when the app is in a broken state I can still bring up the react native debug menu however when I click Toggle Inspector nothing happens (i.e. I don't get the inspector UI). It's looking like this is somehow breaking everything.
Has anyone seen/solved this issue before? At the moment it's effectively made my app useless.
Update 3: Some code to hopefully make things clearer:
const SetupUser = createStackNavigator(
{
SetupUser: WelcomeScreen,
SetupName: UserName,
SetupCurrentWeight: CurrentWeight,
SetupGoalWeight: GoalWeight,
SetupGoalDate: GoalDate,
Summary: Summary,
LogWeight: LogWeight,
},
{
defaultNavigationOptions: {
headerStyle: {
backgroundColor: '#001830',
},
headerTintColor: '#fff',
headerTitleStyle: {
fontWeight: 'bold',
},
},
},
);
const MainApp = createStackNavigator(
{
LogWeight: LogWeight,
LogWeightSummary: LogWeightSummary,
},
{
defaultNavigationOptions: {
headerStyle: {
backgroundColor: '#001830',
},
headerTintColor: '#fff',
headerTitleStyle: {
fontWeight: 'bold',
},
},
},
);
export default createAppContainer(
createSwitchNavigator(
{
MainApp: MainApp,
SplashScreen: SplashScreen,
SetupUser: SetupUser,
},
{
initialRouteName: 'SplashScreen',
},
),
);
In getting this code snippit together (I've removed the tab navigator as the error is still there even without it) I think I've managed to track down the source of the issue however I'm still not sure how to fix it. The first view loaded is the splash screen which looks like this:
export default class SplashScreen extends React.Component {
constructor(props) {
super(props);
GoBinder.verifyDatabase()
.then(GoBinder.getLastUser())
.then(user => {
this.props.navigation.navigate(
user.user == null ? 'SetupUser' : 'MainApp',
);
})
.catch(error => {
GoBinder.toast('Error while checking initial user state: ' + error);
});
}
render() {
return (
<View style={styles.container}>
<ActivityIndicator />
<StatusBar barStyle="default" />
</View>
);
}
}
In the above code, GoBinder.verifyDatabase() and GoBinder.getLastUser() are calls to native code which perform some database operation to get things setup and check to see if there are any existing users. I believe the issue is this.props.navigation.navigate is firing too quickly which is causing react-navigation to load the next screen but get messed up in the process.
I've tried following the suggestions in this post https://www.novemberfive.co/blog/react-performance-navigation-animations about moving blocking code into InteractionManager.runAfterInteractions under componentDidMount however this made no difference. However, it I manually trigger the move to the new screen using a button everything works correctly so its really looking like the act of programatically changing screens is messing things up.
Update 4:
Looks like I jumped the gun a bit, its still freezing up a fair bit, moving my answer into an update:
So I'm not sure if this is the best solution but I have found a workaround. I am now calling my database code in InteractionManager.runArfetInteraction() as suggested in https://www.novemberfive.co/blog/react-performance-navigation-animations. I am then calling setState with the result of the DB functions to store the result. Finally, I am using a callback on setState to trigger the actual navigation. My complete working code is:
componentDidMount() {
InteractionManager.runAfterInteractions(() => {
GoBinder.verifyDatabase()
.then(
GoBinder.getLastUser().then(user => {
this.setState(
{
user: user.user,
},
() => {
this.props.navigation.navigate(
this.state.user == null ? 'SetupUser' : 'MainApp',
);
},
);
}),
)
.catch(error => {
GoBinder.toast('Error while checking initial user state: ' + error);
});
});
}
Thanks for your help
I am working on some end-to-end tests for a React Native (version 0.60) application which are run via Appium.
I have a couple of buttons which are wrapped around a SafeAreaView to avoid problems with the latest iOS devices (e.g. iPhone X, iPad Pro, etc...). This is the key part of the Component render() function:
const Buttons = (
<StickyContainer visible={isSizeSelected} width={width} style={containerStyle}>
{showAddToWishlist && (
<Button
outline
fixedWidth
uppercase
tx="product.addToWishlist"
onPress={() => this.onPressAddTo(ProductAction.AddToWishlist)}
icon="heartBlack"
margin={margin}
showSpinner={isAddingToWishlist}
/>
)}
{showAddToShoppingBag && (
<Button
primary
fixedWidth
uppercase
tx="product.addToCart"
onPress={() => this.onPressAddTo(ProductAction.AddToShoppingBag)}
showSpinner={isAddingToShoppingBag}
{...setTestId("sizeOverlayAddToCartButton")}
/>
)}
</StickyContainer>
)
return <SafeAreaView forceInset={{ bottom: "never" }}>{Buttons}</SafeAreaView>
As you can see, the accessibility IDs are set through the setTestid() function which is doing nothing more than this:
const getPlatformTestId = (id: string) => {
if (IS_IOS) {
return {
testID: id
}
}
return {
accessibilityLabel: id,
accessible: true
}
}
export const setTestId = (id: string) => {
return getPlatformTestId(id)
}
Now the problem: if I run the app and I try to search on Appium for the ID sizeOverlayAddToCartButton I can't find anything. If I remove the <SafeAreaView> and I return directly Buttons the ID is found without any problem.
It's also interesting that if I use the app Accessibility Inspector (it's part of Xcode) instead of Appium, the ID is always found no matter if I use the <SafeAreaView>.
Does someone know why this is not working? I can't find any compatibility issue online
I have created an app and added splach screen. It takes 1 second to load the app on Android Emulator. However after I publish the app in the store, it takes 4 seconds to load.
This is pretty annoying for such a simple app.
I was thinking it was because of _loadResourcesAsync func to load pictures. Therefore I commented out those lines but nothing has changed.
Any recommendation to speed up my app launch.
Here you can find my app.js
import React from 'react';
import { Platform, StatusBar, StyleSheet, View } from 'react-native';
import { AppLoading, Asset } from 'expo';
import AppNavigator from './navigation/AppNavigator';
export default class App extends React.Component {
constructor(props) {
super(props);
this.state = {
isLoadingComplete: false,
};
}
render() {
if (!this.state.isLoadingComplete && !this.props.skipLoadingScreen) {
return (
<AppLoading
startAsync={this._loadResourcesAsync}
onError={this._handleLoadingError}
onFinish={this._handleFinishLoading}
/>
);
} else {
return (
<View style={styles.container}>
{Platform.OS === 'ios' && <StatusBar barStyle="default" />}
<AppNavigator />
</View>
);
}
}
_loadResourcesAsync = async () => {
return Promise.all([
Asset.loadAsync([
// require('./assets/images/big_bottle.png'),
// require('./assets/images/bottle.png'),
// require('./assets/images/coffee.png'),
// require('./assets/images/juice.png'),
// require('./assets/images/menu.png'),
// require('./assets/images/tea.png'),
// require('./assets/images/water-glass.png'),
]),
]);
};
_handleLoadingError = error => {
// In this case, you might want to report the error to your error
// reporting service, for example Sentry
console.warn(error);
};
_handleFinishLoading = () => {
this.setState({ isLoadingComplete: true });
};
}
const styles = StyleSheet.create({
container: {
flex: 1,
backgroundColor: '#fff',
},
});
Based on my experience, hybrid apps, in general, tend to launch slower.
Based on my tests, even a completely blank Expo app, based on the device performance, can take anywhere from 1-4s to launch. This is the time it takes from the moment the user taps (opens) the app to the time the splash screen gets hidden (and the first app screen is visible).
What you can do to speed it up is a really broad topic. Here are two recommendations, that helped a lot for a project of mine:
Cashe your assets.
Bundling assets into your binary will provide for the best user experience as your assets will be available immediately. Instead of having to make a network request to the CDN to fetch your published assets, your app will fetch them from the local disk resulting in a faster, more efficient loading experience.
Always load a cached version of your app first.
You can configure the "updates" section in your app.json to always starts with the cached version of your app first (fallbackToCacheTimeout: 0) and continue trying to fetch the update in the background (at which point it will be saved into the cache for the next app load).
Unfortunately, as of today, it looks like we're kind of limited to what we can do to improve further the initial loading time on a blank app. There isn't really a reliable way to know how long the 'check for updates' process takes, the general React Native initialization time, the 'load all the JS' time.
There is a feature request for tools for improving the startup time, would be great if the Expo team introduces something similar anytime in the future.