Shaking on a physical device can be hard sometimes, especially when the device and the cable have been connected hardly (for some problem by charge socket or cable).
It happened to me several times, when I wanted to shake my connection lost.
Is there any way for simulating the shake gesture on physical devices?
I was struggling with the same problem that i found these solutions :
------------------FOR ANDROID DEVICES------------------
after running your app on physical device or emulator , run this command to see the dev menu:
adb shell input keyevent KEYCODE_MENU
--------------------FOR IOS DEVICES-------------------------
you can easily add the shake gesture to the assistive touch and while your app running you can click on that on debug menu will be shown , if you wonder how to activate it check the link
--------FOR BOTH FROM INSIDE THE CODES---------
import React from 'react';
import {
View,
PanResponder,
NativeModules,
} from 'react-native';
const DevMenuTrigger = ({children}) => {
const {DevMenu} = NativeModules;
const panResponder = PanResponder.create({
onStartShouldSetPanResponder: (evt, gestureState) => {
if (gestureState.numberActiveTouches === 3) {
DevMenu.show();
}
},
});
return <View style={{flex: 1}} {...panResponder.panHandlers}>{children}</View>;
};
AppRegistry.registerComponent('myApp', (): any => <DevMenuTrigger><MyApp></DevMenuTrigger>);
With an android you can simply type "d" in your metro terminal and it will open the developer menu in your app.
It might also work for iOS but I haven't tried it.
Related
Building a React native (expo built) app that will connect to POS and part of the app is scanning items and adding them to the cart, now I did it with expo camera which is working great but I got a task to upgrade it to work with devices like Android 10.0 CS20
Currently the only idea I have is having a Text input which I don't show keybooard for and on input trigger the action, the issue: I figured how to re focus to it, but a user can click on product to change it or delete it and it defocuses and I have no clue how to handle that one
Relevant code
const handleScan = (text: string) => {
handleBarCodeScanned({type:'scanner', data: text})
setTimeout(() => {
textInput.current?.focus()
},1000)
};
<ScannerCamera
handleBarCodeScanned={handleBarCodeScanned}
flashmode={flashmode}
scannerActive={scannerActive}
/>
<SlidingWindow
setProducts={setScannedProducts}
modal={modal}
setModal={setModal}
>
<TextInput
style={{width:100, height:50, backgroundColor:'red'}}
ref={textInput || null}
value={scan}
onChangeText={(text) => handleScan(text)}
showSoftInputOnFocus={false}
autoFocus={true}
/>
I have a Barcode Scanner peripheral attached to my phone (see attached image) and cannot seem to find a way to access the readings of the scanner. The phone is reading the barcodes as there is an emulator app within the phone which displays the numeric values of barcodes once scanned.
My question is, how can the Barcode peripheral be accessed? Similar to how the Camera is accessed via the NPM package RNCamera (React Native).
Source - https://www.amazon.com/MUNBYN-Ergonomic-Warehouse-Inventory-Management/dp/B0885ZY3DV
For react-native you can use this library for barcode scanner.
Library link
https://github.com/react-native-camera/react-native-camera
Here is the explanation in react-native:-
https://medium.com/#dinukadilshanfernando/implementing-a-barcode-scanner-by-using-react-native-camera-b170de4b7f51
MUNBYN Barcode Scanner Device Integration for React Native
I had an experience with the same brand of barcode scanner and their attached scanner device acts like a keyboard actually. When it scans, it writes some texts and presses Enter automatically. After i realized it then i created a component for myself.
The idea is, you have to focus to an input. Because after scanning, some texts will be placed on somewhere, so the quick way is focus to an input and catch the scanned informations inside the input.
And also after scanning you need to handle "editing end" case, because scanner will press enter automatically. That's all you have to know actually.
import React, { useEffect, useState, useRef } from "react";
import { Center, Spinner, Input, VStack, Button, Stack } from "native-base";
const QuickBarcodeReader = ({ getSingleCode, getArrayOfCodes, endEditingCb, inputStyle }) => {
const [code, setCode] = useState("");
const [codeList, setCodesList] = useState([]);
const inputRef = useRef(null);
useEffect(() => {
if (codeList.length > 0) getArrayOfCodes?.(codeList);
}, [codeList]);
const endEditing = () => {
setCodesList(i => [...i, code]);
inputRef?.current?.focus();
endEditingCb?.();
setCode("");
};
const textChange = text => {
setCode(text);
getSingleCode?.(text);
};
return (
<>
<Stack w="full">
<Stack
style={{ position: "absolute", width: "100%", height: "100%", backgroundColor: "transparent", zIndex: 2 }}
/>
<Input
ref={inputRef}
w="full"
{...inputStyle}
placeholder={codeList[codeList.length - 1]}
showSoftInputOnFocus={false}
autoFocus
value={code}
onChangeText={textChange}
onEndEditing={endEditing}
/>
</Stack>
<Button opacity={0} h="0" w="0" m="0" p="0" />
</>
);
};
export default QuickBarcodeReader;
For design purpose i was using native-base, so please ignore the component just focus on how i handle scanner with some functions and hooks.
I successfully created an app for pickers and delivery guys on the market.
Happy scanning, cheers...
I have created an app and added splach screen. It takes 1 second to load the app on Android Emulator. However after I publish the app in the store, it takes 4 seconds to load.
This is pretty annoying for such a simple app.
I was thinking it was because of _loadResourcesAsync func to load pictures. Therefore I commented out those lines but nothing has changed.
Any recommendation to speed up my app launch.
Here you can find my app.js
import React from 'react';
import { Platform, StatusBar, StyleSheet, View } from 'react-native';
import { AppLoading, Asset } from 'expo';
import AppNavigator from './navigation/AppNavigator';
export default class App extends React.Component {
constructor(props) {
super(props);
this.state = {
isLoadingComplete: false,
};
}
render() {
if (!this.state.isLoadingComplete && !this.props.skipLoadingScreen) {
return (
<AppLoading
startAsync={this._loadResourcesAsync}
onError={this._handleLoadingError}
onFinish={this._handleFinishLoading}
/>
);
} else {
return (
<View style={styles.container}>
{Platform.OS === 'ios' && <StatusBar barStyle="default" />}
<AppNavigator />
</View>
);
}
}
_loadResourcesAsync = async () => {
return Promise.all([
Asset.loadAsync([
// require('./assets/images/big_bottle.png'),
// require('./assets/images/bottle.png'),
// require('./assets/images/coffee.png'),
// require('./assets/images/juice.png'),
// require('./assets/images/menu.png'),
// require('./assets/images/tea.png'),
// require('./assets/images/water-glass.png'),
]),
]);
};
_handleLoadingError = error => {
// In this case, you might want to report the error to your error
// reporting service, for example Sentry
console.warn(error);
};
_handleFinishLoading = () => {
this.setState({ isLoadingComplete: true });
};
}
const styles = StyleSheet.create({
container: {
flex: 1,
backgroundColor: '#fff',
},
});
Based on my experience, hybrid apps, in general, tend to launch slower.
Based on my tests, even a completely blank Expo app, based on the device performance, can take anywhere from 1-4s to launch. This is the time it takes from the moment the user taps (opens) the app to the time the splash screen gets hidden (and the first app screen is visible).
What you can do to speed it up is a really broad topic. Here are two recommendations, that helped a lot for a project of mine:
Cashe your assets.
Bundling assets into your binary will provide for the best user experience as your assets will be available immediately. Instead of having to make a network request to the CDN to fetch your published assets, your app will fetch them from the local disk resulting in a faster, more efficient loading experience.
Always load a cached version of your app first.
You can configure the "updates" section in your app.json to always starts with the cached version of your app first (fallbackToCacheTimeout: 0) and continue trying to fetch the update in the background (at which point it will be saved into the cache for the next app load).
Unfortunately, as of today, it looks like we're kind of limited to what we can do to improve further the initial loading time on a blank app. There isn't really a reliable way to know how long the 'check for updates' process takes, the general React Native initialization time, the 'load all the JS' time.
There is a feature request for tools for improving the startup time, would be great if the Expo team introduces something similar anytime in the future.
I am using Browserstack's automate API with selenium-webdriver's node package to programatically take screenshots on different browsers and device
The desktop screenshots work fine - we screenshot, scroll, take another screenshot, until the end of the screen is reached. Then some code stitches the screenshots together.
On Android devices there is a problem, Browserstack takes screenshots that seem to include the space where the browser's bottom bar is, but it comes out as white space.
As per the Browserstack documentation, this is the method I am using to to the screenshot:
webdriver.WebDriver.prototype.saveScreenshot = (filename) => {
return driver.takeScreenshot().then((data) => {
fs.writeFile(
`${__dirname}/screenshots/${filename}`,
data.replace(/^data:image\/png;base64,/, ''),
'base64',
(err) => {
if (err) {
throw err;
}
},
);
});
};
I can successfully enter kiosk mode with desktop Chrome by adding a --kiosk argument to the chromeOptions object, but it doesn't have any effect in Chrome on a mobile device.
I have also tried executing the script document.documentElement.requestFullScreen through the driver, with no luck.
Is it possible to enter kiosk mode on android chrome programatically with selenium-webdriver?
Is there another way to reliably and programatically hide the url bar?
We are encountering a very bizarre scenario with react-navigation in our React Native application that is only observed on Android (both in the emulator and on physical devices AND for debug builds as well as release builds), but it works fine on iOS.
Context
We have an existing native application, and decided to implement some new screens in React Native as an experiment to see whether it would benefit our development lifecycle.
Our native app has a sidebar menu, and we added a new menu item, that when selected, takes the user into the React Native portion. They can of course navigate back out whenever they want, and later go back into that React Native portion.
Observed problem (Only occurs in Android)
We have identified it relates to the react-navigation library, but we don't know what we're doing wrong.
When the app is first loaded, the user can select the new menu item and the React Native app loads fine, showing its initial route page and with the StackNavigator working fine.
If the user returns to the native portion (either via the back key, or by selecting a different option from the sidebarmenu) and then later opts to return to the React Native portion, then the StackNavigator portion doesn't display. Other React components outside the StackNavigator get rendered. We know it mounts the contained components, as some of them are making API calls and we see those endpoints being queried. It just doesn't render.
Reloading within the emulator will render the app properly again until we navigate out of React Native and then return.
Oddly enough: If we turn on remote JS debugging, it suddenly all works fine.
So our question:
Can anyone spot what we might be missing in how we are using the StackNavigator, that is keeping it from rendering properly? Again: it works fine when the JS debugger is on, making us think that it is not a logic item, but perhaps a timing condition, or some subtle config? Or should we just ditch react-navigation and go to a different navigation library?
Simple reproduction of the issue
Our package.json is:
{
"dependencies": {
"react": "16.0.0",
"react-native": "0.50.4",
"react-navigation": "1.5.2"
}
}
Our React Native entry page (index.js) is:
import * as React from 'react';
import { AppRegistry, Text, View } from 'react-native';
import { StackNavigator } from 'react-navigation';
import TestPage from './TestPage';
AppRegistry.registerComponent('MyApp', () => MyApp);
class MyApp extends React.Component {
public render() {
return (
<View style={{flex:1}}>
<Text>'This text always shows up fine on Android, even on reentry to React application'</Text>
<AccountNavigator />
</View>
);
}
}
const AccountNavigator = StackNavigator(
{
FirstPage: {
screen: TestPage,
navigationOptions: ({ navigation }) => ({
title: 'Test View'
})
},
{
initialRouteName: 'FirstPage'
}
);
The simple test page (TestPage.js) is just:
import * as React from 'react';
import { Text, View } from 'react-native';
export default class TestPage extends React.Component {
render() {
return (
<View style={{flex:1, alignItems: 'center', justifyContent: 'center'}}>
<Text>'If you can read this, then the app is on first load. But return to the native portion and then try to come back to React Native and you will not see me.'</Text>
</View>
);
}
}
Turns out it was a layout setting issue. In our native code, within our React Activity layout XML we had:
<com.facebook.react.ReactRootView
android:id="#+id/ReactRootView
android:layout_width="match_parent"
android:layout_height="wrap_content" />
and the issue was in the "wrap_content" for height which was causing it to render the StackNavigator() as 1 pixel high. No idea why it always happened only on re-entry and not on the first time, nor why the JS debugger would cause the issue to disappear.
Changing layout_height to "match_parent" resolved the issue altogether.