I'm having a problem getting a variable after the function runs.
I have to read a qr code and then put the qr code in a text field, but when i try to read it gets the default variable text, and only gets the right qr code when i try to read for the seccond time.
Here is the qr reader function:
Future<void> scanQR() async {
String barcodeScanRes;
// Platform messages may fail, so we use a try/catch PlatformException.
try {
barcodeScanRes = await FlutterBarcodeScanner.scanBarcode(
'#ff6666', 'Cancel', true, ScanMode.QR);
print(barcodeScanRes);
} on PlatformException {
barcodeScanRes = 'Failed to get platform version.';
}
if (!mounted) return;
setState(() {
_scanBarcode = barcodeScanRes;
});
}
Here the button to run the function and then put the qr on the controller:
onPressed: () {
scanQR();
setState(() {
_codprodController.text = _scanBarcode;
});
},
//on the first time i try it shows "unknown" that is the _scanBarCode default text instead of the qr i read
I guess the problem is the "_codprodController.text = _scanBarcode;" that its not runnig after (but together) the scanQR() but i dont know hot to fix.
Try below
onPressed: () async {
await scanQR();
setState(() {
_codprodController.text = _scanBarcode;
});
},
Related
I am using this package package for scanning barcode. It works fine but after successfully scan I want to close the camera. I searched for the solution but no solution worked in my case. Here is my code.
try {
FlutterBarcodeScanner.getBarcodeStreamReceiver(
'#ff6666', 'Cancel', true, ScanMode.BARCODE)!
.listen((data) {
print(data);
});
} on PlatformException {
// barcodeScanRes = 'Failed to get platform version.';
}
}
you can close scan page by adding this line to your code
if (barcodeScanRes != null){
print(barcodeScanRes);
// this will send your scan result to previous page
// or you can navigate to other page after scan success
Navigator.pop(context, barcodeScanRes);
}
String barcodeScanRes; // put this variable in statefull widget
Future<void> scanQR() async {
try {
barcodeScanRes = await FlutterBarcodeScanner.scanBarcode(
'#ff6666', 'Cancel', true, ScanMode.QR);
// add this line to close scanner or naivigate to other page
if (barcodeScanRes != null){
print(barcodeScanRes);
Navigator.pop(context, barcodeScanRes);
}
} on PlatformException {
barcodeScanRes = 'Failed to get platform version.';
}
if (!mounted) return;
setState(() {
_scanBarcode = barcodeScanRes;
});
}
or if you want to pause the camera after scan, you can use this package https://pub.dev/packages/qr_code_scanner
Hi all I have an app with a feature that allows the user to take a picture during a task.
Recently when asking for camera permissions for the first time the device is not showing the native alert but rather deferring to my secondary alert that is supposed to be used if the user had denied or changed their permissions settings after the first attempt.
My understanding is that when a device is asked for the first time iOS will supply the permissions alert similar to this
I have this in my info.plist
<key>NSCameraUsageDescription</key>
<string>Allow access to your camera to take pictures of your dog!</string>
code when the user taps the camera button in app (solution at bottom)
function takePhoto() {
check(
Platform.select({
ios: PERMISSIONS.IOS.CAMERA,
android: PERMISSIONS.ANDROID.CAMERA,
})
)
.then(async (response) => {
if (response == 'unavailable') {
alert('Camera is not available on this device');
return;
}
const userId = auth().currentUser.uid;
let image = null;
const mapAPI = new MapAPI(firestore(), userId, storage);
const showSettingsAlert = (title, message) => {
Alert.alert(
title,
message,
[
{
text: translations['settings.goto'],
onPress: async () => {
Linking.openSettings();
},
},
{
text: translations['cancel'],
onPress: () => {
console.log('Cancel Pressed');
},
style: 'cancel',
},
],
{ cancelable: true }
);
};
if (response != RESULTS.GRANTED) {
showSettingsAlert(
translations['permissions.cameratitle'],
translations['permissions.cameradesc']
);
return;
}
I've been trying to tackle this for a while and appreciate any help. Thanks!
You are testing for response != RESULTS.GRANTED. So, if it is “not determined” (the status prior to asking the user’s permissions), that would result in your alert. We generally show our custom alert if the status is “denied” or “restricted”, rather than not “granted”.
I realized with the help from another overflower that I had not requested to access the camera before calling the custom alert.
Here is the solution that worked in my instance.
const takePhoto = async () => {
const imageProps = {
mediaType: 'photo',
width: 400,
height: 400,
writeTempFile: true,
};
const userId = auth().currentUser.uid;
let image = null;
const mapAPI = new MapAPI(firestore(), userId, storage);
try {
image = await ImageCropPicker.openCamera(imageProps);
} catch (error) {
console.log(error);
const response = await check(
Platform.select({
ios: PERMISSIONS.IOS.CAMERA,
android: PERMISSIONS.ANDROID.CAMERA,
})
);
if (response !== RESULTS.GRANTED && response !== RESULTS.UNAVAILABLE) {
showSettingsAlert(
translations['permissions.cameratitle'],
translations['permissions.cameradesc']
);
}
}
if (!image) {
return;
}
react-native-audio-recorder-player, library for use in my project to record audio.
When I start recording everything works fine, but close the application when I call the function to stop recording the audio "audioRecorderPlayer.stopRecorder ()".
It works perfectly on IOS (emulator and real device), and on Android emulator, but, it doesn't work on real android device.
What could be the cause of this error and how could it be resolved?
"react-native": "0.63.3",
"react-native-audio-recorder-player": "^2.6.0-rc3",
hooks for control
const [record, setRecord] = useState(false);
start record function
async function handleRecord() {
try {
const permissions = await getPermissions();
if (permissions) {
const path = Platform.select({
ios: `audio-${new Date().getTime()}.m4a`,
android: `sdcard/audio-${new Date().getTime()}.mp3`
});
const audioSet = {
AudioEncoderAndroid: AudioEncoderAndroidType.AAC,
AudioSourceAndroid: AudioSourceAndroidType.MIC,
AVEncoderAudioQualityKeyIOS: AVEncoderAudioQualityIOSType.high,
AVNumberOfChannelsKeyIOS: 2,
AVFormatIDKeyIOS: AVEncodingOption.aac
};
const result = await audioRecorderPlayer.startRecorder(path, audioSet);
setRecord(true);
audioRecorderPlayer.addRecordBackListener(e => {
return;
});
}
} catch (err) {
}
}
stop record function
async function onStopRecord() {
try {
const result = await audioRecorderPlayer.stopRecorder();
audioRecorderPlayer.removeRecordBackListener();
setRecord(false);
setFiles([
...files,
{
uri: result,
type: "audio/mpeg",
name: `audio-${new Date().getTime()}.mp3`
}
]);
}catch(error){
Alert.alert(
"Erro",
error
);
}
}
expo-camera: "^8.0.0"
sdkVersion: "36.0.0"
Hello people, when i try:
import { Camera } from 'expo-camera';
...
const cameraIsAvailable = await Camera.isAvailableAsync()
const availablesCameraTypes = await Camera.getAvailableCameraTypesAsync()
console.log("cameraIsAvailable: ", cameraIsAvailable)
console.log("availablesCameraTypes: ", availablesCameraTypes)
i get the fallowing errors:
expo-camera.isAvailableAsync is not available on android, are you sure you've linked all the native dependencies properly?
The method or property expo-camera.getAvailableCameraTypesAsync is not available on android, are you sure you've linked all the native dependencies properly?
the problem just disappear when i remove:
state = {
...
cameraType: Camera.Constants.Type.front,
};
...
<Camera
type={this.state.cameraType}
flashMode={flashMode}
style={styles.preview}
ref={camera => this.camera = camera}
/>
and change it by:
state = {
...
cameraType: Camera.Constants.Type.back,
};
and i change "cameraType" by
componentDidMount = () => {
this.props.navigation.addListener('didFocus', async () => {
await setTimeout(() => {
this.setState({ cameraType: Camera.Constants.Type.front })
}, 100)
});
}
it seems its an error from expo-camera...
so when i try to call these methods:
const cameraIsAvailable = await Camera.isAvailableAsync()
const availablesCameraTypes = await Camera.getAvailableCameraTypesAsync()
i get following errors: errors: expo-camera.isAvailableAsync and expo-camera.getAvailableCameraTypesAsync is not available on android
The methods you're trying to use, Camera.isAvailableAsync and Camera.getAvailableCameraTypesAsync are marked in the documentation as Web only, so calling them will only work, well, on Web.
In code run in react-native context (as opposed to browser context) just check permissions and you should be good to go!
I am working with the flutter text to speech functionality. Later after when I get the text it is sending the text in parts.
I am using the speech_recognition plugin where I input my voice command and the text is sent to one of the API created, but the problem is it sends the text in parts to the API.
Example :
Actual Text: What time it is
but some times the String goes 2 times 2 API calls
OR Some times it sends only a few words at a time
_speechRecognition.setAvailabilityHandler((bool result) => setState(() {
_isAvailable = result;
print('Avalibility handler was called');
}));
_speechRecognition.setRecognitionStartedHandler(
() => setState(() {
_isListening = true;
print('recognition start handler was called');
}),
);
_speechRecognition.setRecognitionResultHandler(
(String speech) => setState(() {
resultText = speech;
print(
'set result handler was called This is the result handler : $resultText');
if (_isListening == false && resultText != '') {
_handleSubmitted(speech);
}
}),
);
_speechRecognition.setRecognitionCompleteHandler(() {
setState(() {
print('set recognition complete handler was called');
_isListening = false;
});
});
_speechRecognition.activate().then(
(result) => setState(() {
_isAvailable = result;
print('set activate handler was called');
}),
);
}```
What i have done to resolve this issue in my project i have taken one new variable to hold result of speech recognition and on complete method i have passed that result to API call
Example:-
String transcription = '';
_speech.setRecognitionResultHandler(onRecognitionResult);
_speech.setRecognitionCompleteHandler(onRecognitionComplete);
void onRecognitionResult(String text) {
setState(() {
transcription = text;
});
}
void onRecognitionComplete() {
setState(() {
LogUtils.d("result.....$transcription");
_handleSubmitted(transcription);
});
}
Instead of managing API call in onRecognitionResult use onRecognitionComplete