On a specific device (Samsung Galaxy S9 with Android 9), when I try to open the camera through ExponentImagePicker, I get the following error:
Error: Call to function 'ExponentImagePicker.launchCameraAsync' has been rejected.
Caused by: kotlin.UninitializedPropertyAccessException: lateinit property cameraLauncher has not been initialized
On an Android 9 emulator it works, and also for newer API version emulators.
This was working previously, but seems to have stopped working after updating react native and other libraries.
Anything I can do about it?
Code:
import * as ImagePicker from 'expo-image-picker';
const MediaSelector: React.FC<Props> = (props) => {
const open = async () => {
const permissions = await ImagePicker.requestCameraPermissionsAsync();
if (!permissions.granted) return Alert.alert("permission denied!"))
const config: ImagePicker.ImagePickerOptions = {
mediaTypes: ImagePicker.MediaTypeOptions.Images,
allowsEditing: true,
allowsMultipleSelection: false,
exif: false,
aspect: [1, 1],
}
try {
const result = await ImagePicker.launchCameraAsync(config);
} catch (error) {
console.log(error)
Alert.alert("error!")
return
}
}
return <Pressable style={styles.container} onPress={open}>
<ImageView img={props.image}/>
</Pressable/>
}
versions:
"react": "18.0.0",
"expo-image-picker": "~13.3.1",
"react-native": "0.69.6",
I had the same issue, and for some reason using getCameraPermissionsAsync() fixed the issue - whereas requestCameraPermissionsAsync() on its own would cause launchCameraAsync() to be rejected on Android devices.
See the following:
let permissionResult = await ImagePicker.getCameraPermissionsAsync();
if(permissionResult.status !== 'granted') {
permissionResult = await ImagePicker.requestCameraPermissionsAsync();
}
if(permissionResult.status !== 'granted') {
alert("You must turn on camera permissions to record a video.");
}
else {
let result = await ImagePicker.launchCameraAsync({
mediaTypes: ImagePicker.MediaTypeOptions.Videos,
allowsEditing: true,
aspect: [3, 4],
});
Related
Here is my code:
const saveImg = async (base64Img: string, success: Function, fail:Function) => {
const isAndroid = Platform.OS === "android"
const isIos = Platform.OS === 'ios'
const dirs = isIos? RNFS.LibraryDirectoryPath : RNFS.ExternalDirectoryPath;
const certificateTitle = 'certificate-'+((Math.random() * 10000000) | 0)
const downloadDest = `${dirs}/${certificateTitle}.png`;
const imageDatas = base64Img.split('data:image/png;base64,');
const imageData = imageDatas[1];
try{
await RNFetchBlob.config({
addAndroidDownloads:{
notification:true,
description:'certificate',
mime:'image/png',
title:certificateTitle +'.png',
path:downloadDest
}
}).fs.writeFile(downloadDest, imageData, 'base64')
if (isAndroid) {
} else {
RNFetchBlob.ios.previewDocument(downloadDest);
}
success()
}catch(error:any){
console.log(error)
fail()
}
}
I get this error:
undefined is not an object (near '...}).fs.writeFile(downloadD...')
at node_modules/react-native-webview/lib/WebView.android.js:207:16 in _this.onMessage
When I hit the download button and this runs I get the mentioned Error.
I use to get the download done with the below code modification, but I really need to show the download feedback from both android and IOS.
This works (but without notification)
await RNFetchBlob.fs.writeFile(downloadDest, imageData, 'base64')
I am using expo
I discovered that the react-fetch-blob does not work with expo, to solve it, I used the following libraries:
expo-file-system, expo-media-library, expo-image-picker,expo-notifications
This was the code to convert, download and show the notification of the image in the "expo way":
import * as FileSystem from 'expo-file-system';
import * as MediaLibrary from 'expo-media-library';
import * as ImagePicker from 'expo-image-picker';
import * as Notifications from 'expo-notifications';
const saveImg = async (base64Img: string, success: Function, fail:Function) => {
const imageDatas = base64Img.split('data:image/png;base64,');
const imageData = imageDatas[1];
try {
const certificateName = 'certificate-'+((Math.random() * 10000000) | 0) + ".png"
const certificatePathInFileSystem = FileSystem.documentDirectory +certificateName ;
await FileSystem.writeAsStringAsync(certificatePathInFileSystem, imageData, {
encoding: FileSystem.EncodingType.Base64,
});
await MediaLibrary.saveToLibraryAsync(certificatePathInFileSystem);
Notifications.setNotificationHandler({
handleNotification: async () => ({
shouldShowAlert: true,
shouldPlaySound: false,
shouldSetBadge: true,
}),
});
await Notifications.scheduleNotificationAsync({
content: {
title: certificateName +' saved !',
body: "Click to show the certificate",
},
trigger: null,
});
setCertificatePath(certificatePathInFileSystem)
success()
} catch (e) {
console.error(e);
fail()
}
}
In order to open the images gallery on click I used this code:
useEffect(()=>{
if(certificatePath){
Notifications.addNotificationResponseReceivedListener( async (event )=> {
await ImagePicker.launchImageLibraryAsync({
mediaTypes: ImagePicker.MediaTypeOptions.All,
allowsEditing: true,
})
})
}
},[certificatePath])
Try to call fetch after create RNFetchBlob.config
If you just wanna display an Image and not store you can show image as fallows (https://reactnative.dev/docs/next/images#uri-data-images)
<Image
style={{
width: 51,
height: 51,
resizeMode: 'contain'
}}
source={{
uri: 'data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAADMAAAAzCAYAAAA6oTAqAAAAEXRFWHRTb2Z0d2FyZQBwbmdjcnVzaEB1SfMAAABQSURBVGje7dSxCQBACARB+2/ab8BEeQNhFi6WSYzYLYudDQYGBgYGBgYGBgYGBgYGBgZmcvDqYGBgmhivGQYGBgYGBgYGBgYGBgYGBgbmQw+P/eMrC5UTVAAAAABJRU5ErkJggg=='
}}
/>
Call fetch on config object:
try{
const fetchConfig = await RNFetchBlob.config({
addAndroidDownloads:{
notification:true,
description:'certificate',
mime:'image/png',
title:certificateTitle +'.png',
path:downloadDest
}
})
fetchConfig.fetch('your.domain.com').fs.writeFile(downloadDest, imageData, 'base64')
if (isAndroid) {
} else {
RNFetchBlob.ios.previewDocument(downloadDest);
}
success()
}catch(error:any){
console.log(error)
fail()
}
I am trying to connect react native webrtc lib to my react native project. I can go to screen normally, but when I press button to start the call I receive following error:
[Unhandled promise rejection: TypeError: undefined is not an object (evaluating 'navigator.mediaDevices.enumerateDevices')]
This was tested on both android emulator and android phone. Both had same issue.
const startLocalStream = async () => {
// isFront will determine if the initial camera should face user or environment
const isFront = true;
let devices = await navigator.mediaDevices.enumerateDevices(); //this is where I get error
console.log(devices);
const facing = isFront ? 'front' : 'environment';
const videoSourceId = devices.find(device => device.kind === 'videoinput' && device.facing === facing);
const facingMode = isFront ? 'user' : 'environment';
const constraints = {
audio: true,
video: {
mandatory: {
minWidth: 500, // Provide your own width, height and frame rate here
minHeight: 300,
minFrameRate: 30,
},
facingMode,
optional: videoSourceId ? [{ sourceId: videoSourceId }] : [],
},
};
const newStream = await mediaDevices.getUserMedia(constraints);
setLocalStream(newStream);
};
const startCall = async id => {
const localPC = new RTCPeerConnection(configuration);
localPC.addStream(localStream);
const roomRef = await db.collection('rooms').doc(id);
const callerCandidatesCollection = roomRef.collection('callerCandidates');
localPC.onicecandidate = e => {
if (!e.candidate) {
console.log('Got final candidate!');
return;
}
callerCandidatesCollection.add(e.candidate.toJSON());
};
localPC.onaddstream = e => {
if (e.stream && remoteStream !== e.stream) {
console.log('RemotePC received the stream call', e.stream);
setRemoteStream(e.stream);
}
};
const offer = await localPC.createOffer();
await localPC.setLocalDescription(offer);
const roomWithOffer = { offer };
await roomRef.set(roomWithOffer);
roomRef.onSnapshot(async snapshot => {
const data = snapshot.data();
if (!localPC.currentRemoteDescription && data.answer) {
const rtcSessionDescription = new RTCSessionDescription(data.answer);
await localPC.setRemoteDescription(rtcSessionDescription);
}
});
roomRef.collection('calleeCandidates').onSnapshot(snapshot => {
snapshot.docChanges().forEach(async change => {
if (change.type === 'added') {
let data = change.doc.data();
await localPC.addIceCandidate(new RTCIceCandidate(data));
}
});
});
setCachedLocalPC(localPC);
};
How to fix it? I am using official lib for react native webrtc.
Thank you for visiting this post.
I am having a trouble with React Native getting user location.
I have searched for a quite some time and read other people's posts about this function.
It seems like this function has some several problems.
And also react native expo documentation seems to be outdated.
https://docs.expo.dev/versions/latest/sdk/location/
I use the code from here under usage section.
import React, { useState, useEffect } from 'react';
import { Platform, Text, View, StyleSheet } from 'react-native';
import * as Location from 'expo-location';
export default function App() {
const [location, setLocation] = useState(null);
const [errorMsg, setErrorMsg] = useState("");
useEffect(async () => {
(async () => {
let { status } = await Location.requestForegroundPermissionsAsync();
console.log(status);
if (status !== 'granted') {
console.log("denied")
setErrorMsg('Permission to access location was denied');
return;
}
console.log("status", status); // <=== always says "granted"
let location = await Location.getCurrentPositionAsync({
accuracy: Location.Accuracy.Highest,
maximumAge: 10000,
timeout: 5000
});
console.log({ location }) // <== never reach here.
setLocation(location);
setErrorMsg('No ERROR');
})();
}, []);
let text = 'Waiting..';
if (errorMsg) {
text = errorMsg;
} else if (location) {
text = JSON.stringify(location);
}
return (
<View>
<Text>{text}</Text>
</View>
);
}
I have seen one of the post saying I have to pass the arguments of accuracy, maximumAge to getCurrentPositionAsync, instead of {} empty object which is provided in the expo docs.
But still does not work. since getCurrentPositionAsync is hanging, the screen keep displaying waiting....
And of course, the android emulator is setup correctly I believe, since I do see the status log and it says "granted".
Thank you in advance so much for your help and reading my post.
"expo": "~43.0.2",
"expo-location": "~13.0.4",
"expo-status-bar": "~1.1.0",
"react": "17.0.1",
"react-dom": "17.0.1",
"react-native": "0.64.3",
"react-native-web": "0.17.1"
You can't use async with useEffect, and therefore not await.
Have you tried the other way around?
const [position, setPosition] = useState(false)
// useEffect
Location.getCurrentPositionAsync({
accuracy: Location.Accuracy.Highest,
maximumAge: 10000,
timeout: 5000
})
.then(res => setPosition(res))
.catch(e => console.log(e)
I've also had more chances without non-fat arrowed IIFE, you should try out
(function () {
let { status } = await Location.requestForegroundPermissionsAsync();
console.log(status);
if (status !== 'granted') {
console.log("denied")
setErrorMsg('Permission to access location was denied');
return;
}
console.log("status", status); // <=== always says "granted"
let location = await Location.getCurrentPositionAsync({
accuracy: Location.Accuracy.Highest,
maximumAge: 10000,
timeout: 5000
});
console.log({ location }) // <== never reach here.
setLocation(location);
setErrorMsg('No ERROR');
})();
Im trying to send push notification to an android device using the expo push notification tool, (It is working for iOS) but it doesn't work, I have added a firebase project with a separate android app, the google-service.json, and ran the expo push:android:upload --api-key . Is there something missing ?
Im using a managed workflow
SDK 39
This is the code which I have used and it works perfectly for iOS
getting the expo push notification code
async function registerForPushNotificationsAsync() {
let token;
if (Constants.isDevice) {
const { status: existingStatus } = await Permissions.getAsync(Permissions.NOTIFICATIONS);
let finalStatus = existingStatus;
if (existingStatus !== 'granted') {
const { status } = await Permissions.askAsync(Permissions.NOTIFICATIONS);
finalStatus = status;
}
if (finalStatus !== 'granted') {
return;
}
token = (await Notifications.getExpoPushTokenAsync()).data;
let uid;
firebase.auth().signInAnonymously()
.then(() => {
uid = firebase.auth().currentUser.uid;
})
.then(() => {
db.collection('users').doc(uid).get()
.then(function(doc) {
if (doc.exists) {
db.collection('users').doc(uid).update({
expoPushToken: token,
'last use date': new Date().toISOString().slice(0,10),
'region': Localization.region
})
} else {
db.collection('users').doc(uid).set({
expoPushToken: token,
'last use date': new Date().toISOString().slice(0,10),
'region': Localization.region
})
}
})
})
.catch(() => console.log('ERROR'))
} else {
alert('Must use physical device for Push Notifications');
}
if (Platform.OS === 'android') {
Notifications.setNotificationChannelAsync('default', {
name: 'default',
importance: Notifications.AndroidImportance.MAX,
vibrationPattern: [0, 250, 250, 250],
lightColor: '#FF231F7C',
});
}
return token;
}
app.json
"android": {
"useNextNotificationsApi": true,
"googleServicesFile": "./google-services.json",
"adaptiveIcon": {
"foregroundImage": "./assets/adaptive-icon.png",
"backgroundColor": "#FFFFFF"
},
"package": "com.NAME.NAME",
"versionCode": 3
},
react-native-audio-recorder-player, library for use in my project to record audio.
When I start recording everything works fine, but close the application when I call the function to stop recording the audio "audioRecorderPlayer.stopRecorder ()".
It works perfectly on IOS (emulator and real device), and on Android emulator, but, it doesn't work on real android device.
What could be the cause of this error and how could it be resolved?
"react-native": "0.63.3",
"react-native-audio-recorder-player": "^2.6.0-rc3",
hooks for control
const [record, setRecord] = useState(false);
start record function
async function handleRecord() {
try {
const permissions = await getPermissions();
if (permissions) {
const path = Platform.select({
ios: `audio-${new Date().getTime()}.m4a`,
android: `sdcard/audio-${new Date().getTime()}.mp3`
});
const audioSet = {
AudioEncoderAndroid: AudioEncoderAndroidType.AAC,
AudioSourceAndroid: AudioSourceAndroidType.MIC,
AVEncoderAudioQualityKeyIOS: AVEncoderAudioQualityIOSType.high,
AVNumberOfChannelsKeyIOS: 2,
AVFormatIDKeyIOS: AVEncodingOption.aac
};
const result = await audioRecorderPlayer.startRecorder(path, audioSet);
setRecord(true);
audioRecorderPlayer.addRecordBackListener(e => {
return;
});
}
} catch (err) {
}
}
stop record function
async function onStopRecord() {
try {
const result = await audioRecorderPlayer.stopRecorder();
audioRecorderPlayer.removeRecordBackListener();
setRecord(false);
setFiles([
...files,
{
uri: result,
type: "audio/mpeg",
name: `audio-${new Date().getTime()}.mp3`
}
]);
}catch(error){
Alert.alert(
"Erro",
error
);
}
}