I used react-native-fs to download an image from the server then I used openCropper() from react-native-image-crop-picker to crop. It worked well in iOS, however, it crashed in Android running without any error message or alert.
Have you ever in this issue ? Give me a guide. Thanks everyone.
const onCropImage = async () => {
setViewImageModal(false);
const uri = `${RNFS.DocumentDirectoryPath}/${fileName}`;
let options = {
fromUrl: viewImage && viewImage[0].url,
toFile: uri,
};
await RNFS.downloadFile(options).promise;
// App crashed when I call .openCropper() in Android running
ImageCropPicker.openCropper({
path: uri,
width: 300,
height: 400,
cropping: true,
freeStyleCropEnabled: true,
})
.then((image) => {
if (image) {
const temp = image.path.split("/");
const imageName = temp[temp.length - 1];
navigation.navigate("EditHostScreen", {
host,
type: "hosts",
info: { uri: image.path, typeImage: image.mime, name: imageName },
});
const time = setTimeout(() => {
ImageCropPicker.clean();
if (uri) {
RNFS.unlink(uri);
}
}, 100000);
clearTimeout(time);
}
})
.catch((err) => {
// console.log(err);
});
};
You need to add the android specific path value.
import { Platform } from 'react-native';
ImageCropPicker.openCropper({
path: Platform.OS === "android" ('file://' + uri) : uri,
width: 300,
height: 400,
cropping: true,
freeStyleCropEnabled: true,
})
Related
Here is my code:
const saveImg = async (base64Img: string, success: Function, fail:Function) => {
const isAndroid = Platform.OS === "android"
const isIos = Platform.OS === 'ios'
const dirs = isIos? RNFS.LibraryDirectoryPath : RNFS.ExternalDirectoryPath;
const certificateTitle = 'certificate-'+((Math.random() * 10000000) | 0)
const downloadDest = `${dirs}/${certificateTitle}.png`;
const imageDatas = base64Img.split('data:image/png;base64,');
const imageData = imageDatas[1];
try{
await RNFetchBlob.config({
addAndroidDownloads:{
notification:true,
description:'certificate',
mime:'image/png',
title:certificateTitle +'.png',
path:downloadDest
}
}).fs.writeFile(downloadDest, imageData, 'base64')
if (isAndroid) {
} else {
RNFetchBlob.ios.previewDocument(downloadDest);
}
success()
}catch(error:any){
console.log(error)
fail()
}
}
I get this error:
undefined is not an object (near '...}).fs.writeFile(downloadD...')
at node_modules/react-native-webview/lib/WebView.android.js:207:16 in _this.onMessage
When I hit the download button and this runs I get the mentioned Error.
I use to get the download done with the below code modification, but I really need to show the download feedback from both android and IOS.
This works (but without notification)
await RNFetchBlob.fs.writeFile(downloadDest, imageData, 'base64')
I am using expo
I discovered that the react-fetch-blob does not work with expo, to solve it, I used the following libraries:
expo-file-system, expo-media-library, expo-image-picker,expo-notifications
This was the code to convert, download and show the notification of the image in the "expo way":
import * as FileSystem from 'expo-file-system';
import * as MediaLibrary from 'expo-media-library';
import * as ImagePicker from 'expo-image-picker';
import * as Notifications from 'expo-notifications';
const saveImg = async (base64Img: string, success: Function, fail:Function) => {
const imageDatas = base64Img.split('data:image/png;base64,');
const imageData = imageDatas[1];
try {
const certificateName = 'certificate-'+((Math.random() * 10000000) | 0) + ".png"
const certificatePathInFileSystem = FileSystem.documentDirectory +certificateName ;
await FileSystem.writeAsStringAsync(certificatePathInFileSystem, imageData, {
encoding: FileSystem.EncodingType.Base64,
});
await MediaLibrary.saveToLibraryAsync(certificatePathInFileSystem);
Notifications.setNotificationHandler({
handleNotification: async () => ({
shouldShowAlert: true,
shouldPlaySound: false,
shouldSetBadge: true,
}),
});
await Notifications.scheduleNotificationAsync({
content: {
title: certificateName +' saved !',
body: "Click to show the certificate",
},
trigger: null,
});
setCertificatePath(certificatePathInFileSystem)
success()
} catch (e) {
console.error(e);
fail()
}
}
In order to open the images gallery on click I used this code:
useEffect(()=>{
if(certificatePath){
Notifications.addNotificationResponseReceivedListener( async (event )=> {
await ImagePicker.launchImageLibraryAsync({
mediaTypes: ImagePicker.MediaTypeOptions.All,
allowsEditing: true,
})
})
}
},[certificatePath])
Try to call fetch after create RNFetchBlob.config
If you just wanna display an Image and not store you can show image as fallows (https://reactnative.dev/docs/next/images#uri-data-images)
<Image
style={{
width: 51,
height: 51,
resizeMode: 'contain'
}}
source={{
uri: 'data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAADMAAAAzCAYAAAA6oTAqAAAAEXRFWHRTb2Z0d2FyZQBwbmdjcnVzaEB1SfMAAABQSURBVGje7dSxCQBACARB+2/ab8BEeQNhFi6WSYzYLYudDQYGBgYGBgYGBgYGBgYGBgZmcvDqYGBgmhivGQYGBgYGBgYGBgYGBgYGBgbmQw+P/eMrC5UTVAAAAABJRU5ErkJggg=='
}}
/>
Call fetch on config object:
try{
const fetchConfig = await RNFetchBlob.config({
addAndroidDownloads:{
notification:true,
description:'certificate',
mime:'image/png',
title:certificateTitle +'.png',
path:downloadDest
}
})
fetchConfig.fetch('your.domain.com').fs.writeFile(downloadDest, imageData, 'base64')
if (isAndroid) {
} else {
RNFetchBlob.ios.previewDocument(downloadDest);
}
success()
}catch(error:any){
console.log(error)
fail()
}
I am trying to connect react native webrtc lib to my react native project. I can go to screen normally, but when I press button to start the call I receive following error:
[Unhandled promise rejection: TypeError: undefined is not an object (evaluating 'navigator.mediaDevices.enumerateDevices')]
This was tested on both android emulator and android phone. Both had same issue.
const startLocalStream = async () => {
// isFront will determine if the initial camera should face user or environment
const isFront = true;
let devices = await navigator.mediaDevices.enumerateDevices(); //this is where I get error
console.log(devices);
const facing = isFront ? 'front' : 'environment';
const videoSourceId = devices.find(device => device.kind === 'videoinput' && device.facing === facing);
const facingMode = isFront ? 'user' : 'environment';
const constraints = {
audio: true,
video: {
mandatory: {
minWidth: 500, // Provide your own width, height and frame rate here
minHeight: 300,
minFrameRate: 30,
},
facingMode,
optional: videoSourceId ? [{ sourceId: videoSourceId }] : [],
},
};
const newStream = await mediaDevices.getUserMedia(constraints);
setLocalStream(newStream);
};
const startCall = async id => {
const localPC = new RTCPeerConnection(configuration);
localPC.addStream(localStream);
const roomRef = await db.collection('rooms').doc(id);
const callerCandidatesCollection = roomRef.collection('callerCandidates');
localPC.onicecandidate = e => {
if (!e.candidate) {
console.log('Got final candidate!');
return;
}
callerCandidatesCollection.add(e.candidate.toJSON());
};
localPC.onaddstream = e => {
if (e.stream && remoteStream !== e.stream) {
console.log('RemotePC received the stream call', e.stream);
setRemoteStream(e.stream);
}
};
const offer = await localPC.createOffer();
await localPC.setLocalDescription(offer);
const roomWithOffer = { offer };
await roomRef.set(roomWithOffer);
roomRef.onSnapshot(async snapshot => {
const data = snapshot.data();
if (!localPC.currentRemoteDescription && data.answer) {
const rtcSessionDescription = new RTCSessionDescription(data.answer);
await localPC.setRemoteDescription(rtcSessionDescription);
}
});
roomRef.collection('calleeCandidates').onSnapshot(snapshot => {
snapshot.docChanges().forEach(async change => {
if (change.type === 'added') {
let data = change.doc.data();
await localPC.addIceCandidate(new RTCIceCandidate(data));
}
});
});
setCachedLocalPC(localPC);
};
How to fix it? I am using official lib for react native webrtc.
can someone tell me what i m doing wrong i keep getting error 400 bad request, i can't seem to figure out how to send the image i tried to send the path, the filename and the mime but it's not working this is my request:
const [image,setImage]=useState(null)
const[filename,setFileName]=useState(null)
const sendpic=async ()=>{
await ImagePicker.openCamera({
mediaType:'photo',
width: 300,
height: 400,
cropping: false,
}).then(image => {
setImage(image['path']);
const paths=image['path']
const filename=paths.substring(paths.lastIndexOf('/')+1);
setFileName(filename);
console.log(filename)
console.log(image)
const data=new FormData();
data.append('image',filename)
data.append('title','3aslemajiti')
const headers={
Accept:'application/json',
'Content-Type':'multipart/form-data',
}
try{
const response= axios.post('http://192.168.1.19:8000/Sends/',data,{headers:headers})
alert('yess!!!!!');
}
catch (error) {
// handle error
alert(error.message);
}
});
};
and this is my model:
from django.db import models
# Create your models here.
class Send(models.Model):
title = models.CharField(max_length=255)
image=models.ImageField(default ='null')
def __str__(self):
return self.title
how do i write the request so it is accepted by the server?
data.append('image', {
uri: filename,
name: 'test.jpg',
type: 'image/jpeg'
});
Image upload format should be this and please check file url should be correct.
"uri": "file:///Users/user/Library/Developer/CoreSimulator/Devices/33198C8D-55D3-4555-B9B5-DC1A61761AAF/data/Containers/Data/Application/B5067299-1CD2-4000-8935-59B59ED447F6/tmp/871EB6D5-2408-4A10-8DE7-EE52B1855ECD.jpg"
this is url for image. it should be like this.
const data = new FormData();
data.append("uploadFile", {
name: filename,
type: filetype,
uri:
Platform.OS === "android"
? fileuri
: fileuri.replace("file://", "")
});
var url = uploadDoc
axios.post(url, data, {headers: {
"Content-Type": "multipart/form-data",
Accept: "application/json",
Authorization: authToken
}})
.then((res) => {
})
.catch((err) => {
})
I'm using the library react-native-pdf to show it.
Its frame is okay because I've set the background in blue.
I'm doing a RNFetchBlob.fs.exists(path) just before and the value is true but nothing appears on pdf viewer
Any idea ? I have no log or whatever which tell me there is a problem.
<Pdf source={{uri : filePath}}
style={{backgroundColor: 'grey', flex : 1}}/>
Here is how I show the pdf.
const request = RNFetchBlob
.config(this.getConfig(title, ext))
.fetch('GET', uri, header);
this.setState(
() => {
return {request: request}
},
() => {
this.state.request.then((res) => {
if (Platform.OS === 'android') {
const path = res.path()
const infos = res.info()
console.log(infos)
if (path !== undefined){
this.setState(() => {
return {
filePath : path
}
})
The file path is returned by the res.path()
addAndroidDownloads: {
useDownloadManager: true,
notification: false,
path: filePath,
description: `${title}`,
mediaScannable: true,
}
The variable config is made like this.
I have the error like in question, when I'm trying to design my application to call native.camera, I see my console in ionic 3 project, I saw this error :
Native : tried calling Camera.getPicture, but Cordova is not available. Make sure to include cordova.js or run in a device / simulator.
Here is the code that I used to called native camera.
This is the code in my problem.html
<button class="logoCamera" ion-button (click)="presentActionSheet()">
<ion-icon name="camera" ></ion-icon>
This is the code in my problem.ts
import { File } from '#ionic-native/file';
import { Transfer, TransferObject} from '#ionic-native/transfer';
import { FilePath } from '#ionic-native/file-path';
import { Camera } from '#ionic-native/camera';
public presentActionSheet(){
let actionSheet = this.actionSheetCtrl.create({
title: 'Select Image',
buttons: [
{
text: 'Load from Library',
handler: () => {
this.takePicture(this.camera.PictureSourceType.PHOTOLIBRARY);
}
},
{
text: 'Use Camera',
handler: () => {
this.takePicture(this.camera.PictureSourceType.CAMERA);
}
},
{
text: 'Cancel',
role: 'cancel'
}
]
});
actionSheet.present();
}
public takePicture(sourceType){
//Create option for the Camera dialog
var options = {
quality: 100,
sourceType : sourceType,
saveToPhotoAlbum: false,
correctOrientation: true
};
//Get the data of an image
this.camera.getPicture(options).then((imagePath) => {
//special handling for android lib
if(this.platform.is('android') && sourceType === this.camera.PictureSourceType.PHOTOLIBRARY) {
this.filePath.resolveNativePath(imagePath)
.then(filePath => {
let correctPath = filePath.substr(0, filePath.lastIndexOf('/') + 1 );
let currentName = imagePath.substring(imagePath.lastIndexOf('/') + 1, imagePath.lastIndexOf('?'));
this.copyFileToLocalDir(correctPath, currentName, this.createFileName());
});
} else {
var currentName = imagePath.substr(imagePath.lastIndexOf('/') + 1);
var correctPath = imagePath.substr(0, imagePath.lastIndexOf('/')+ 1);
this.copyFileToLocalDir(correctPath, currentName, this.createFileName());
}
}, (err) => {
this.presentToast('Error while selecting Image.');
});
}
//Create a new name for image
private createFileName() {
var d = new Date(),
n = d.getTime(),
newFileName = n + ".jpg";
return newFileName;
}
//copy image to local folder
private copyFileToLocalDir(namePath, currentName, newFileName) {
this.file.copyFile(namePath, currentName, cordova.file.dataDirectory, newFileName).then(success => {
this.lastImage = newFileName;
}, error => {
this.presentToast('Error while storing file.');
});
}
private presentToast(text) {
let toast = this.toastCtrl.create({
message: text,
duration: 3000,
position: 'middle'
});
toast.present();
}
public pathForImage(img){
if (img === null) {
return '';
} else {
return cordova.file.dataDirectory + img;
}
}
public uploadImage() {
//destination URL
var url = "";
//file to upload
var targetPath = this.pathForImage(this.lastImage);
//file name only
var filename = this.lastImage;
var options = {
fileKey: "file",
fileName: filename,
chunkedMode: false,
mimeType: "multipart/form-data",
params: {'fileName': filename}
};
const fileTransfer: TransferObject = this.transfer.create();
this.loading = this.loadingCtrl.create({
content: 'Uploading...',
});
this.loading.present();
//use FileTransfer to upload image
fileTransfer.upload(targetPath, url, options).then(data => {
this.loading.dismissAll()
this.presentToast('Image successful uploaded.');
}, err => {
this.loading.dismissAll()
this.presentToast('Error while uploading file.');
});
}
When I run ionic serve, everything is smooth, no error, no nothing.
But when I click my button to access natve camera, the error shows, please help me figure out the problem, I check a lot of web, and none of it solve my question.
After I try run ionic cordova run ios --simulator, there are error coming out, but I am pretty sure that this error does not exist before I run this command.
May I know how to solve this problem ??
The error message is pretty accurate here:
Native : tried calling Camera.getPicture, but Cordova is not available. Make sure to include cordova.js or run in a device / simulator.
Running ionic serve does not include cordova.js nor does it run your application in a simulator or on a device which is why you get the error. You can fix it either by running your application on the device or simulator:
ionic cordova run android/ios --device/--simulator
Or by adding the browser platform:
cordova platform add browser
And running the browser platform:
ionic cordova run browser