I'm trying to upload small files using react-native-image-crop-picker it's working fine but when I tried uploading large video/image file using react-native-image-crop-picker I'm getting Network error.
Note : Backend is working fine I'm able to upload file using Postman.
It's happening with large files only. I'm able to upload files that are smaller than 1MB
Code
import ImagePicker from 'react-native-image-crop-picker';
import axios from "axios";
function uploadFile(){
ImagePicker.openCamera({
mediaType: 'any',
}).then(file=> {
const body = new FormData();
body.append('vurl', {
name: file.fileName,
type: file.mime,
uri: Platform.OS === 'ios' ? file.path.replace('file://', '') : file.path,
});
axios({
method:"post",
url:"Server url",
data:body,
headers: {
Accept: 'application/json',
'Content-Type': "multipart/form-data",
}
})
.then((res) => {
console.log(JSON.stringify(res));
})
.catch((err) => {
console.log(err.response);
});
});
}
//calling here
<TouchableOpacity onPress={uploadFile}>
<Text>upload file</Text>
<TouchableOpacity>
Please check that you have set or not client_max_body_size in your backend for the server.
For Nginx :- /etc/nginx/proxy.conf
client_max_body_size 100M;
For more: Increasing client_max_body_size in Nginx conf on AWS Elastic Beanstalk
Make sure that you don't have a default timeout defined in the code like:
.defaults.timeout = x
https://www.npmjs.com/package/react-native-axios#config-defaults
I produced my suggestions by following directly from react-native-image-crop-picker example app repo here:
Set max height and width of what you're recording on camera. Example :
ImagePicker.openCamera({
mediaType: 'any',
cropping: cropping,
width: 500,
height: 500,
}).then(file=> {
Compressing the quality of an image that you are picking
ImagePicker.openPicker({
mediaType: 'any',
compressImageMaxWidth: 1000,
compressImageMaxHeight: 1000,
compressImageQuality: 1, ----> you can set the value between 0 and 1 e.g. 0.6.
})
.then(file=> {.....
Set a max size should not exceed e.g. 1MB (whatever number you want) when you get the response before handling it further. Again I obtained file.size in the response section of the link above
ImagePicker.openCamera({
mediaType: 'any',
}).then(file=> {
if (file.size > xxx) {
Alert.alert('File size too large!')
//dont forget to import alert from react-native
}
Related
My API is a very basic Python script with Sanic, I have a SSL certificate from letsencrypt, and the server is running on an AWS Lightsail instance. Both Postman and my React Native application are running in a separate Windows 11 machine.
Here's the relevant API code:
from os import abort
from sanic import Sanic
from sanic.response import json
app = Sanic(name="APIbuilding")
ssl = {
"cert": "/home/ec2-user/keys/fullchain.pem",
"key": "/home/ec2-user/keys/privkey.pem",
}
#app.route('/loginRequest')
async def confirm(request):
email = str(request.args.get("email"))
password = str(request.args.get("password"))
return json({'hello': 'world'}, headers={"Access-Control-Allow-Origin": "*","Access-Control-Allow-Methods": "*"})
if __name__ == '__main__':
app.run(host='123.456.789.875', port=8443, ssl=ssl)
My request in Postman works:
https://www.example.com/loginRequest/?email=example#mail.com&password=password
Returns
{
"hello": "world"
}
And I can see the correct logs in my server:
[2022-10-18 16:22:07 +0000] - (sanic.access)[INFO][123.123.123.123:49339]: POST https://www.example.com/loginRequest/?email=example#mail.com&password=password 200 30
I can see the parameters sent, and I can read the values in the server. However, I have not been able to replicate this using React Native (or Android Studio, for that matter).
My React Native code is as follows:
import React from 'react';
import { Button, View } from 'react-native';
const App = () => {
return (
<View style={{ flex: 1, justifyContent: "center", alignItems: "center" }}>
<Button
onPress={() =>
fetch('https://www.example.com/loginRequest/', {
method: 'POST',
headers: {
Accept: 'application/json',
'Content-Type': 'application/json'
},
body: {
"email": "example#mail.com",
"password": "password"
}
})
}
title="POST to API"
/>
</View>
);
}
export default App;
On the Android Emulator, I see the correct view, which is a plain button with the text "POST TO API". When I click on it, I see activity in the server. There is a connection. However, it doesn't send the body of the request. The log looks like this:
[2022-10-18 16:31:52 +0000] - (sanic.access)[INFO][123.123.123.123:49339]: POST https://www.example.com/loginRequest/ 200 30
Note that there are no parameters in the log this time. Also, I get the same result when using JSON stringify to build the body of the fetch request.
API expects credentials as query parameters, so something like this would work:
fetch(`https://www.example.com/loginRequest/?email=${email}&password=${password}`, {
method: 'POST',
headers: {
'Accept': 'application/json',
'Content-Type': 'application/json'
}
}
On your backend side, you take params from the URL as far as I understand (https://sanic.dev/en/guide/basics/request.html#parameters). On React Native side you pass another body params. You need to pass your params as a part of the URL or change your backend handler.
I got very strange issue: we have scan functionality for documents in our app and as the result scan give's me encoded base64 image with photo. Everything is good on ios platform but when I trying to send my picture on android, I get xhr.status 0 and error. Also, next strange thing is that when I starting debug mode and enable network inspection in react-native-debugger, picture is sending without errors. I was trying it on release app version, installed on my device, but still got an error with status 0
XHR request
export const uploadXHRImage = (url: string, data: IDataUploadImage) => {
return new Promise((resolve, reject) => {
const xhr = new XMLHttpRequest();
xhr.onreadystatechange = () => {
if (xhr.readyState === 4) {
if (xhr.status === 200) {
resolve('Image successfully uploaded to S3');
} else {
reject(localize('failedUploadImage'));
}
}
};
xhr.ontimeout = () => reject(localize('timeoutUploadImage'));
xhr.timeout = UPLOAD_IMAGE_TIMEOUT;
xhr.open('PUT', url);
xhr.setRequestHeader('Content-Type', data.type);
xhr.send(data);
});
};
Add to header:
"Content-Type": 'application/json',
"Connection": "close",
I found the answer: Android can not process xhr send image, while image isn't saved as file in cache or other directory. Also, android needs file:// before data. Example is here:
saveImage = (image: string) => {
if (IS_IOS) {
return `data:image/jpg;base64,${image}`;
}
const tempFileDirectory = `${fs.CachesDirectoryPath}`;
const tempFilePath = `${tempFileDirectory}/${uuidv4()}.jpg`;
fs.writeFile(tempFilePath, image, 'base64');
return `file://${tempFilePath}`;
};
I want to change my local image into file type so I can send it on a api. How can I do it? How can it be done with react-native-fetch-blob
Please help.
This is what I'm trying so far but it gives me a warning that it cannot find the path to the image. And nothing is shown via alert.
RNFetchBlob.fs.readFile('../../assets/imgs/profile.jpg', 'base64')
.then((data) => {
// handle the data ..
alert(data);
})
According to: https://github.com/joltup/rn-fetch-blob#user-content-upload-example--dropbox-files-upload-api
This example uses drop box.
RNFetchBlob.fetch('POST', 'https://content.dropboxapi.com/2/files/upload', {
// dropbox upload headers
Authorization : "Bearer access-token...",
'Dropbox-API-Arg': JSON.stringify({
path : '/img-from-react-native.png',
mode : 'add',
autorename : true,
mute : false
}),
'Content-Type' : 'application/octet-stream',
// Change BASE64 encoded data to a file path with prefix `RNFetchBlob-file://`.
// Or simply wrap the file path with RNFetchBlob.wrap().
}, RNFetchBlob.wrap(PATH_TO_THE_FILE))
.then((res) => {
console.log(res.text())
})
.catch((err) => {
// error handling ..
})
I'm having an ugly issue that only affect my expo app on Android.
Im trying to upload a base64 image taken with expo ImagePicker to Firebase Storage passing the image value with a http-request made with axios to a Firebase Cloud Function which returns the url of the saved image. This url goes in Firestore, but this is out of reach of my question I think.
My current implementation works flawless in IOS (I can get as many urls as I want, they upload pretty quick actually) but, in Android I only can upload 2 images in a row; when I try for the third time, my app get frozen when reach axios/fetch* statement and gives no clue of whats happened. Console is just as it was before trying the third time and the apps or simulators freeze.
Here you can see this behaviour in a 2 min video:
https://youtu.be/w66iXnKDmdo
When I begun working in this bug I was using fetch instead of axios. At that time the issue was that I was able to upload only one image. It were necessary to close and open the app again to upload one more. Now with axios Im able to upload 2 insted of one, but the problem persist.
This is how I implemented the code:
const imageBase64 = 'QWEpqw0293k01...'
This is how I upload the image to Firebase Cloud Storage:
export const savePhoto = (imageBase64) => {
const db = firebase.firestore();
const docRef = db.collection('Comidas').doc();
return () => {
uploadImageToFirestore(imageBase64)
.then(imageUrl => {
console.log('image-url: ', imageUrl);
docRef.set({ imagen: { uri: imageUrl }, });
})
.catch(err => console.log('error: ', err));
};
};
I made a function helper that allow me to make the http request reusable:
import axios from 'axios';
export const uploadImageToFirestore = (imageBase64) => {
//<--- here is where it get frozen the third time
//<--- console.log() calls three times but not axios
return axios({
method: 'post',
url: 'https://us-central1-menuapp-9feb4.cloudfunctions.net/almacenamientoImagen',
data: {
image: base64
},
})
.then(res => res.data.imageUrl)
.catch(err => console.log('error while uploading base64: ', err));
};
This invoques the following Firebase Cloud Function:
exports = module.exports = functions.https.onRequest((req, res) => {
cors(req, res, () => {
const body = req.body;
console.log('image: ', body.image);
fs.writeFileSync("/tmp/uploaded-image.jpg", body.image, "base64", err => {
console.log(err);
return res.status(500).json({ error: err });
});
const bucket = gcs.bucket("myapp.appspot.com");
const uuid = UUID();
bucket.upload(
"/tmp/uploaded-image.jpg",
{
uploadType: "media",
destination: "/comidas/" + uuid + ".jpg",
metadata: {
metadata: {
contentType: "image/jpeg",
firebaseStorageDownloadTokens: uuid
}
}
},
(err, file) => {
if (!err) {
console.log('url: ', {
imageUrl:
"https://firebasestorage.googleapis.com/v0/b/" +
bucket.name +
"/o/" +
encodeURIComponent(file.name) +
"?alt=media&token=" +
uuid
});
res.status(201).json({
imageUrl:
"https://firebasestorage.googleapis.com/v0/b/" +
bucket.name +
"/o/" +
encodeURIComponent(file.name) +
"?alt=media&token=" +
uuid
});
} else {
console.log(err);
res.status(500).json({ error: err });
}
}
);
});
});
I know that axios it’s not being called because there is no log neither register of the Firebase Cloud Function execution.
I expect this code to upload as many images as user consider he/she needs, not just 2 per app session as it does at this moment
How can I solve this?
I'm developing a app with PhoneGap/Cordova 2.5.0 and I'm making AJAX calls with jQuery 1.8.2 to retrieve datas from an external server. I'm doing a lot of requests and I can see my app cache growing up, and this is not pretty cool...
I've tested many things like :
$.ajaxSetup({
cache: false,
headers: {
"Cache-Control": "no-cache"
}
});
OR / AND
var ajaxRequests = {}; // Limit one AJAX call for each "data_id" to prevent numbers calls
if (vrbd.ajaxRequests[data_id] === undefined) {
ajaxRequests[data_id] = $.ajax({
type: 'GET',
dataType: 'xml' + data_id,
url: url,
data: {
_: new Date().getTime() + Math.random()
},
async: true,
timeout: (data_count >= 2 ? data_count * 800 : 2000),
cache: false,
headers: {
"Cache-Control": "no-cache"
}
})
.done(function(data, textStatus, jqXHR) { ... })
.fail(function(jqXHR, textStatus, errorThrown) { ... })
.always(function(jqXHR, textStatus) { delete ajaxRequests[data_id]; });
}
If I let my app running during a couple of hours, I can see my cache growing up from about 160kb to about 30Mb in Settings > Apps > MyApp > Cache (AVD and real device).
So, didn't I understand anything about the cache in Settings or did I forget something ?
Please, let me know if you need another informations, sorry for my english, and thanks in advance for your help.
Best regards
Alex
Clear cache:
// clear cache
super.clearCache();
super.loadUrl("file:///android_asset/www/index.html");
Source:
Adding a splash screen and clearing cache with PhoneGap and Android