React-Native <Image source={url}> - Path to Cached Image - android

Maybe my question will sound foolish, but here it is...
How can we get the path to a cached image (in both iOS and Android)?
Here is my use case: I present a view on my App that lists images from the web --> I get an array of urls from Google Customer Search API based on the user's provided keywords...
<FlatList
style={{ width: '100%' }}
data={this.state.suggestions}
numColumns={2}
renderItem={(img) => {
return (
<TouchableOpacity onPress={() => this.selectImageHandler(img.item)} >
<Image source={{ uri: img.item.link }} />
</TouchableOpacity>
)
}}
keyExtractor={(item, index) => index.toString()}
/>
The result looks like this:
Then the user presses on an image to select it, which then needs to store this image in the app's folder (PictureDir/myappfolder/ in Android, DocumentDir/myappfolder/ in iOS) ...
What I am doing right now is when the image is selected, I download it again:
selectImageHandler = (img) => {
// (... pre code ...)
RNFS.downloadFile({
fromUrl: img.link, // e.g. "http://www.url.to/the/picture.png
toFile: uri, // e.g. "file://"+PictureDir+ "/myAppFolder/picturename.jpg"
}).promise.then( res => {
// (... post code ...)
}
It works fine! But it takes a bit of time, as it downloads again the image,
but I feel this is doing it twice, as it was downloaded already a stored in the cache to be displayed.
So here comes my question again, is there a way to know where the image was stored in the cache, so that when the user pressed the image to save it, it will not download it again, but will rather move it from the cache folder to the app's folder?
Am I making any sense? Or is redownloading the right approach?
Thanks for your help!

One way to avoid re-downloading images a few times might be to take over the control of downloading from a remote url from the <Image> component. Basically, you can download the remote image using the RNFS.downloadFile method and then supply the local URI (toFile value) as the image source. This requires a bit more work, of course, as we need to create a wrapper component, but this approach also provides options to control the display of an image while it's loading.
For example:
import React, { useState, useLayoutEffect } from 'react';
import RNFS from 'react-native-fs';
import { URL } from 'react-native-url-polyfill';
const CACHE_DIR = RNFS.DocumentDirectoryPath; // Cross-platform directory
function ImageCard ({ imgUrl }) {
const [cachedImgUrl, setCachedImgUrl] = useState(null);
const [isImageLoading, setIsImageLoading] = useState(true);
useLayoutEffect(() => {
const getCachedImageUrl = async () => {
try {
const basename = new URL(imgUrl).pathname.split('/').pop();
const localImgUrl = `file://${CACHE_DIR}/${basename}`;
if (await RNFS.exists(localCacheUrl)) {
setCachedImgUrl(localImgUrl);
} else {
const download = RNFS.downloadFile({
fromUrl: imgUrl,
toFile: localImgUrl,
});
const downloadResult = await download.promise;
if (downloadResult.status === 200) {
setCachedImgUrl(localImgUrl);
}
}
} catch (err) {
// handle error
} finally {
setIsImageLoading(false);
}
};
getCachedImageUrl();
}, [imgUrl]);
if (isImageLoading || !cachedImgUrl) {
// A better idea would be to return some `<Loader />` component, or some placeholder, like skeleton animation, etc. This is just an example.
return null;
}
return (
<Image source={{ uri: localImgUrl }} />;
);
}
The <ImageCard /> component replaces the plain <Image /> component in the <FlatList /> and downloads from the remote image URL only once.
The code above is simplified and it assumes that you have unique image names that you can use as the identifiers on the file system, and that the image urls don't include any search parameters, etc. Please be cautious and adapt the code for your needs before using it directly.

Related

React-Native NFC Reader returns: The NFC tag's type is not supported

I am trying to create an android app that will be used to make a payment with an Ingenico terminal. I added the react-native-nfc-manager (https://github.com/whitedogg13/react-native-nfc-manager) to my project an followed the v2-ios+android-write-ndef example. I also enabled NFC on my phone. I bought from Amazon an NFC USB reader. I installed it and also got a windows app called GoToTags that connected successfully to the USB module.
I fired up my app and tested the NFC button.
The scanner beeps (So the NFC technologie was loaded) but GoToTags sends back a message (The NFC tag's type is not supported) that looks like this:
{"Uid":null,"ReadOnly":false,"DataLength":null,"CanMakeReadOnly":false,
"Formatted":false,"Records":null,"TagTech":null,"MaxDataLength":null,
"Exception":"The NFC tag's type is not supported."}
I am unsure what I am doing wrong. I followed the instructions to the letter and also watched and followed a YouTube tutorial.
https://www.youtube.com/watch?v=Kx22B6OH3Oc
The only difference between my code and the guy in the YouTube video is that I am using an android phone instead of an iPhone.
This is my code:
import React from 'react';
import {
View,
Text,
TouchableOpacity,
} from 'react-native';
import NfcManager, {Ndef, NfcTech} from 'react-native-nfc-manager';
function buildUrlPayload(valueToWrite) {
return Ndef.encodeMessage([
Ndef.uriRecord(valueToWrite),
]);
}
class AppV2Ndef extends React.Component {
componentDidMount() {
NfcManager.start();
}
componentWillUnmount() {
this._cleanUp();
}
render() {
return (
<View style={{padding: 20}}>
<Text>NFC Demo</Text>
<TouchableOpacity
style={{padding: 10, width: 200, margin: 20, borderWidth: 1, borderColor: 'black'}}
onPress={this._testNdef}
>
<Text>Test Ndef</Text>
</TouchableOpacity>
<TouchableOpacity
style={{padding: 10, width: 200, margin: 20, borderWidth: 1, borderColor: 'black'}}
onPress={this._cleanUp}
>
<Text>Cancel Test</Text>
</TouchableOpacity>
</View>
)
}
_cleanUp = () => {
NfcManager.cancelTechnologyRequest().catch(() => 0);
}
_testNdef = async () => {
try {
let resp = await NfcManager.requestTechnology(NfcTech.Ndef, {
alertMessage: 'Ready to write some NFC tags!'
});
console.warn(resp);
let ndef = await NfcManager.getNdefMessage();
console.warn(ndef);
let bytes = buildUrlPayload('https://www.google.com');
await NfcManager.writeNdefMessage(bytes);
console.warn('successfully write ndef');
await NfcManager.setAlertMessageIOS('I got your tag!');
this._cleanUp();
} catch (ex) {
console.warn('ex', ex);
this._cleanUp();
}
}
}
export default AppV2Ndef;
What I need to do is send out a 4 digit code to the Ingenico terminal from the phone using the NFC tech. But before I go a head an make that possible, I just want to get the NFC to work first. Sending out https://google.com to the card reader would be a good first step. But so fare, no luck.
What am I missing? It seems pretty much straight forward no?

How do I handover a image file in react native as URI, since require just returns a number?

I am looking for a solution to handover a image URI to a native module in Android.
The problem is, that when I require a image in react native, it just returns a number.
So I'd need to load the URI directly or some sort of string that can be converted to a file and afterwards to URI in android.
EDIT:
My images are loaded in a separat file like this:
{ sticker1: { uri: require('./Sticker_1.png'), path: './Sticker_1.png' } },
Afterwards I display items in a Flatlist like this:
renderItem(item) {
const itemIndex = item.index;
const itemKey = `sticker${itemIndex + 1}`;
const sticker = item.item[itemKey];
return (
<TouchableOpacity
onPress={() => this.handOverSticker()}
key={itemKey}
>
<Image source={sticker.uri} style={styles.previewImage} key={itemKey} />
</TouchableOpacity>
);
}
So far everything works.
handOverSTicker function looks like this:
handOverSticker(sticker) {
this.props.handleUseSticker(sticker);
}
The other component then takes care of the URI in a react native module in android and does other stuff. The problem is the received URI being the returned number by require.
Thank you!
constructor(props){
super(props);
let imgUrl = props.image ? { uri: props.image } : require("../assets/images/image.jpg");
this.state = { image: imgUrl };
}
In source of Image place this code:
source={this.state.image}

Storing/Moving photo to a separate app folder in the phone

I am building a react-native app that takes a photo or choose a photo from Gallery and the photo should be moved to a separate folder called {appName} folder in the phone. I am storing all the images locally so it is critical that the image is stored in the folder.
For example: When you upload photo to Instagram or Whatsapp, the app will create a separate folder in the phone called Instagram/Whatsapp and store all of its photo in that folder.
I am using react-native to build the app, and image picker to take photo or choose photo from gallery.
I am currently trying on a simulator for Iphone, but I want it to work on both Iphone and Android.
I tried using 'react-native-fs' and 'react-native-fetch-blob'. But its not working correctly,
const dirPictures = `${RNFS.DocumentDirectoryPath}/hazelnut`;
//const dirPictures = `${RNFS.PicturesDirectoryPath}/hazelnut`;
const newImageName = `${moment().format('DDMMYY_HHmmSSS')}.jpg`;
const newFilepath = `${dirPictures}/${newImageName}`;
const imageMoved = await this.moveAttachment(this.state.image.uri, newFilepath);
moveAttachment = async (filePath, newFilepath) => {
return new Promise((resolve, reject) => {
RNFS.mkdir(dirPictures)
.then( () => {
RNFS.moveFile(filePath, newFilepath)
.then(() => {
console.log('FILE MOVED', filePath, newFilepath);
resolve(true);
})
})
.catch(err => {
console.log('mkdir error', err);
reject(err);
});
});
};
The Image is moved, but it is not the behavior I want. Using the code above, it is moved to a document directory. But I want to move it to the photo directory. If I go into Photos, I should be able to see the app folder with all the images.
If I use const dirPictures = ${RNFS.PicturesDirectoryPath}/hazelnut, then I get You don’t have permission to save the file “hazelnut” in the folder “undefined”.
I think you should use this :
const dirPictures = `${RNFS.DocumentDirectoryPath}/hazelnut`;
For android
And this :
const dirPictures = `${RNFS.LibraryDirectoryPath}/hazelnut`;
For ios

How to get the thumbnail of an Android gallery picture, in NativeScript?

I understand that Android automatically creates a thumbnail, for every picture taken by the camera. I need to be able to display that thumbnail.
I'm using nativescript-imagepicker plugin to select images. The plugin returns only the size and src of the selected image(s), for instance:
'/storage/emulated/0/DCIM/DSCF2060.jpg'
How could i use this src, to retrieve the corresponding thumbnail(is it even possible?).
The Android API is very confusing for me(not to mention the Java), so any help will be greatly appreciated.
Use ThumbnailUtils
export function onGetImageButtonTap(args) {
let context = imagepicker.create({
mode: "single"
});
context
.authorize()
.then(function () {
return context.present();
})
.then(function (selection) {
selection.forEach(function (selected) {
const size = layout.toDevicePixels(96);
const bitmap = android.media.ThumbnailUtils.extractThumbnail(android.graphics.BitmapFactory.decodeFile(selected.android),
size, size);
args.object.page.getViewById("thumbnailImg").src = fromNativeSource(bitmap);
});
}).catch(function (e) {
console.log(e)
});
}
Playground Sample

PDF thumbnail generation for Firebase

I am developing an Android app that manages PDFs. I know that Cloudinary allows users to upload PDFs and automatically generate thumbnails for the uploaded PDF (see here). Does Firebase Storage or Cloud Firestore offer a similar feature? If not, any recommended third-party tool for this task? Thanks!
Intro
ImageMagick comes pre installed on the cloud function environment, but Ghostscript does not (at least, not right now). Ghostscript is what’s needed to generate images from PDFs.
First, download a copy of the Ghostscript executable from here (linux 64 bit APGL Release), and store it in the root of your functions directory. You may want to rename the folder/executable name for shorter path references. This will be deployed when you deploy your function(s).
Second as seen in this repo, npm install —-save https://github.com/sina-masnadi/node-gs/tarball/master . This is a wrapper you’ll need to communicate with the Ghostscript executable from within your function.
Third, you may want to look over the Ghostscript / ImageMagick docs to customize even further.
Sample cloud function
Finally, here is a function that I just got working for my functions file structure. You’ll need to write better validation and tweek it for your setup, but know that the meat of this will work.
const admin = require('firebase-admin');
const functions = require('firebase-functions');
const fs = require('fs');
const os = require('os');
const path = require('path');
const write = require('fs-writefile-promise');
const spawn = require('child-process-promise').spawn;
const mkdirp = require('mkdirp-promise');
const gs = require('gs');
const gs_exec_path = path.join(__dirname, '../../../ghostscript/./gs-923-linux-x86_64');
try { admin.initializeApp(functions.config().firebase); } catch(e) {}
/*
Callable https function that takes a base 64 string of a pdf (MAX
10mb), uploads a thumbnail of it to firebase storage, and returns its
download url.
*/
module.exports = functions.https.onCall((data, context) => {
if (!data.b64str) { throw Error('missing base 64 string of pdf!'); }
const b64pdf = data.b64str.split(';base64,').pop();
const pg = typeof data.pg === 'number' ? data.pg : 1; //1-based
const max_wd = typeof data.max_wd === 'number' ? data.max_wd : 200;
const max_ht = typeof data.max_ht === 'number' ? data.max_ht : 200;
const st_fname = typeof data.fname === 'string' ? data.fname : 'whatever.jpg';
const bucket = admin.storage().bucket();
const tmp_dir = os.tmpdir();
const tmp_pdf = path.join(tmp_dir, 'tmp.pdf');
const tmp_png = path.join(tmp_dir, 'doesntmatter.png');
const tmp_thumb = path.join(tmp_dir, st_fname.split('/').pop();
const st_thumb = st_fname;
/* create tmp directory to write tmp files to... */
return mkdirp(tmp_dir).then(() => {
/* create a temp pdf of the base 64 pdf */
return write(tmp_pdf, b64pdf, {encoding:'base64'});
}).then(() => {
/* let ghostscript make a png of a page in the pdf */
return new Promise((resolve, reject) => {
gs().batch().nopause()
.option(`-dFirstPage=${pg}`)
.option(`-dLastPage=${pg}`)
.executablePath(gs_exec_path)
.device('png16m')
.output(tmp_png)
.input(tmp_pdf)
.exec(err => err ? reject(err) : resolve());
});
}).then(() => {
/* make a thumbnail for the png generated by ghostscript via imagemagick */
var args = [ tmp_png, '-thumbnail', `${max_wd}x${max_ht}>`, tmp_thumb ];
return spawn('convert', args, {capture: ['stdout', 'stderr']});
}).then(() => {
/* upload tmp_thumb to storage. */
return bucket.upload(tmp_thumb, { destination: st_thumb });
}).then(() => {
/* get storage url for the uploaded thumbnail */
return bucket.file(st_thumb).getSignedUrl({
action:'read',
expires: '03-01-2500'
});
}).then(result => {
/* clean up temp files and respond w/ download url */
fs.unlinkSync(tmp_pdf);
fs.unlinkSync(tmp_png);
fs.unlinkSync(tmp_thumb);
return result[0];
});
});
Sample invocation from client
var fn = firebase.functions().httpsCallable('https_fn_name');
fn({
b64str: 'base64 string for pdf here....',
pg: 2, //optional, which page do u want a thumbnail of? 1 is the first.
fname: 'path/to/file/in/storage.jpg', //optional but recommended, .png if u like
max_wd: 300, // optional, max thumbnail width
max_ht: 300 // optional, max thumbnail height
}).then(res => console.log(res));
If you cant generate a signed url...
You'll need to Add the role "Cloud Functions Service Agent" to whatever service account this function is using. You can Add the role to your service account in your cloud console.

Categories

Resources