How to fetch random image from camera-roll using react native? - android

I have requirement where i need to get a random image every time when i click an button. I don't want picker to come up for the camera-roll with images, instead random image should selected from the camera folder and display in the image view.
I have followed the official FB tutorial of camera roll. Please find the code as below
_handleButtonPress = () => {
CameraRoll.getPhotos({
first: 20,
assetType: 'Photos',
})
.then(r => {
this.setState({ photos: r.edges });
})
.catch((err) => {
});
};
But this code will select the recently clicked images and display in the picker. Instead of to randomly select the uri of the image and display in the imageview. Any help is appreciated.
Regards,
Sharath

You essentially have the photos and all the necessary metadata once you set the state: this.setState({ photos: r.edges })
All you have to do is pick a random image from there. Here's how I did it:
import React, { Component } from 'react';
import {
StyleSheet,
View,
Image,
CameraRoll,
Button
} from 'react-native';
export default class App extends Component {
constructor(props) {
super(props)
this.state = {
img: null
}
}
getRandomImage = () => {
const fetchParams = {
first: 25,
}
CameraRoll.getPhotos(fetchParams)
.then(data => {
const assets = data.edges
const images = assets.map((asset) => asset.node.image)
const random = Math.floor(Math.random() * images.length)
this.setState({
img: images[random]
})
})
.catch(err => console.log)
}
render() {
return (
<View style={styles.container}>
{ this.state.img ?
<Image
style={styles.image}
source={{ uri: this.state.img.uri }}
/>
: null
}
<Button title="Get Random Image from CameraRoll" onPress={this.getRandomImage}/>
</View>
);
}
}
const styles = StyleSheet.create({
container: {
flex: 1,
justifyContent: 'center',
alignItems: 'center',
backgroundColor: '#F5FCFF',
},
image: {
width: '100%',
height: '75%',
margin: 10,
}
});

Related

Camera is not running

I’m trying to make a prototype application that over and over
1- record a video with the camera for x seconds
2- displays this video
For this I use the components Camera from expo-camera and Video from expo-av
For this I have two views :
I use in my code the stateSequence property and the sequencer() function which displays alternately the view with the Camera component which films for x seconds , and the video view which allows me to display the video.
Sequencer() is triggered with setInterval( this.sequencer , 10000) found in the componentWillMount()
I can switch alternately from a View with the Camera component to a View with the Video component.
To record a video with the Camera component I use recordAsync(), but I get the following error:
Unhandled promise rejection: Error: Camera is not running
I’m using an android phone for my tests.
Can’t you help me
this is my code
import { StyleSheet, Text, View ,TouchableOpacity} from 'react-native';
import * as Permissions from 'expo-permissions';
import { Camera } from 'expo-camera';
import { Video } from 'expo-av';
import { Logs } from 'expo';
export default class SequenceViewer extends Component {
constructor(props) {
super(props);
this.state = {
stateSequence: "SHOOT ",
hasCameraPermission: null,
type: Camera.Constants.Type.front,
}
this.recordVideo = this.recordVideo.bind(this)
}
sequencer = () => {
if(this.state.stateSequence==="WATCH"){
this.setState({ stateSequence: "SHOOT",})
this.recordVideo(); // Error message Camera is not running
} else {
this.setState({stateSequence: "WATCH"})
}
}
async componentWillMount() {
let rollStatus = await Permissions.askAsync(Permissions.CAMERA_ROLL);
let cameraResponse = await Permissions.askAsync(Permissions.CAMERA)
if (rollStatus.status=='granted'){
if (cameraResponse.status == 'granted' ){
let audioResponse = await Permissions.askAsync(Permissions.AUDIO_RECORDING);
if (audioResponse.status == 'granted'){
this.setState({ permissionsGranted: true });
setInterval( this.sequencer , 10000);
}
}
}
}
recordVideo = async () => {
if(this.state.cameraIsRecording){
this.setState({cameraIsRecording:false})
this.camera.stopRecording();
}
else {
this.setState({cameraIsRecording:true})
if (this.camera) {
let record = await this.camera.recordAsync(quality='480p',maxDuration=5,mute=true).then( data =>{
this.setState( {recVideoUri :data.uri})
})
}
}
};
render() {
const { hasCameraPermission } = this.state
if(this.state.stateSequence=="WATCH")
{
return(
<View style={styles.container}>
<Video
source={{ uri:this.state.recVideoUri }}
rate={1.0}
volume={1.0}
isMuted={false}
resizeMode="cover"
shouldPlay
isLooping
style={{ width: 300, height: 300 }}
ref={(ref) => { this.player = ref }}
/>
</View>
)
} else
{
return(
<View style={{ flex: 1 }}>
<Camera style={{ flex: 1 }} type={this.state.type} ref={ref => {this.camera = ref; }}></Camera>
</View>
)
}
}
}
const styles = StyleSheet.create({
viewerText: {
fontSize: 20,
fontWeight: 'bold',
},
container: {
flex: 1,
backgroundColor: '#fff',
alignItems: 'center',
justifyContent: 'center',
},
});
Thank you
I had the same problem, my solution was by default camera type have to be "back" and you could change to "front" by:
componentDidMount = () => {
this.props.navigation.addListener('didFocus', async () => {
await setTimeout(() => {
this.setState({ cameraType: Camera.Constants.Type.front })
}, 100)
});
}
I was getting the "Camera is not running" error when i changed screens. So for functional components instead of withNavigationFocus(Camera) use this method:
import { useIsFocused } from '#react-navigation/native';
function MyComponent() {
const isFocused = useIsFocused()
return (
<View>
{ isFocused && <RNCamera /> }
</View>
)
}

Share image on whatsapp from react native android app

I am currently working on react-native photo sharing app for Android. Used native share method but it only share message and title. No options to share an image.
Looking after so many questions here couldn't find any straight forward way.
Please provide help.
This is the message I am getting Share awesome status on whatsapp using Khela #imageurl. Download #urltoplaystore
To share any image in React Native you are right you need to use the Share from react-native library itself, and you were wondering what is needed for an image, the answer it's really simple, you just need to use a Base64 image.
Check it out a working snack: snack.expo.io/#abrahamcalf/share-image
Wrap the code:
import * as React from 'react';
import {
Text,
View,
StyleSheet,
Image,
Share,
TouchableOpacity,
} from 'react-native';
export default class App extends React.Component {
state = {
cat: 'data:image/jpeg;base64,some-encoded-stuff;
};
handleSharePress = () => {
Share.share({
title: 'Share',
message: 'My amazing cat 😻',
url: this.state.cat,
});
};
render() {
return (
<View style={styles.container}>
<Image source={{ uri: this.state.cat }} style={styles.img} />
<TouchableOpacity onPress={this.handleSharePress}>
<Text>Share Image</Text>
</TouchableOpacity>
</View>
);
}
}
const styles = StyleSheet.create({
container: {
flex: 1,
justifyContent: 'space-around',
alignItems: 'center',
},
img: {
width: 200,
height: 300,
},
});
If you want to try something else, probably complex, I recommend to check out the react-native-share library from the React Native Community.
import Icon from 'react-native-vector-icons/Feather';
import Share from 'react-native-share';
import RNFetchBlob from 'rn-fetch-blob';
import React, {Component} from 'react';
const fs = RNFetchBlob.fs;
class ProductDetail extends Component {
constructor(props) {
super(props);
this.state = {};
}
shareTheProductDetails(imagesPath) {
let {productDetails} = this.state;
let imagePath = null;
RNFetchBlob.config({
fileCache: true,
})
.fetch('GET', imagesPath.image)
// the image is now dowloaded to device's storage
.then((resp) => {
// the image path you can use it directly with Image component
imagePath = resp.path();
return resp.readFile('base64');
})
.then((base64Data) => {
// here's base64 encoded image
var imageUrl = 'data:image/png;base64,' + base64Data;
let shareImage = {
title: productDetails.product_name, //string
message:
'Description ' +
productDetails.product_description +
' http://beparr.com/', //string
url: imageUrl,
// urls: [imageUrl, imageUrl], // eg.'http://img.gemejo.com/product/8c/099/cf53b3a6008136ef0882197d5f5.jpg',
};
Share.open(shareImage)
.then((res) => {
console.log(res);
})
.catch((err) => {
err && console.log(err);
});
// remove the file from storage
return fs.unlink(imagePath);
});
}
render() {
return (
<TouchableOpacity
style={{
borderWidth: 0,
left:(5),
top:(2),
}}
onPress={() =>
this.shareTheProductDetails(images)
}>
<Icon
style={{
left: moderateScale(10),
}}
name="share-2"
color={colors.colorBlack}
size={(20)}
/>
</TouchableOpacity>
)}
}

React native Image uploading not uploading the images

I am developing an app That contain an image uploader. I found some code on internet that can upload image. And it worked partially. Which means When i press the choose image button, it opens gallery and i am able to select and crop image. And after cropping when i select ok nothing happen. It just show the image and its not uploading to the firebase cloud. Here is my code.
import React, { Component } from 'react';
import {
AppRegistry,
StyleSheet,
Text,
View,
Button,
Image,
ActivityIndicator,
TouchableOpacity
} from 'react-native';
import * as firebase from 'firebase';
import RNFetchBlob from 'react-native-fetch-blob';
import ImagePicker from 'react-native-image-crop-picker';
export default class RNF extends Component {
constructor (props) {
super(props);
this.state = {
loading: false,
dp: null
};
}
openPicker () {
this.setState({ loading: true });
const Blob = RNFetchBlob.polyfill.Blob;
const fs = RNFetchBlob.fs;
window.XMLHttpRequest = RNFetchBlob.polyfill.XMLHttpRequest;
window.Blob = Blob;
// const { uid } = this.state.user
const uid = '12345';
ImagePicker.openPicker({
width: 300,
height: 300,
cropping: true,
mediaType: 'photo'
}).then(image => {
const imagePath = image.path;
let uploadBlob = null;
const imageRef = firebase.storage().ref(uid).child('dp.jpg');
let mime = 'image/jpg';
fs.readFile(imagePath, 'base64')
.then((data) => {
// console.log(data);
return Blob.build(data, { type: `${mime};BASE64` });
})
.then((blob) => {
uploadBlob = blob;
return imageRef.put(blob, { contentType: mime });
})
.then(() => {
uploadBlob.close();
return imageRef.getDownloadURL();
})
.then((url) => {
let userData = {};
// userData[dpNo] = url
// firebase.database().ref('users').child(uid).update({ ...userData})
let obj = {};
obj['loading'] = false;
obj['dp'] = url;
this.setState(obj);
})
.catch((error) => {
console.log(error);
});
})
.catch((error) => {
console.log(error);
});
}
render () {
const dpr = this.state.dp ? (<TouchableOpacity onPress={ () => this.openPicker() }><Image
style={{ width: 100, height: 100, margin: 5 }}
source={{ uri: this.state.dp }}
/></TouchableOpacity>) : (<Button
onPress={ () => this.openPicker() }
title={ 'Change Picture' }
/>);
const dps = this.state.loading ? <ActivityIndicator animating={this.state.loading} /> : (<View style={styles.container}>
<View style={{ flexDirection: 'row' }}>
{ dpr }
</View>
</View>);
return (
<View style={styles.container}>
{ dps }
</View>
);
}
}
const styles = StyleSheet.create({
container: {
flex: 1,
justifyContent: 'center',
alignItems: 'center',
backgroundColor: '#F5FCFF'
},
welcome: {
fontSize: 20,
textAlign: 'center',
margin: 10
},
instructions: {
textAlign: 'center',
color: '#333333',
marginBottom: 5
}
});
AppRegistry.registerComponent('RNF', () => RNF);

I want to use the data from a JSON file as the label for my radio button (react native)

I have an app that reads a JSON file. In one tab I have it coming out as a list, on another tab, I want it to show the items selected from the file as the labels for a radio button I found here:
https://www.npmjs.com/package/react-native-radio-buttons-group
Here's the code that I have for the tab of my app that pulls names of medications from a JSON file:
import React, { Component } from "react";
import {
View,
Text,
StyleSheet,
TouchableHighlight,
FlatList
} from "react-native";
import { Icon } from 'native-base'
import RadioGroup from 'react-native-radio-buttons-group';
//import MultipleChoice from 'react-native-multiple-choice'
class LikesTab extends Component {
_onSelect = ( item ) => {
console.log(item);
};
onPress = data => this.setState({ data });
constructor(props){
super(props);
this.state = {
data:[]
}
}
//SETTING THE STATE MAKING AN EMPTY ARRAY WHICH WE FIL
// state = {
// data: []
//};
componentWillMount() {
this.fetchData();
}
//Getting the data
fetchData = async () => {
const response = await fetch("https://api.myjson.com/bins/s5iii");
const json = await response.json();
this.setState({ data: json.results });
};
//var customData = require('./customData.json');
//Setting what is shown
render() {
return (
<View style={{ marginVertical: 10, backgroundColor: "#E7E7E7" }} >
<FlatList
data={this.state.data}
keyExtractor={(x, i) => i}
renderItem={({ item }) =>
<Text>
{`${item.name.first} ${item.name.last}`}
</Text>}
/>
</View>
);
}
}
export default LikesTab;
const styles = StyleSheet.create({
container: {
flex: 1,
alignItems: 'center',
justifyContent: 'center'
}
});
Here's my tab for the radio button
import React, { Component } from 'react';
import { Text, View, StyleSheet } from 'react-native';
import RadioGroup from 'react-native-radio-buttons-group';
export default class AddMediaTab extends Component {
componentWillMount() {
this.fetchData();
}
//Getting the data
fetchData = async () => {
const response = await fetch("https://api.myjson.com/bins/s5iii");
const json = await response.json();
this.setState({ data: json.results });
};
state = {
data: [
{
label:' ' ,
}
]
};
// update state
onPress = data => this.setState({ data });
render() {
let selectedButton = this.state.data.find(e => e.selected == true);
selectedButton = selectedButton ? selectedButton.value : this.state.data[0].label;
return (
<View style={styles.container}>
<Text style={styles.valueText}>
Value = {selectedButton}
</Text>
<RadioGroup radioButtons={this.state.data} onPress={this.onPress}
/>
</View>
);
}
}
const styles = StyleSheet.create({
container: {
flex: 1,
alignItems: 'center',
justifyContent: 'center',
},
valueText: {
fontSize: 18,
marginBottom: 50,
},
});
What is your selectedButton variable? I think Value = {selectedButton} is the issue. If selectedButton evaluates to a string, fine but it looks like it is an object in your case. Array.find() returns the first element that satisfies the condition, or if none satisfy it it returns undefined. If that variable undefined during the time it's waiting for your api call to return something that could cause an issue as well.

How to make a react native input which gives validation state feedback to the user. [Valid, Printine, Error, Editing]

I would like to have an input that updates continuously as the user types and then loses focus. The feedback will be a border around the input.
1 Green: when valid
2 Amber: when typing and is in error state (Green when valid)
3 Red: when in error state and unfocused
4 Nothing: when input is pristine (not touched and empty)
What is the best way to achieve this?
Ideally this will work with both iOS and Android.
TextInput has two functions that will be useful to achieve this:
onBlur and onChangeText
To dynamically set the style on the TextInput, you could attach a variable for the bordercolor like below:
<TextInput
onBlur={ () => this.onBlur() }
onChangeText={ (text) => this.onChange(text) }
style={{ borderColor: this.state.inputBorder, height: 70, backgroundColor: "#ededed", borderWidth: 1 }} />
Then, pass the result from the onChangeText function through a regex or pattern matcher to achieve whatever result you are trying to achieve.
I've set up a working project here that checks if there is whitespace, and throws the errors you are wanting. You can take it and edit it to be more specific to your needs, but the basic premise should work the same. I've also put the code below for the working project that implements the functionality:
'use strict';
var React = require('react-native');
var {
AppRegistry,
StyleSheet,
Text,
View,
TextInput
} = React;
var SampleApp = React.createClass({
getInitialState: function() {
return {
inputBorder: '#eded',
defaultVal: ''
}
},
onBlur: function() {
console.log('this.state.defaultVal', this.state.defaultVal)
if(this.state.defaultVal.indexOf(' ') >= 0) {
this.setState({
inputBorder: 'red'
})
}
},
onChange: function(text) {
this.setState({
defaultVal: text
})
if(text.indexOf(' ') >= 0) {
this.setState({
inputBorder: '##FFC200'
})
} else {
this.setState({
inputBorder: 'green'
})
}
},
render: function() {
return (
<View style={styles.container}>
<View style={{marginTop:100}}>
<TextInput
onBlur={ () => this.onBlur() }
onChangeText={ (text) => this.onChange(text) }
style={{ height: 70, backgroundColor: "#ededed", borderWidth: 1, borderColor: this.state.inputBorder }} />
</View>
<View style={{marginTop:30}}>
<TextInput
style={{ height: 70, backgroundColor: "#ededed" }} />
</View>
</View>
);
}
});
var styles = StyleSheet.create({
container: {
flex: 1,
}
});
AppRegistry.registerComponent('SampleApp', () => SampleApp);

Categories

Resources