I'm working on an app where one part of the process is shooting a video, then uploading it. I'm using react-native-video to display the preview after the user has finished recording, and react-native-camera for the capturing process. I also use react-navigation to move between screens.
Currently I can get to the preview screen and set the video component's source uri from Redux. However, there is no player to be seen. The uri is in format "file:///path/video.mp4", so apparently it should be in the app cache as intended.
First the user is presented with a camera, where s/he can capture the video.
const recordVideo = async () => {
if (camera) {
const data = await camera.current.recordAsync()
if (data) {
dispatch(saveVideo(data)) <-- CONTAINS THE URI
navigation.navigate(CONFIRM)
}
}
When stopRecording() is called, the promise obviously resolves and the video's URI will be dispatched to Redux. Afterwards we navigate to the "confirmation screen", where the user can preview the video and choose whether to shoot another or go with this one.
My problem is, I can't get that preview video to play at all. I think I've tried pretty much everything within my power by now and I'm getting really tired of something so seemingly simple being so overly difficult to do. I've gotten the video to play a few times for some odd reason, so it's not the player's fault. At best what I've achieved is show the preview once, but when you go back and shoot another, there's no video preview anymore. Also, the "confirm" screen loads photos normally (that were taken in the same manner: camera -> confirm), but when it's the video's turn, it just doesn't work. The video component's onError handler also gives me this: {"error": {"extra": -2147483648, "what": 1}} which seems like just gibberish.
PS. yes, I've read through every related post here without finding a proper solution.
Use Exoplayer
Instead of using the older Media Player on Android, try using the more modern Exoplayer. If you're on React Native 0.60+, you can specify this in your react-native.config.js by doing the following:
module.exports = {
dependencies: {
"react-native-video": {
platforms: {
android: {
sourceDir: "../node_modules/react-native-video/android-exoplayer"
}
}
}
}
};
I was experiencing the same issue and this solution worked for us. Note, we're only supporting Android 5+ so not sure if this will work with devices older than that.
https://github.com/react-native-video/react-native-video/issues/1747#issuecomment-572512595
Related
I want to run some code on a photo capture, but there isn't an obvious way to to this. I set tapPhoto to true so that the plugin captures pictures when the user taps the screen, and it seems to be working since it triggers the camera sound. My issue is that there does not seem to be a way to run code once the capture happens, so that I can close the camera, retrieve the image data, etc. This is my code:
angular.module('myApp').factory('photoService', function(...){
var cameraOptions = {
camera: CameraPreview.CAMERA_DIRECTION.BACK,
tapPhoto: true
}
CameraPreview.startCamera(cameraOptions);
//onCapture isn't a real part of the plugin, but this is just the code I want to run each
//time the user taps the phone screen
CameraPreview.onCapture = function(imgData){
showPhoto(imgData);
CameraPreview.stopCamera();
}
});
I saw online that takePicture() is basically what I want, but it looks like we have to call it once for it to work subsequent times (https://github.com/cordova-plugin-camera-preview/cordova-plugin-camera-preview/issues/364) and I don't want to do that because my app essentially opens the camera, allows the user to take a single photo, and then closes the camera, so having to take double the number of pictures than necessary is not ideal. Is there another way to achieve the functionality I want?
I'm having trouble solving this bug and hope someone can guide me. I created the share screen function in android with WebRTC (EglBase). it works perfectly fine until the user stops sharing and then shares again, the participant's screen will be a black screen, and can't see the content.
I have already check the SurfaceViewRenderer is assigned correctly. I believe the problem may be on how I implemented the init() and release() functions.
In this function, I init the screen share video
EglBase root = EglBase.create();
screenShare.init(root.getEglBaseContext(), null);
screenShare.setZOrderMediaOverlay(true);
remoteParticipant.setView(screenShareVideoView);
In this function, I release the screen share
screenShare.clearImage();
screenShare.release();
screenShare.setVisibility(View.INVISIBLE);
Maybe I re-init the view after release() incorrectly.
I am trying to create a playlist component, which has a this.state.playlist consisting of multiple mp4 video files, and the source of the Video tag is this.state.playlist[0]. Ideally when one video finishes, this.state.playlist shifts the first component and pushes it to the back, effectively moving onto the next component. However, it seems like the next video is starting at the same time the first video ends.
So for instance: if I have video A ( 1 min and 10 seconds) , video B (1 min and 40 seconds), and video C (2 minutes and ten seconds) . In this example, video A will play as normal, video B will start at 1 min and 10 seconds, and video C will start at 1 minute and 40 seconds. Here is an example expo.snack/Online Repl if you would like to test out the code:
https://snack.expo.io/By7nzgmtW
You can imagine this is especially an issue when it comes to videos that are the same amount of time as it seemingly just skips over them in general (this has been happening to me and I’ve been struggling the past couple days trying to figure out why the player seems to be skipping over some videos).
Also note: in my own code for my full repo (I would share it but its quite lengthy and I think the above snack/repl is much more illustrative of the issue) I use the require syntax because the video files are local, so let me know if theres any differences between the require/uri syntax in your solutions please.
Also, If you’re going to link me to the docs and a specific portion I would highly appreciate an example snack/repl/code ideally using my code as a basis, as I’ve read through the docs for the AV/Video stuff quite a few times and haven’t figured out a solution.
Thank you.
Here is the full code from the repl/expo.snack if you don't want to go to the link:
import React, { Component } from 'react';
import { View } from 'react-native';
import { Video } from 'expo';
export default class App extends Component {
constructor(props){
super(props);
this.state = {
playlist: [
{src: 'http://wsiatest.bitballoon.com/videotrack.mp4', name: 'dancers'},
{src: 'https://mohammadhunan.herokuapp.com/coding.mp4', name: 'portfolio'},
{src: 'http://www.html5videoplayer.net/videos/toystory.mp4', name: 'toys'}
]
}
}
videoUpdated(playbackStatus){
if(playbackStatus['didJustFinish']){
this.playNext();
}
}
playNext(){
// playNext() method takes the first item in the this.state.playlist array and moves it to the back
console.log('video did just finish');
let playlist = this.state.playlist;
let temp = playlist[0];
playlist.shift();
playlist.push(temp);
this.setState({playlist,})
}
render() {
return (
<View>
<Video
onPlaybackStatusUpdate={(playbackStatus)=> this.videoUpdated(playbackStatus)}
resizeMode="cover"
source={{uri: this.state.playlist[0].src}}
useNativeControls
rate={1.0}
volume={1.0}
muted={false} style={{height:180,width:240}} />
</View>
);
}
}
Edit:
Some extra info:
I used the create-react-native-app generator
expo version #20.0.0
react version #16.0.0-alpha.12
react native version ^0.47.0
This is an android app I'm building (though this is most likely irrelevant as this code is pretty much interchangeable with ios)
I have a working code on desktop browsers supporting getUserMedia Api, I can correctly see a video preview of my webcam in the div videoPreview. However, when running on Android device, this same code takes a picture with my front camera when I accept to share it in Chrome browser, then the preview keeps frozen on this first frame.
navigator.getMedia = (navigator.getUserMedia || navigator.webkitGetUserMedia || navigator.mozGetUserMedia);
navigator.getMedia(
// constraints
{video:true, audio:false},
// success callback
function (mediaStream) {
var video = document.getElementById('videoPreview');
video.src = window.URL.createObjectURL(mediaStream);
video.play();
},
//handle error
function (error) {
console.log(error);
}
)
For those encountering same problem : I fixed it by adding autoplay attribute to my <video> tag.
Was stuck with this for a while, I hope this will help someone else.
A colleague of mine and myself had the same issue today: working code didn't work anymore and the camera was frozen. Surprisingly (or not), a reboot fixed that problem.
I am developing an app using phonegap for android. It consists of around 25 images and similar number of mp3 files. Each mp3 file has duration not more than 10 seconds. The requirement of app is whenever a image is shown on screen its related mp3 file should be played. I am using jqtouch swipe action to move from one page to another. I am facing following problems both on emulator and real device(samsung galaxy 3)-
After 15-20 images sound stops playing both on emulator and galaxy 3. In logcat I got following error
ERROR/AudioFlinger(34): not enough memory for AudioTrack size=49216
I am using following code to play mp3 files
if(mp3file!=null){
mp3file.stop();
mp3file = null;
}
mp3file = new Media("/android_asset/www/name_of_file.mp3",
function() {
console.log("playAudio():Audio Success");
},
function(err) {
},
function(status) {
});
mp3file.play();
I think error is due to audiomanager objects from phonegap api of each mp3 file remaining in memory.
I want to know how to destroy media object created in javascript. You can play, stop, pause. But is there any method in phonegap for android to destroy it so that it does not remain in memory after it has done playing.
Another problem that I am facing is related with left swipe action in jqtouch to view next image. If I am currently viewing image1 and if I try to view image2 by left swipe action, image2 is shown for short amount of time and after that image1 is shown for a moment and after that again image2 is shown. In short the transition from image1 to
image2 is not smooth and it has flickering effects. However if i go from image2 to image1 using right swipe transition is smooth.
Thanks
I got some help with another developer on this.The trick is to make a function declare mp3file outside the function then call stopRecord() on it.(even though you're not doing recording).I tested this with a simple button playing the sound over and over and it worked 75 times before I got tired of playing with it.
Let me know it works for you.
var mp3file;
function playAudio(){
if(mp3file!=null){
mp3file.stop();
mp3file.stopRecord();
mp3file=null;
}
mp3file = new Media("/android_asset/www/yourmp3.mp3",
function() {
console.log("playAudio():Audio Success");
},
function(err) {
},
function(status) {
});
mp3file.play();
}