I am creating an application in which user has to record video not more that 15 seconds. How to check duration while capturing video in phonegap. User should not record video more than 15 seconds.
I dont want to use the below code.
navigator.device.capture.captureVideo(function(mediaFiles) {
mediaFiles[0].getFormatData(function(data) {
if(data.duration > 15) {
alert('Your video is longer than the allowed 30 seconds.');
}
});
}, function(error) { alert('An error occured'); }, null);
This checks duration after capturing video.
Is it possible to stop recording video once user reach limit of 15 second.
In cordova have many option to handle this.....
so you need set
navigator.device.capture.captureVideo(captureVideoSuccess, captureErrorVideo, {
destinationType: destinationType.FILE_URL,duration:15});
duration take in seconds....When camera reached 15 seconds of video its automatically closed and call captureVideoSuccess. now you write your logic here..........
Check this
This has been posted long time ago, but I'll answer this anyway since I had similiar problem (but with audio). What you could do on Android is to set timeout with function that will stop recording.
Here's my code for audio recording with timeout after 10 seconds:
var src = "tmprecording.amr";
mediaRec = new Media(src, function() { // success
console.log("recordAudio():Audio Success");
}, function(err) { //error
console.log("recordAudio():Audio Error: "+ err.code);
});
//Record audio
mediaRec.startRecord();
timeout = setTimeout(function(){
//Here's your recording stop
mediaRec.stopRecord();
mediaRec.release();
mediaRec = null;
}, 10000);
Does that help?
Related
I have an android application that sends the camera stream through a webview through peerjs (webrtc) the web application on the browser receives the video and streams it.
Things are working but the video on the web is too slow and the image freezes for some time before getting the second image...
Is there a way to make the resolution lower ? or buffer the video on the web application ? or can it be something wrong with my implementation ?
Android Webview code:
initVideo = function(videoSourceValue) {
var video = document.querySelector('video');
navigator.getUserMedia({video: {optional: [{
sourceId: videoSourceValue
}]
}
},function(stream) {
video.src = window.URL.createObjectURL(stream);
$('#peerId').text("calling : " + SERVER_PEER_ID);
var mediaConnection = peer.call(SERVER_PEER_ID, stream);
mediaConnection.on('stream', function(remoteStream) {
// Show stream in some video/canvas element.
});
},function(e){
console.log('failed',e);
});
}
Web part:
function getVideoStream() {
PEER.on('call', function(call) {
var mediaConnection = navigator.getUserMedia({video: true}, function(stream) {
call.answer(stream); // Answer the call with an A/V stream.
call.on('stream', onReceiveStream);
}, function(err) {
console.log('Failed to get local stream' ,err);
});
});
}
function onReceiveStream(stream){
console.log('received stream');
$('video').prop('src',window.URL.createObjectURL(stream));
}
Thanks
Update 1
I tried to add {reliable : true}, still having the same issue.
I'm also sending location data to the server, and it seems that the video streams and location data are sent together periodically (the chart on the web showing speed and the video move at the same time) but the frame rate is too slow.
When you establish the video/audio stream you can specify some constraints...
var videoOptions = (isCordova) ? {audio: true, video: true} :
{ audio: true,
video: {
mandatory: {
maxWidth: 640,
maxHeight: 360,
// maxAspectRatio:4/3,
// maxFrameRate:1
},
quality: 7,
width: { ideal: 320 },
height: { ideal: 240 }
}
};
navigator.getUserMedia(videoOptions, function (stream) {
In the above code, if you are on a device (android/ios) you don't get to choose, but you can control it on the browser. A quality of 5 is the level that the video driver author deemed as an acceptable trade off between quality and bandwidth. Limiting the dimensions of the picture helps too.
See this link for mode details: https://developer.mozilla.org/en-US/docs/Web/API/MediaDevices/getUserMedia
My issue was completely unrelated to bandwidth, I simply didn't put autoplay on the video tag , so the video was only refreshing when a redraw was happening.
Thanks a lot for your answers they really give insight on how things work in webrtc.
How does Media.release() work. Looking at the docs it feels that you have to use it like this:
MediaService.loadMedia('sounds/connection-error.wav').then(function(media){
media.play();
media.release();
});
But I googled enough to know that is wrong. We have to explicitly release the core instances on Android.
But how to do that? If I have 8 views in my app and if I play a sound file on each of those views does that count as 8 core instances being used? And can I go back to say view number 1 and again play the sound associated with that view? If so, would that count as a 9th instances ?
Straight away calling media.release() just like above does not play any sound at all.
Most common way to play sounds using Cordova Media plugin is following:
function playAudio(src) {
// HTML5 Audio
if (typeof Audio != "undefined") {
new Audio(src).play() ;
// Phonegap media
} else if (typeof device != "undefined") {
// Android needs the search path explicitly specified
if (device.platform == 'Android') {
src = '/android_asset/www/' + src;
}
var mediaRes = new Media(src,
function onSuccess() {
// release the media resource once finished playing
mediaRes.release();
},
function onError(e){
console.log("error playing sound: " + JSON.stringify(e));
});
mediaRes.play();
} else {
console.log("no sound API to play: " + src);
}
}
I want to play a sound when the user click on a image. I'm using SoudManager plugin.
Here you can check what I'm doing:
soundManager.setup({
url: 'swf',
onready: function() {
soundCarro = soundManager.createSound({
id: 'soundCarro',
url: 'sound/carro_camona.mp3'
});
soundMoto = soundManager.createSound({
id: 'soundMoto',
url: 'sound/moto_camona.mp3'
});
soundNautico = soundManager.createSound({
id: 'soundNautico',
url: 'sound/nautico_camona.mp3'
});
}
});
As you can see, I create 3 sound objects (soundCarro, soundMoto and soundNautico).
Here the action, where when the user clicks on it, the scrollTo function is called:
<img id="ca" class="ca logo" onclick="scrollTo('secao-carros',true);" src="images/svg/CAhome.svg">
Here you can see the scrollTo function:
function scrollTo(target,sound){
if(sound){
var delay = 0;
if(target == 'secao-carros'){
soundCarro.play();
delay = 700;
duration = 750;
}
if(target == 'secao-motos'){
soundMoto.play();
delay = 900;
duration = 750;
}
if(target == 'secao-nauticos'){
soundNautico.play();
delay = 850;
duration = 2000;
}
$('html, body').delay(delay).animate({ scrollTop: $('#'+target).offset().top }, {duration: duration});
}else{
$('html, body').animate({ scrollTop: $('#'+target).offset().top }, {duration: 750});
}
}
As you can see, in this function I execute the sound object created (ex: soundNautico.play();).
The issue is that on iPad and Android devices this sound gets a big delay to execute, but in desktop browsers it works perfect!
How can I prevent this delay?
That's because desktop browsers will preload the audio so when you call .play() the browser will begin playing it immediately because it has already buffered some (if not all) of the audio.
I know that iOS and most other mobile browsers will A) only allow you to play a single audio file at a time, B) not allow you to cache audio at all, and C) only allow you initialize/play an audio object if the user physically initiated the action on the same call stack. So you're basically out of luck if you don't want the delay.
Edit: You could add an event listener to the audio and only trigger the scroll after the audio had buffered. However, you cannot load more than one sound at a time.
audio.addEventListener("canplay", function(e) {
doScroll();
audio.play();
}, false);
```
I need to play the "correct" sound if user answer the question correctly and "wrong" if incorrect. The sound is playing at first but after at the fifth time I answer correctly which output the fourth time "correct" sound already the fifth time, no sound can be heard. It seems the music can only played at most 4th times.
What is the possible reason that cause this?
updated*
I added media.release in my stopAudio() but so far the sound can be heard for 6th (improved) but it is still not suits my case. (Many questions that need the same sound effect)
Js.file
// Audio player
var my_media = null;
var mediaTimer = null;
// Play audio
function playAudio(src) {
// Create Media object from src
my_media = new Media(src, null, soundCB);
my_media.play();
}
function soundCB(err){
console.log("playAudio():Audio Error: "+err);
}
// Stop audio
//
function stopAudio() {
if (my_media) {
alert("AQA");
my_media.stop();
my_media.release();
}
clearInterval(mediaTimer);
mediaTimer = null;
}
/////question part//////
$("#answer").live('tap', function(event){
if(buttonAns==true) {
$this= $(this);
buttonAns = false;
var choice = $this.attr("class");
if((correctAnsNo == 0 && choice =="answer0")||(correctAnsNo == 1 && choice =="answer1")||(correctAnsNo == 2 && choice =="answer2")){
playAudio('/android_asset/www/audio/fall.mp3'); //falling sound
correct = parseInt(correct)+2;
playAudio('/android_asset/www/audio/correct.mp3'); //dingdong
}else{
playAudio('/android_asset/www/audio/wrong1.wav');
}
if(quesNo < wordsNo ){
setTimeout(function() {
buttonAns = true;
db.transaction(queryQuesDB, errorCB);
},1800);
}else{
endLevel();
}
} else {
event.preventDefault();
}
});
Android has a finite amount of resources in order to play sounds. You keep creating a new Media object each time you want to play a file. You need to release the sounds you are not using. If you need to play those sounds multiple times each then consider creating separate media objects to hold a reference to those sounds.
Read up on media.release.
I had the same problem.
The solution I did.:
if(media){
media.stop();
media.release();
}
media = new Media(...);
Here's my trick: I released it upon it finished playing or an error occurred.
function playAudio(url) {
try {
var my_media = new Media(url,
// success callback
function () {
**my_media.release();**
},
// error callback
function (err) {
**my_media.release();**
});
// Play audio
my_media.play();
} catch (e) {
alert(e.message);
}
}
Hello There,
I am new to phonegap.I am trying to record a audio clip and uploading it to server.I am working with phonegap +jquery mobile + Android.Can anyone tell me a good way which can work for me with a small example.Basically I have a form which have a button as Record, from which user can record an audio clip and can publish it.So basically I need to upload that recorded file on server on submitting the form.I tried phonegap API's Media and File for recording and uploading file but couldn't succeed.
I am using following function for recording :
function recordAudio() {
var src = "myrecording.mp3";
var mediaRec = new Media(src, onSuccess, onError);
// Record audio
mediaRec.startRecord();
// Stop recording after 10 sec
var recTime = 0;
var recInterval = setInterval(function() {
recTime = recTime + 1;
setAudioPosition(recTime + " sec");
if (recTime >= 10) {
clearInterval(recInterval);
mediaRec.stopRecord();
}
}, 1000);
}
Now I need to upload this recorded file to server.I am testing on emulator.
Kind Regrads
Jaya
the default location of recorded audio for android is "mnt/sdcard/myrecording.wav"
to upload, use the FileTransfer object to upload the audio
var ft = new FileTransfer();
ft.upload("mnt/sdcard/myrecording.wav", "http://www.website.com/upload.php", win, fail);
Why don't you try the captureAudio function directly?
function captureAudio() {
// Launch device audio recorder
navigator.device.capture.captureAudio(captureSuccess, captureError);
}
Check here: PhoneGap Doc
Under captureSuccess callback, you could use the FileTransfer object to upload your file.