HTML5 Video does not load local video Ionic Android - android

I've got a problem with some code in Ionic.
I am developing an app on Ios and Android that needs access to the video.
The video is private and specific that is why I need to store them in the assets folder.
Here is the problem, sometimes on Android, the video can not be loaded from this file and I have a GET error, it tries 30 times to get the video and then stop.
That means that whenever I play the video I get an Error saying: Uncaught promise the play() method was interrupted by a call to pause().
Here are the request from the app
Error message
Above are the request and the error code.
Here is my HTML5 code :
<ion-row class="ion-justify-content-center ion-align-items-center">
<video class="ion-no-margin" #video *ngIf="exercise?.video"
playsinline loop preload="auto" [ngClass]="{'preview-video': !isExerciseStarted}"
tappable (click)="onClickStartVideo()" [poster]="exercise.thumbnail">
<source [src]="exercise?.video" type='video/mp4'>
</video>
</ion-row>
And here is the TypeScript :
public onClickStartVideo() {
const video = (<HTMLVideoElement>this.video.nativeElement);
if (this.isVideoPlaying) {
video.currentTime = 0;
} else {
video.currentTime = 0;
video.play();
}
this.isVideoPlaying = true;
}
Thank you for your help.

Related

HTML Camera API in Android webview

I'm using the following html code. I'm able to get a video stream on my desktop, but I'm getting a grey play button in the android webView app. I'm serving this over a https connection.
Please guide me as Im new to both of these code snippets.
HTML
<div id="video-container">
<video id="camera-stream" width="500" autoplay></video>
</div>
Script.js
window.onload = function() {
navigator.getUserMedia = (navigator.getUserMedia ||
navigator.webkitGetUserMedia ||
navigator.mozGetUserMedia ||
navigator.msGetUserMedia);
if (navigator.getUserMedia) {
navigator.getUserMedia({ video: true },
function(localMediaStream) {
var vid = document.getElementById('camera-stream');
vid.srcObject = localMediaStream;
},
function(err) {
console.log('The following error occurred when trying to use getUserMedia: ' + err);
}
);
} else { alert('Sorry, your browser does not support getUserMedia'); }
}
This screenshot is taken from my desktop chrome browser.
and this is taken from my phone webView.
I know this is an old thread but I recently run into the same issue and only after a lot of tries I finally found the solution.
If you see this play button all the permission should be set correctly.
It seems that the webview is waiting for an user interaction to start the stream but tapping on the icon does not starts the video (idk how the user should "approve" the streaming)
The solution is to change the setting of the webview in you webapp:
webView.settings.mediaPlaybackRequiresUserGesture = false;
In this way the stream starts correctly without any user interaction

Cordova Media plugin - Load stream after hours

For a long time i m trying to solve problem with cordova media plugin...
All i want is to play internet radio stream on button press (play)
This is my code:
app.js
var mmedia;
function playAudio(url) {
mmedia = new Media(url,
function () {
console.log("playAudio():Audio Success");
},
function (err) {
console.log("playAudio():Audio Error: " + err);
}
);
// Play audio
mmedia.play();
}
index.html
<input type="button" value="Play" onclick="playAudio('http://streaming.tdiradio.com:9000/');" />
But all i get is silence...And few hours later radio just starts playing..
Hours after app was closed (i didn't release media though)...
I'm using Android 4.4.2 on my phone..
On Ripple works just fine without 1s of delay..
What is this about? And what is the best way to solve this?

Creating a custom receiver with the Google Cast - Media Player Library

I would like to implement media player functionality to my custom receiver.
On the google developer website, I found a description to implement a sender and a styled media receiver application.
I have done this sample, and it works fine. I can cast a MP3 file hosted on Google Drive to my chromecast device.
Now, I have implemented a custom receiver (see attachment) which should be able to play an URL refered to a m3u8 file. For this, I am using the Media Player Library as suggested from Google.
<body>
<div>
<p id='text'> </p>
<video id='vid'> </video>
</div>
<script type="text/javascript" src="https://www.gstatic.com/cast/sdk/libs/receiver/2.0.0/cast_receiver.js"></script>
<script type="text/javascript" src="https://www.gstatic.com/cast/sdk/libs/mediaplayer/1.0.0/media_player.js"></script>
<script type="text/javascript">
// If you set ?Debug=true in the URL, such as a different App ID in the
// developer console, include debugging information.
if (window.location.href.indexOf('Debug=true') != -1) {
cast.receiver.logger.setLevelValue(cast.receiver.LoggerLevel.DEBUG);
cast.player.api.setLoggerLevel(cast.player.api.LoggerLevel.DEBUG);
}
console.log("mediaElement set");
var mediaElement = document.getElementById('vid');
// Create the media manager. This will handle all media messages by default.
window.mediaManager = new cast.receiver.MediaManager(mediaElement);
// Remember the default value for the Receiver onLoad, so this sample can Play
// non-adaptive media as well.
window.defaultOnLoad = mediaManager.onLoad;
mediaManager.onLoad = function (event) {
// The Media Player Library requires that you call player unload between
// different invocations.
if (window.player !== null) {
player.unload(); // Must unload before starting again.
window.player = null;
}
// This trivial parser is by no means best practice, it shows how to access
// event data, and uses the a string search of the suffix, rather than looking
// at the MIME type which would be better. In practice, you will know what
// content you are serving while writing your player.
if (event.data['media'] && event.data['media']['contentId']) {
console.log('Starting media application');
var t = document.getElementById('text');
t.innerHTML = event.data['media'];
console.log("EventData: "+event.data);
console.log("EventData-Media: "+event.data['media']);
console.log("EventData-ContendID: "+event.data['media']['contentId']);
var url = event.data['media']['contentId'];
console.log("URL: "+url);
// Create the Host - much of your interaction with the library uses the Host and
// methods you provide to it.
window.host = new cast.player.api.Host(
{'mediaElement':mediaElement, 'url':url});
var ext = url.substring(url.lastIndexOf('.'), url.length);
var initStart = event.data['media']['currentTime'] || 0;
var autoplay = event.data['autoplay'] || true;
var protocol = null;
mediaElement.autoplay = autoplay; // Make sure autoplay get's set
protocol = cast.player.api.CreateHlsStreamingProtocol(host);
host.onError = function(errorCode) {
console.log("Fatal Error - "+errorCode);
if (window.player) {
window.player.unload();
window.player = null;
}
};
// If you need cookies, then set withCredentials = true also set any header
// information you need. If you don't need them, there can be some unexpected
// effects by setting this value.
// host.updateSegmentRequestInfo = function(requestInfo) {
// requestInfo.withCredentials = true;
// };
console.log("we have protocol "+ext);
if (protocol !== null) {
console.log("Starting Media Player Library");
window.player = new cast.player.api.Player(host);
window.player.load(protocol, initStart);
}
else {
window.defaultOnLoad(event); // do the default process
}
}
}
window.player = null;
console.log('Application is ready, starting system');
window.castReceiverManager = cast.receiver.CastReceiverManager.getInstance();
castReceiverManager.start();
</script>
</body>
I've figured out, that it's just possible to cast .m3u8, .ism and .mpd files with the Media Player Library. So I created a m3u8 file as follows, host it to Google Drive, and tried to cast it to my custom receiver.
#EXTM3U
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=173952
https://www.googledrive.com/host/0B1x31lLRAxTMRndtNkhSWVdGLVE
But it doesn't works. I used the Goolge Cast Developer Console to debug the custom receiver. By exceuting the
window.player.load(protocol, initStart);
command, I get an FATAL ERROR on the console.
I think the problem is on the custom receiver code because the sender application from the google documentation works fine with the styled media receiver.
Is there anyone who know this problem or see some issue on the custom receiver code? Has anyone a idea how I could debug the styled media player? It would be much easier if I could see which messages are exchanged with the styled media player but if I activate the debugging I canĀ“t see the exchanged messages.
If you turn on the debugging, you can see message exchanges (see here, under the Debugging section). There is a full fledged receiver sample project on our github repo as well.

HTML5 video in WebView on Samsung S4 (Android 4.4.2) issue

I have a weird issue playing videos embedded in video tags in my Phonegap 3.3 app on Samsung S4 + Android 4.4.2.
I have the following HTML code:
<video id='myvideo' autoplay=1 preload='auto'></video>
Videos are started via Javascript (with jQuery):
$('#myvideo').attr('src', 'http://...jpg');
$('#myvideo').attr('poster', 'http://...mp4');
$('#myvideo')[0].play();
Video does not start as expected.
The problem only turns out after upgrading from Android 4.2 to Android 4.4.2. The issue is not present on other smartphones running Android 4.4.2 (eg. Nexus 4, 5).
add this setting to your webview settings
webSettings.setMediaPlaybackRequiresUserGesture(false);
here is my js code to play video automatically and loop on it ( you can remove this part if you only want to play it once
// here is my code to find my video tag
var myVideo = document.getElementsByTagName('video')[0];
// if there is a video tag found ( this js is in a separate js files ) we play the video
if (typeof myVideo !== "undefined")
{
// setInterval allow me to monitor every 40 millisecond if the video is ending or not, if true we loop the video and play it again
setInterval(function() {
if ( myVideo.readyState != 0)
{
if (myVideo.paused) {
myVideo.currentTime = 0;
myVideo.play();
}
}
}, 40)
}
I think your .mp4 and .jpg are backwards.

Media.play not working synchronously in Android 4.3

I'm creating a smartphone app that is essentially a talking phrasebook, using App Framework 2 and Cordova 2.9, testing on a Samsung Galaxy S4. I found that after playing about 25 - 30 sound clips the sound stopped playing. I found the problem was that audio resources on Android are finite and not being released after issuing the media.play. I added a console log for each successful play and this showed successful execution even when it ran out of resources.
So, I added media.stop and media.release after the media.play (as discussed in several other questions on this forum and elsewhere) and found sound stopped working completely....sigh.
On a hunch, I added an alert statement between the media.play and media.stop and now sounds can be played as often as I want so long as I click OK when the alert displays. I also noticed the console log entries were no longer being made at all. So, it looks like the command after media.play is being executed immediately and not waiting for the sound to finish playing. Thus I issue media.play and before it can play the media.stop and media.release are executed and so I'm getting no sound. This seems to be confirmed as I noticed the alert was being displayed before the sound clip had finished playing and tapping OK terminated the sound playback, presumably because the media.stop and media.release were executed.
Am I misunderstanding how a synchronous call is supposed to work? Any suggestions to fix this much appreciated. Here's my js code:
<!-- Get the web root path -->
<script type="text/javascript">
function getWebRoot() {
"use strict" ;
var path = window.location.href ;
path = path.substring( 0, path.lastIndexOf( '/' ) ) ;
return path ;
}
</script>
<!-- /Get the web root path -->
<!-- Play sound using cordova -->
<script type="text/javascript">
function soundclip() {
var my_media = new Media(getWebRoot() + "/test.ogg", function() { console.log("my media found so stop and release after play"); });
my_media.play() ;
if (my_media) {
alert("my media found alert before stop and release") ;
my_media.stop();
my_media.release();
} ; // End If my_media
} ; // End soundclip
</script>
<!-- /Play sound using cordova -->
And here's the html
<input type="button" onClick="soundclip()" value=" Play Ay Caramba 1">
Thanks for any suggestions, this is driving me nuts !!

Categories

Resources