Below code shows how i am using audioplayers without audio service
playFromFile(List textList, seek) async {
for (int i = seek; i < textList.length; i++) {
setState(() {
newSeek = i;
});
itemScrollController.scrollTo(
index: i,
duration: const Duration(seconds: 2),
curve: Curves.easeInOutCubic);
String text = textList[i].join(' ');
final bytes = await getAudio(text);
await audioPlugin.playBytes(bytes);
while (audioPlugin.state == PlayerState.PLAYING && !playFresh) {
await Future.delayed(const Duration(seconds: 1));
if (audioPlugin.state == PlayerState.PLAYING) {
audioPlugin.onPlayerCompletion.listen((onDone) async {
audioPlugin.state = PlayerState.COMPLETED;
await audioPlugin.release();
});
}
if (audioPlugin.state == PlayerState.COMPLETED) {
await audioPlugin.release();
break;
}
}
if (playFresh) break;
}
}
I want to implement this code using audio service so i can play audio in background . How to implement this with audio service so it will play in background. Please help as i am new to flutter and unable to solve this for days.
Solution will be quite simple.
You implement this using different layer.
UI Layer - This will be your UI from where you going to click the play button.
Intermediate layer or Audio service Layer(Audio service class) - From your UI layer you going to called this audio service class play method.
Final layer (Actual audio player methods) - This will be final layer where your actual audio package mehtods resides. You called this layer from your intermediate layer.
Summary - UI layer play button -> Audio service Play method -> Audio player Play method
Related
The app plays some background audio (wav) in an endless loop. Audio playback was flawless using just_audio package, but after switching to audioplayers, all looped audio now contain a short gap of silence before they restart at the beginning. Question now is: How to get rid of the gap?
The code to load and play the files is quite simple:
Future<void> loadAmbient() async {
_audioAmbient = AudioPlayer();
await _audioAmbient!.setSource(AssetSource('audio/ambient.wav'));
await _audioAmbient!.setReleaseMode(ReleaseMode.loop);
}
void playAmbient() async {
if (PlayerState.playing != _audioAmbient?.state) {
await _audioAmbient?.seek(Duration.zero);
_audioAmbient?.resume();
}
}
According to the audioplayers docs, this seems to be a known issue:
The Getting Started docs state:
Note: there are caveats when looping audio without gaps. Depending on the file format and platform, when audioplayers uses the native implementation of the "looping" feature, there will be gaps between plays, witch might not be noticeable for non-continuous SFX but will definitely be noticeable for looping songs. Please check out the Gapless Loop section on our Troubleshooting Guide for more details.
Following the link to the [Trouble Shooting Guide] (https://github.com/bluefireteam/audioplayers/blob/main/troubleshooting.md) reveals:
Gapless Looping
Depending on the file format and platform, when audioplayers uses the native implementation of the "looping" feature, there will be gaps between plays, witch might not be noticeable for non-continuous SFX but will definitely be noticeable for looping songs.
TODO(luan): break down alternatives here, low latency mode, audio pool, gapless_audioplayer, ocarina, etc
Interesting is the Depending on the file format and platform, ... part, which sounds as if gap-less loops could somehow be doable. Question is, how?
Any ideas how to get audio to loop flawless (i.e. without gap) are very much appreciated. Thank you.
You can pass this parameter
_audioAmbient = AudioPlayer(mode: PlayerMode.LOW_LATENCY);
if not solved u can try something new like https://pub.dev/packages/ocarina
I fixed it this way:
Created 2 players, set asset path to both of them and created the method that starts second player before first is completed
import 'package:just_audio/just_audio.dart' as audio;
audio.AudioPlayer player1 = audio.AudioPlayer();
audio.AudioPlayer player2 = audio.AudioPlayer();
bool isFirstPlayerActive = true;
StreamSubscription? subscription;
audio.AudioPlayer getActivePlayer() {
return isFirstPlayerActive ? player1 : player2;
}
void setPlayerAsset(String asset) async {
if (isFirstPlayerActive) {
await player1.setAsset("assets/$asset");
await player2.setAsset("assets/$asset");
} else {
await player2.setAsset("assets/$asset");
await player1.setAsset("assets/$asset");
}
subscription?.cancel();
_loopPlayer(getActivePlayer(), isFirstPlayerActive ? player2 : player1);
}
audio.AudioPlayer getActivePlayer() {
return isFirstPlayerActive ? player1 : player2;
}
void _loopPlayer(audio.AudioPlayer player1, audio.AudioPlayer player2) async {
Duration? duration = await player1.durationFuture;
subscription = player1.positionStream.listen((event) {
if (duration != null &&
event.inMilliseconds >= duration.inMilliseconds - 110) {
log('finished');
player2.play();
isFirstPlayerActive = !isFirstPlayerActive;
StreamSubscription? tempSubscription;
tempSubscription = player1.playerStateStream.listen((event) {
if (event.processingState == audio.ProcessingState.completed) {
log('completed');
player1.seek(const Duration(seconds: 0));
player1.pause();
tempSubscription?.cancel();
}
});
subscription?.cancel();
_loopPlayer(player2, player1);
}
});
}
My app downloads audio files to device and has a audio player which plays that audio.
So I am trying to add song to my playlist while audio is playing. I am using flutter package just_audio: ^0.9.7 . I query all the songs from device from a specific folder and use these two function in init state
Future<List<AudioSource>> _getList(List<SongModel> something) async {
List<AudioSource> _list = [];
something.forEach((element) {
_list.add(
AudioSource.uri(
Uri.directory(element.data.toString()),
tag: MediaItem(
id: element.id.toString(),
title: element.title.toString(),
),
),
);
print(element.data);
});
return _list;
}
Future<void> settingAudioPlayer() async {
List<SongModel> something = [];
something =
await OnAudioQuery().querySongs(sortType: SongSortType.DATA_ADDED);
something
.removeWhere((element) => !element.data.contains('StrikleDustiness'));
//print('${something[0].data}');
var items = await _getList(something);
await MainScreen.player.setAudioSource(
ConcatenatingAudioSource(useLazyPreparation: true, children: items),
);
}
by doing this I am able to play song whenever I restart the app. But I want to add song to the playlist whenever I download new song while playing the existing song . I want the new download song to be my next queue.
Sorry for the bad english and also I am new to flutter.
Thanks..
EDIT: I figured it out. I just had to create a separate list containing all the video sources.
guys, I'm building a music app in flutter and I have a problem which I don't know how to resolve, so if someone can help me out on this.
So this is the thing I have songs that are stored on CloudFront and they work perfectly
play(song, context) async {
int result = await audioPlayer.play(song); //path to the song
setState(() {
isPlaying = true;
});
// int seek = await audioPlayer.seek(Duration(minutes: 10, seconds: 10));
if (result == 1) {
return 'succes';
}
}
but when I give another path from podbean it will not start playing "https://mcdn.podbean.com/mf/play/9i7qz8/Japji_Sahib_Pauri_3.mp3" -> this is the link you can play it in browser
Error message. Shows after some time on the initial play of the podcast
Any idea what I can do about this problem.
Thanks in advance.
I use navigator.mediaDevices.enumerateDevices to retrieve list of all video devices (element.kind === 'videoinput') and then call navigator.mediaDevices.getUserMedia(constraints) call to rotate video devices (using deviceId as constraint). Everything works fine on Windows Chrome / Firefox, but on android phone (tried Samsung, Asus, Huawei with Android 8/9) this call fails for back camera with NotReadableError / Could not start video source (for Chrome) or AbortError / Starting video failed (for Firefox).
Strangely same code works ok in iOS / Safari.
Also this only happens when WebRTC call is present in browser. If there is no call I can select any video device.
Also if I select back camera first and try to establish the call, it does not work, I get similar error.
I know it's far-fetched but maybe someone had same/similar issue?
All browser versions are up-to-date.
[UPDATE - code snippet and log]
switchCamera() {
try {
if (this.localStream) {
const tracks = this.localStream.getTracks();
console.log('switchCamera stopping this.localStream tracks', tracks);
tracks.forEach((track: MediaStreamTrack) => {
console.log('switchCamera stopping track', track);
track.stop();
});
console.log('switchCamera stop stream');
}
const constraints = {
audio: true,
video: { facingMode: this.faceCamera ? 'environment' : 'face' }
};
this.faceCamera = !this.faceCamera;
console.log('switchCamera constraints: ', constraints);
navigator.mediaDevices.getUserMedia(constraints)
.then(stream => {
console.log('getUserMedia:', stream);
this.logText('got stream');
this.localVideo.srcObject = stream;
const videoTracks = stream.getVideoTracks();
const audioTracks = stream.getAudioTracks();
console.log('videoTracks', videoTracks);
if (videoTracks.length > 0) {
console.log(`Using video device: ${videoTracks[0].label}`);
}
const videoTrack = videoTracks[0];
const audioTrack = audioTracks[0];
console.log('Replacing track for pc', videoTrack, audioTrack);
const pc = this.session.sessionDescriptionHandler.peerConnection;
const videoSender = pc.getSenders().find(s => {
return s.track && s.track.kind === videoTrack.kind;
});
const audioSender = pc.getSenders().find(s => {
return s.track && s.track.kind === audioTrack.kind;
});
if (videoSender) {
console.log('videoSender.replaceTrack', videoTrack);
videoSender.replaceTrack(videoTrack);
}
if (audioSender) {
console.log('audioSender.replaceTrack', audioTrack);
audioSender.replaceTrack(audioTrack);
}
})
.catch(e => {
console.log('getUserMedia error:', e.name, e.code, e.message);
});
} catch (e) {
window.alert(e);
}
}
this is the log from chrome remote device debug:
The error is "NotReadableError", "Could not start video source" which means that the underlying device handle could not be obtained by chrome.
Again, safari/ios works ok.
For mobile devices, there is a dedicated way of how to select between front & back camera.
VideoFacingMode - https://www.w3.org/TR/mediacapture-streams/#dom-videofacingmodeenum
TL;DR
window.navigator.mediaDevices.enumerateDevices().then(devices => {
if (devices.filter(device => device.kind === 'videoinput').length > 1) {
navigator.mediaDevices.getUserMedia({video: {facingMode: 'user' /*'environment'*/}}).then(console.log.bind(this))
}
})
It works for mobile Safari, Chrome and FF.
NOTE
Remember, to stop the previous video track before calling the
getUserMedia with video again, otherwise, you will get an
exception.
Ok, so I narrowed it down to calling navigator.mediaDevices.getUserMedia() in ngInit() (this is Angular app).
Even if I remove all code in .then() handler function, the effect is the same.
Only removing this call solves the issue.
Not sure at this time why such behavior, will investigate it more thoroughly and update.
To switch between front and back cameras on mobile, you need to stop the previous stream before opening a new stream.
if (videoIn.srcObject) {
videoIn.srcObject.getTracks().forEach((track) => {
track.stop();
});
How does Media.release() work. Looking at the docs it feels that you have to use it like this:
MediaService.loadMedia('sounds/connection-error.wav').then(function(media){
media.play();
media.release();
});
But I googled enough to know that is wrong. We have to explicitly release the core instances on Android.
But how to do that? If I have 8 views in my app and if I play a sound file on each of those views does that count as 8 core instances being used? And can I go back to say view number 1 and again play the sound associated with that view? If so, would that count as a 9th instances ?
Straight away calling media.release() just like above does not play any sound at all.
Most common way to play sounds using Cordova Media plugin is following:
function playAudio(src) {
// HTML5 Audio
if (typeof Audio != "undefined") {
new Audio(src).play() ;
// Phonegap media
} else if (typeof device != "undefined") {
// Android needs the search path explicitly specified
if (device.platform == 'Android') {
src = '/android_asset/www/' + src;
}
var mediaRes = new Media(src,
function onSuccess() {
// release the media resource once finished playing
mediaRes.release();
},
function onError(e){
console.log("error playing sound: " + JSON.stringify(e));
});
mediaRes.play();
} else {
console.log("no sound API to play: " + src);
}
}