I am new to flutter and I want to play audio file from URL path with play, pause and seek button and also show notification in player.
You can play a URL in just_audio like this:
final player = AudioPlayer();
await player.setUrl('https://example.com/song.mp3');
player.play();
player.pause();
player.seek(Duration(seconds: 143);
To add notification support, the easiest way is to add just_audio_background. You need to change the above code slightly so that instead of calling setUrl, you now do this:
await player.setAudioSource(AudioSource.uri(
'https://example.com/song.mp3',
tag: MediaItem(
id: 'Some unique ID',
title: 'Song title',
album: 'Song album',
artUri: Uri.parse('https://example.com/art.jpg'),
),
));
Now once that song starts playing, the supplied metadata will also be shown in the notification.
just_audio_background must also be initialised in your main:
Future<void> main() async {
await JustAudioBackground.init(/* See API for options */);
runApp(MyApp());
}
And don't forget to follow the platform-specific setup instructions for each plugin:
just_audio
just_audio_background
Note that just_audio_background uses the audio_service plugin under the hood, so if your app has more complex requirements, you could use that plugin directly.
If you have questions about how to build the actual UI, you can create a separate question on that, or you can look at the above two links because each plugin includes an example app which demonstrates how to link it all up in a UI.
#2022 Update
After trying the above answers, i was getting this error:
Error: Expected a value of type 'Source', but got one of type 'string'
Here is how i managed to solve it:
Declare a variable of type Source
AudioPlayer myAudioPlayer=AudioPlayer();
late Source audioUrl;
Then initialize it in initstate
audioUrl=UrlSource('https://dummyurl.com/audio1.mp3');
and then you can use it to play audio like this
myAudioPlayer.paly(audioUrl);
Open AndroidManifest.xml file and enable internet permission ,usesCleartextTraffic
android\app\src\main\AndroidManifest.xml
Add following 2 lines for enabling internet permission and usesCleartextTraffic
You can use audioplayers package.
Define AudioPlayer instance like this
AudioPlayer audioPlayer = AudioPlayer();
Play audio as shown below
play() async {
int result = await audioPlayer.play(url);
if (result == 1) {
// success
}
}
You can pause and stop like this
await audioPlayer.pause();
await audioPlayer.stop();
This is how you can play local audio
playLocal() async {
int result = await audioPlayer.play(localPath, isLocal: true);
}
I recommend you to use audio_service package. This is highly customizable with notification and you can also play audio in background or sleep mode.
Related
I'm using the firebase_messaging package on my flutter app and it works perfectly for everything except that when the app on the background and I get a notification it only opens the app and never do what I ask it does after opening the app ..here is my code:
Future<void> _firebaseMessagingBackgroundHandler(RemoteMessage message) async {
// If you're going to use other Firebase services in the background, such as Firestore,
// make sure you call `initializeApp` before using other Firebase services.
await Firebase.initializeApp();
print("Handling a background message: ${message.messageId}");
final dynamic data = message.data;
print("data is $data");
if(data.containsKey('request_id')) {
print("we are inside ..!");
RegExp exp = RegExp(r'\d+');
var id = int.parse(exp.firstMatch(data['request_id']!)!.group(0)!);
OpenRequestsController openRequestsController = Get.isRegistered<OpenRequestsController>()?Get.find():Get.put(OpenRequestsController());
OpenRequest matchingRequest = openRequestsController.openRequestsList.firstWhere((request) => request.id == id);
print("the request from the list is $matchingRequest");
openRequestsController.openRequestDetails(matchingRequest,false);
}
}
what happens here is that it tries to run the whole function when the message is received not when clicked .. and ofc it fails because the app is not already running in the foreground
For opening the application and moving to the said screen you need to implement onMessageOpenedApp in the initstate of your first stateful widget which will let the application to know that it is supposed to open the application when a notification being clicked. The code function should look like this:
FirebaseMessaging.onMessageOpenedApp.listen((message) {
Get.to(() => const MainScreen()); // add logic here
});
It is better if you could assign identifiers for the type of screens you want to open on clicking a particular notification while sending the notification and implement an if-else or switch statement here on the basis of message type.
Thanks
maybe you can give me a hand, I've been looking for all information, in order to solve this problem of mine, but nothing, I can't use the player with cordova ...
in practice I can't load mp3 files, whether they are both locally and remotely, I always get this error: "Error: buffer is either not set or not loaded", the files in the path are there.
I tried using Tone's oscilloscope to see if I was having trouble loading Tone, but this works fine.
Could it depend on some authorization, particular that I might be missing ?.
For example using cordova media plugin I can reproduce the audio, but I need to use Tonejs.
Do you have any ideas, what this might depend on. or what could I try to do? ...
even this simple example, after the device ready does not go:
const player = new Tone.Player("https://tonejs.github.io/audio/berklee/gong_1.mp3") .toDestination();
Tone.loaded().Then(() => {
player.start ();
});
You who have already worked with Tone and Cordova, give me hope that it can be used in some way, let me know if you have any ideas thanks in advance!
--- UPDATE:
Currently when the app starts, I behave like this (I state that it is a porting of the code developed for a website and that everything works correctly loading it, etc.):
let myLoader_File = function functionOne(test) {
var androidFixPath = cordova.file.applicationDirectory
console.log("Entered function");
return new Promise((resolve, reject) => {
let CaricatorePlayers = new Tone.Players({
Beep: androidFixPath + "sample/Beep.mp3",
Clave: androidFixPath + "sample/Clave.mp3",
}).toDestination();
resolve(
CaricatorePlayers
//"ok tutti i file sono stati caricati!"
);
reject("si รจ verificato un errore!");
});
};
function onDeviceReady() {
$('#fade-wrapper-flash').fadeOut(1000);
$('#fade-wrapper').fadeIn();
//attach a click listener to a start button
document.querySelector('#fade-wrapper').addEventListener('click', async () => {
await Tone.start();
console.log('audio is ready');
myLoader_File().then((res) => {
console.log(`The function recieved with value ${res}`)
MultiPlayers = res;
console.log(MultiPlayers)
try {
// Play
MultiPlayers.player("Beep").start();
$('#fade-wrapper').fadeOut();
preLoad();
} catch (error) {
console.log(error);
console.log("IN ERRORE RICARICO");
}
}).catch((error) => {
console.log(`Handling error as we received ${error}`);
console.log("IN ERRORE RICARICO");
$('#fade-wrapper').fadeIn();
});
});
}
I posted an answer, here, which also solves this problem.
I don't know if the solution is the only possible way, or the best,
in order to access the folders locally, I made these changes to the call. Using an XMLHttpRequest object
which can be used, to request data from a web server, I got a Blob object, which I then passed to the Tone player.
I'm currently attempting to implement an audio player in React Native using Expo's Audio module. Everything is working as expected in Android, but on iOS, the playbackStatusUpdate() callback does not appear to be working. I've pasted the relevant snippet below.
Audio.setAudioModeAsync({
playsInSilentModeIOS: true,
interruptionModeIOS: INTERRUPTION_MODE_IOS_DO_NOT_MIX,
interruptionModeAndroid: INTERRUPTION_MODE_ANDROID_DO_NOT_MIX,
});
const { sound } = await Audio.Sound.createAsync({ uri });
await sound.setProgressUpdateIntervalAsync(100);
sound.setOnPlaybackStatusUpdate((data) => {
if (data.isLoaded) {
setCurrentPosition(data.positionMillis);
setIsPlaying(data.isPlaying);
setClipDuration(data.durationMillis || 0);
if (data.didJustFinish && !data.isLooping) {
sound.setPositionAsync(0);
options.onPlaybackComplete();
}
}
});
On iOS, the callback simply is not called at all, so no statuses, etc. are set.
I'm using the current position to update a slider at a higher level in my component tree. Again - this all works exactly as expected in Android. Any idea what could be happening here?
I am using flutter video_player plugin for camera video streaming. Camera streaming is from ESP32Cam hardware.
ESP32Cam is streaming video on network using http protocol and in mjpeg format
Verified in VLC media player, Codec information are as below:
Codec: Motion JPEG Video (MJPG)
Decoded format: Planar 4:2:2 YUV full scale
What are the configuration required in video_player plugin to stream the video?
Here is my flutter code for streaming initialization:
late VideoPlayerController _controller;
_controller = VideoPlayerController.network(
//'https://www.sample-videos.com/video123/mp4/720/big_buck_bunny_720p_20mb.mp4'
"http://192.168.216.40",
// formatHint: VideoFormat.hls,
)
..initialize().then((_) {
print("Streaming initialized...");
// Ensure the first frame is shown after the video is initialized, even before the play button has been pressed.
});
Then I am using widget in my Container widget
Container(
child: VideoPlayer(_controller),
),
ESP32Cam setup for reference link
you can use live streaming pluging
without null safety: https://pub.dev/packages/mjpeg
with null safety: https://pub.dev/packages/flutter_mjpeg
You can use the above comment's plug-in in the pubspec.yaml file and do the following steps
Add new dummy variable bool isRunning = true; in State<YourStateName>
put dummy variable in Mjpeg[isLive] -> child: Mjpeg(isLive: isRunning, stream: 'replace with your local streaming URL')
you can use this link for more information https://youtu.be/2OjO6K5QuYs
I have an android app having a CameraActivity which runs a tflite classifier periodically on image frames from the preview stream. The implementation of the Camera and tflite works great in the Android part and gives a good FPS.
I want to show this CameraActivity in my Flutter App as a Screen. The Flutter app has all the Frontend and UI part implemented already.
I've already tried using the official Flutter Camera plugin to implement the same by using camera.startImageStream but was unable to get matching FPS and the Camera Preview lags when calling the tflite model asynchronously using methodChannel.
I also came across AndroidView which embeds an Android view in the Widget hierarchy but the docs say it is an expensive operation and should be avoided when a Flutter equivalent is possible.
Is there a way to write a plugin for showing the CameraActivity (i.e. the UI) on the Flutter end similar to the way methodChannel is used for exchanging data between the Flutter and the Native codes. Or if there's another possible way of achieving this, please let me know.
Thanks in advance!
//Flutter side
Future<Null> showNativeView() async {
if (Platform.isAndroid) {//check if platform is android
var methodChannel = MethodChannel("methodchannelname");//create a method channel name
await methodChannel.invokeMethod('showNativeCameraView');//create method name
}
}
Container(
child: InkWell(
onTap: () {
showNativeView();//call to open native android activity
},
child: Center(
child: new Text("takeimage"))))
//Android side
//inside mainactivity on create
var channel: MethodChannel? = null
channel = MethodChannel(flutterView, "methodchannelname");
MethodChannel(flutterView, "methodchannelname")//same as flutterside
.setMethodCallHandler { call, result ->
if (call.method == "showNativeCameraView") {//same methodname as flutterside
val intent = Intent(this, ActivityCamera::class.java)//native camera activity code can be added in the android folder of the flutter application
startActivity(intent)
result.success(true)
} else {
result.notImplemented()
}
}
This link will help u further
https://medium.com/#Chetan/flutter-communicate-with-android-activity-or-ios-viewcontroller-through-method-channel-c11704429cd0