I have written a Dart web app that retrieves .mp3 files from a server and plays them back; I am trying to write a mobile version using Flutter. I know dart:web_audio is the main option for a web app, but Flutter can't find it in my SDK. I know it's there because I can compile the following to Javascript:
import 'dart:html';
import 'dart:convert';
import 'dart:web_audio';
AudioContext audioContext;
main() async {
audioContext = new AudioContext();
var ul = (querySelector('#songs') as UListElement);
var signal = await HttpRequest.getString('http://10.0.0.6:8000/api/filelist');
// Map json = JSON.decode(signal);
// for (Map file in json['songs']) {
print("signal: $signal");
Map json = JSON.decode(signal);
for (Map file in json['songs']) {
var li = new LIElement()
..appendText(file['title']);
var button = new ButtonElement();
button.setAttribute("id", "#${file['file']}");
button.appendText("Play");
li.append(button);
new Song(button, file['file']);
ul.append(li);
}
}
class Song {
ButtonElement button;
bool _playing = false;
// AudioContext _audioContext;
AudioBufferSourceNode _source;
String title;
Song(this.button, this.title) {
button..onClick.listen((e) => _toggle());
}
_toggle() {
_playing = !_playing;
_playing ? _start() : _stop();
}
_start() {
return HttpRequest
.request("http://10.0.0.6:8000/music/$title", responseType: "arraybuffer")
.then((HttpRequest httpRequest) {
return audioContext
.decodeAudioData(httpRequest.response)
.then((AudioBuffer buffer) {
_source = audioContext.createBufferSource();
_source.buffer = buffer;
_source.connectNode(audioContext.destination);
_source.start(0);
button.text = "Stop";
_source.onEnded.listen((e){
_playing = false;
button.text = "Play";
});
});
});
}
_stop() {
_source.stop(0);
button.text = "Play";
}
}
How would I rewrite the dart:web_audio parts of my code for a Flutter app? Can Flutter access MediaPlayer? And if so, how would I refer to it in pubspec.yaml?
As raju-bitter noted above, Flutter used to provide some built-in audio wrappers in its core engine but those have since been removed: https://github.com/flutter/flutter/issues/1364.
Flutter-using Apps are just iOS or Android apps, and thus it is possible to do anything the underlying iOS/Android can do via Flutter using some Java or Obj-C code in the hello_services model (https://github.com/flutter/flutter/tree/master/examples/hello_services). This model is documented at https://flutter.io/platform-services. It's not nearly as easy as we'd like it to be yet. Many improvements to come soon.
I know its 4 years late, but i have found audioplayers package which can be used as the following
import 'package:audioplayers/audio_cache.dart';
import 'package:audioplayers/audioplayers.dart';
//Call this function from an event
void playRemoteFile() {
AudioPlayer player = new AudioPlayer();
player.play("https://luan.xyz/files/audio/ambient_c_motion.mp3");
}
Related
I have a small MAUI app i'm testing with. Im trying to read a file that was part of the deployment. I have the code below, which works great in a Windows deploy of the MAUI app, but crashes in Android. What is the proper cross-platform way to do this?
// TODO get from service or xml
var path = AppDomain.CurrentDomain.BaseDirectory;
//var path = System.IO.Path.GetDirectoryName(System.Reflection.Assembly.GetEntryAssembly().Location);
var fullpath = Path.Combine(path, "Services\\questions.json");
var json = File.ReadAllText(fullpath);
MAUI has a new way to access files included with the app: MauiAsset.
Described in blog Announcing .NET MAUI Preview 4, Raw Assets:
.NET MAUI now makes it very easy to add other assets to your project and reference them directly while retaining platform-native performance. For example, if you want to display a static HTML file in a WebView you can add the file to your project and annotate it as a MauiAsset in the properties.
<MauiAsset Include="Resources\Raw\index.html" />
Tip: you can also use wildcards to enable all files in a directory:
... Include="Resources\Raw\*" ...
Then you can use it in your application by filename.
<WebView Source="index.html" />
UPDATE
However, the feature MauiAsset apparently still needs improvement:
open issue - MauiAsset is very hard to use.
There we learn that for now:
Set BuildAction in each file's properties to MauiAsset.
That is, its not recommended to use the "wildcard" approach at this time. Set that build action on each file in solution explorer / your project / the file.
Accessing on Windows requires a work-around:
#if WINDOWS
var stream = await Microsoft.Maui.Essentials.FileSystem.OpenAppPackageFileAsync("Assets/" + filePath);
#else
var stream = await Microsoft.Maui.Essentials.FileSystem.OpenAppPackageFileAsync(filePath);
#endif
NOTE: This will be simplified at some point; follow that issue to see progress.
UPDATE
The current MAUI template is missing some platform-specific flags. For now, add your own flag to identify when the code is running on Windows:
Complete example in ToolmakerSteve - repo MauiSOAnswers. See MauiAssetPage.
MauiAssetPage.xaml:
<ContentPage xmlns="http://schemas.microsoft.com/dotnet/2021/maui"
xmlns:x="http://schemas.microsoft.com/winfx/2009/xaml"
x:Class="MauiTests.MauiAssetPage">
<ContentPage.Content>
<!-- By the time Maui is released, this is all you will need. -->
<!-- The Init code-behind won't be needed. -->
<WebView x:Name="MyWebView" Source="TestWeb.html" />
</ContentPage.Content>
</ContentPage>
MauiAssetPage.xaml.cs:
using Microsoft.Maui.Controls;
using System.Threading.Tasks;
namespace MauiTests
{
public partial class MauiAssetPage : ContentPage
{
public MauiAssetPage()
{
InitializeComponent();
Device.BeginInvokeOnMainThread(async () =>
{
await InitAsync();
});
}
private async Task InitAsync()
{
string filePath = "TestWeb.html";
#if WINDOWS
var stream = await Microsoft.Maui.Essentials.FileSystem.OpenAppPackageFileAsync("Assets/" + filePath);
#else
var stream = await Microsoft.Maui.Essentials.FileSystem.OpenAppPackageFileAsync(filePath);
#endif
if (stream != null)
{
string s = (new System.IO.StreamReader(stream)).ReadToEnd();
this.MyWebView.Source = new HtmlWebViewSource { Html = s };
}
}
}
}
TestWeb.html:
(whatever html you want)
In Solution Explorer, add TestWeb.html to your project. In its Properties, select Build Action = MauiAsset.
I tried looking for a solution to this for months. I ended up hosting the file online then creating a method to download the file during runtime
public async Task DownloadFile(string fileName)
{
if (File.Exists(FileSystem.Current.AppDataDirectory + $"/{fileName}"))
{
return;
}
else
{
try
{
NetworkAccess networkAccess = Connectivity.Current.NetworkAccess;
if (networkAccess == NetworkAccess.Internet)
{
await Task.Run(() =>
{
var uri = new Uri($"https://myhostedfile.com/{fileName}");
WebClient webClient = new WebClient();
webClient.DownloadFileCompleted += new AsyncCompletedEventHandler(DownloadFileCallback2);//checking if download is complete
webClient.DownloadProgressChanged += new DownloadProgressChangedEventHandler(MaintainProgress);//event handler to check download progress
webClient.DownloadFileAsync(uri, FileSystem.Current.AppDataDirectory + $"/{fileName}");
});
}
else
await Shell.Current.DisplayAlert("No Internet", "Failed to get some files from the internet, confirm if your internet is" +
"working", "OK");
}
catch (Exception)
{
await Shell.Current.DisplayAlert("Error", "Failed to get some files from the internet, confirm if your internet is" +
"working", "OK");
}
}
}
Then you can access your file URL using:
string filePath = FileSystem.Current.AppDataDirectory + $"/myfile.pdf;
I'm trying to call google speech-to-text api but it always return me null result. I got the implementation hint from this answer:
Using gcloud speech api for real-time speech recognition in dart, flutter
I'm using flutter_sound (https://pub.dev/packages/flutter_sound) package to record audio and then send base64 encoded audio to speech API
Code for recording audio
String path = await flutterSound.startRecorder(
Platform.isIOS ? 'ios.' : 'android.aac',
androidEncoder: AndroidEncoder.AAC,
sampleRate: 16000 ,
numChannels: 1,
androidAudioSource: AndroidAudioSource.MIC,
);
print('startRecorder: $path');
The audio file android.aac with .aac extension is generated successfully from above code.
Below code is used for sending audio data to speech api
final _credentials = new ServiceAccountCredentials.fromJson(r'''
{
"type": "service_account",
"project_id": "",
"private_key_id": "",
....
''');
final _SCOPES = const [SpeechApi.CloudPlatformScope];
void convert() async {
clientViaServiceAccount(_credentials, _SCOPES).then((http_client) {
var speech = new SpeechApi
try{
String myPath= _path;
_readFileByte(myPath).then((bytesData) async {
String audioString = base64.encode(bytesData);
print('audioString: $audioString');
String audioStringSample = "";
RecognizeRequest r = RecognizeRequest();
RecognitionAudio audio = RecognitionAudio.fromJson({ 'content': audioString});
r.audio = audio;
RecognitionConfig config = RecognitionConfig.fromJson({
'languageCode' : 'en-US',
'encoding' : 'LINEAR16',
'sampleRateHertz' : 16000,
});
r.config = config;
speech.speech.recognize(r).then((results) {
for (var result in results.results) {
print(result.alternatives[0].transcript);
}
});
});
} catch (e) {
// if path invalid or not able to read
print(e);
}
});
}
Future<Uint8List> _readFileByte(String filePath) async {
Uri myUri = Uri.parse(filePath);
File audioFile = File.fromUri(myUri);
Uint8List bytes;
await audioFile.readAsBytes().then((value) {
bytes = Uint8List.fromList(value);
print('reading of bytes is completed');
}).catchError((onError) {
print('Exception Error while reading audio from path:' +
onError.toString());
});
return bytes;
}
The above code works perfect with audioStringSample(Find sample audio content here: https://gist.github.com/DazWilkin/34d628b998b4266be818ffb3efd688aa) but when I pass my own audio i.e audioString the result is always null. Anything I am doing wrong here?
P.S: I've also tried different encoding methods which are listed in Speech API reference (https://cloud.google.com/speech-to-text/docs/encoding) but remained unsuccessful.
The problem lied in the recorder library. The recorder which resolved the problem:
https://pub.dev/packages/flutter_audio_recorder
I recently ran into this exact problem as well and I think the problem lies with the encoding of the file. I'm using v2.0.3 for flutter_sound and the default file type after recording is aac, however, according to https://cloud.google.com/speech-to-text/docs/encoding, they only acceptable file types are flac, amr, wav and some others.
I was using https://pub.dev/packages/google_speech and the preset encode is
'encoding' : 'LINEAR16',
which explains why the wav file worked
I am building a mobile app for body measurement through photo I capture. how I can use OpenCV for the same? How to integrate OpenCV with Ionic framework? Kindly help.
Essentially you can pull in a flavor of the opencv.js framework. The way I've done it is by pulling down some reference like https://docs.opencv.org/3.4.1/opencv.js, and then hosting it somewhere (in case opencv moves it on you). Then include that script in the Ionic project. Be careful how you do that though. It is a big file, so it could take the app longer to load. Some options I've used:
Local Asset
Store the js file in the local assets, and include it in the index.js. If the Ionic app is deployed as a native app, then this asset is already in the app and fairly fast to load.
<script src="assets/js/opencv.js" async></script>
Dynamically load the file (example below)
async ionViewDidLoad() {
let loadingScreen = this.loadingCtrl.create({ content: "Loading Scripts. Please Wait..." });
//Show loading screen & load scripts
try {
await loadingScreen.present();
await this.loadScript();
}
catch (error) {
this.errorMessage = "We had some trouble loading scripts...";
}
finally {
loadingScreen && loadingScreen.dismiss();
}
}
public loadScript(): Promise<any> {
return new Promise((resolve, reject) => {
var isFound = false;
var scripts = document.getElementsByTagName("script")
for (var i = 0; i < scripts.length; ++i) {
if (scripts[i].getAttribute('src') != null &&
scripts[i].getAttribute('src').includes("opencv")) {
isFound = true;
return resolve();
}
}
if (!isFound) {
var dynamicScripts = ["https://docs.opencv.org/3.4.1/opencv.js"];
for (var i = 0; i < dynamicScripts.length; i++) {
let scriptNode = document.createElement('script');
scriptNode.src = dynamicScripts[i];
scriptNode.type = 'text/javascript';
scriptNode.async = false;
scriptNode.charset = 'utf-8';
document.getElementsByTagName('head')[0].appendChild(scriptNode);
scriptNode.onload = resolve;
}
}
});
}
I am trying run a local server for a Xamarin.Forms WebView. This is to get around CORS, and so the html can be structured like a normal page. This works for UWP and iOS, but Android always comes up with an ERR_CONNECTION_REFUSED. Some further details/things I have tried:
The App is running it's own server, so it is not the case of trying to access a server on a separate device.
Internet permission is enabled.
The path to the files do exist, otherwise the Webserver would fail to start.
Link to the local server nuget package: https://github.com/unosquare/embedio
Below is an outline of the code I'm using. In practise, I'm using a custom renderer, injecting Javascript to access platform features, etc. but this should simplify it:
The class that creates and starts the WebServer with EmbedIO:
public class LocalWebServer: IDisposable
{
public static string Url = "http://localhost:8787/";
private readonly string _filePath;
private WebServer _server;
public LocalWebServer(string filePath)
{
_filePath = filePath;
}
public void StartWebServer()
{
_server = new WebServer(Url);
_server.RegisterModule(new LocalSessionModule());
_server.RegisterModule(new StaticFilesModule(_filePath));
_server.Module<StaticFilesModule>().UseRamCache = true;
_server.Module<StaticFilesModule>().DefaultExtension = ".html";
_server.Module<StaticFilesModule>().DefaultDocument = "index.html";
_server.Module<StaticFilesModule>().UseGzip = false;
Task.Factory.StartNew(async ()=>
{
Debug.WriteLine("Starting Server");
await _server.RunAsync();
});
}
public void Dispose()
{
_server?.Dispose();
}
}
Code which starts the server and displays the webview:
public App()
{
InitializeComponent();
//Create and display a Webview
_webView = new WebView();
MainPage = new ContentPage()
{
Content = _webView,
};
}
protected override async void OnStart()
{
//Service which can initialize app for first time use, and stores
//the folder location for the html page on each platform
var htmlService = DependencyService.Get<IHandleHtmlContentService>();
//Local webserver
var localWebServer = new LocalWebServer(htmlService.DirectoryPath);
//This is just a function that loads the html content from the
//bundle resource or assets into a folder. Will only really
//matter during the first time the App boots up.
await htmlService.InitializeHtmlContent();
//Start the Webserver
localWebServer.StartWebServer();
//Navigate to the webserver
_webView.Source = LocalWebServer.Url;
}
I'v been bashing my head on this for a while, so any help would be appreciated. If you need any more details, let me know.
Turns out, Android has no concept of "localhost" (at least from what I can read). Instead, I need to find the IP Address of my device. I have done this with the following code:
public class LocalWebServer: IDisposable
{
public readonly string Url;
...
public LocalWebServer(string filePath)
{
_filePath = filePath;
Url = "http://" + GetLocalIpAddress() + ":8787/";
}
...
private static string GetLocalIpAddress()
{
var listener = new TcpListener(IPAddress.Loopback, 0);
try
{
listener.Start();
return ((IPEndPoint)listener.LocalEndpoint).Address.ToString();
}
finally
{
listener.Stop();
}
}
}
Code was found on this Xamarin Forums post: https://forums.xamarin.com/discussion/42345/simple-android-http-listener-not-working
Looks like it's a pretty common trouble but I still can't see any solution.
The goal is to play some AAC stream in Adobe AIR mobile app.
I've used a transcoder from https://code.google.com/p/project-thunder-snow/. The one adds FLV headers to AAC data so the stream can be played through a standard AS3 NetStream object (as I understand). It works fine for Win and Mac, but the same app launched on Android or iOS produces neither sound nor any error. I've figured out that transcoder works fine, the cause is in the different
characteristics of NetStreams.
So, is there any solution or, at least, any documentation describing the difference between the NetStreams on PC and mobile platform?
okay... I might have a clue for you. I tried running the NBAAC code link you posted on the other question from inside Flash CS5 and got this error:
onChannelReady : MetaData
Error: Error #2067: The ExternalInterface is not available in this container. ExternalInterface requires Internet Explorer ActiveX, Firefox, Mozilla 1.7.5 and greater, or other browsers that support NPRuntime.
So it seems the player is designed to be used only inside an html page & uses ExternalInterface (Javascript) for getting metadata from browser into SWF
I removed the ExternalInterface stuff and it played ok in Flash Player & Device Central without browser/HTML. Try this code in your AIR app (I don't have AIR installed anymore to confirm..)
package
{
import com.thebitstream.flv.CodecFactory;
import com.thebitstream.flv.Transcoder;
import com.thebitstream.flv.codec.*;
import com.thebitstream.ice.*;
import com.thebitstream.nsv.*;
import flash.display.Sprite;
import flash.events.Event;
import flash.events.IOErrorEvent;
import flash.events.ProgressEvent;
import flash.events.SecurityErrorEvent;
//import flash.external.externalInterface;
import flash.media.SoundTransform;
import flash.net.NetConnection;
import flash.net.NetStream;
import flash.net.NetStreamAppendBytesAction;
import flash.net.URLRequest;
import flash.net.URLRequestMethod;
import flash.net.URLStream;
import flash.utils.ByteArray;
import flash.utils.setTimeout;
[SWF (width="250",height="250")]
public dynamic class NBAAC_AIR_v2 extends Sprite
{
public var request:URLRequest;
public var transcoder :Transcoder
public var transport :NetConnection;
public var transportStream :NetStream;
public var serverConnection:URLStream;
//public var host:String=" "; //now not used in line... request=new URLRequest(resource);
public var resource:String="http://aacplus-ac-64.timlradio.co.uk/;"; //test station
public function NBAAC_AIR_v2 ()
{
super();
/*
if(loaderInfo.parameters.host)
host=loaderInfo.parameters.host;
if(loaderInfo.parameters.resource)
resource=loaderInfo.parameters.resource;
*/
//CodecFactory.ImportCodec(MetaData);
CodecFactory.ImportCodec(AAC);
CodecFactory.ImportCodec(AACP);
transcoder = new Transcoder();
transcoder.addEventListener(CodecEvent.STREAM_READY,onChannelReady);
transcoder.addEventListener(StreamDataEvent.DATA, onTag);
transcoder.initiate();
transcoder.loadCodec("AAC");
transport = new NetConnection();
transport.connect(null);
flash.utils.setTimeout(boot,500);
}
public function boot():void
{
//externalInterface.addCallback('nbaac.setVolume', onVolume);
//externalInterface.addCallback('nbaac.togglePause', onTogglePause);
//externalInterface.addCallback('nbaac.setBuffer', setBufferLength);
//externalInterface.addCallback('nbaac.getBuffer', getBufferLength);
//externalInterface.addCallback('nbaac.getTime', getTime);
/*
var meta:Object={};
meta.uri="";
meta.StreamTitle="";
*/
//externalInterface.call('logit','start up');
transportStream = new NetStream(transport);
transportStream.bufferTime=2;
transportStream.client = this;
transportStream.soundTransform=new SoundTransform(.5,0);
transportStream.play(null);
transportStream.appendBytesAction(NetStreamAppendBytesAction.RESET_BEGIN);
var headerTag:ByteArray=transcoder.createHeader(false,true);
transportStream.appendBytes(headerTag);
headerTag.clear();
//transcoder.readMetaObject(meta,0);
serverConnection=new URLStream();
serverConnection.addEventListener(ProgressEvent.PROGRESS,loaded);
serverConnection.addEventListener(IOErrorEvent.IO_ERROR, onIo);
serverConnection.addEventListener(SecurityErrorEvent.SECURITY_ERROR, onNoPolicy);
serverConnection.addEventListener(Event.CLOSE, onClose);
request=new URLRequest(resource); //removed "host" from url
//request.requestHeaders=[new URLRequestHeader("GET",resource+" HTTP/1.0")];
//request.requestHeaders=[new URLRequestHeader("Icy-MetaData","1")];
request.method=URLRequestMethod.GET;
serverConnection.load(request);
}
private function getTime() :Number
{ return transportStream.time; }
private function getBufferLength() :Number
{ return transportStream.bufferLength; }
private function setBufferLength (val:Number) :void
{ transportStream.bufferTime = val; }
private function onTogglePause():void
{ transportStream.togglePause(); }
private function onVolume (val:Number) :void
{ transportStream.soundTransform = new SoundTransform(val, 0); }
private function onIo(pe:IOErrorEvent) :void
{ /* externalInterface.call('logit','IOErrorEvent') */ }
private function onTag(sde:StreamDataEvent) :void
{
sde.tag.position=0;
transportStream.appendBytes(sde.tag);
}
private function onChannelReady(ce:CodecEvent) :void
{ trace('onChannelReady :',ce.codec.type); }
private function onClose(e:Event):void
{ /* externalInterface.call('logit','onClose') */ }
public function onMetaData(e:Object):void
{ /* externalInterface.call('logit','onMetaData') */ }
private function loaded(e:ProgressEvent):void
{
var chunk:ByteArray=new ByteArray();
while(serverConnection.bytesAvailable)
{ chunk.writeByte( serverConnection.readByte() ); }
chunk.position=0;
transcoder.addRawData( chunk, 0, "AAC" );
}
private function onNoPolicy(se:SecurityErrorEvent):void
{ /* externalInterface.call('logit','SecurityErrorEvent'+host+resource+'<br />'+se.text); */ }
}
}