Is possible to play multiple audio files simultaneously in android / ios / flutter? - android

I am developing an App for kids learn to compose music similar to a drum pad machine.
Is possible to play multiple audio files simultaneously with the minimum delay possible between each other, like audacity, in android and ios?
I already checked near all the stackoverflow (and google also) related questions. But the posts are very old (2016, 2017, ...), and seemed that it was difficult to play sounds simultaneously. Maybe, now in 2019 is more easy to do it.
As far as I know, it is possible to use audiopool (but is limited to 1mb size and i need more than 1mb) and Mediaplayer. About mediaplayer, I can not found much information and tutorials.
Also, there is the new flutter framework. Is it possible to do it in flutter? Would be great, since with the same code could run on android and ios.

For flutter, you should try this resource: https://pub.dev/packages/audioplayers
It supports playing multiple audio files, preloading audio files and playing them with minimal delay as possible

Related

Android VLC Embed vs Android VideoView

=== BACKGROUND SUMMARY===
At this moment, we are using Android VideoView to perform video play black. Everything seems to be working great until we encounter Live Streaming.
VideoView tends to have 10-15 seconds delay from the live stream within a local network (LAN).
While attempting to solve this issue, we came across VLC Embed for Android. After searching on the Internet, it seems there isn't any article compare pros and cons of using Android VLC Embed vs. Android VideoView.
=== QUESTION ===
What's the advantage (pros) and disadvantage (cons) of using Android
VLC Embed vs. Android VideoView?
Is VLC Embed stable?
Anything I should be careful when switching existing VideoView to VLC?
Thank you all in advanced
My view may not be very professional but it's about what I've experienced so far.
First, Android VideoView is good since it comes with the Android SDK so it does not require external library. But this one has some limits. For example, as far as I know, it doesn't support MMS and MMSH protocols and some others I didn't quote. Which is not the case for Android VLC SDK. This library is complete and supports almost all media formats I know so far.
It just increases your apk on size, on my side that's the only disadvantage.
Is the Android VLC SDK stable? Yes it's stable and maintained by a huge community.
Anything I should be careful when switching existing VideoView to VLC?
You should keep your sources same and care about aspect ratio.
What's the advantage (pros) and disadvantage (cons) of using Android VLC Embed vs. Android VideoView?
Advantage:
More features. VLC supports almost all media formats, hardware decoding. audio tracks, subtitles, chapter are also supported.
More integrated, simpler logic. You can easily get media information and cache them. The playback engine will proactively notify state changes and events, just register player event listening.
Disadvantage:
APK file size increas. If both arm64-v8a and armeabi-v7a are supported, it will increase more than 30MB.
Multiple instances are not perfect. For example, playing 2 videos at the same time is a hassle.
Is VLC Embed stable?
Stable. Starting with VLC 2.0.x (now 3.0.x), I use the VLC library in my Android App. It runs steadily from Android 5.1 to Android 8.0. A small number of 4k h265 video playback is not normal, but can be resolved by displaying "Can not play".
Anything I should be careful when switching existing VideoView to VLC?
To use LibVLC on Android The Medialibrary(org.videolan.medialibrary) is also required. You also need to note the licenses.
VLC for Android is licensed under GPLv3
This may be a concern for you if your project uses a different license.

How to buffer and play video in a Android and iOS like Netflix and Iflix

I have a requirement to develop a Android and iOS mobile apps that allow subscribers to view movies like the way Netflix and Iflix does it.
I would like to know if this can be achieved by inbuilt Video playing classes or widgets on the Android and iOS platforms, or if we will need a library or SDK for this.
I came across this URL on how to stream video in Android apps. Would this approach suffice for this requirement?
https://code.tutsplus.com/tutorials/streaming-video-in-android-apps--cms-19888
Netflix and similar systems use ABR to deliver video to mobile devices - ABR allows the client device or player download the video in chunks, e.g 10 second chunks, and select the next chunk from the bit rate most appropriate to the current network conditions. See here for an example:
https://stackoverflow.com/a/42365034/334402
There are several ABR protocols but the two most common at this time are HLS and DASH. HLS must be used to deliver video streams to iOS devices due to the apple guidelines (at this time and for video over 10 mins which may be accessed on a mobile network - the guidelines can change over time) and DASH is probably more common on Android devices, although HLS can be supported on Android also.
Most Android players now can handle ABR - the Android Exoplayer is a good example, is very well used and supports this natively:
https://github.com/google/ExoPlayer
Take a look at the Developers Guide (included in the link above at the time of writing) which shows how to include ExoPlayer in your app.
On iOS the native player supports ABR using HLS.

Most instant way to stream live video to iOS and Android

I'm making an app that needs to send a video feed from a single source to a server where it can be accessed by desktop browsers and mobile apps.
So far, I've been using Adobe Media Server 5 with a live RTMP stream. This gives me about a 2.5 second delay on desktop browsers, which gives me no native support for iOS, but leaves me with the option to use Air to export the app for iOS, which produces a minimum 5-6 second delay.
The iOS docs strongly recommend the use of HTTP Live Streaming which segments the stream into chunks and serves it using a dynamic playlist in a .m3u8 file. Doing this produces a 15+ second delay in desktop browsers and mobile devices. A Google search seemed to reveal that this is to be expected from HLS.
I need a maximum of 2-4 second delays across all devices, if possible. I've gotten poor results with Wowza, but am open to revisiting it. FFMpeg seems inefficient, but I'm open to that as well, if someone has had good results with it. Anybody have any suggestions?? Thanks in advance.
I haven't even begun to find the most efficient way to stream to Android, so any help in that department would be much appreciated.
EDIT: Just to be clear, my plan is to make an iOS app, whether it's written natively or in Air. Same goes for Android, but I've yet to start on that.
In the ios browser HLS is the only way to serve live video. The absolute lowest latency would be to use 2 second segments with a 2 segment windows in the manifest. This will give you 4 seconds latency on the client, plus another 2 to 4 on the server. There is no way to do better without writing an app.
15 Second delay for HLS streams is pretty good, to provide lower latency you need to use a different streaming protocol.
RTP/RTSP will give you the lowest latency and is typically used for VoIP and video conferencing, but you will find it very difficult to use over multiple mobile and WiFi networks (some of them unintentionally block RTP).
If you can write an iOS app that supports RTMP then that is the easiest way to go and should work on Android too (only old Androids support Flash/RTMP natively). Decoding in software will result in poor battery life. There are other iOS apps that don't use HLS for streaming, but I think you need to limit it to your service (not a generic video player).
Also please remember that higher latency equals higher video quality, less buffering, better user experience etc. so don't unnecessarily reduce latency.

Playing multiple tracks of audio online

My team and I are nearly done developing a music application for iPhone and Android that allows users to create their own music, built by playing and overlapping sampled sounds (up to 16 at a time). We are looking for a way to allow users to share these songs by embedding an audio player in our website which will (like the Android and iPhone applications already do) take the songs, which are expressed as a string representing pitch, duration, start time, and instrument, and convert them into a single playable audio file (any format).
We have looked into SoundManager 2 and WebAudio, and have run into the same problem with both: stopping sounds creates beeping or popping sounds that cannot be removed. Does anyone know of another framework or API that we should look into? A little googling also made sfplay stand out, but there isn't very much documentation on it. Any other suggestions?
Thanks!
There are still a lot of problems with javascript/html5 audio. WebAudio is very powerful but not very portable. Soundmanager 2 is less powerful, but more portable. You should not be hearing clicks/discontinuities with either library unless you are doing something wrong, but there are problems with synchronization and so on with browser-based audio.
That's why most people doing "serious" audio on the web are using java applets or flash, which won't work on devices.

android sound app sdk or ndk?

I've been experimenting with making an android app for the past week or two and have reached a point where I need to decide on a general plan of action concerning the code I'm writing.
started with SoundPool, easy to use, not very flexible. went on to AudioTrack, seems good but slow.
So now I'm looking at the ndk..
Does the ndk have direct access to AudioTrack? or something else?
What is the general concensus on this kind of thing?
A guess is to make the UI in java and the 'sound engine' in C++
I'd like to get started on the right track before I write too much.
edit:
It is a music studio that plays and manipulates wav files from the sdcard as well as realtime sound synthesis.
The Android SDK doesn’t offer a fully featured audio processing solution like Core Audio on iOS. The available features are exposed through OpenSL ES interface. Any audio processing feature's availability is dependent on the device manufacturer’s Android flavor and configuration, so you can not rely on them.
To wit, the infamous Android fragmentation problem is even bigger in audio.
Even worse, if a device reports an audio feature available, it may still not work properly. For example: audio playback rate on a Samsung Galaxy Note III.
Realtime sound synthesis? None of the Android SDK’s features are designed for that.
The best way is doing the UI in Java and dealing with sound in C++.
There are some 'audio engine' offers on the web, but many of them are just “wrappers” around the same thing.
As cofounder of Superpowered, allow me to recommend the use the Superpowered Audio SDK, which is a total audio processing solution designed for real-time and highest performance, without any dependency on Android’s audio offerings, and runs on all.
http://superpowered.com/
There are a ton of Audio Latency issues in Android. There's really not anything that can be done about it. It seems like ICS (4.0) may have done some improvements on it, from what I've read.
You could subscribe to Andraudio and you'd actually be better off directing Android Audio questions through their emailing list than through Stackoverflow:
http://music.columbia.edu/mailman/listinfo/andraudio

Categories

Resources