I am a MIDI based musical application author. In my application I am generating a .midi file with a small lib that I wrote and play it on MediaPlayer and that's enough for that app. However in the future app I plan to have more interactivity and that's where I would probably need a streaming API.
As far as I know Android leaks APIs for realtime midi synth (at least official). But still I can see some apps that do use midi in quite advanced way. Question is how? Do they use NDK to access Sonivox directly or are there an unofficial apis for that after all? Any ideas?
Also I'm very interested if Google is planning to improve MIDI support in future versions of Android (in case anybody of Google sees this :))
Thanks.
You should check out libpd, which is a native port of PureData for both Android and iOS. It will provide you with access to the MIDI drivers of the system while still being able to prototype your software with very high-level tools.
Java has a very important latency, so i think this should be done with the NDK. Check this question, it has a couple of hints. This was reported as an Android issue (NDK support for low-latency audio), there might be some tips or info there too.
This is a simple but great sample application that successfully streams MIDI on Android https://github.com/billthefarmer/mididriver
You will have to put your MIDI messages together manually though ( the example creates two MIDI messages for play note and stop note). One can refer to the MIDI specification to further control the MIDI channels. The problem is that the default sound fonts on Android sound so bad.
Related
I'm developing a mobile APP to record audio, save to file and then send to a server. I'm currently using SmartFace.io, cross platform application to create both Android and iOS mobile apps. Been researching but can't find the audio capabilities of such platform, the online API documentation doesn't include specifics and the media items have no detailed info.
I'm not a beginner, SmartFace looks good but can't find any info regarding what I need to do. I'm not so sure if a lot of people is using it.
What I've done:
- Tried using PhoneGap but couldn't make it work, a coworker with more experience on Phonegap struggled until one project worked, we discovered some libraries and versions collide
- Tried samples posted here but as some other users reported, didn't work
- Also tried the now dead Mosync but the C code provided on the now dead forum doesn't work (says platform not supported).
- I know Appcelerator have working samples but it's my last choice
- Found working projects for Android Studio but we are still trying to avoid specific work for each platform-app-IDE-framework
Thanks in advance
Smartface App Studio offers lots of ready to use components and libraries in it.
However, for current version it is not available to record an audio samples.
For more details about the features and roadmap please check the links below;
http://www.smartface.io/developer/guides/
http://docs.smartface.io/
http://www.smartface.io/roadmap/
I'm currently working on a project which requires video from Android and iOS devices to be streamed live to our server.
I've been researching this for a while, and come across libraries that are dead, expensive, or dead expensive.
The only viable solution I've found so far is using Adobe FlashBuilder, but it is quite frankly not very nice for multiple reasons.
I would love to be able to do this natively for both platforms, but this is not a very juicy project for my employers, so they are reluctant to spend any cash on expensive libraries.
Are there any free/cheap libraries that fit the bill? Is there some other way for me to do this natively? The technology itself is negotiable, we are currently focused on RTMP since we are using FlashBuilder, but as long as the video is streamed up to a server then they don't particularly care what protocol is used.
Thanks for your help in advance, let me know if the way I've asked this question is not up to scratch.
I can say about Android. I'm in my application using openCV library specifically FFmpegFrameRecorder can work with RTMP protocol. My application works with RED5 server. Of the drawbacks I would mention a large number of native libraries.
See my answer, I described there that I used
Android Studio with javaCv and FFMPEG
I'm just learingn mobile web development and thinking about task:
Is there a way to make a videostream betwen iOS, Android and Browser. What architecture and technology it should use. I already read this quetion on SO Peer-to-Peer video from iOS to Android? but there is nothing about browsers.
If it can't be p2p and crossplatfom at the same time. I thought i shoud use Red5 server or etc. or Xmpp
So I'm asking your advice and opinion here. Any information would be valuable
Yes, You can !!!
There is new technology enforced by google is WEBRTC
It is stands for "web real time communication" and is an opensource project funded by google.
It is also support Android/iPhone native application.
I am working on it and got success say 60%.
Video clarity is good but audio is choppy.
You can find source code from Here
Discussion with community Here
You can see live demo Here
NOTE:
It is ongoing project and has not been stable yet. Google team is working on.Currently it is working on latest Chrome,FF and opera. IE has not given support yet.
Yes,the open source solution should be WEBRTC technology,please check it on official website: webrtc.org
I am trying to write a metronome application in Python, and I intend to publish the application for Android and iOS. I have found a few cross-platform frameworks like Kivy, but their audio support is lacking. More specifically, I need very precise audio timing and I can't rely on thread timing or events. I want to write audio data directly to the device's audio output, or create a MIDI file that can be played on the fly. The problem is, I cannot find any suitable framework for this task.
I know that many games have been written for Android in Python, and those games have excellent and precise sound timing. I need help finding either:
a way to create and play MIDI files on the fly in Android with Python,
a Python framework for Android with a suitable audio API to write sound directly to an audio device, or at least play audio with very accurate timing.
Thanks!
I'm looking for the same thing. I too am looking at Kivy. The possible solutions I can see to audio is hooking in a 3rd party application as a "recipe" in Kivy.
There is aubio, which apparently can be compiled for iOS/Android (see stackoverflow question regarding this), but I believe you have to get your own audio source for it, which could be potentially handled by the audiostream subproject in kivy.
Kivy/audiostream imports the core libpd project it appears, so you can use libpd python bindings. I think this is the path of least resistance, but I had issues when trying to run the examples.
Both of these approaches, I think could work but both need some effort to be able to start using.
Since Android (officially) support HLS starting with 3.0 I've looked at different ways to include hls in my app.
use a lib like the nexstreaming (pretty expensive)
use a HTML 5 player (hls not working on some 2.3 devices)
utilize plugin players like vitamio
My Problem is, that possibility number 3 works best, but my client doesn't want the users to see that a plugin is used.
Is there a way to include another apk / install it without prompting the user?
Or maybe someone has a completly different idea on playing hls on 2.x .
Vitamio is free(means charge) for personal/commercial use so far...
Contact vov.io and buy a commercial license for vitamio. Then you could bundle it in your apk. It's still going to be way cheaper than nextreaming.