I've been contemplating (re)building an app on iPad for some time, where I would use objective-C and DSMI to send MIDI signals to a host computer. This is not bad (I mean, except for actually writing the app).
Now I'm contemplating perhaps developing the app for Android tablets (TBA).
In Java, what options are available for MIDI message communication? I'm quite familiar with javax.sound.midi, but then I would need a virtual MIDI port to send messages to the host.
On the other hand, if the app were done in Adobe AIR, what options would I have available for communicating with MIDI?
Obviously another option is to send/receive messages over a TCP/IP socket to a Java host, and talk that way, but it sounds a tad cumbersome... or perhaps not? DSMI does use a host program, after all.
javax.sound.midi is not available in android.
The only access to midi functionality in android is through the JetPlayer class and its very much designed for use in games. Android will play a midi file but its unclear what code path it uses, theres no way to write to the midi hardware from memory.
In one app I've made i needed to play notes dynamically based on the GUI/user interaction and ended up having to use samples and pitch filters to create the notes.
Sounds to me like what you need to do is port DSMI to android, its open source the iPhone library looks pretty simple, shouldn't be difficult to port over.
EDIT:
After thinking about this for second you wouldn't gain anything by using javax.sound.midi or whatever midi functionality exists in AIR anyway. All you need to do is pass MIDI messages through a network link to another device who is responsible for communication with the actual MIDI synth device. This is exactly what DSMI does. Porting the iPhone libdsmi to Android is what you need to do, its only 2 source files and 2 headers. 1 handles the MIDI message format which is very simple and can pretty much just be converted line by line to java and the other handles the network connection to the DSMI server which will need to be rewritten to use Android semantics for creating network connections.
Related
There are plenty of good sources on how to receive audio data on mobile phones. I find (almost) none of streaming audio data TO a server in a standardized (i.e., using a standard protocol, such as RTP) way.
What is the standard way to stream audio data from Android or iOS to a server?
Some additional info:
The closest solutions I found are:
Objective c: Send audio data in rtp packet via socket, where the accepted answer does send audio data, but not inside RTP
Using ffserver, which can listen to a port and receive streamed audio for further processing. But it has been discontinued.
I can write all of that functionality myself. I.e., wrap the audio data in RTP, RTSP, RTMP, whatever, write a server that receives the stream, decodes it and does the processing, but that's days of work for something that seems to be a standard task and thus should already exist.
Furthermore, I firmly believe that you shouldn't reinvent the wheel. If there's a good, established, API for something, Apple, Google or a third party, writing the functionality myself is a bad idea, in particular if networking is involved and thus security concerns.
I think I figured it out, at least one standard way. One likely answer to my question is simply: SIP
There are (even native) interfaces for SIP on both iOS and Android, there are (many) ready-made servers for this, and very likely there exist libraries or command-line clients to run on server side and use for the further processing.
I haven't tried it, but this looks very standardised, is likely widely used for such problems.
I have been wondering, on how to capture Audio inputs through USB in Android.
My scenario is to receive audio through external hardware and play that received audio through android app. This transmission is to be done over USB.
Is there any way to do this using Android SDK / Android NDK.
Any suggestion will be helpful to me.
Task Done Right by time I am able to interact with Hardware using CDC class and also able to play some random noisy audio through USB in my app. Neither I am able to get clear sound by that approach, nor there is consistency within the transmission of audio.
Thanks.
Regards, Vivek
Most modern Android devices can act as USB host. So you can connect e.g. USB microphone for capturing the audio. Android also contains support for usb_audio class. Use that to get access to the audio on the device.
Since you have already experimented with Communication Device Class (CDC), you are aware of Android's USB host functionality. Now you need to ensure your peripheral has implemented USB audio class (the audio source part) and make your app to use the audio class to obtain the audio. This pretty well explained here, so it does not make sense to copy all the information to this post. If you are already using audio class, that page may explain some of the issues you have (e.g. using wrong format).
USB Audio class specifications can be found at USB.org website. The problem with those is that Audio class is pretty large and Android probably does not support everything.
I have a question whether I can remotely control VLC video player program (play, pause, sound, maybe some video streaming, cam streaming) between my computer/mobile phone.
Here is my plan:
1. VLC player on Mac OS
2. writing some TCP Server (C++)
3. writing client on side of android mobile phone
here i consider writing in C++ in order to use it in android/ios ?
4. writing application on Android with simple buttons that can control remotely this player...
Can this solution work properly?
Some additional questions:
1. Can such solution work over WAN (Internet) not only LAN (TCP socket communication)
2. VLC player has in preferences Interface > Main Interfaces > RC
and Lua HTTP, Lua Telnet, etc. (whats the aim of this? )
3. I saw some application on Google Store that comunicates via Lua HTTP?
I would prefer writing my own server/client + protocol for communication this is for university lower degree project.
So my question is whether if I will write such a server, Will there be possibility to integrate it with VLC somehow like adding to preferences > Interfeaces, or it should be separate program or it can be written as plugin or some add-on ?
In summary I need some help to direct me which solution will be the best to provide the most seamless interaction with VLC and have own server, client, protocol in order it hasn't been to easy (I saw in documentacion that possibly there are simple commands in VLC over HTTP protocol which I assume could allow for easy interacting with VLC).
I think also about extending this project by enabling mouse move control on Mac OS / Windows. What should I need for it?
The last part is to enable streaming video to phone and maybe in opposite direction from phone to VLC player. Also web cam capture streaming from phone to VLC and oposite Mac book to phone will be interesting solution.?
thanks for any help
PLEASE If it is too long question please concentrate on answering whether it is possible to do, and whether it can be seamlessly integrated in such way that end user shouldn't have to make many hours of configuration...
Best solution form my point of view:
- preference screen of my plugin embedded in VLC player settings
- writing TCP port/ host (maybe using current host IP in local network)
- on mobile side detecting and connecting via this host:port using client and it just works...
1. VLC player on Mac OS 2. writing some TCP Server (C++) 3. writing client on side of android mobile phone here i consider writing in C++ in order to use it in android/ios ? 4. writing application on Android with simple buttons that can control remotely this player... Can this solution work properly?
Yes it is possible and it works perfectly. VLC has in built server, so you do not need another server app to control it. you just write a client side app for android or Windows/iOS. However if you still want to write server app, you can do so (I don't recommend it), but obviously communication delay between client app and VLC is higher than usual.
1. Can such solution work over WAN (Internet) not only LAN (TCP socket communication) 2. VLC player has in preferences Interface > Main Interfaces > RC
and Lua HTTP, Lua Telnet, etc. (whats the aim of this? ) 3. I saw some application on Google Store that comunicates via Lua HTTP?
Yes it should be possible, but I haven't tried it though.
I would prefer writing my own server/client + protocol for communication this is for university lower degree project. So my question is whether if I will write such a server, Will there be possibility to integrate it with VLC somehow like adding to preferences > Interfeaces, or it should be separate program or it can be written as plugin or some add-on ?
As I said, you can write your own server app, you can integrate that server with VLC server (web Interface). Again this method is not recommended.
If you still want to write server app, instead of integrating with VLC's web interface, map key board short cuts (for example, receive stop request from your client app, and rise keyboard 'S' key event on your server app. 'S' key is short cut for stop command for VLC. For more VLC short cut keys refer here)
VLC supports both transcoding and streaming, I suggest you to write only client app, integrate it with the VLC Web interface. That is the best method. (For more info, there are many apps on play store, try any one of them or refer VLC forum)
I am trying to create an OPUS based multicast server for an audio project that I am working on and it will be running on the O-Droid X (http://www.hardkernel.com/renewal_2011/products/prdt_info.php?g_code=g133999328931) for this project. At the moment I am unsure on where to start for creating and going about making a multicast server in linux or android using the OPUS codec. This is my first multicast server for audio support that I have done from scratch. If there are any pointers they would greatly be appreciated.
Also making it accessible through a web page and playable through that webpage would be an ideal situation so that a specific app on the client side would not be needed.
Apparently Icecast does a lot of what you're looking for. It's open source (GPL) and supports Opus streams using the Ogg container format, you could have a peek for some general software architecture ideas. My SoundWire Android app (with Win/Linux server) does Opus streaming with low latency but the network protocols are custom... I don't know any established open protocols that can do low latency (by my definition 1 second delay is not low latency).
My approach was to build a conventional network server that sets up a normal unicast UDP socket for each client. Avoid TCP if you want low latency, then you'll have to deal with the datagram nature of UDP in some way. With Opus the amount of data streamed per client is not excessive. I use multicast only for discovery (auto locate a server).
I suggest you start with some open source server code and adapt it for your needs, bring in Opus which is very easy to integrate, choose a container format such as Ogg if it's suitable (search for Ogg Opus). If you want browser compatibility then you'll more or less be implementing part of a web server (HTTP etc.) and will have to give up on your low latency goals.
As a general note, pending a reply to my comment: You will be disappointed to learn that multicast is pretty much useless. Outside of some unusual configurations which you will probably not encounter in the real world, multicast doesn't work over the Internet, as most routers are not configured to pass it. It's really only usable over local networks.
As far as making it accessible through a web page, you're pretty much out of luck. There is no native browser support for multicast, nor is support widespread for OPUS, and most of the standard methods of extending the browser's capabilities (e.g, Javascript and Flash) can't really help you much either. You might be able to implement it in a Java applet, but the number of user agents with working Java installations is rapidly shrinking (particularly with the recent Java exploit), and the resulting applet may end up requiring elevated privileges to use multicast anyway.
Sending files over bluetooth in android is easy. I am fully aware of all the solutions about using in-built android system programs to handle the transfer.
But I've seen some apps that shows transfer progress bit by bit without system apps and I am wondering how can that be achieved?
With the normal Android Bluetooth API that is a bit restricted, I think.
I was googling, reading, browsing for countless hours but all I was able to find was to use intents and whatnot to send the file, which essentially uses system app to send files...
I tried coding with RFCOMM socket and I was able to write file bytes to the stream, but that didn't work as the other end didn't show it got a valid data stream and after some research there is a lot more to the protocol to initiate a proper transfer than just writing to the stream.