Implementing your own Android bluetooth file transfer progress - android

Sending files over bluetooth in android is easy. I am fully aware of all the solutions about using in-built android system programs to handle the transfer.
But I've seen some apps that shows transfer progress bit by bit without system apps and I am wondering how can that be achieved?
With the normal Android Bluetooth API that is a bit restricted, I think.
I was googling, reading, browsing for countless hours but all I was able to find was to use intents and whatnot to send the file, which essentially uses system app to send files...
I tried coding with RFCOMM socket and I was able to write file bytes to the stream, but that didn't work as the other end didn't show it got a valid data stream and after some research there is a lot more to the protocol to initiate a proper transfer than just writing to the stream.

Related

Standard way of streaming audio data from a mobile device to a server

There are plenty of good sources on how to receive audio data on mobile phones. I find (almost) none of streaming audio data TO a server in a standardized (i.e., using a standard protocol, such as RTP) way.
What is the standard way to stream audio data from Android or iOS to a server?
Some additional info:
The closest solutions I found are:
Objective c: Send audio data in rtp packet via socket, where the accepted answer does send audio data, but not inside RTP
Using ffserver, which can listen to a port and receive streamed audio for further processing. But it has been discontinued.
I can write all of that functionality myself. I.e., wrap the audio data in RTP, RTSP, RTMP, whatever, write a server that receives the stream, decodes it and does the processing, but that's days of work for something that seems to be a standard task and thus should already exist.
Furthermore, I firmly believe that you shouldn't reinvent the wheel. If there's a good, established, API for something, Apple, Google or a third party, writing the functionality myself is a bad idea, in particular if networking is involved and thus security concerns.
I think I figured it out, at least one standard way. One likely answer to my question is simply: SIP
There are (even native) interfaces for SIP on both iOS and Android, there are (many) ready-made servers for this, and very likely there exist libraries or command-line clients to run on server side and use for the further processing.
I haven't tried it, but this looks very standardised, is likely widely used for such problems.

Streaming using libvlc and libstreaming

I'd like to get advices on how to do the following if possible:
I've given 3 android devices and I'd like to stream from one to an other (maybe backwards too) at a time but also save it on a third platform (a pc maybe to have a lot of space) for later processing. I'd like to make this pc as a "server" where I recieve a stream from device A, saving it and forwarding to device B. I also want this type of connection between device A-C and B-C at once. This is the idea in a nutshell.
What I have now is I can stream device A's camera to device B using libstreaming and libvlc to recieve it.
Is it possible to achive such system and if so how difficult is it?
Thanks in advance for any kind of reply.
If you setup the stream as RTSP you can have multiple subscribers, so one device could subscribe and record to video, and the other could do whatever. They would then all get the feed at the same time, no need for extra routing. This can all be done with libVLC, and it's not too difficult. You'll have to find examples online for the server and client. It will only get tricky if you want to stream data from memory (for server using item) or write data to memory (for client using seem) directly, but there are examples for this too.

Creating an OPUS codec based multicast server (android/linux)

I am trying to create an OPUS based multicast server for an audio project that I am working on and it will be running on the O-Droid X (http://www.hardkernel.com/renewal_2011/products/prdt_info.php?g_code=g133999328931) for this project. At the moment I am unsure on where to start for creating and going about making a multicast server in linux or android using the OPUS codec. This is my first multicast server for audio support that I have done from scratch. If there are any pointers they would greatly be appreciated.
Also making it accessible through a web page and playable through that webpage would be an ideal situation so that a specific app on the client side would not be needed.
Apparently Icecast does a lot of what you're looking for. It's open source (GPL) and supports Opus streams using the Ogg container format, you could have a peek for some general software architecture ideas. My SoundWire Android app (with Win/Linux server) does Opus streaming with low latency but the network protocols are custom... I don't know any established open protocols that can do low latency (by my definition 1 second delay is not low latency).
My approach was to build a conventional network server that sets up a normal unicast UDP socket for each client. Avoid TCP if you want low latency, then you'll have to deal with the datagram nature of UDP in some way. With Opus the amount of data streamed per client is not excessive. I use multicast only for discovery (auto locate a server).
I suggest you start with some open source server code and adapt it for your needs, bring in Opus which is very easy to integrate, choose a container format such as Ogg if it's suitable (search for Ogg Opus). If you want browser compatibility then you'll more or less be implementing part of a web server (HTTP etc.) and will have to give up on your low latency goals.
As a general note, pending a reply to my comment: You will be disappointed to learn that multicast is pretty much useless. Outside of some unusual configurations which you will probably not encounter in the real world, multicast doesn't work over the Internet, as most routers are not configured to pass it. It's really only usable over local networks.
As far as making it accessible through a web page, you're pretty much out of luck. There is no native browser support for multicast, nor is support widespread for OPUS, and most of the standard methods of extending the browser's capabilities (e.g, Javascript and Flash) can't really help you much either. You might be able to implement it in a Java applet, but the number of user agents with working Java installations is rapidly shrinking (particularly with the recent Java exploit), and the resulting applet may end up requiring elevated privileges to use multicast anyway.

Read an audio stream during (GSM) phone call

Is it possible to read an audio stream during (GSM) phone call? I would like to write an encoding application, and I do not want to go with SIP&VoIP. Thank you.
This will be phone and OS dependent and there are several apps that claim they record audio (Total Recall, Record my call on Android) but they generally seem to record via the microphone meaning the far end sound is poor.
I don't believe either the apple or android api's support access to the raw voice stream today.
Something to be aware of also is that it is not always legal to do this without informing the other party (i.e. the person on the other end of the call that you are planning to 'capture' the voice stream somehow) in many places - this may not be relevant for your particular plans but worth mentioning anyway.
If you have the option of doing the work in the network or on a PABX then you can create a basic (if not very efficient) solution by simply creating a three way (or conference) call.

MIDI on Android: Java and/or AIR libraries

I've been contemplating (re)building an app on iPad for some time, where I would use objective-C and DSMI to send MIDI signals to a host computer. This is not bad (I mean, except for actually writing the app).
Now I'm contemplating perhaps developing the app for Android tablets (TBA).
In Java, what options are available for MIDI message communication? I'm quite familiar with javax.sound.midi, but then I would need a virtual MIDI port to send messages to the host.
On the other hand, if the app were done in Adobe AIR, what options would I have available for communicating with MIDI?
Obviously another option is to send/receive messages over a TCP/IP socket to a Java host, and talk that way, but it sounds a tad cumbersome... or perhaps not? DSMI does use a host program, after all.
javax.sound.midi is not available in android.
The only access to midi functionality in android is through the JetPlayer class and its very much designed for use in games. Android will play a midi file but its unclear what code path it uses, theres no way to write to the midi hardware from memory.
In one app I've made i needed to play notes dynamically based on the GUI/user interaction and ended up having to use samples and pitch filters to create the notes.
Sounds to me like what you need to do is port DSMI to android, its open source the iPhone library looks pretty simple, shouldn't be difficult to port over.
EDIT:
After thinking about this for second you wouldn't gain anything by using javax.sound.midi or whatever midi functionality exists in AIR anyway. All you need to do is pass MIDI messages through a network link to another device who is responsible for communication with the actual MIDI synth device. This is exactly what DSMI does. Porting the iPhone libdsmi to Android is what you need to do, its only 2 source files and 2 headers. 1 handles the MIDI message format which is very simple and can pretty much just be converted line by line to java and the other handles the network connection to the DSMI server which will need to be rewritten to use Android semantics for creating network connections.

Categories

Resources