Access voice stream before sending it through the network - android

I'm fairly new to Android and I was wondering if there is a way to access and edit the voice stream before sending it through the network of the mobile distributor when there is a call? I guess the stream is in binary format? Thanks in advance

Related

How do I access voice data in a phone call?

I'm a newbie in the Android and telecom world.
I'm trying to access the voice stream in a phone call and encrypt it before sending it to the network.
Is there a way to access voice stream in Android?

Streaming audio from an Android device to another

How would I go about streaming audio from one device to another over the internet? I'm aware of sending basic data using Java sockets, but wondering how to:
Start streaming midway through a file (say, during the middle of a song)
What format is needed for the data being sent. MediaPlayer can take a url as a data source, so how should the audio be represented when being sent from the server side?
Thanks
Having implemented a music streaming app, I can share a little with you.
If you want to stream and use the Android MediaPlayer class, MP3 or OGG is your best bet for a format.
If your architecture is client-server, i.e. real server in the Internet serving streams to Android devices, then just stream MP3 or OGG bytes over HTTP. Just point MediaPlayer to a URL on on your server.
If your architecture is peer-to-peer with your own custom socket code, you can create a "proxy http" server that listens on localhost on a dedicated thread. You point your MediaPlayer instance to your local in-process socket server (e.g. http://localhost:54321/MyStream.mp3). Then you have to implement code to parse the HTTP get request form MediaPlayer, then proxy the stream bytes between your custom P2P socket protocol and listeners connected to your local http server. A lot of radio streaming apps do exactly this so as to parse the ICECAST metadata from the MP3 stream. Here's the code I use for my radio streaming app that does this.
For the "start midway through the file" scenario, you might find my MP3 Stream Reader class useful. It wraps an InputStream (file, socket stream, etc..) and syncs to the next valid frame from where ever you started from. Just call read_next_chunk to get the next block of audio and its format. MediaPlayer might do most of this heavy lifting for you, so this might not be needed.

Get voice stream during android call

I have wifi connection between java desktop app and android app. I need to transfer opponent's voice during call to desktop side and also transer my voice from desktop's microphone to opponent. How can I do it? How can I get input and output streams of call?
You can not handle gsm call data (neither send over uplink nor receive over downlink). I think you can try sip calls if that meets your requirement.

android RTP send and receive program

I am new to android programming and i need idea about android RTP programming stuff. Questions
How to capture the microphone audio data on android device?
How to construct RTP packet by using captured microphone audio data without using API?
How to transmit RTP packet to other android device?
How to play received RTP packet in android ?
For transmitting and receiving RTP packets, I would suggest looking into the jlibrtp library. Basically you initialize it with 2 DatagramSockets (one for sending RTP data and one for receiving RTCP data), define a payload type, add a recipient, and send byte arrays. I beleive it handles the RTP timestamps by itself, but you have to make sure you're payload is already formatted by the RFC reccommendations.
Here is an example of how you would set up an RTP session
Answers for your questions..
1.Use Android Media Api like AudioRecord for recording the voice data & AudioTrack for playing the voice data, both in .pcm format.
2.Go through this link
3.you have to use sip for transmiting packets.
4.go through this link

Adobe AIR for mobile: Using Bluetooth audio as "Microphone"

I'm developing an AIR for Android application, and am current sending audio to fms servers via standard NetStream/Microphone options. I (ignorantly) assumed that attaching a bluetooth device would be pretty simple, and connecting it would make it show up as a native "Microphone". Unfortunately, it does not.
I don't think it is even possible to use Netstream.publish and publish raw bytes, so the only hope is that there's a way to use NativeProcess + Java to create a native microphone "handle" that AIR can pick up on.
Has anyone run into this issue?
I think one possible solution would be using NetConnection.send() instead of Netstream.publish().
You should get sound data from your BT microphone. I am not sure if you can get using AIR. You may need to use an android service that gets the sound data and feeds your AIR app via a file, a UDP port or an invoke etc.
When you get some sound data, encode it so flash can play it (Speex, Nellymoiser, etc) You can do the encoding in your Android service as well.
Whenever your AIR app receives a sound data, send it to your streaming server via NetConnection.Send().
Extend your streaming server to process sound data received. You can embed it into a flv stream, or send to other flash clients if it is a chat app.
Other than that, I can't find a way to have a "microphone handle" for your BT microphone. I once thought of creating a virtual device on Android, but I couldn't find any solution.

Categories

Resources