I have an android client app from which i record audio from mic and send it to an socket. Now I want to build another app that will create a ServerSocket and listen to the audio from the mic and play it.
Will anybody guide me on that?
Related
I'm working on a protocol to transmit data from voice calls in android. Audio files will be played on a voice call without any role for the user. Is this possible programmatically?
I'm working on entry phone. The entry phone is sending to me voice and video through rtsp protocol so I can simply get voice and video from camera on my device. But I have no idea how to send a voice to that device. Is there anything that would help me send and receive audio in the same time (something like a call)?
If I understand correct you want video calls. Just like sipdroid app. It is an open source project look at VideoCamera.java class in this project.
http://code.google.com/p/sipdroid/
I'm developing an AIR for Android application, and am current sending audio to fms servers via standard NetStream/Microphone options. I (ignorantly) assumed that attaching a bluetooth device would be pretty simple, and connecting it would make it show up as a native "Microphone". Unfortunately, it does not.
I don't think it is even possible to use Netstream.publish and publish raw bytes, so the only hope is that there's a way to use NativeProcess + Java to create a native microphone "handle" that AIR can pick up on.
Has anyone run into this issue?
I think one possible solution would be using NetConnection.send() instead of Netstream.publish().
You should get sound data from your BT microphone. I am not sure if you can get using AIR. You may need to use an android service that gets the sound data and feeds your AIR app via a file, a UDP port or an invoke etc.
When you get some sound data, encode it so flash can play it (Speex, Nellymoiser, etc) You can do the encoding in your Android service as well.
Whenever your AIR app receives a sound data, send it to your streaming server via NetConnection.Send().
Extend your streaming server to process sound data received. You can embed it into a flv stream, or send to other flash clients if it is a chat app.
Other than that, I can't find a way to have a "microphone handle" for your BT microphone. I once thought of creating a virtual device on Android, but I couldn't find any solution.
How can I make the Device sound a beep when socket connection has been established. In other words, how to set a notification tone when an event occurs in Android sdk?
If you are the one opening the socket, use MediaPlayer or SoundPool or ToneGenerator or AudioTrack or something to play back a beep.
If you are trying to arrange for beeps when other applications open sockets, that is not possible without firmware modifications.
general video players connect the media server through unicast
but I need a player to receive media stream using multicast/broadcast.
scenario:
Media Server ---> AP --(multicast/broadcast video stream)--> player(android phone)
is there any Android SDK to support this function?
or is there any solution without developing software codec and RTP stack?
James.
Here is a post about Android and multi-cast support: How to receive Multicast packets on Android
The question about a multicast video streaming protocol is a separate issue. There should be nothing Android-specific required (assuming you can get and receive multicast data is all you need from Android).
Getting the new codec to show up as a video-playing app in Android is a separate issue. See this question:
How to add a new video codec to Android?