I'm working on entry phone. The entry phone is sending to me voice and video through rtsp protocol so I can simply get voice and video from camera on my device. But I have no idea how to send a voice to that device. Is there anything that would help me send and receive audio in the same time (something like a call)?
If I understand correct you want video calls. Just like sipdroid app. It is an open source project look at VideoCamera.java class in this project.
http://code.google.com/p/sipdroid/
Related
hi i'm coding an android app which handle telephony call in android. but for answering machine i need to send pre recorded voice before recording the call. is there any idea to play recorded audio for answering machine in android?
I have a device (ESP32) that has a Bluetooth module. I found some library to use on it and it acts as A2DP sink, works perfectly with Spotify app. My objective is to be able to send some small packages of data to it from my Android phone (device name, notifications, etc) while Spotify app is streaming media to it. What would be the best way of doing that? I tried to approach this with MediaSession.setMetadata method, but it seems it only works with actual media playback.
I have a ip camera from Apexis model J011ws, and i'm developing an app to control it, and i've been trying for some days send audio to camera, but i don't know how to do this, i'm using phonegap, i didn't found nothing helpful on internet.
I'm developing an AIR for Android application, and am current sending audio to fms servers via standard NetStream/Microphone options. I (ignorantly) assumed that attaching a bluetooth device would be pretty simple, and connecting it would make it show up as a native "Microphone". Unfortunately, it does not.
I don't think it is even possible to use Netstream.publish and publish raw bytes, so the only hope is that there's a way to use NativeProcess + Java to create a native microphone "handle" that AIR can pick up on.
Has anyone run into this issue?
I think one possible solution would be using NetConnection.send() instead of Netstream.publish().
You should get sound data from your BT microphone. I am not sure if you can get using AIR. You may need to use an android service that gets the sound data and feeds your AIR app via a file, a UDP port or an invoke etc.
When you get some sound data, encode it so flash can play it (Speex, Nellymoiser, etc) You can do the encoding in your Android service as well.
Whenever your AIR app receives a sound data, send it to your streaming server via NetConnection.Send().
Extend your streaming server to process sound data received. You can embed it into a flv stream, or send to other flash clients if it is a chat app.
Other than that, I can't find a way to have a "microphone handle" for your BT microphone. I once thought of creating a virtual device on Android, but I couldn't find any solution.
general video players connect the media server through unicast
but I need a player to receive media stream using multicast/broadcast.
scenario:
Media Server ---> AP --(multicast/broadcast video stream)--> player(android phone)
is there any Android SDK to support this function?
or is there any solution without developing software codec and RTP stack?
James.
Here is a post about Android and multi-cast support: How to receive Multicast packets on Android
The question about a multicast video streaming protocol is a separate issue. There should be nothing Android-specific required (assuming you can get and receive multicast data is all you need from Android).
Getting the new codec to show up as a video-playing app in Android is a separate issue. See this question:
How to add a new video codec to Android?