I'm building video chat application. I'm wondering is it possible to make/receive calls during video conversation using OpenTok.
The main point is to allow hear each other caller and video companion - some kind of conference.
I've read documentation for OpenTok, Android and iOS developers and didn't find anything helpful
I've tried to test this and there is no sound in video conversation while call is active. Looks like they use same sound input and output
I know this is a bad question, but I don't know how to make proper request to google or documentation
There is no such connectivity right now, so, you cannot mix an OpenTok session, and a regular mobile phone call. You could be interested in inspecting connectivity with SIP networks:
https://tokbox.com/developer/guides/sip/
Related
Lets imagine i want to build some app for videoconferences or videomeeting that is chromecast-compatible.
So i can use my android device to connect meeting, after that i just click something like "cast to device" and get audio and video on my TV.
Also i noticed that skype, zoom, discord and all apps like this has not bulid-in chromecast support. So maybe it is impossible?
I tried to find something about that but but found nothing useful. I found zoom sdk, and chromecasst sdk, but not seen what actually i shoud cast. As i understood chromecast can send only media-content to reciever, but zoom sdk not actually provide some media-like links for conversation(videoconference).
So i want to know what are my steps to build my own chromecast-compatible meeting app and what may i want to use to develop app like that?
Or maybe someone know how can i use Zoom sdk to achive what i want?
When a sender device connects to a Chromecast, it has a limited set of "commands" it can send to the device.
There is no immediate way to send data other than those messages from the sender to the device - there is a 'custom' message, but that too only includes stringified JSON
The way this would have to work is too set up a stream of the screen/app you want to display via Chromecast and send a LOAD message to the Chromecast which will connect back to the sender (or a third party server where the video can be streamed from) and play it.
I'm not too confident this is going to work as intended though - there will be a significant delay between the sender and the Chromecast, and since the Chromecast has no mic, you will also have to use the mic of the sender and deal with the acoustic feedback.
I want to do a conference call using Android Sip. Is that possible? Can someone give a working example please. Also are there any limitations of using that library like can it work on 3G or 4G?
Current Android SIP API does not support creating conferences. However the functionality could be implemented on a SIP Application Server (AS).
In theory, it would be possible to use FACs (Feature Access Codes) at server side.
Example:
You are in a call with +1234567890 and want to add +1234567890 to the call.
by using makeAudioCall API with a supported FAC in the peerProfileUri, lets say "tel:*111#+1234567890#+1234567890#" a capable AS could place the second call create a conference.
Also are there any limitations of using that library like can it work on 3G or 4G?
There are no limitations, it can work on 3G/4G as long as bandwidth for audio is enough. Audio mixing is normally done at server side, so each participant would send/receive audio as in a regular call (no additional streams).
is it possible in android to make a phone call or SIP call and play a soundfile after the call is established? Other option that would be ok for me is that after the established call the TTS engine reads some text so that the person on the other side could hear that.
Is this possible?
Thanks!
If you mean played locally (i.e. only you can hear it), then sure. That should be working without using any special tricks.
If you mean inject audio into the uplink so that the other party can hear it, then no - at least not during a normal voice call. Perhaps it would be possible during a SIP call if you implement the whole SIP stack yourself and generate the audio packets in your app. I'm not really familiar with how SIP calls works, so I can't say whether that would work or not.
I'm developing an AIR for Android application, and am current sending audio to fms servers via standard NetStream/Microphone options. I (ignorantly) assumed that attaching a bluetooth device would be pretty simple, and connecting it would make it show up as a native "Microphone". Unfortunately, it does not.
I don't think it is even possible to use Netstream.publish and publish raw bytes, so the only hope is that there's a way to use NativeProcess + Java to create a native microphone "handle" that AIR can pick up on.
Has anyone run into this issue?
I think one possible solution would be using NetConnection.send() instead of Netstream.publish().
You should get sound data from your BT microphone. I am not sure if you can get using AIR. You may need to use an android service that gets the sound data and feeds your AIR app via a file, a UDP port or an invoke etc.
When you get some sound data, encode it so flash can play it (Speex, Nellymoiser, etc) You can do the encoding in your Android service as well.
Whenever your AIR app receives a sound data, send it to your streaming server via NetConnection.Send().
Extend your streaming server to process sound data received. You can embed it into a flv stream, or send to other flash clients if it is a chat app.
Other than that, I can't find a way to have a "microphone handle" for your BT microphone. I once thought of creating a virtual device on Android, but I couldn't find any solution.
I want to add one feature of Push To Talk kind of application for communication between my Team in my application. Beside this I also need some kind of text messaging. But I want it to be able to work in Gprs.I found that SIP API can be used for making voice calls but it says that it Requires WIFI. I want to make it run on Wifi as well as GPRS.
Can somebody give me some idea where to start from?
Push To Talk in SIP is just a regular call, with RTP doing the tricky floor control.
There's usually a media server involved broadcasting the voice bursts to all participants to save on the scarce upload bandwidth. The server usually has a public address simplifying NAT traversal for participants.
But if you are rolling your own, and don't need interoperability with other SIP services or IMS, and the whole thing resembles instant messaging more than phone calls, XMPP might be a simpler option.
I'm not sure about the Android aspect, but apart from the new, built-in SIP support which might be limited on purpose, there's always the SIP stack from SIPDroid, right?