I am implementing a webRTC android application, and the servers that i am communicating with are a signaling server(via websockets) and a kurento media server.
Time sequence is:
clients subscribe to the server
every client sent sdpOffer
every client sent all the iceCandidate that will be created
signaling server send the iceCandidate to every client
signaling server send to the clients the sdpAnswers which originate from kurento media server
My mobile clients subscribe to the server and they send theirs sdpOffers.
After ice trickle the signaling server sending back the sdpAnswers from kurento media server.
So, i am getting the startCommunication messages and the remote description is set successfully with the sdpAnswers from kurento media server.
I have check all my sdp and ice packages, they are all as expected, and all webRTC related callbacks are successful.
After setting the remoteDescription in peerConnection,
the onAddStream() is invoked and i am getting the videoTrack.But the video is not rendering..
my onAddStream:
override fun onAddStream(p0: MediaStream?) {
super.onAddStream(p0)
p0?.videoTracks?.get(0)?.addSink(remote_view)
Log.d(TAG, "on add stream" + p0?.videoTracks?.size )
Log.d(TAG,App.rtcClient.peerConnection?.iceConnectionState().toString())
Log.d(TAG,App.rtcClient.peerConnection?.iceGatheringState().toString())
}
init of the remote view:
fun initSurfaceView(view: SurfaceViewRenderer) = view.run {
setMirror(true)
setEnableHardwareScaler(true)
init(rootEglBase.eglBaseContext, null)
}
....
App.rtcClient.initSurfaceView(remote_view)
xml of the remote view:
<org.webrtc.SurfaceViewRenderer
android:id="#+id/remote_view"
android:layout_width="0dp"
android:layout_height="0dp"
app:layout_constraintBottom_toBottomOf="parent"
app:layout_constraintEnd_toEndOf="parent"
app:layout_constraintStart_toStartOf="parent"
app:layout_constraintTop_toBottomOf="#id/local_view" />
I am not getting any errors and my local stream is rendering perfectly.
So, i am debugging the app without knowing where to go next, because is not clear where this problem comes from.
This exact question exists here but there is no answers and it is 5 y.o
if you need any additional info please comment.
Update: Exact the same behavior exists in the iOS version too - no video rendering.
Related
In my Android WebRTC client to send DTFM tones I use code like this
val audioSource = peerConnectionFactory.createAudioSource(MediaConstraints())
val localAudioTrack = peerConnectionFactory.createAudioTrack("audio", audioSource)
peerConnection.addTrack(localAudioTrack)
peerConnection?.senders?.firstOrNull()?.let {
it.dtmf()?.insertDtmf(code, 400, 50)
}
But it seems tone does not reach a peer, and there is erro message in logcat
dtmf_sender.cc E (line 126): InsertDtmf is called on DtmfSender that can't send DTMF.
No matter what device I use.
Why could it happen?
There are multiple reasons why this could happen, one of them being that the other party in the WebRTC connection does not support the RTP telephone-event
Also, check this example: https://developer.mozilla.org/en-US/docs/Web/API/WebRTC_API/Using_DTMF
(haven't tried it myself though)
Pay attention to this:
"Note, however, that although it's possible to send DTMF using WebRTC, there is currently no way to detect or receive incoming DTMF. WebRTC currently ignores these payloads; this is because WebRTC's DTMF support is primarily intended for use with legacy telephone services"
I have found, problem is that peers are not connected via RTP. I mean WebRTC did not found yet suitable route, basing on ICE candidates gathered, to pass audio traffic.
As only route is constructed, traffic goes on, and sender comes ready to send DTMF tones.
To be ensured that peers are ready to trancieve media, you may look on connection state in PeerConnection.Observer.onIceConnectionChange(), and get sender when state comes to "CONNECTED".
I have enable stream management in both side client or server. I have two users A and B. Both users are online.Then user A suddenly lose his connection. but A user still appear online on user B and as well as on server. During that time user B sending message on user A. Those message are not lost but when user A is appear online again it will receive those message after 2-3 minute.and i will get message stanza on Offline storage and delivery receipt i will got on SM storage.This issue same occur on one to one chat and mucLight. have i need to customized any mongooseIM modules. Please guide me why users received delay message when their are lost his connection. is it possible to changed SM storage to offline storage(MAM). here's link for same issue i have found same issue on this link (https://www.ejabberd.im/faq/tcp) but have not lost my messages but just received it delay.
I am use smack-4.2 lib on my Android app.and following code used to enabled stream management in XMPPTCPConnection.
static{
XMPPTCPConnection.setUseStreamManagementDefault(true);
XMPPTCPConnection.setUseStreamManagementResumptionDefault(true);
}
Here's my ejabbered.cfg file for mod_stream_management module
{mod_stream_management, [
% default 100
% size of a buffer of unacked messages
% {buffer_max, 100}
% default 1 - server sends the ack request after each stanza
% {ack_freq, 1}
% default: 600 seconds
% {resume_timeout, 600}
]},
I have also enable following module on my config file
%% Only archives for c2c messages, good performance.
{mod_mam_odbc_user, [pm]},
{mod_mam_cache_user, [pm]},
% {mod_mam_mnesia_dirty_prefs, [pm]},
% {mod_mam_odbc_arch, [pm, no_writer]},
{mod_mam_odbc_async_pool_writer, [pm]},
{mod_mam, []}
I found little solution here smack connect to xmpp server with previous stream id but its not work on mongooseIM-2.0 server.
Thank you in advanced.
I'm assuming below that user A, when they reconnect, is not using Stream Resumption (as defined by XEP-0198: Stream Management) and merely starting a new session.
This means that on the server side there's still a dangling process waiting for Stream Resumption to happen. When user A is already reconnected to the server, the dangling process times out (which takes resume_timeout seconds) and sends the messages it had stored for delivery in the outgoing message buffer.
If you don't like this behaviour, you can do one of these:
a) (not advised) disable Stream Management and send a Message Archive Management query (that is use mod_mam) to have the most up to date conversation state each time you establish a new connection to the server
b) leave Stream Management enabled, but use Stream Resumption if only possible; that is, you always try to resume the previous session, unless you don't have the previous session ID or the server rejects the resumption request; ideally you would also use Message Archive Management
c) use Delayed Delivery aka mod_offline, but risk that in some rare cases, if you use multiple devices, the messages might be sent to a wrong device; for example, if you have a phone and a laptop, it might happen that your messages will reach the laptop, but never reach the phone
Did you try using mod_ping and configure on ejabbered.cfg file.
{mod_ping, [{send_pings, true}]},
fore more details please follow this link mod_ping
First Case:
I am using PSI client in Ubuntu 16.04 LTS and my ejabberd version is 16.03.
I am facing message lost issues hence i have gone through this link for stream management : http://xmpp.org/extensions/xep-0198.html
When i send following request
<enable xmlns='urn:xmpp:sm:3' resume='true'/>
Server response is ok for me, i.e
<enabled xmlns="urn:xmpp:sm:3" id="g2gCbQAAAANELTVoA2IAAAW+YgAMmKxiAAnx/A==" max="300" resume="true"/>
After some chatting with other user, when i send following Stream resumption request :
<resume xmlns='urn:xmpp:sm:3'
h='0'
previd='g2gCbQAAAANELTVoA2IAAAW+YgAMmKxiAAnx/A=='/>
I got the following error:
<failed xmlns="urn:xmpp:sm:3"><unexpected-request xmlns="urn:ietf:params:xml:ns:xmpp-stanzas"/></failed>
I have tried all the ways like disconnect the network, kill application and offline the user. But stream resumption is not working.
I am not getting the problem, Please help me.
Second Case:
When i use following configuration in ejabberd.yml:
listen:
port: 5222
module: ejabberd_c2s
resend_on_timeout: if_offline
stream_management: true
And start chatting after enables stream management. Then for the case of network disconnect and application kill, My all messages are storing in the offline queue (if i am not able to reconnect within 300 sec). In this process no one message is lost.
But my problem is this process is work only for mobile (ejabberd_c2s module). Web or Bosh is not supporting the stream management (ejabberd_http module). How can i use stream management for Bosh or web?
Following this tutorial I have successfully created a GLSurfaceView that displays my local video in my Android app. I am using Pristine's gradle build scripts to use native code for webrtc. The web app works as expected in Chrome.
I have established a connection to my nodejs server via web sockets to join a pre-existing conversation. Kurento is being used to deal with rooms. I believe I need to create an SDP Offer to begin sending and receiving videos between peers at this point. (To begin I simply want the video from the Android device to appear on the web interface.)
However if I create a PeerConnection and add my local media stream (created with PeerConnectionFactory.createLocalMediaStream) and then call createOffer() it fails.
The SDPObserver that listens to my connection gets its onCreateFailure called with the message "CreateOffer called with invalid media streams."
Looking at the C code it appears that the streams do not have unique IDs (despite the fact I have only created one stream).
I've been trying to figure this out for a day now and don't seem to be making any progress. Any suggestions?
Thanks in advance!
I am new to WebRTC native framework.
I was able to get the WebRTC source and run the demo Android application based on http://andrii.sergiienko.me/?go=all/building-webrtc-demo-for-android/enter link description here.
I was able to send/receive Audio and Video between two Android devices on the same local network.
Is there any way to send a small JSON payload in this peer connection?
I tried looking for it in the source and I only found support to send Video and Audio.
Thank you.
Your are looking for WebRTC DataChannels.
WebRTC's RTCDataChannel API is used to transfer data directly from one peer to another. This is great for sending data between two browsers(Peers) for activities like communication, gaming, or file transfer and slew of other functionalities.
It is an Alternative For WebSockets:
This is a great alternative to WebSockets because no central server is involved and transmission is usually faster and there is no bottleneck. You can of course mitigate the failover of P2P transfer by having a hooks which activates Websockets based communication if P2P data-channel communication fails.
Code for Reference:
The events defined are called when you wish to send a message of when a message is received (including error handling). This code should be running in both the browsers. Instead of sending "Hello World" you just need to send your JSON String.
var peerConnection = new RTCPeerConnection();
// Establish your peer connection using your signaling channel here
var dataChannel =
peerConnection.createDataChannel("myLabel", dataChannelOptions);
dataChannel.onerror = function (error) {
console.log("Data Channel Error:", error);
};
dataChannel.onmessage = function (event) {
console.log("Got Data Channel Message:", event.data);
};
dataChannel.onopen = function () {
dataChannel.send("Hello World!");
};
dataChannel.onclose = function () {
console.log("The Data Channel is Closed");
};
The dataChannel object is created from an already established peer connection. It can be created before or after signaling happens. You then pass in a label to distinguish this channel from others and a set of optional configuration settings:
var dataChannelOptions = {
ordered: false, // do not guarantee order
maxRetransmitTime: 3000, // in milliseconds
};
For more details check out the links provided.
Examples:
Link to official documentation: DataChannels in WebRTC
Link to a file Trasfer example using WebRTC Data Channels.
File Transfer
Link to popular Use Cases for WebRTC Data Channels
Use Cases