I have integrate successfully video call using WebRTC in android with this dependency,
implementation 'org.webrtc:google-webrtc:1.0.21217'
In app if video capture is stopped for remote video stream then at receiver side new frames are also stopped. But now I want to identify if remote stream is stopped to display a message to receiver side.
Is it possible with webrtc events? Can anyone guide me with code or any link ?
Thanks in advance :)
So you want to know if the video has been stooped from the android client or from the remote client ? Your question is a bit confusing. If you stop the video on your local device cant you also send a message to the receiver that the video was stooped on the same event you literally stop the video? If you mean you stop remote video and you get black frames instead of video and want to put an image, again can't you just send a message on the socket(I'm assuming you are using socket) when you stop the video?And also how do you stop the video stream ? As I am aware you can do
stream.getVideoTracks.get(0).setEnable(false) . I assume you are refering to this, on android side. If you geve me some details I would gladly help :)
Related
I am planning to create a video calling app for iOS in Swift. For the project I decided to select Agora.io as the chosen SDK. I wanted to know whether I can implement Picture-In-Picture mode ?
My use case is, If a user decided to chat with someone while he is on a video call with someone else or the same person he is currently on a video call. So that he can put the remote video preview in P-I-P mode and send and receive chat messages ?
I want to stream an audio from the server to Android App. I found some solutions but my scenario is quite different.
I have an audio file which is uploaded to the server, I want to run that audio on a specific time and must be stop after audio complete.
Now let say a user opens his android app to listen an audio after one minute from the start time of that audio than he must listen audio from after one minute of audio running and could not listen that audio from back and forward.
Please guide me the to do it and which technologies should I use?
Would it be better you use Google Firebase or any Realtime server? or simple hosting will be a nice option ?
Thanks
I am developing an Android application for video calling using RTSP/TRMP and LibStreaming. When i initiate a call from my app, i cant able to understand thether the other end is ringing / attend my call.
I would like to know how to set a ringback tone for an outgoing call. I need to play it till the other user attend the call or reject the call
For video calling, you can use WebRTC. its good library
you can check demo
Using RTSP, it will difficulty for you. if you want one way live steaming then RTSP is fine
I am writing a android app which is supposed to play back a audio file when a call is in progress coming from a specific number .. I tried many approaches.. but all went in vein
Separate Thread
Listener on Telephone service
starting a service in parallel
can any one please help me how to proceed regarding this ?
Update :
I am able to play a mp3 file on call recieve and i am able to play it load on speaker.. but how ever loud i play the calling party is not able to listen to it.... is there anyway i can push the speaker stream to call stream
From the api doc here
Note: You can play back the audio data only to the standard output
device. Currently, that is the mobile device speaker or a Bluetooth
headset. You cannot play sound files in the conversation audio during
a call.
So from this and lots of other so answers we can conclude that we can not play an audio during a call process.
But a long time ago from a personal experience with a handset I got a result where the audiotrack was playing while there was a call established and both the voices were heard at the same time. So I think this depends on handsets if it allows it then you can play.
You can try another thing experimentally. play the audio using a different route ( speakerphone or bluetooth).
Another option is to build your custom android build
i am making audio live streaming application. what i was doing before, i recoded voice/audio for certain time and i will push the stop button to stop the recording.
then i put action to send the video after clicking the stop button to server through internet. in here i am using socket programming. it works perfectly however i was not live streaming application.
i have heard about doing the code offset to send the audio file to make it live. it will send bytes by bytes of the audio to the server. however, so far i could not find any reliable tutorial/simple source code of how i would like to do it. does anyone knows where i could fine any reference regarding this method?