video call using Session Initiation Protocol android - android

Is it possible to make video call using Session Initiation Protocol in android.
I found SipAudioCall for audio calling but did not find any similar class for video calling.

It is possible, however video streaming is not implemented.
Consider using one of third party libraries.
See also:
Android video calls using android's sip
Video Call through SIP in Android

Related

how to use the ant media server with flutter using the Rest API

I want to create a stream app like TIKTOK using flutter ,
I am using the ant media server and I am using the REST API to create room
and to create stream and it is working .
the problem is that I do not know what to do next ?
do I have to create a signaling using the stuns ? if it is yes , how I can do that ?
and is the any library that can plays the stream like the OBS Studio and the VLC ?
do I have to use the Web Sockets ?
!! please help !!
I think you're confused about Live Streaming and Video Chat, which are both very important but different video platforms and use scenarios, and their architecture is totally different.
I think TIKTOK is an mobile app with:
Most of it is VoD file and social connections.
Some part is Live Streaming, like twitch.tv or YouTube live streaming.
Few of Video Chat ability, only when two user are talking in a RTC room.
For VoD, it's something like HTTP(S) files, deliver by CDN or Nginx if you want to build it.
About the Live Streaming and Video Chat, please read detail from here.

WEBRTC onAddStream not Called in native APIS

I am working with videocall plugin for an app. Every thing is working fine on web but in mobile using react native , there is only support for onAddStream and not on track. So Call sets up fine but during renegotiation while converting audio call to video call, video from mobile to web is added successfully but video from web to mobile is not added as onAddStream event is not called while adding tracks to existing stream.
A workaround has been given in this link and this which is to return a new stream everytime with new track labels. If I do this and use add Stream instead of addTrack, will it raise compatibility issues with other browsers? Also as the peer connection is between browser and janus, or mobile and janus so does it need to be implemented in janus or the other peer? Is there any other workaround that I can use or has someone used renegotiation with older apis i.e onaddstream?
If this is the only option, can some one guide me on how to to change labels for tracks and streams either directly or by editing sdp etc.? The same solution is mentioned in this answer but unable to find any relevant code.
Looking forward to your answers.

Google Cast SDK RemoteMediaPlayer web video casting e.g. YouTube

i am new to the google cast development. Currently i've successfully developed an app that can cast online media (e.g. this mp3) by using the RemoteMediaPlayer class.
Now i have tried to cast an YouTube video (this video) by using the same technique but it won't start casting. The callback of GoogleApiClient.load() says that the operation was not successful.
So, is it possible to cast a YouTube video (or another streaming platform videos such as vimeo) by using the RemoteMediaPlayer class?
You need to have a link to the media, and media should be in one of the formats that is payable on Chromecast (see our documentation for accepted formats). Otherwise you will not be possible. For YouTube specifically, you don't have access to the mp4 files (at least not through YouTube APIs) so you cannot play a YT content in your app's receiver; some have tried using the embedded iframe but it is not always successful and it is limited.

How does the Twitch-application stream to Android?

I'm currently working on a project to develop a e-sport streaming calendar for a company. The app works fine but the problem is that twitch only lets you stream flash, and for Android that doesn't quite work after Google's decision to remove the support. Http-live-streaming isn't very well supported either so the group is currently at a dead end.
My question is therefore: How does the Twitch-application stream to Android?
It works on Android devices that doesn't support flash or HLS, so there should be another way do it.
My guess is it probably used HLS or RTSP(RTMP+RTSP is the most common scenarios) inside its flash client already, the Android app is just merely another stream client implementation.
As of HLS, it doen't need any kind of native support to work on Android, it's just plain simple HTTP, you can even write you own implementation if you want. The native MediaPlayer API Android has already provide implementation. It's the same for RTMP + RTSP.
So, as of your problem, there're two ways I can think to solve it:
Get a router that supports packet sniffering(maybe one router with OpenWRT flashed and tcpdump installed), and reverse-engineer the URL and protocol twitch Android client used, then use it in your app.
pros: no dependency on twitch app itself
cons: harder to pull off, may break if twitch changed its internal protocol
Reverse-engineer the Intent twitch app used to pass to its video player Activity, and mimic one of your own to allow user to open the player to watch the stream.
tools you may find useful: https://play.google.com/store/apps/details?id=uk.co.ashtonbrsc.android.intentintercept
pros: it's more reliable and more consistent
cons: may not work if the Intent is private, depends on user installing the twtich app
UPDATE:
I just found out Twitch website works on Android native browser, too. Seems like it used <video> tag from HTML5 standard. So the simplest solution could be just use a WebView to wrap around that stream page, but it's not good for user experience.
Alternatively, you could write a server-side code which accept a stream page URL as parameter and the video tag as an output and use regular expression or XPath or some XML parser library to extract the <video> tag to client. The client app can just set up a WebView with just that <video> tag inside it. This approach prevents your app from stopping to work if Twitch changes its page structure.
Also, if you wish not to use WebView, you can extract the src attribute of that <video> tag and play it with Android's native MediaPlayer AP if you want.

why native android mediaplayer does not support HLS but web browser does

I'm trying to implement an android app that read video stream via internet however I've got error when playing it. It rejected with error code (1, -4) which look like did not able to support the format.
I tried parse along the URL on the android web browser and it was able to read it and video shown nicely. So the question is, why web browser able to decode the stream but MediaPlayer not? What is the possible solution I can use?
Thx for advice.
Without knowing your test-device, my best guess is that the browser supports 'Flash', which it is using to stream your HLS. Native Android has a lot of problems with streaming HLS
Check out the following link for more information:
http://www.jwplayer.com/blog/the-pain-of-live-streaming-on-android/

Categories

Resources