I searched many documents but didn't find any exact solution for my problem. I want to implement audio call and screen sharing in Android native application using webrtc without using any third party sdk.
I found one demo example i.e apprtc but it supports only audio call. How to implement screen sharing too?
This answer may be irrelevant for the OP, since the question is very old.
Anyway, for anyone in the future searching for something similar, check this commit in webrtc repo. It adds a screen capturer for Android.
Related
I am really new to WebRTC. What i need is to implement app to app voice calling (not video calling) feature in my android app. I want to call randomly among my app users by webRTC on android. I implemented appRTC sdk in android studio and made an app. By this app i can create or join in a room. And then i can create peer to peer connection (voice call). Its 1 to 1 calling in same room. But how to implement random calling. I just want to know the way i can achieve it. Thank you
I've been playing with webRTC quite a lot recently and created videochat-roullete as a sample for our webRTC wrapper, you might want to take a look into it : https://github.com/netguru/videochatguru-android
WebRTC can be problematic in many cases as it lacks of good documentation, hope it helps.
Use easyRTC which is built on webRTC. I have personally applied it in one of our project for audio /video and chat communication. Use this link https://demo.easyrtc.com/demos/index.html
There is only one challenge I am facing right now, how to make it work on iOS
Recently I've been struggling a lot with WebRTC, I was able to build a very simple WebRTC web application based on the WebRTC codelab which consists of a simple signaling server (basically step 8 in the codelab tutorial).
My next target is to build a native Android application that does the same thing which is to be able to make video call with the web application using the same simple signaling server. I am very new to WebRTC and I could not find any good tutorial or guide that allows me to build a simple native Android application.
I've searched for similar questions on Stackoverflow but most of them are outdated and do not provide useful answers that I need.
I'm hoping the Stackoverflow community who knows any good source or tutorial on how to build a simple and basic native WebRTC Android application can share with me their knowledge and information. Thank you so much.
I suggest you build the AppRTCMobile target in WebRTC (see https://webrtc.org/native-code/android for details on how to build etc) then deploy your own instance of AppRTC (https://github.com/webrtc/apprtc) if you wan to have full control over the signaling. Otherwise you can just use the one publicly available at https://appr.tc.
As per the question title, is there a way to implement an Android live wallpaper using CocoonJS? Either as a standalone app or as an additional service of an existing app, it doesn't matter. I can't seem to find any evidence at all that this is possible, although there doesn't seem to be any technical reason for it.
I've been developing games with cocoonjs and I do not know of any API call to set a wallpaper.
Here you can see what you can do with the cocoonjs javascript API:
http://doc.ludei.com/3.0.5/
At the moment you can not add any more functionality than this API provides.
Nevertheless, there are plans for the future to make it possible to write own plugins :)
Regards
I'm building an music app for android. I have added the C library into eclipse alongside my app. The library is made for android, but it is written in C. I'm unsure on how to convert the C code into the native Android code. If anyone can help me with entering the library's code into my app it will be greatly appreciated! Thank you!
I would assume the main objective for Spotify to release this API is to identify talent.
People strongest in the combination of expressing interest and making results will
come to their attention.
In case of Android you could for example compile your own Android OS build, create JNI calls between Android and the native library and produce a showcase scenario of music usage for Spotify to consider.
If they like what they see/hear they might want to engage you in some way.
As a sample; one big Spotify thing I would suggest would be to allow end users to contribute Wikipedia-like annotations about artists/songs. People out there are immensely knowledgeable about all sorts of music represented on Spotify and could contribute lots of artist/song info that if compiled smartly could become an awesome asset of Spotify and its suppliers.
A question would be in what way incentives could be offered to such contributors since Spotify or the music company behind a song would much likely have to assume ownership of the contributed info.
Lots of ideas possible on how to develop services that make Spotify an even richer service.
I am trying to work on an app that streams videos from a website into an app. So, it's like watching those videos in the app, not the website. How would you do this? Can someone point me to a tutorial or explain it to me?
Well depends on what framework you are developing for. You need to provide more info if you want real answers.
For example, if you are using the native sdk to develop for the playbook (C/C++), info regarding streaming video can be found here:
https://bdsc.webapps.blackberry.com/native/documentation/video_playback_overview_1935223_11.html
May not be for streaming specifically, but its a start.
If you are developing and AIR application (Flex), well then thats library question. Just off the top of my head, you would probably use the NetStream class. Documentation found here:
http://help.adobe.com/en_US/FlashPlatform/reference/actionscript/3/flash/net/NetStream.html
Again, need more info to give you a good answer.