I am working on the Linphone Android source code and I am trying to use the callState CallIncomingEarlyMedia to show video in the CallIncomingActivity. Does anyone got any example how to do this?
I have tried to use acceptEarlyMediaWithParams() instead of acceptCallWithParams() in the LinphoneManager.java but it doesn't show the video.
any ideas?
Related
We are developing an app using AppRTC. Audio and Video call from iOS-iOS and Android-Android are working fine but whenever we try to call from android to iOS or iOS to android, nothing happens after the call is accepted.
We have tried using the same Video codec (H264) on both android and iOS but the issue still persists.
Any assistance in this matter is highly appreciated.
There are couple of things you can do to solve this issue:
See if you are using https://apprtc.appspot.com instead of https://appr.tc, you should use https://appr.tc for latest AppRTC.
Make sure you use "H264 Baseline" or "H264 High" video codec on android side as iOS supports H264 codec only.
Keep the following class updated with AppRTC github code PeerConnectionClient.java on android
Use latest AppRTC code on both iOS and Android
I am new to QPython and all I need to do is play a video (stored on the Android Tablet), and wait until the video finish playing.
I found the Androidhelper library commands mediaPlay and mediaIsPlaying functions, but I would not know where to start to use these functions.
Sorry, I did say I am new, so I would appreciate the basics, like how to get the source file directory, etc., a one line example would be fine.
Also if there is a better library to do this, please let me know.
Thanks
I think there is not API for Qpython, but there is a a8player.py in scripts after you having installed the Qpython, which maybe show how to call a player app's API.
BTW: you could post the requiement in official's facebook page. The dev team is very nice.
I'm using Phonegap for my app. I want to use sound in my app. I've used Howlerjs for the sound, but that one doesn't work on Android 4 for some reason. So I decided to try the Cordova Media Plugin, but I can't get that one to work either. I always get error code 1. I've tried to put /android_asset/www/ before the source and I've tried to use .wav files instead of .mp3 files. Everything doesn't work. Hopefully someone can tell me why it doesn't work and how to fix it.
Thanks in advance!
I am developing an Android App for recording video and sending to server. Also video must be recorded only on Touch of Record button. I have googled and got this library in the below link but I don't know how to use it in my project. This is the first time am going to use a library in a project. I have enable in my properties but don't know to use this functionality in my app. Please help me or suggest a tutorial.
https://github.com/sourab-sharma/TouchToRecord
I have just unchecked the Library option from Properties and used it as a package in my project. Its really a silly question. But the worked great
Any suggestion for live stream for android.
In my app i need to broadcast live video from android on a web,some app like Qik, Justin.tv, Ustream.tv and Bambuser.
it is developed under ffmpeg,
anyone would help if there is any open source project like this,or anyone had already done it
please tell me the project name or send me a copy of the code.
Email:liangyingshuang#gmail.com
Thanks.
Bambuser have opened source their version of ffmpeg that compiles on the Android. That should get you started.
see: http://bambuser.com/opensource
Best thing is SpyDroid http://code.google.com/p/spydroid-ipcamera/
It works the same as Qik