I passed three weeks searching how to create a video live call for my android app (using android studio), but can't find exactly what I'm looking for, I don't want to use something like quikbox or snitch because it's my final year project and I have to do it programatically, I find that webRTC for android can be used but unfortunately I didn't understand how to use it.
So please can anyone help me with any things.
This is unfortunately something that's still hard. The team is working on it though: see https://bugs.chromium.org/p/webrtc/issues/detail?id=6328 for progress.
There's also https://bugs.chromium.org/p/webrtc/issues/detail?id=6804 that has resulted in this bot archiving .aar builds: https://build.chromium.org/p/client.webrtc.fyi/builders/Android%20Archive which should make it easier to consume the library.
Related
I have been trying to find a way for ages to access Android's Native Date Picker and Time Picker in Unity via script. Most of the method's need a plugin or add-ons that cost money. I am looking for a way to do this without pluug-ins but would not be against the idea of using one.
You may try with this package which i have not used. But hope it would work. Please try with demo included on that package.
Android Native Dialogs and Functions Plugin
It may be possible using AndroidJavaObject and AndroidJavaClass, but i assume you have already tried that route.
I just crossed this bridge myself, i needed background audio and the only way to make that happen was with a native Foreground Service and using Native Media API's.
I have successfully wrapped up with that side of the project now and moved over to accomplishing the same functionality with apple (much harder so far..)
I highly recommend that you just bite the bullet and write your own plugin man, as i had to. it was well worth it. its also completely free. the time invested is really worth the functionality it opens up for your current and future projects
This was by far the best YouTube tutorial to get your started (just install Android Studio First) -> https://www.youtube.com/watch?v=bmNMugkOQBI
Followed by : https://www.youtube.com/watch?v=eNKSXzWOnlI
Once you have your plugin created, it should just be a matter of adding one java method that 'Sends' a string back to a given C# method
it took me a solid week to figure it all out. but if you have any questions i can answer them in the comments :) just ask
I have a few questions about webRTC on android. I can say I'm new about android and webRTC but I can also say I made tooo much research about webRTC on android. But still have a few questions. (some of them because of I'm new, and some of them because of I'm okey but not fully)
I'm trying to make an android app which is going to communicate between web browser (first choice is chrome) and android device directly (p2p we can say). So I made too much research and I found webRTC is good for me. Do you advise me something other or is it okey you think? (also I am going to code a plugin for chrome).
Nearly every document says 'android is not directly support webRTC'. So I need something to provide me webRTC on android. What it is? Is it native android that I have to code? Is it native (NDK) library that I have to include my project? Or is it a java lib? Or should I go for cordova/crosswalk or sth like that? I researched all but didn't find something can help me. Yeah there are documents about it but not enough..
Some of documents says, I need chromium. But why and how? They show me lots of linux terminal commands and even there is no a line java or C or C++ code. Even some terminal commands and links that they give is not working.
I read/found/tried these things as a result of my research:
Apache cordova
Crosswalk
http://www.webrtc.org/
https://github.com/webrtc
http://webrtc.github.io/samples/
http://simonguest.com/2013/08/06/bui...t-for-android/
http://orcaman.blogspot.com.tr/2014/...tc-source.html
https://github.com/pchab/ProjectRTC
https://github.com/pchab/AndroidRTC
and something more..
in a nutshell I need help. Please give me your hand. Thank you. (because I'm really very helpless and tried to do my best)
Thank you.
As others have suggested, I recommend checking out g.co/webrtc. As I understand it, your goal is to make Android connect to a web browser using WebRTC. There are two (three) ways you can achieve that.
You can just use Chrome, Opera or Firefox for Android. All these browsers support WebRTC, and it allows you to use the same code for your web app, as for your Android app. With the new Add to homescreen support, as well as support for push notifications from web apps on Android, this could be a very good solution for you.
You can use the Android native WebRTC library, available from WebRTC.org. As mentioned in my article, I recommend using the pristine.io compiled library, available from MavenCentral.
If you can limit your application to Lollipop, you can use WebView, which support WebRTC now iirc. I don't know much about it though.
And the best resource for getting help is discuss-webrtc. It's a lot more active than StackOverflow.
I follow up on this article: TarsosDSP with Android
I am trying to implement an android application that reads mp3 files and processes them using WEKA.
The TarsosDSP seems to be a good step in the right direction, especially since the Berkley guys seems to have implemented a fork with android.
When I tried downloading their source code here: TarsosDSPAndroid Source Code
I still found a lot of references to javax.sound, which is kind of counter-productive.
So is something mixed up with their uploaded source code or am I looking in the wrong place?
Perhaps some background to what I am trying to accomplish overall:
I am writing an Android App that will read the entire mp3 library, and using WEKA and pre-loaded test-groups will classify each song to appropriate genre.
The part of reading the mp3 library is all done and so is the classification using WEKA, now I am stuck in joining them up - What seemed to be working fine using jAudio in a java project doesn't work for android because of the dependency in javax.sound, so I am trying to bypass that using a different library that works for android.
Thanks in advance!
-Alex
Version 2.0 of TarsosDSP supports Android out of the box. There are no more dependencies on javax.sound.*. This makes it a lot more easy to work with on Android. There is even an TarsosDSP Android jar file that can be included in your project directly.
I would like to edit an audio file. As java doesn't support voice libraries(according to my knowledge), I would like to use juce library for this.
From some resources found in google, I came to know that we can do it using introjucer..
but i couldn't find proper tutorials for making android projects using introjucer. Can anyone help me out with this? Please correct me if i've misinterpreted any concept.
Best Useful library for audio editing is ringdroid: https://code.google.com/p/ringdroid/
Android Audio reference.
http://developer.android.com/reference/android/media/AudioTrack.html is the Android API for handling audio at the lowest level.
Second check out this -> Getting started with programmatic audio
If you dont want to use ringdroid.
Juce Hello world Tutorial : http://jucevst.wordpress.com/2011/08/17/hello-world-with-juce-actually-making-something/
Juce Beginner Tutorial: http://www.rawmaterialsoftware.com/viewtopic.php?f=13&t=10953#p61988
For training and fun I want to build an android app, that is able to stream audio from one device to another. It will be a simple baby-phone app.
I've tried using gstreamer, but have some trouble including the binaries and building the eclipse project. However, now I am looking for alternatives. Does any one know a simple one? Or is there even some android api stuff I can use? Please note: the difficult thing is not receive a stream, but provide one...
Thanks a lot in advance!