I'm going through the WebRTC tutorial, and trying to setup their sample code.
Problem is, i do not have a webcam. I quickly realized that my Android phone can be used as a webcam using an App called IP Webcam, which transmits the video via wifi to my computer(which can be viewed in a special address).
Is there anyway, that this can be recognized and used by getUserMedia?
I'm using this sample code from he tutorial:
https://codelabs.developers.google.com/codelabs/webrtc-web/#3
When i run it, i get this error:
code: 8 message: "Requested device not found" name: "NotFoundError"
Is there any way to make WebRTC aware of the connected phone?
I'm using Windows 7, latest Chrome, and serving the sample code via Xampp(if any of it matters)
Related
I'm using libVLC in my Android app to stream video over rtsp from the camera that I'm connected to over WiFi.
In general, streaming works fine, but there seems to be streaming problem if I'm connected do the camera by WiFi (that provides no internet) and also have got mobile data turned on. I use bindProcessToNetwork to make sure that the streaming is done via my WiFi network. On some devices (like Huawei Mate 10 with Android 9) the streaming works ok (it seems to use WiFi and ignore having mobile data on), but on other devices (like Samsung Note 10 with Android 10) when I use new networking API it seems that VLC is trying to connect via the mobile data, and only after some time when it fails it decides to use my camera's WiFi (despite the fact that I used bindProcessToNetwork).
I get an error log
VLC-std: Unable to determine our source address: This computer has an invalid IP address: 0.0.0.0
Suprisingly, it works fine if I connect to my WiFi from the system settings...
I found some comments that media streaming is done in a separate process, and it ignores calling to bindProcessToNetwork but on some devices (and Android versions) is seems to work and on others it does not.
I already asked this question on the Videolan forum, but with no luck.
Is there a way to force libVLC to stream using a specified network?
I don't think LibVLC can do this, and it's a bit out of scope of a multimedia framework.
I would handle this on the app side if I were you, using something like How do I connect to a specific Wi-Fi network in Android programmatically?
I'm trying to port a web application to a native Android application using Cordova. It's fairly simple, primarily just sending Midi messages to a connected device. I know the WebMidi API is only supported on recent versions of Webkit on Android, and I have been testing on 5.1. I've managed to prove that the basics work by running the original web version on Chrome on the device, it works fine.
The problem when running in Cordova is the messages themselves are not sent for some reason, no error, just not getting there. I know the API is working, as a separate part of the application lists the connected devices and presents a dropdown list to choose from, this works fine, and recognises the connected Midi device. However, when I send messages they don't have the desired effect on the Midi device. They are SysEx messages, which I believe needs additional permissions, android.webkit.resource.MIDI_SYSEX, is it possible that this is enabled on Chrome but not on the Cordova application? I've tried adding this permission to the ./config.xml, and ./platform/android/AndroidManifest.xml but to no avail, it doesn't seem to have any effect, and doesn't even show as an additional permission when installed.
Based on various searches, I've also tried installing the Crosswalk plugin, but couldn't get that to work at all, not even the device listing.
Any thoughts welcome.
The problem you're facing is that you won't even be prompted for midi sysex permission unless you meet certain criteria. You either have to be accessing your web midi code via a localhost, OR on an https URL. Sysex is potentially harmful, so they have used this as a minimum security requirement.
I had it working on android by opening a URL on my to my dev PC (using a self signed SSL cert on wamp). It gives the security prompt for sysex and then works as expected, so chrome on android works for sure. Crosswalk Cordova however, I'm not so sure.
I've tried running a little webserver in my cordova app (on Android), starting the webserver on 127.0.0.1:8080 and then connected to it using chrome (separately on the same device). Feels tantalizingly close, but I need it to run in my app!
My attempts to run an iFrame with the webserver's URL (http://127.0.0.1:8080) have failed. it's just not found. No security error, so doesn't seem to be to do with white-listing, although I need to look into that further to be sure.
It seems that the webserver plugin is running successfully, but is not visible from within the app.
You should have a play with this, and see if it gets you anywhere...
Or perhaps you'll find another one that is visible from within the app itself.
The alternative approach is to use a socket server to connect to your computer, and have the midi devices connected to it. Not exactly portable though!
I have an embedded device which has Wifi and Audio capabilities. I want to make my Android phone communicate to it using RTP. I have already tried Mobile to Mobile RTP audio communications. That works like charm. But now I want to try, one side Android and other side this embedded device. Could anyone point me to the source code (simple one as less RAM in embedded device) of basic RTP.
I am using ffmpeg on Windows PC to simulate the embedded device. When I "join()" the RTP from my Android, I get the error as 0x64 wrong protocol type.
If I get a basic handshake between the devices for audio, it will be great.
So I'm trying to hook up a Kinect to an Android tablet using any means necessary. I would preferably like to avoid a windows machine or arduino board in the middle.
The method I've already tried is to have a C# program (the kinect sdk uses C#) communicate with the android device. I tried to figure out how to send a message through usb, and decided to do port forwarding. This worked, but was slower than I would like it to be.
I guess the question is can I connect it to Android as a usb device or accessory and communicate via JNI?
In theory you should be able to use the OpenNI for ARM. I've seen Hirotaka's demo of OpenNI running on Linaro Android but using an Asus Xtion Pro sensor and a Panda board.
Hirotaka also posted notes on his setup.
Doing a quick youtube search reveals examples with Kinect and Android tablets.
Side note: I don't I understand why you're trying to use C#: you'll be writing Android applications in Java and OpenNI has a Java wrapper.
Can LoadRunner do load testing for web mobile application of Android? I
Here are two tutorials which may help to record network traffic of Android (iOS, Windows Mobile, etc.) devices using HP Loadrunner 9.5x:
1) Using the Android SDK emulator:
http://www.perftesting.co.uk/recording-and-performance-testing-android-applications-with-hp-loadrunner-vugen/
2) Using the real "hard" device (combined with DNS spoofing to send the network traffic through the Hp Loadrunner Virtual User Generator):
http://www.perftesting.co.uk/performance-testing-iphone-ios-android/
Yes. There are potentially multiple ways of capturing the conversion, from using an Android simulator running on a PC to using VUGEN as an HTTP man-in-the-middle proxy to record the conversation passed from an Android device through VUGEN and onto the final target.
I used jMeter for this purposes.
Load Test Mobile Apps article provides you clear steps