I'm trying to develop a proyect like PTTDroid, I mean a Push-To-Talk or Walkie-Talkie application.
The issue is that in this app you canĀ“t use 3G to access the web, so I've decided to use a Node.js server and implement an Android client to comunicate with it. I tried to do a multiplattform proyect using Phonegap the problem is that for audio record you can't access to buffer, you can only start and stop or pause the recording process but not send data while capturing. So my problem is that is possible to streams audio capture in real time by native Android functions (Audiorecord class) with a Node.js server by Socket.IO or similar?
I discovered this project, Asimi JS, but I don't know if someone else knows a better way to do what I want.
Thank you very much for your help!
It is certainly possible to do it, but a standard NodeJS http server would not be advisable as it uses tcp. You want to use UDP as a transport layer for audio, since it will be faster and the small packet loss that can occur will most likely not be a problem.
To be completely honest with you it sounds like you need to write a few demo applications on the native platforms - so do not use phonegap. You need native platforms in order to access things suchs as the mircrophone and to stream over UDP.
When you have a demo working, you can go on and try with another platform afterwards, but start with a simple setup instead of trying to do it all at once - if it was that easy, someone else would have done it before you.
Let me recommend a simple UDP server in whatever language you are most comfortable with such as (NodeJS, Java, C, C++, C#). Let the UDP server receive and save the content into a file that you can then play back on a desktop computer to verify the result. As a simple client, build one either on Android or iOS, and stream a file that you have already recorded and included in the app. When you have this setup working, you can try to capture the microphone, then do a user interface, then support multiple phones, then build a server which records the conversations, then build a user database, and so on a so forth. But start with a prototype of your main feature.
I've finally discovered and solved my problem (at least that's what I think)...First of all I created a server to send and receive UDP packets by DatagramSocket and after that, to achieve communication between server and client, when I was connected by 3G, I needed to have a static port and IP, that's why my server couldn't connect with the client. With data connection, the user IP and port is not always the same and you have to keep the same socket always opened if you want to send and receive. On the other hand the server has to store the adress and port from the client in the moment of connection.
Thank you very much for your help ExxKA
Related
I am working on a project that gives notifications to a web application when some phone sensors change their values rapidly. I am new to android and even though I know how to get the sensors values in my phone, I'm not sure how to transmit them to my pc server. I will look into sockets for network/bluetooth transmitting, but is there any way to send it via the internet ?
Basically, I want my application to work like this: I connect to my android application on phone, I shake the phone a bit and then on my web application on pc I get some notifications. I was looking around and I saw working with a python server?
How should I proceed here? Which method should I use?
Use HTTP/HTTPS on the Android Client to send to your server. As that will be the easiest to program and most reliably way to get data off the phone regardless of network type. See HttpURLConnection for details. Just do a POST with your data to the web address.
If you don't want to just construct a simple PHP, ASPX, or other web service on top of a readily available HTTP server, then you can use a variety of HTTP server modules for Python. SimpleHTTPServer for example.
I am developing an android app for voice chat and using webrtc in client side and node.js as server. I have successfully being able to stream voice between two peers and used node.js server for signalling.But this method has a huge problem, because webrtc connects directly the peers, when a peer is connected directly to 200 peers, it will use alot of the device's CPU and bandwidth and i want 500 and more peers to be able to voice chat without consuming much bandwidth and device cpu.To reduce the load on cpu and bandwidth usage I thought of creating a streaming link directly with the node.js server and from there stream it to the other peers like that a peer will have a single link that communicate with others.I want to know if there is a node.js module capable of linking with android's libjingle_peerconnection. I have tried node-webrtc and does not work with recent libjingle_peerconnection.
An Android device will not be able to directly connect to hundreds of WebRTC peers; this simply requires too many resources.
You want to look at a media server, like Kurento. Kurento will run on a server, and be able to send WebRTC media streams from one client to many other clients in the manner you describe. You have to write the signalling layer specific to your application, which you can do in node.js similarly to the two client case.
I would like to implement a voice chat (even just a primitive support for it), without the hassle of implementing the whole VoIP stack. I would also like to make it available to both iOS and Android.
I have been searching for a way to do this, and it just seems to me that using Socket.IO would be something to start with. Is it possible to use Socket.IO as a signalling server (to discover who is available, and who to start a voice chat with), and then establish a peer-to-peer connection between two devices in order to transfer audio data?
NOTE: Although I would prefer peer-to-peer, because I would like to avoid overloading the server with transferring such an enormous amount of data, is it possible to have socket.io receiving and transmitting audio data between devices (because I heard that iOS and Android cannot establish a P2P connection with one another). If it's not possible, then maybe it's possible to use a socket.io server as an intermediary for sending audio data?
I found the following helpful links (which are probably based on socket.io) but they have not any reference for mobile devices.
http://peerjs.com/
https://simplewebrtc.com/
Any help is greatly appreciated. Any alternatives to socket.io, are also accepted, if there are any.
I want to create an app that can send and receive files like text and images between multiple users in android without any net connections. Just to create a local server in one device and use it as a hotspot for transmission of data.
Any suggestions?
You can do this by using socket programming.
https://docs.oracle.com/javase/tutorial/networking/sockets/
In nut shell, The device which you would like to make as server should create a server socket and continue listening to a port. All client devices can create a client socket and connect to the server socket and transmit data.
You can refer to the sample app "Bluetooth Chat" which is available with Android sdk sample. You can use the same concept to transfer files as chunks of bytes (part by part).
Good luck!!
Start with basics
(https://en.wikipedia.org/wiki/Communications_protocol)
Think about what you want to achieve.
Choose your protocol or technology (HTTP, FTP, REST, UPnP or
thousand-others)
Then move on to implementation. You are likely to use some existing library.
I downloaded the open source code of webRtc.In side that I found the WEbRTCDemo test project for Android.I am able to generate the APK BUt when I install it my device not Able to communicate with both device....
Steps What I am doing..
1. In application settings->HostId: I puted Ip Address of other Android device and pressing the start call button but problem is in another Side Nothing happening.
My question is
1.for communicating the with other Device I have to setUp any server??
2.Can any one explain how its working in case of Android Device.
Please help me.
Thank You
Krishna.
Yes, you most definitely do need a 'messaging server'. Your task here is to relay the SDP from one client to the other. The SDP includes the ICE Candidates, which basically tells a client how to directly 'reach' the other (IP Address + Port combination). Once both the clients have exchanged these 'handshake' signals, they can start transferring their streams peer to peer.
Now, the implementation of this server is completely left to you. Since it is decoupled from the rest of the WebRTC API, so you can safely resort to any technology to make sure that these signalling messages are exchanged between the two clients. And once you have successfully established a PeerConnection, you can from there on even use the DataChannels to re-negotiate.
To sum things up,
Yes you do need a server to relay the messages between two clients.
Since this is independent of the WebRTC implementation, you can resort to any technology of your choice.