I have a ip camera from Apexis model J011ws, and i'm developing an app to control it, and i've been trying for some days send audio to camera, but i don't know how to do this, i'm using phonegap, i didn't found nothing helpful on internet.
Related
I have music player and need to add sync play functionality with other mobile. For example if 2 or more users are using my music player and want to play same song on all devices then they just connect through same network and can play music on all devices from one device with complete music player control of all devices on single device.
Would anyone explain me which is best and how can I share audio from one Android device to another device on Sync and what are the steps to do so.
Points I know about WiFi P2P-
create connection
create socket for sharing
share a complete file
Points I want to know:-
How can I share file without storing in another device storage.
How to play sound on both devices at same position (ON SYNC).
and after Wifi P2P I want to say that I don't know about WebRTC like:-
How it works?
How to setup connection for this ?
Is it always required internet connection ?
Is same application is required in both devices to create connection between devices?
I don't know is this helpful for you or not just see the links may be you get some useful info...
for web rtc
https://webrtc.org/native-code/android/
This link will help you to know about webrtc like how it works and how to setup
I am trying to develop a program which connects raspberry pi and android application over global network (aka they are in different network. i.e. User from NY is using Android, LTE and raspberry is connected to wifi, located in Oregon )
So far, I succeeded connecting them via Pusher service for TCP connection for controlling GPIO pins.
However, I just can't figure out how to establish live streaming in raspberry pi (pi camera).
I tried Youtube live streaming, however, I had to enable adsense from youtube and to enable it, I have to reach over 4000 watch hours which is overwhelming effort for small project...
Idea I had in mind is..
Periodically upload photo taken from raspberry pi to Amazon S3 and download to Android, making it look like a video..
Build a web server that hosts live streaming and receive static IP.
If there's a service that hosts video live streaming, please do let me know..
Any help would help me greatly! Thanks in advance!
You can use NGINX - RTMP server, please look here : https://github.com/arut/nginx-rtmp-module
Also If you want to play your rtmp over android, you need a rtmp/rtsp client for your app. You can find it here: https://github.com/pedroSG94/rtmp-rtsp-stream-client-java
So no need to pay anywhere
I'm trying to develop a proyect like PTTDroid, I mean a Push-To-Talk or Walkie-Talkie application.
The issue is that in this app you canĀ“t use 3G to access the web, so I've decided to use a Node.js server and implement an Android client to comunicate with it. I tried to do a multiplattform proyect using Phonegap the problem is that for audio record you can't access to buffer, you can only start and stop or pause the recording process but not send data while capturing. So my problem is that is possible to streams audio capture in real time by native Android functions (Audiorecord class) with a Node.js server by Socket.IO or similar?
I discovered this project, Asimi JS, but I don't know if someone else knows a better way to do what I want.
Thank you very much for your help!
It is certainly possible to do it, but a standard NodeJS http server would not be advisable as it uses tcp. You want to use UDP as a transport layer for audio, since it will be faster and the small packet loss that can occur will most likely not be a problem.
To be completely honest with you it sounds like you need to write a few demo applications on the native platforms - so do not use phonegap. You need native platforms in order to access things suchs as the mircrophone and to stream over UDP.
When you have a demo working, you can go on and try with another platform afterwards, but start with a simple setup instead of trying to do it all at once - if it was that easy, someone else would have done it before you.
Let me recommend a simple UDP server in whatever language you are most comfortable with such as (NodeJS, Java, C, C++, C#). Let the UDP server receive and save the content into a file that you can then play back on a desktop computer to verify the result. As a simple client, build one either on Android or iOS, and stream a file that you have already recorded and included in the app. When you have this setup working, you can try to capture the microphone, then do a user interface, then support multiple phones, then build a server which records the conversations, then build a user database, and so on a so forth. But start with a prototype of your main feature.
I've finally discovered and solved my problem (at least that's what I think)...First of all I created a server to send and receive UDP packets by DatagramSocket and after that, to achieve communication between server and client, when I was connected by 3G, I needed to have a static port and IP, that's why my server couldn't connect with the client. With data connection, the user IP and port is not always the same and you have to keep the same socket always opened if you want to send and receive. On the other hand the server has to store the adress and port from the client in the moment of connection.
Thank you very much for your help ExxKA
I'm working on entry phone. The entry phone is sending to me voice and video through rtsp protocol so I can simply get voice and video from camera on my device. But I have no idea how to send a voice to that device. Is there anything that would help me send and receive audio in the same time (something like a call)?
If I understand correct you want video calls. Just like sipdroid app. It is an open source project look at VideoCamera.java class in this project.
http://code.google.com/p/sipdroid/
I'm developing an AIR for Android application, and am current sending audio to fms servers via standard NetStream/Microphone options. I (ignorantly) assumed that attaching a bluetooth device would be pretty simple, and connecting it would make it show up as a native "Microphone". Unfortunately, it does not.
I don't think it is even possible to use Netstream.publish and publish raw bytes, so the only hope is that there's a way to use NativeProcess + Java to create a native microphone "handle" that AIR can pick up on.
Has anyone run into this issue?
I think one possible solution would be using NetConnection.send() instead of Netstream.publish().
You should get sound data from your BT microphone. I am not sure if you can get using AIR. You may need to use an android service that gets the sound data and feeds your AIR app via a file, a UDP port or an invoke etc.
When you get some sound data, encode it so flash can play it (Speex, Nellymoiser, etc) You can do the encoding in your Android service as well.
Whenever your AIR app receives a sound data, send it to your streaming server via NetConnection.Send().
Extend your streaming server to process sound data received. You can embed it into a flv stream, or send to other flash clients if it is a chat app.
Other than that, I can't find a way to have a "microphone handle" for your BT microphone. I once thought of creating a virtual device on Android, but I couldn't find any solution.