I am trying to develop a program which connects raspberry pi and android application over global network (aka they are in different network. i.e. User from NY is using Android, LTE and raspberry is connected to wifi, located in Oregon )
So far, I succeeded connecting them via Pusher service for TCP connection for controlling GPIO pins.
However, I just can't figure out how to establish live streaming in raspberry pi (pi camera).
I tried Youtube live streaming, however, I had to enable adsense from youtube and to enable it, I have to reach over 4000 watch hours which is overwhelming effort for small project...
Idea I had in mind is..
Periodically upload photo taken from raspberry pi to Amazon S3 and download to Android, making it look like a video..
Build a web server that hosts live streaming and receive static IP.
If there's a service that hosts video live streaming, please do let me know..
Any help would help me greatly! Thanks in advance!
You can use NGINX - RTMP server, please look here : https://github.com/arut/nginx-rtmp-module
Also If you want to play your rtmp over android, you need a rtmp/rtsp client for your app. You can find it here: https://github.com/pedroSG94/rtmp-rtsp-stream-client-java
So no need to pay anywhere
Related
I have music player and need to add sync play functionality with other mobile. For example if 2 or more users are using my music player and want to play same song on all devices then they just connect through same network and can play music on all devices from one device with complete music player control of all devices on single device.
Would anyone explain me which is best and how can I share audio from one Android device to another device on Sync and what are the steps to do so.
Points I know about WiFi P2P-
create connection
create socket for sharing
share a complete file
Points I want to know:-
How can I share file without storing in another device storage.
How to play sound on both devices at same position (ON SYNC).
and after Wifi P2P I want to say that I don't know about WebRTC like:-
How it works?
How to setup connection for this ?
Is it always required internet connection ?
Is same application is required in both devices to create connection between devices?
I don't know is this helpful for you or not just see the links may be you get some useful info...
for web rtc
https://webrtc.org/native-code/android/
This link will help you to know about webrtc like how it works and how to setup
I'm trying to create a mic app to use inside a conference as real mics for audience.We will connect Android devices to Wi-Fi LAN and anybody could trigger the mic from their app to say something to others. The data will go to a server Java programme inside the LAN and to the speakers from there.
Dont know how to get this done.Can someone help me???
Thanks in advance
i think the easiest solution is WEBRTC
you can use webrtc inside your android program , it can handle Microphone/Video itself without any redundant code / plugin / library
you can use it via https://webrtc.org/native-code/android/
WebRTC standards require the use of three IEFT NAT traversal standards to address these issues:
Interactive Connectivity Establishment (ICE) – RFC 5245 Session
Traversal Utilities for NAT (STUN) – RFC 5389 Traversal Using Relay
NAT (TURN) – RFC 5766
so you don't need to handle Client to Client packet transport if you want to use Internet gateways.
there are also bunch of SO threads which you can read it and use it
1. Stream Live Android Audio to Server
2. Streaming voice between Android Phones over WiFi
3. Android. How to record the microphone over audio stream?
4. An extensive project: Streaming audio from microphone to Android device
I am building an android application which will stream video calling from one device to another android device for that i am using wowza video streaming API(Media engine) With that i am able to steam video from android app to web but is it possible device to device video steaming?
If you are planning to develop all the infrastructure , then these are the points to be evaluated and concluded.
What Technology is used
WebRTC is the technology used to support video calling. WebRTC is a free, open project that provides browsers and mobile applications with Real-Time Communications (RTC) capabilities via APIs. Check out WebRTC Details here It was introduced by Google in 2010. This allows real time communication between two browsers / mobiles.
Concepts involved
1. Data streams and Hardware
WebRTC helps into setting up/identifying hardware’s and identifying network with STUN server ( What is STUN server ) along with hardware ( microphone/camera and speakers) . For mobiles this comes as inbuilt hardwares
2. Audio Video CODECS
Google has made audio/video required for these features as open source. Generally Audio G711 for phones (still varies in specific cases) . And for Video VP8 and VP9
3. Peer Discovery
For making a call , generally either address is required. Now in internet most IP’s are dynamic. To solve this , server needs to keep track of who is online. This can be done using XMPP, SIP or some custom protocols. So for anyone to receive a call, the caller should check with server or the other way around
4. STUN Server
Once signalling (peer discovery) is done, then STUN server is required. This server will faciliate to determine external IP address as well as info whether two or more devices can talk to each other or not
5. TURN Server
If a peer-to-peer session is not possible , then a TURN server is required. The TURN server will basically shift the bits for you through open holes in the firewall between the two clients. This happens due to asymmetric firewalls and the possibility to punch holes on different ports in firewalls
Or else you can use providers like SINCH who already handles and configures the basic requirement and you only need to concentrate on mobile front end.
Check out SINCH ANDROID SAMPLE as well
I am developing an android app for voice chat and using webrtc in client side and node.js as server. I have successfully being able to stream voice between two peers and used node.js server for signalling.But this method has a huge problem, because webrtc connects directly the peers, when a peer is connected directly to 200 peers, it will use alot of the device's CPU and bandwidth and i want 500 and more peers to be able to voice chat without consuming much bandwidth and device cpu.To reduce the load on cpu and bandwidth usage I thought of creating a streaming link directly with the node.js server and from there stream it to the other peers like that a peer will have a single link that communicate with others.I want to know if there is a node.js module capable of linking with android's libjingle_peerconnection. I have tried node-webrtc and does not work with recent libjingle_peerconnection.
An Android device will not be able to directly connect to hundreds of WebRTC peers; this simply requires too many resources.
You want to look at a media server, like Kurento. Kurento will run on a server, and be able to send WebRTC media streams from one client to many other clients in the manner you describe. You have to write the signalling layer specific to your application, which you can do in node.js similarly to the two client case.
I am curious to know if there is way that I can make my android device as a RTSP server and stream video with any other acceptable device.
I have no understanding of RTSP server and other protocol that should be followed to make this happen.
Any help is appreciated.
Thanks,
SKU
There is nothing to worry about this problem. I seen a solution here
http://techsplurge.com/5080/get-vlc-media-player-for-android-with-unofficial/ take read and get your answer.
AFAIK, the architecture you should adopt is to put in place a remote machine.
On this machine, you must have a server installed : Flash Media Server, Red5, etc. It will host the application that will get the coming stream and will outputs it.
Then, you can stream from a device (that you called "server") to the remote hosted application (ex: www.remoteserver.com:1234/myApp). To stream, just put that url in the input of a media container (http://developer.android.com/reference/android/widget/VideoView.html). Hopefully, Android natively supports the RTSP protocol via that VideoView container.
Hope it helps.
I am trying to implement something similar. One suggestion I have received was, when you are making your device as RTSP server and someone wants to stream video from your device, its called peer to peer streaming. In this case, its better to involve a server in between, which acts like encoder-decoder. As in when another client sends RTSP SETUP request, it comes to your device for sure. But now when you are ready to stream video, send payload (using RTP) to server and server intern sends it to requesting device. You can avoid firewall problems possible in LAN or routers (in case of wife network).