Android Live video steaming from one device to another - android

I am building an android application which will stream video calling from one device to another android device for that i am using wowza video streaming API(Media engine) With that i am able to steam video from android app to web but is it possible device to device video steaming?

If you are planning to develop all the infrastructure , then these are the points to be evaluated and concluded.
What Technology is used
WebRTC is the technology used to support video calling. WebRTC is a free, open project that provides browsers and mobile applications with Real-Time Communications (RTC) capabilities via APIs. Check out WebRTC Details here It was introduced by Google in 2010. This allows real time communication between two browsers / mobiles.
Concepts involved
1. Data streams and Hardware
WebRTC helps into setting up/identifying hardware’s and identifying network with STUN server ( What is STUN server ) along with hardware ( microphone/camera and speakers) . For mobiles this comes as inbuilt hardwares
2. Audio Video CODECS
Google has made audio/video required for these features as open source. Generally Audio G711 for phones (still varies in specific cases) . And for Video VP8 and VP9
3. Peer Discovery
For making a call , generally either address is required. Now in internet most IP’s are dynamic. To solve this , server needs to keep track of who is online. This can be done using XMPP, SIP or some custom protocols. So for anyone to receive a call, the caller should check with server or the other way around
4. STUN Server
Once signalling (peer discovery) is done, then STUN server is required. This server will faciliate to determine external IP address as well as info whether two or more devices can talk to each other or not
5. TURN Server
If a peer-to-peer session is not possible , then a TURN server is required. The TURN server will basically shift the bits for you through open holes in the firewall between the two clients. This happens due to asymmetric firewalls and the possibility to punch holes on different ports in firewalls
Or else you can use providers like SINCH who already handles and configures the basic requirement and you only need to concentrate on mobile front end.
Check out SINCH ANDROID SAMPLE as well

Related

Streaming the audio from a microphone through a LAN Wi-Fi network

I'm trying to create a mic app to use inside a conference as real mics for audience.We will connect Android devices to Wi-Fi LAN and anybody could trigger the mic from their app to say something to others. The data will go to a server Java programme inside the LAN and to the speakers from there.
Dont know how to get this done.Can someone help me???
Thanks in advance
i think the easiest solution is WEBRTC
you can use webrtc inside your android program , it can handle Microphone/Video itself without any redundant code / plugin / library
you can use it via https://webrtc.org/native-code/android/
WebRTC standards require the use of three IEFT NAT traversal standards to address these issues:
Interactive Connectivity Establishment (ICE) – RFC 5245 Session
Traversal Utilities for NAT (STUN) – RFC 5389 Traversal Using Relay
NAT (TURN) – RFC 5766
so you don't need to handle Client to Client packet transport if you want to use Internet gateways.
there are also bunch of SO threads which you can read it and use it
1. Stream Live Android Audio to Server
2. Streaming voice between Android Phones over WiFi
3. Android. How to record the microphone over audio stream?
4. An extensive project: Streaming audio from microphone to Android device

Server software for handling multiple mp3/ogg streams

I'm looking for advice regarding an aspect of a project I'm working on.
I'm developing a demo android app for a not-for-profit that specialises in services for the vision impaired. The plan is, among other things, the app will enable users to stream this organization's specialised audiobooks.
For the sake of demo'ing/development I need to establish some sort of server which will, pending a request from a device running the app:
directly transfer certain xml/html index files to the android phone (no streaming necessary)
stream .ogg and .mp3 audio files to the device
serve more than one client device at a time
start a stream from a specific point within an mp3/ogg file, pending a request from the phone app
I've had a look at Icecast as an mp3/ogg streaming solution, but my knowledge of servers is a bit limited (I've only ever done some basic work in Flask). Would I need to run this in tandem with something that can generically serve files / handle requests?
I'm basically just looking for a good solution/tool to implement this is. The server side doesn't need to be completely fleshed out, just fit the bill above, as my focus is developing the phone app side for now. For the sake of a demo, something straightforward / well documented would suit best.
You don't need a special server for this. Any HTTP server that supports range requests will be fine. This includes Nginx, Apache, etc. There is no need to spoon-feed clients media data from the server. This streaming and buffering aspect is handled automatically on the client, through TCP window size and outright closing the connection and connecting again later as-needed.
Icecast is meant for radio-style streams where everyone listening hears the same thing at roughly the same time. Since that's not an aspect you want, stick to any normal HTTP server.

How to create direct link between Node.js and android libjingle_peerconnection

I am developing an android app for voice chat and using webrtc in client side and node.js as server. I have successfully being able to stream voice between two peers and used node.js server for signalling.But this method has a huge problem, because webrtc connects directly the peers, when a peer is connected directly to 200 peers, it will use alot of the device's CPU and bandwidth and i want 500 and more peers to be able to voice chat without consuming much bandwidth and device cpu.To reduce the load on cpu and bandwidth usage I thought of creating a streaming link directly with the node.js server and from there stream it to the other peers like that a peer will have a single link that communicate with others.I want to know if there is a node.js module capable of linking with android's libjingle_peerconnection. I have tried node-webrtc and does not work with recent libjingle_peerconnection.
An Android device will not be able to directly connect to hundreds of WebRTC peers; this simply requires too many resources.
You want to look at a media server, like Kurento. Kurento will run on a server, and be able to send WebRTC media streams from one client to many other clients in the manner you describe. You have to write the signalling layer specific to your application, which you can do in node.js similarly to the two client case.

initiating a rtsp connection over cellular with android

The majority of sim accounts are public dynamic. Most if not all cellular providers do not allow incoming connections to public dynamic ip addresses. (3g anyway, maybe not 4g/LTE)
The issue of connecting is not one of dynamic ips, but rather blocked incoming ports.
So, if I wanted to stream video from an android phone on demand (based on information gleaned from this conversation (Streaming video from Android camera to server)), what would be the chain of events to properly intitiate a connection.
My idea of this (roughly):
app on android phone initiates and keeps open some sort of connection to media server (wowza or something).
At some point when server wants video from phone, it uses the open connection to request a video stream.
Android phone pushes rtsp stream to server.
Is this correct, and if so, what type of connection should i use as the permanent control connection. Also, is it possible to push rtsp or would i have to do something else?
Thanks!
I know this is an old question but if anybody else is searching for something similar the following is now available:
http://developer.android.com/guide/google/gcm/index.html
This essentially allows a message to be sent from a server to an app on an Android device (it replaces C2DM which did a similar thing).
Update
Google GCM has now being replaced in turn by Google Firebase Cloud Messaging:
https://firebase.google.com/docs/cloud-messaging/
Using a could based app messaging service like this, the steps would be:
Add a message subscription service to your app (e.g. Firebase)
The App registers with the cloud messaging service when it starts up
When the server wants video from the phone (as noted in the questions above) the server sends a message to the app
The app opens a connections to the streaming server and starts to stream video to the server.
Note: there is a comment below about how this approach does not allow an incoming connection from the server to the Android phone.
This, in fact, is not how streaming from a phone typically works. The phone actually makes an 'outgoing' connection to a streaming server which it then streams the video to. Other devices wanting to see the video then stream it form here.
There are several reasons why this is the preferred approach, one of the key ones being that supporting a quality streaming service that will play back on most common devices, browsers, OS's etc requires transcoding the video into multiple bit rates, and even encodings in some cases, and packaging and serving in the appropriate streaming packaging format. Doing all this on the mobile device would be very compute and storage intensive.

RTSP Stack implementation

I am working on making RTSP url to work on Android.
I have used MediaPlayer and VideoView to play the RTSP url but both of them fail to play in G1 Device and emulator. Error is PVMFFailureiin PLAYER_INIT.
So I thought of implementing a RTSP client from scratch. I have developed a small application, it exchanges all the messages (DESCRIBE, SETUP, PLAY, TEARDOWN). But the problem is after PLAY, I should receive the RTP packets at client_port as mentioned in Transport header but TCP dump shows the following messages. I think UDP packets are coming but not received at the application.
11:38:50.213394 IP ew-in-f177.google.com.6970 > 192.168.1.2.6970: UDP, length 444
11:38:50.213451 IP 192.168.1.2 > ew-in-f177.google.com: ICMP 192.168.1.2 udp port 6970 unreachable, length 36
Please let me know how to solve this issue.
OpenCORE, the multimedia engine in Android, is rather stringent about sticking to various standards. Some things that may work in the wild (e.g., MP3 over RTSP) are not supported because they are not in the respective spec. Leastways, that's what we have been told.
Hence, it may be that the media you are trying to stream is slightly out of spec, or the server is slightly out of spec. You may wish to try to find some RTSP stream known to work on Android, get your player working right using it, then focus on getting it to consume your own streams.
Some handsets (I don't know whether Android falls in this category or not) include a firewall on the device. In order to receive packets you may have to perform a hole punch (i.e. send a dummy packet to the server. This will create an allow rule on a firewall running on the local device. It will also create the appropriate NAT mapping if necessary as well.)
Is the device on WIFI or on the carrier's network (which carrier)?

Categories

Resources