I am curious to know if there is way that I can make my android device as a RTSP server and stream video with any other acceptable device.
I have no understanding of RTSP server and other protocol that should be followed to make this happen.
Any help is appreciated.
Thanks,
SKU
There is nothing to worry about this problem. I seen a solution here
http://techsplurge.com/5080/get-vlc-media-player-for-android-with-unofficial/ take read and get your answer.
AFAIK, the architecture you should adopt is to put in place a remote machine.
On this machine, you must have a server installed : Flash Media Server, Red5, etc. It will host the application that will get the coming stream and will outputs it.
Then, you can stream from a device (that you called "server") to the remote hosted application (ex: www.remoteserver.com:1234/myApp). To stream, just put that url in the input of a media container (http://developer.android.com/reference/android/widget/VideoView.html). Hopefully, Android natively supports the RTSP protocol via that VideoView container.
Hope it helps.
I am trying to implement something similar. One suggestion I have received was, when you are making your device as RTSP server and someone wants to stream video from your device, its called peer to peer streaming. In this case, its better to involve a server in between, which acts like encoder-decoder. As in when another client sends RTSP SETUP request, it comes to your device for sure. But now when you are ready to stream video, send payload (using RTP) to server and server intern sends it to requesting device. You can avoid firewall problems possible in LAN or routers (in case of wife network).
Related
I'm looking for advice regarding an aspect of a project I'm working on.
I'm developing a demo android app for a not-for-profit that specialises in services for the vision impaired. The plan is, among other things, the app will enable users to stream this organization's specialised audiobooks.
For the sake of demo'ing/development I need to establish some sort of server which will, pending a request from a device running the app:
directly transfer certain xml/html index files to the android phone (no streaming necessary)
stream .ogg and .mp3 audio files to the device
serve more than one client device at a time
start a stream from a specific point within an mp3/ogg file, pending a request from the phone app
I've had a look at Icecast as an mp3/ogg streaming solution, but my knowledge of servers is a bit limited (I've only ever done some basic work in Flask). Would I need to run this in tandem with something that can generically serve files / handle requests?
I'm basically just looking for a good solution/tool to implement this is. The server side doesn't need to be completely fleshed out, just fit the bill above, as my focus is developing the phone app side for now. For the sake of a demo, something straightforward / well documented would suit best.
You don't need a special server for this. Any HTTP server that supports range requests will be fine. This includes Nginx, Apache, etc. There is no need to spoon-feed clients media data from the server. This streaming and buffering aspect is handled automatically on the client, through TCP window size and outright closing the connection and connecting again later as-needed.
Icecast is meant for radio-style streams where everyone listening hears the same thing at roughly the same time. Since that's not an aspect you want, stick to any normal HTTP server.
I'm working on a project where I stream video over wifi-direct from one android device to another using rtp over udp.
Essentially, the first android device hosts a rtsp server and listens for connections. Once the client android device connects to it (over wi-fi direct) and starts listening for packets, the first device begins to stream the video content.
I'm aware that RTP packet headers have a 32-bit timestamp at bit offset 32 - 64. But I do not know how to access the contents of the packet and subsequently access just that segment of their header.
Currently, I am using libvlc to play the streamed video on the device. But I would like to be able to measure the latency between the two devices. Either by extracting the timestamps from the packets on arrival or by some other way (maybe VLC can help?)
Edit: I am going to try to learn from the code posted here, but still await any replies.
Edit2: So I'm trying to go simpler. I've instead made a client activity to connect to the remote host and read packets that way using android's DatagramSocket api. However, my server activity doesn't start serving even after the client activity says it is connected. Not sure what needs to be done to let the server know there is a client ready to be served. MediaPlayer and VLC apis both were able to start streaming video once they connected. What am I missing? Do I need to do more than DatagramSocket.connect(ipaddress, port)?
I'm trying to develop a proyect like PTTDroid, I mean a Push-To-Talk or Walkie-Talkie application.
The issue is that in this app you canĀ“t use 3G to access the web, so I've decided to use a Node.js server and implement an Android client to comunicate with it. I tried to do a multiplattform proyect using Phonegap the problem is that for audio record you can't access to buffer, you can only start and stop or pause the recording process but not send data while capturing. So my problem is that is possible to streams audio capture in real time by native Android functions (Audiorecord class) with a Node.js server by Socket.IO or similar?
I discovered this project, Asimi JS, but I don't know if someone else knows a better way to do what I want.
Thank you very much for your help!
It is certainly possible to do it, but a standard NodeJS http server would not be advisable as it uses tcp. You want to use UDP as a transport layer for audio, since it will be faster and the small packet loss that can occur will most likely not be a problem.
To be completely honest with you it sounds like you need to write a few demo applications on the native platforms - so do not use phonegap. You need native platforms in order to access things suchs as the mircrophone and to stream over UDP.
When you have a demo working, you can go on and try with another platform afterwards, but start with a simple setup instead of trying to do it all at once - if it was that easy, someone else would have done it before you.
Let me recommend a simple UDP server in whatever language you are most comfortable with such as (NodeJS, Java, C, C++, C#). Let the UDP server receive and save the content into a file that you can then play back on a desktop computer to verify the result. As a simple client, build one either on Android or iOS, and stream a file that you have already recorded and included in the app. When you have this setup working, you can try to capture the microphone, then do a user interface, then support multiple phones, then build a server which records the conversations, then build a user database, and so on a so forth. But start with a prototype of your main feature.
I've finally discovered and solved my problem (at least that's what I think)...First of all I created a server to send and receive UDP packets by DatagramSocket and after that, to achieve communication between server and client, when I was connected by 3G, I needed to have a static port and IP, that's why my server couldn't connect with the client. With data connection, the user IP and port is not always the same and you have to keep the same socket always opened if you want to send and receive. On the other hand the server has to store the adress and port from the client in the moment of connection.
Thank you very much for your help ExxKA
The majority of sim accounts are public dynamic. Most if not all cellular providers do not allow incoming connections to public dynamic ip addresses. (3g anyway, maybe not 4g/LTE)
The issue of connecting is not one of dynamic ips, but rather blocked incoming ports.
So, if I wanted to stream video from an android phone on demand (based on information gleaned from this conversation (Streaming video from Android camera to server)), what would be the chain of events to properly intitiate a connection.
My idea of this (roughly):
app on android phone initiates and keeps open some sort of connection to media server (wowza or something).
At some point when server wants video from phone, it uses the open connection to request a video stream.
Android phone pushes rtsp stream to server.
Is this correct, and if so, what type of connection should i use as the permanent control connection. Also, is it possible to push rtsp or would i have to do something else?
Thanks!
I know this is an old question but if anybody else is searching for something similar the following is now available:
http://developer.android.com/guide/google/gcm/index.html
This essentially allows a message to be sent from a server to an app on an Android device (it replaces C2DM which did a similar thing).
Update
Google GCM has now being replaced in turn by Google Firebase Cloud Messaging:
https://firebase.google.com/docs/cloud-messaging/
Using a could based app messaging service like this, the steps would be:
Add a message subscription service to your app (e.g. Firebase)
The App registers with the cloud messaging service when it starts up
When the server wants video from the phone (as noted in the questions above) the server sends a message to the app
The app opens a connections to the streaming server and starts to stream video to the server.
Note: there is a comment below about how this approach does not allow an incoming connection from the server to the Android phone.
This, in fact, is not how streaming from a phone typically works. The phone actually makes an 'outgoing' connection to a streaming server which it then streams the video to. Other devices wanting to see the video then stream it form here.
There are several reasons why this is the preferred approach, one of the key ones being that supporting a quality streaming service that will play back on most common devices, browsers, OS's etc requires transcoding the video into multiple bit rates, and even encodings in some cases, and packaging and serving in the appropriate streaming packaging format. Doing all this on the mobile device would be very compute and storage intensive.
I am working on making RTSP url to work on Android.
I have used MediaPlayer and VideoView to play the RTSP url but both of them fail to play in G1 Device and emulator. Error is PVMFFailureiin PLAYER_INIT.
So I thought of implementing a RTSP client from scratch. I have developed a small application, it exchanges all the messages (DESCRIBE, SETUP, PLAY, TEARDOWN). But the problem is after PLAY, I should receive the RTP packets at client_port as mentioned in Transport header but TCP dump shows the following messages. I think UDP packets are coming but not received at the application.
11:38:50.213394 IP ew-in-f177.google.com.6970 > 192.168.1.2.6970: UDP, length 444
11:38:50.213451 IP 192.168.1.2 > ew-in-f177.google.com: ICMP 192.168.1.2 udp port 6970 unreachable, length 36
Please let me know how to solve this issue.
OpenCORE, the multimedia engine in Android, is rather stringent about sticking to various standards. Some things that may work in the wild (e.g., MP3 over RTSP) are not supported because they are not in the respective spec. Leastways, that's what we have been told.
Hence, it may be that the media you are trying to stream is slightly out of spec, or the server is slightly out of spec. You may wish to try to find some RTSP stream known to work on Android, get your player working right using it, then focus on getting it to consume your own streams.
Some handsets (I don't know whether Android falls in this category or not) include a firewall on the device. In order to receive packets you may have to perform a hole punch (i.e. send a dummy packet to the server. This will create an allow rule on a firewall running on the local device. It will also create the appropriate NAT mapping if necessary as well.)
Is the device on WIFI or on the carrier's network (which carrier)?